<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.0 20120330//EN" "JATS-journalpublishing1.dtd">

<article article-type="research-article" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:mml="http://www.w3.org/1998/Math/MathML">
 <front>
    <journal-meta>
	<journal-id journal-id-type="publisher-id">Jemr</journal-id>
      <journal-title-group>
        <journal-title>Journal of Eye Movement Research</journal-title>
      </journal-title-group>
      <issn pub-type="epub">1995-8692</issn>
	  <publisher>								
	  <publisher-name>Bern Open Publishing</publisher-name>
	  <publisher-loc>Bern, Switzerland</publisher-loc>
	</publisher>
    </journal-meta>
    <article-meta>
	<article-id pub-id-type="doi">10.16910/jemr.11.6.1</article-id> 
	  <article-categories>								
				<subj-group subj-group-type="heading">
					<subject>Research Article</subject>
				</subj-group>
		</article-categories>
      <title-group>
        <article-title>Eye-Hand Coordination Patterns of Intermediate and Novice Surgeons in a Simulation-Based Endoscopic Surgery Training Environment</article-title>
      </title-group>
	   <contrib-group> 
				<contrib contrib-type="author">
					<name>
						<surname>Topalli</surname>
						<given-names>Damla</given-names>
					</name>
					<xref ref-type="aff" rid="aff1">1</xref>
				</contrib>
				<contrib contrib-type="author">
					<name>
						<surname>Cagiltay</surname>
						<given-names>Nergiz Ercil</given-names>
					</name>
					<xref ref-type="aff" rid="aff1">1</xref>
				</contrib>				
        <aff id="aff1">
		<institution>Atilim University</institution>,   <country>Turkey</country>
        </aff>
		</contrib-group>   

		
	  <pub-date date-type="pub" publication-format="electronic"> 
		<day>8</day>  
		<month>11</month>
        <year>2018</year>
      </pub-date>
	  <pub-date date-type="collection" publication-format="electronic"> 
	  <year>2018</year>
	</pub-date>
      <volume>11</volume>
      <issue>6</issue>
	 <elocation-id>10.16910/jemr.11.6.1</elocation-id> 
	<permissions> 
	<copyright-year>2018</copyright-year>
	<copyright-holder>Topalli, D., &#x26; Cagiltay, N.E.</copyright-holder>
	<license license-type="open-access">
  <license-p>This work is licensed under a Creative Commons Attribution 4.0 International License, 
  (<ext-link ext-link-type="uri" xlink:href="https://creativecommons.org/licenses/by/4.0/">
    https://creativecommons.org/licenses/by/4.0/</ext-link>), which permits unrestricted use and redistribution provided that the original author and source are credited.</license-p>
</license>
	</permissions>
      <abstract>
        <p>Endoscopic surgery procedures require specific skills, such as eye-hand coordination to be developed. Current education programs are facing with problems to provide appropriate skill improvement and assessment methods in this field. This study aims to propose objec-tive metrics for hand-movement skills and assess eye-hand coordination. An experimental study is conducted with 15 surgical residents to test the newly proposed measures. Two computer-based both-handed endoscopic surgery practice scenarios are developed in a simulation environment to gather the participants’ eye-gaze data with the help of an eye tracker as well as the related hand movement data through haptic interfaces. Additionally, participants’ eye-hand coordination skills are analyzed. The results indicate higher correla-tions in the intermediates’ eye-hand movements compared to the novices. An increase in intermediates’ visual concentration leads to smoother hand movements. Similarly, the novices’ hand movements are shown to remain at a standstill. After the first round of practice, all participants’ eye-hand coordination skills are improved on the specific task targeted in this study. According to these results, it can be concluded that the proposed metrics can potentially provide some additional insights about trainees’ eye-hand coordi-nation skills and help instructional system designers to better address training requirements.</p>
      </abstract>
      <kwd-group>
        <kwd>Eye movement</kwd>
        <kwd>eye tracking</kwd>
        <kwd>saccades</kwd>
        <kwd>gaze</kwd>
        <kwd>hand-movement</kwd>
        <kwd>haptic device</kwd>
        <kwd>eye-hand coordination</kwd>
        <kwd>surgical skill assessment</kwd>                		
      </kwd-group>
    </article-meta>
  </front>	
  <body>

    <sec id="S1">
      <title>Introduction</title>

<p>Minimal invasive surgery (MIS) is preferred by surgeons and patients
as it has several benefits for the patients, causing less infection risk
and shorter operation time (<xref ref-type="bibr" rid="b1">1</xref>).
However, MIS education for surgeon faces many challenges, such as
difficulty to learn, while more than 30 procedures for the learning
curves have been
reported (<xref ref-type="bibr" rid="b2 b3">2, 3</xref>). 
In this respect, depending on the type of the operation, the number
of such procedures can be increased up to 100
(<xref ref-type="bibr" rid="b4">4</xref>).</p>

<p>Surgeons involved in these types of operations need to develop
specific psychomotor skills, such as eye-hand coordination, and depth
perception to mention a few
(<xref ref-type="bibr" rid="b5 b6">5, 6</xref>). As the location of the operation site can only be observed through a
monitor in MIS, mislocation (e.g., in case that the display in the
monitor through the endoscope and hands controlling the operational tool
are at different locations) can make it impossible for the surgeon to
follow his/her hands as well as the operative scene simultaneously
(<xref ref-type="bibr" rid="b7 b8">7, 8</xref>
). This mislocation problem can be more critical in complex surgery
procedures, such that surgeons might even require assistance to control
the endoscope.</p>

<p>Hence, surgical education programs today require very important
knowledge and skills to be gained. In traditional training programs,
these skills are gained in the operating theatre and upon practice on
patients. According to earlier studies, critical errors
(<xref ref-type="bibr" rid="b9">9</xref>) can occur when these skills are
not developed properly or urgently put in practice
(<xref ref-type="bibr" rid="b10">10</xref>). In current education
programs, novice surgeons should improve their skills and practical
abilities on real patients supervised by experts
(<xref ref-type="bibr" rid="b11">11</xref>), which is an expensive and
very risky method. Apart from these factors, assessment of surgical
skills is carried out by training specialists through structured
schemes, such as OSATS (Objective structured assessment of technical
skill), which is a method for testing specific operative skills in
surgical trainees
(<xref ref-type="bibr" rid="b12 b13">12, 13</xref>
). However, as also reported by Moorthy et al. (<xref ref-type="bibr" rid="b14">14</xref>) there are certain
constraints in this method, such as resources and time to find
supervising surgeons to observe and evaluate the performance of trainees
(<xref ref-type="bibr" rid="b14">14</xref>). Naturally, since these
operations directly affect human safety, ethical issues may arise. As a
result, in order to guarantee patient safety, trainees should be
educated by other means than with actual patients.</p>

<p>In addition, the use of movement based measures has been studied
extensively, and has been suggested as an effective method for
monitoring surgery training (<xref ref-type="bibr" rid="b15">15</xref>).
It has been reported that motion analysis devices are useful tools to
assess performance compared to merely relying on OSATS and time
(<xref ref-type="bibr" rid="b6">6</xref>). Tracking hand and instrument
movements using markers, known as ‘motion analysis’, has been suggested
by earlier studies as an alternative method to OSATS in assessing the
related skills by measuring the economy of movement
(<xref ref-type="bibr" rid="b16">16</xref>). Latko et al. (1997) studied
videotaped and documented hand activities rated from 0 to 10 through
observations, and provided some definitions for hand movements.
According to them, when no regular exertions are detected, the hand
activity is considered as ‘idle’, and when there is infrequent motion,
it is considered as ‘steady motion’. Based on the frequency of the
motion, they also propose ‘consistent conspicuous’ (long pauses or slow
motions), ‘slow steady motion’, and ‘rapid steady motion’
(<xref ref-type="bibr" rid="b17">17</xref>).</p>

<p>Besides, several hand-movement metrics have been proposed, such as
path length, motion smoothness, depth perception (as the total distance
traveled by the instrument along the instruments’ axis), response
orientation and grasping (<xref ref-type="bibr" rid="b15">15</xref>). In
this vein, some studies have been carried out on motor behaviors in
surgical skills based mainly on the path-length, the amount of time to
complete a procedure, and idle time, which topic needs to be considered
but has remained rather neglected
(<xref ref-type="bibr" rid="b18">18</xref>). Oropesa et al. (2011) define
‘idle time’ as lack of movement in both hands representing the delay in
motor planning or decision making
(<xref ref-type="bibr" rid="b19">19</xref>). One example of motion
metrics is the smoothness of hand function. Oropesa et al. (2013) define
‘motion’ as abrupt changes in acceleration resulting in jerky movements
of the instrument
(<xref ref-type="bibr" rid="b19 b20">19, 20</xref>
). Another proposed metric, ‘working space’, is suggested for the
economy of the area and economy of volume efficiency in MIS
(<xref ref-type="bibr" rid="b20">20</xref>), and is determined by using
an electromagnetic sensor to track the participants’ hand movements and
the summation of distances from the sensor’s average spatial location
(<xref ref-type="bibr" rid="b21">21</xref>). However, further research
has been reported to be necessary in a recent study to better understand
the role and usage of psychomotor metrics, such as smoothness, to assess
performance during certain medical procedures
(<xref ref-type="bibr" rid="b21">21</xref>).</p>

<p>Detecting the location of a given object in a precise manner is, such
as a tool or hand, important for each of these metrics
(<xref ref-type="bibr" rid="b20 b22">20, 22</xref>
). In the literature, video-processing methods
(<xref ref-type="bibr" rid="b23 b24">23, 24</xref>
) and motion-tracking systems (<xref ref-type="bibr" rid="b19">19</xref>)
have been proposed to detect the position of the tool in a precise way,
giving rise to other practical concerns.</p>

<p>Despite the presence of evidence showing a correlation between motor
skills and eye events (<xref ref-type="bibr" rid="b25 b26 b27 b28 b29">25, 26, 27, 28, 29</xref>),
there are very limited studies conducted to improve the understanding
and objective measuring of surgical residents’ eye-hand coordination
skills, while there is hardly any standardized measure in this regard,
thus limiting the interpretation and generalization of related results
(<xref ref-type="bibr" rid="b30">30</xref>). Hence, despite the fact that
there exist some metrics providing insights into the relationship
between the eye and the hand as well as their coordination, still there
is a need to extend our knowledge of how hand movements are guided and
controlled by vision (<xref ref-type="bibr" rid="b31">31</xref>).
Objective methods for analyzing surgeons’ hand movement patterns have
not thoroughly led to devising fully satisfactory methods of assessment
(<xref ref-type="bibr" rid="b32">32</xref>); whereas such metrics are
necessary in order to provide proper feedback and continuous analysis of
psychomotor skills in MIS
(<xref ref-type="bibr" rid="b12 b33">12, 33</xref>
).</p>

<p>Today, computer-based simulation environments offer many benefits
since they are cheaper, provide more time for practice and can be easily
modified for different rare cases
(<xref ref-type="bibr" rid="b34">34</xref>). Apart from such
environments, in the literature, there are several studies showing that
eye-tracking technology provides beneficial insights for surgical
training purposes. In a review study, Tien et al. (2014) concluded that
eye tracking offers reliable quantitative data for objective assessment
purposes to improve performance in surgical training
(<xref ref-type="bibr" rid="b35">35</xref>). Several studies have been
conducted to better understand the eye-movement behaviors of surgical
residents for improved skill assessment purposes. Research suggests that
recording these eye movements may be beneficial both for skill
assessment and training in the field of surgery
(<xref ref-type="bibr" rid="b5">5</xref>). Today, computer-based
simulation environments provide several objective assessment
capabilities, which can be attained continuously during the training
period and without expert supervision
(<xref ref-type="bibr" rid="b36">36</xref>). However, the calculation
methods for many of these metrics are implicit and within the scope of
commercial simulators, making them difficult to manipulate.</p>

<p>In light of all these shortcomings and needs within the field of
surgery, the present study first attempts to adapt the eye-movement
event analysis approaches to the hand motion and propose new additional
objective metrics for the hand-movement events. Additionally, through
haptic devices the data pertaining to the hand locations of the surgical
residents are collected along with their eye-movements while performing
surgical tasks in a virtual reality environment. By analyzing both the
eye and hand behaviors of the surgical residents, this study aims to
understand the eye-hand coordination skills of the surgical residents in
a more elaborate way.</p>
    </sec>
	
    <sec id="S2">
      <title>Methods</title>

<p>The aim of this study is to assess the relationship between eye-gaze
and hand-motion metrics to understand the eye-hand coordination behavior
differences of intermediate and novice surgeons in a simulation-based
endoscopic surgery environment.</p>

<p>The findings of the literature imply that there are several different
algorithms which use constant threshold values to classify eye events
into fixations and saccades. However, it is also reported that an ideal
algorithm should automatically identify the threshold values, without
requiring any parameter setting from the user
(<xref ref-type="bibr" rid="b37">37</xref>). For instance, the BIT
(Binocular-Individual Threshold) algorithm is a fully automatic,
velocity-based algorithm to determine fixations (i.e., fixation duration
and fixation number) and saccades from the eye data, using task- and
individual-specific thresholds
(<xref ref-type="bibr" rid="b38">38</xref>). The algorithm is regarded as
‘machine and sampling frequency independent’
(<xref ref-type="bibr" rid="b38">38</xref>).</p>

<p>Due to the ability of automatically defining task-specific
thresholds, this algorithm is more suitable for skill-based studies.
Hence, in this study, the BIT algorithm is used to identify the fixation
duration, fixation number and the saccades of the eye-gaze data. BIT is
also used to classify hand movement events using the data collected
within a surgical simulation environment. Since the eye and hand can
move at different velocities, the threshold values should be determined
automatically for each case with this algorithm. The source code of the
algorithm is available on the authors’ website, using MATLAB
(<xref ref-type="bibr" rid="b38">38</xref>).</p>

    <sec id="S2a">
      <title>Participants</title>

<p>A total of 15 surgical residents (ten surgeons and five interns) from
the Department of Neurosurgery (six participants) and Otolaryngology
(ENT) (four participants) from Hacettepe Medical School in Ankara,
Turkey participated in this study. There were two skill level groups of
participants, intermediate and novice, based on the categorization
presented in Silvennoinen et al.’s study
(<xref ref-type="bibr" rid="b39">39</xref>). In that concern, those who
have operated at least one endoscopic surgery are considered as
‘intermediate’, whereas others who have only observed and assisted in
endoscopic operations, but have not performed any surgeries by
themselves, are considered as ‘novice’. As shown in Table 1, most of the
participants were male (86.66%).</p>

<table-wrap id="t01" position="float">
					<label>Table 1.</label>
					<caption>
						<p>Information about Participants</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">

    <thead>
      <tr>
        <th><bold>Skill Level</bold></th>
        <th><bold>Age</bold></th>
        <th colspan="2"><bold>Department</bold></th>
        <th colspan="2"><bold>Gender</bold></th>
      </tr>

      <tr>
        <td></td>
        <td></td>
        <td><bold>NRS</bold></td>
        <td><bold>ENT</bold></td>
        <td><bold>F</bold></td>
        <td><bold>M</bold></td>
      </tr>
    </thead>
    <tbody>      
      <tr>
        <td>Intermediate</td>
        <td>28.4</td>
        <td>1</td>
        <td>5</td>
        <td>1</td>
        <td>4</td>
      </tr>
      <tr>
        <td>Novice</td>
        <td>25.6</td>
        <td>4</td>
        <td>5</td>
        <td>1</td>
        <td>9</td>
      </tr>
      <tr>
        <td>NRS: Neurosurgery</td>
        <td></td>
        <td></td>
        <td></td>
        <td></td>
        <td></td>
      </tr>
      <tr>
        <td>ENT: Ear Nose Throat</td>
        <td></td>
        <td></td>
        <td></td>
        <td></td>
        <td></td>
      </tr>      
    </tbody>
  </table>
</table-wrap>

<p>Detailed information about these participants according to their
endoscopic surgical expertise (average number of operations observed,
assisted and performed) is given in Table 2.</p>

<table-wrap id="t02" position="float">
					<label>Table 2.</label>
					<caption>
						<p>Participants’ Surgical Experience</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">
    <thead>
      <tr>
        <th><bold>Skill Level</bold></th>
        <th colspan="1"><bold>Average number of Endoscopic Surgery</bold></th>
      </tr>

      <tr>
        <td></td>
        <td><bold>Observed</bold></td>
        <td><bold>Assisted</bold></td>
        <td><bold>Performed</bold></td>
      </tr>
    </thead>
    <tbody>      
      <tr>
        <td>Intermediate</td>
        <td>52.0</td>
        <td>39.6</td>
        <td>23.8</td>
      </tr>
      <tr>
        <td>Novice</td>
        <td>8.2</td>
        <td>1.0</td>
        <td>0.0</td>
      </tr>
    </tbody>
  </table>
</table-wrap>

    </sec>
	
    <sec id="S2b">
      <title>Apparatus</title>

<p>Haptic devices can be integrated into training simulations for MIS
procedures (<xref ref-type="bibr" rid="b40">40</xref>) in order to
provide similar and real-world practice as well as a realistic sense of
touch. Accordingly, in this study, a mid-range professional ‘Geomagic
Touch’ haptic device is used to perform the tasks in the scenarios. The
simulation software recorded a hundred data points per second as the
hand coordinates for both hands with the help of these haptic devices.
Additionally, the eye movement data is gathered by using a 60 Hz
eye-tracking device, the Eye Tribe
(<xref ref-type="bibr" rid="b41">41</xref>), which is easy to set up,
transportable and reported as appropriate for use in scientific research
(<xref ref-type="bibr" rid="b42">42</xref>). This tool is used to track
the user’s eye movements and calculate the on-screen gaze coordinates
with an accuracy of 0.5°-0.7°. During the experiment, eye-gaze
coordinates were gathered in a top-left oriented 2D coordinate system.
The screen resolution is 1920 x 1080 pixels; in other words, the
horizontal field of view (x-coordinate) is 1920 pixels, whereas the
vertical field of view (y-coordinate) is 1080 pixels. The field of view
(FOV) of the camera, from the left-perspective for Scenario-2 can be
seen in Figure 1.</p>

<fig id="fig01" fig-type="figure" position="float">
					<label>Figure. 1</label>
					<caption>
						<p>Camera’s FOV in Simulation Environment</p>
					</caption>
					<graphic id="graph01" xlink:href="jemr-11-06-a-figure-01.png"/>
				</fig>


<p>In our software, hand motion coordinates, regarded as the position of
tool and camera in the scenarios, were represented as 3D vectors.
Accordingly, the origins for eye O<sub>eye</sub> and for hand
O<sub>hand</sub> coordinates have been represented in a 2D scene and
appear in Figure 2-A and B.</p>

<fig id="fig02" fig-type="figure" position="float">
					<label>Figure. 2</label>
					<caption>
						<p>Eye-Gaze and Hand Motion Coordinates in Scenario-2</p>
					</caption>
					<graphic id="graph02" xlink:href="jemr-11-06-a-figure-02.png"/>
				</fig>

<p>Due to the low sampling rate, the analysis of saccades may become
problematic when collecting data using the Eye Tribe tracker. It is
reported that measuring saccade metrics require high frequency sampling
(<xref ref-type="bibr" rid="b43">43</xref>). However, it has also been
stated that velocity-based algorithms for saccade detection that works
on high-frequency data do a relatively good job on the Eye Tribe data
(<xref ref-type="bibr" rid="b43 b44">43, 44</xref>
), but still they fail to be accurate when it comes to saccadic
positions (<xref ref-type="bibr" rid="b43">43</xref>). However, there are
a number of studies in the literature using 50 Hz
(<xref ref-type="bibr" rid="b38">38</xref>) and 60 Hz sampling rate
trackers in order to classify eye events into fixations and saccades
(<xref ref-type="bibr" rid="b44">44</xref>).</p>
    </sec>
	
    <sec id="S2c">
      <title>Design</title>

<p>In the experimental study, there were two both-handed scenarios for
the surgical training process used. The first one was prepared to
practice on general skills, such as learning the use of surgical tools
with the endoscope and developing depth-perception in a simulated 3D
environment (Scenario-1). The other scenario closer to the operational
procedures uses a simulated anatomical nose model (Scenario-2). Each
scenario consists of ten repetitive tasks. The layout of the scenarios
is shown in Figure 3-A and B, respectively.</p>

<fig id="fig03" fig-type="figure" position="float">
					<label>Figure. 3</label>
					<caption>
						<p>Scenarios for Experimental Study</p>         
					</caption>
					<graphic id="graph03" xlink:href="jemr-11-06-a-figure-03.png"/>
				</fig>
<p>A: Scenario-1: Moving the Red Ball into the Box</p>
<p>B: Scenario-2: Clearing the Nose</p>


    </sec>
	
    <sec id="S2d">
      <title>‘Moving the Red Ball into the Box’ Scenario</title>

<p>In this scenario (Scenario-1), each participant is asked to approach
the red ball with the haptic device, catch it, and then move it into the
green box as shown in Figure 3-A. The tool is controlled with the
dominant hand of the participant, whereas the camera is controlled by
his/her non-dominant hand. The position of the ball and the box changes
arbitrarily in each trial. The participant must complete this process
successfully, which includes 10 tasks, within the allocated time period.
In case of failure to complete each task within 10 seconds, the ball and
the box disappear.</p>
    </sec>
	
    <sec id="S2e">
      <title>‘Clearing the Nose’ Scenario</title>

<p>In this scenario (Scenario-2), the participant must remove the green
ball-like objects, which are spread through the nose model as shown in
Figure 3-B. The camera is used as the light source and the cautery model
as the tool to collect the objects. In case of a collision - that is, if
the haptic device touches the tissue - it provides a force feedback that
feels as if the device pushes back in the hands of the user.</p>

<p>This study is conducted as part of a research project and the content
of both scenarios are prepared based on opinions of the neurosurgery and
ENT domain experts. Mainly endoscopic pituitary surgery procedures are
aimed to be practiced which is in the scope of both domains and
performed starting from the nose holes through the pituitary area.
Accordingly, the scenarios are designed for beginners of these
operations.</p>
    </sec>
	
    <sec id="S2f">
      <title>Procedure</title>

<p>As shown in Figure 4, in this study, the research procedure mainly
consisted of five stages. These are S1: Experimental study; S2:
regenerated simulation version; S3: BIT algorithm; S4: Eye and hand
metrics; and S5: Eye-hand coordination analyses.</p>

<fig id="fig04" fig-type="figure" position="float">
					<label>Figure. 4</label>
					<caption>
						<p>Procedure of the Experimental Study</p>
					</caption>
					<graphic id="graph04" xlink:href="jemr-11-06-a-figure-04.png"/>
				</fig>

<p>At the beginning of the experiment, the participants were asked to
fill out a questionnaire including their demographic information,
dominant-hand and experience level (i.e., years in the department, and
the number of operations observed, assisted, and performed). Prior to
the experimental study, the participant is seated 70 cm. away from the
screen. First, each participant was informed about the calibration
process; more specifically that they should maintain the distance and
that they are not allowed to move their heads or body once calibration
is completed. After this briefing, the calibration process started. Nine
calibration points appeared on the screen one after the other, with a
viewing time of two seconds each. At the end, a five-star rating is
displayed to provide the accuracy of the calibration. If the result of
the calibration is four-star (&#x3C;0.7°) or five-star (&#x3C;0.5°), then it
is regarded as acceptable for the experimental study. After that, a
brief instructional video explaining how to perform the task was shown
to each participant separately about the procedure. Next, each
participant was asked to perform two scenarios, Scenario 1 and 2, using
both their dominant and non-dominant hands at the same time. The
eye-gaze data (i.e., pupil size, fixation, raw and smoothed X and Y
coordinates of both left and right eye) and hand motion data (i.e., tool
and camera position, tool and camera rotation as 3D vectors) were
collected and stored using a special software (Figure 4: S1). The
experimental setup can be seen in Figure 5.</p>

<fig id="fig05" fig-type="figure" position="float">
					<label>Figure. 5</label>
					<caption>
						<p>Experimental Setup</p>
					</caption>
					<graphic id="graph05" xlink:href="jemr-11-06-a-figure-05.png"/>
				</fig>

<p>Afterwards, the performance of each participant in both scenarios is regenerated in
the simulation environment using a software developed in the Unity
environment (Figure 4: S2). This regeneration software took each
participant’s eye data with the help of the tracker device and the hand
data from the haptic device while they were performing each task during
the experimental study. In this way, each participant’s eye and hand
coordinates are aligned within the same time scale by representing the
eye location through an eye icon and hand location through a small blue
sphere in the regenerated simulation re-play (See Figure 6).</p>

<fig id="fig06" fig-type="figure" position="float">
					<label>Figure. 6</label>
					<caption>
						<p>‘Simulated’ Performance of a Participant (Scenario-2)</p>
					</caption>
					<graphic id="graph06" xlink:href="jemr-11-06-a-figure-06.png"/>
				</fig>

<p>This regenerated simulation allows researchers to determine the
extent regarding tissue-contact, left-right hand coordination and the
eye-hand coordination of each individual. In the meantime, the
observation data was gathered through a questionnaire to understand such
behavior differences between novice and intermediate groups. Five
observers (other than the researchers) have evaluated the participants’
performances in Scenario-2. As this scenario is based on an environment
similar to the operational procedures (Figure 4: S3), the observations
are conducted on this scenario. All of the observers are the graduate
students in the field of engineering and briefed about the observation
procedure before they start the evaluations.</p>

<p>Using this approach, the raw coordinates for both the eye and hand
movement data aligned in the time scale were obtained as the output for
regenerated simulated version of the scenarios. Then, these coordinates
were given as an input to the BIT Algorithm (Figure 4: S4). Afterwards,
the classification process of the eye and hand movements - as either
fixation or saccade events- were identified by running the algorithm,
separately for the eye data and the hand data. The output of the BIT
algorithm performed on the eye data is the fixation and saccade metrics,
whereas metrics related hand movement were obtained as the outputs from
the hand data (Figure 4: S5). Finally, all collected data is analyzed to
better understand the efficiency of the proposed metrics and to
objectively measure the eye-hand coordination skills of the participants
(Figure 4: S6).</p>
    </sec>
	
    <sec id="S2g">
      <title>Metrics</title>


<p>In this study, using the BIT algorithm, three metrics are identified,
namely Fixation Duration (FD) (the time from one saccade to another),
Fixation Number (FN) (the number of fixations in an interval) and
Saccade Number (SN) (the number of saccades in an interval).</p>

<p>As explained in the procedure, the eye movements and the hand
movements of the participants are aligned in the same time scale. The
authors believe that further insight regarding the eye-movement events
can be also be applied to the hand movement events as well. Our main aim
in this study is to test this assumption by comparing the results
conducted in this study with the ones that are reported in earlier
studies on eye-hand coordination skills of the surgeons. Accordingly, in
the present work, new hand metrics are introduced to identify the hand
movements of the participants as explained below:</p>

<p>The ‘Stand Still’ metric is proposed in this study as the period when
the hand movement remains within a very small range and lower velocity
for some time. In other words, it determines the ‘idle state’ of the
hand movement. By running the BIT algorithm, such events can be
classified into ‘Stand Still Duration’ (SSD) and ‘Stand Still Number’
(SSN) for hand movements. In this respect, the ‘Sudden Sharp Movement’
(SSM) metric is also proposed to identify very fast, sharp hand
movements while performing any given task.</p>
    </sec>
    </sec>

    <sec id="S3">
      <title>Results</title>

<p>A correlation analysis is performed to assess the relationship
between eye-gaze and hand-motion metrics considering three pairs: the
fixation duration of the eye-gaze (FD) and stand still duration of the
hand motion (SSD), the fixation number of the eye-gaze (FN) and stand
still number of the hand motion (SSN), and the saccade number of the
eye-gaze (SN) and the sudden sharp movement (SSM) for hand motion, where
the correlation coefficient r is calculated. The value of | r | from 0.1
to 0.3 represents a small correlation. From 0.3 to 0.5, the value
represents a moderate correlation, while larger than 0.5 shows a strong
correlation as reported by Cohen
(<xref ref-type="bibr" rid="b45 b46">45, 46</xref>
).</p>

    <sec id="S3a">
      <title>Eye- Hand Correlation Results for Scenario-1</title>


<p>In Scenario-1, the descriptive statistics for the two groups of
participants (intermediate and novice) for both the eye metrics (FD, FN,
and SN), and the hand metrics (SSD, SSM and SSN) are depicted in Figure
7. Additionally, the mean and standard deviations for all metrics are in
Scenario-1 is shown in Table 3.</p>

<fig id="fig07" fig-type="figure" position="float">
					<label>Figure. 7</label>
					<caption>
						<p>Mean for Eye and Hand Metrics for each group in Scenario-1</p>
					</caption>
					<graphic id="graph07" xlink:href="jemr-11-06-a-figure-07.png"/>
				</fig>
<p>A: Scenario-1: Mean FD (ms) of eye and Mean SSD (ms) of hand</p>
<p>B: Scenario-1: Mean FN of eye and Mean SSN of hand</p>
<p>C: Scenario-1: Mean SN of eye and Mean SSM of hand </p> 

<table-wrap id="t03" position="float">
					<label>Table 3.</label>
					<caption>
						<p>Descriptive Results for the Metrics in Scenario-1</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">

    <thead>
      <tr>
        <th></th>
        <th colspan="2"><bold>Intermediate</bold></th>
        <th colspan="2"><bold>Novice</bold></th>
        <th></th>
        <th></th>
      </tr>
    </thead>
    <tbody>
      <tr>
        <td><italic><bold>Eye Metrics</bold></italic></td>
        <td><bold>M</bold></td>
        <td><bold>SD</bold></td>
        <td><bold>M</bold></td>
        <td><bold>SD</bold></td>
      </tr>
      <tr>
        <td>FD</td>
        <td>121.53</td>
        <td>19.05</td>
        <td>112.70</td>
        <td>26.55</td>
      </tr>
      <tr>
        <td>FN</td>
        <td>12.14</td>
        <td>1.91</td>
        <td>11.26</td>
        <td>2.66</td>
      </tr>
      <tr>
        <td>SN</td>
        <td>20.80</td>
        <td>13.46</td>
        <td>28.60</td>
        <td>32.46</td>
      </tr>
      <tr>
        <td><italic><bold>Hand Metrics</bold></italic></td>
        <td><bold>M</bold></td>
        <td><bold>SD</bold></td>
        <td><bold>M</bold></td>
        <td><bold>SD</bold></td>
      </tr>
      <tr>
        <td>SSD</td>
        <td>92.99</td>
        <td>10.25</td>
        <td>101.98</td>
        <td>15.03</td>
      </tr>
      <tr>
        <td>SSN</td>
        <td>9.30</td>
        <td>1.01</td>
        <td>10.19</td>
        <td>1.51</td>
      </tr>
      <tr>
        <td>SSM</td>
        <td>55.80</td>
        <td>37.48</td>
        <td>55.10</td>
        <td>15.82</td>
      </tr>
    </tbody>
  </table>
</table-wrap>

    </sec>

    <sec id="S3b">
      <title>Eye-Hand Correlation for Intermediates</title>

<p>A Pearson's product-moment correlation was run to assess the
relationship between the eye-gaze and hand motion metrics, (FD- SSD and
FN- SSN) and saccades (SN- SSM) for intermediates. For all of these
three pairs, the preliminary analyses showed the relationship to be
linear with both variables normally distributed, as assessed by the
Shapiro-Wilk's test (p &#x3E; .05), and there were no outliers. Also,
there was a strong negative correlation between the FD – SSD and FN- SSN
metrics among the intermediates, r = -.836 and r = -.837, respectively.
On the other hand, a strong positive correlation existed between the
saccade metrics SN and SSM in the same group r = .755 (Table 4).</p>
    </sec>

    <sec id="S3c">
      <title>Eye-Hand Correlation for Novices</title>

<p>Again, a Pearson's product-moment correlation was run to assess the
relationship between the eye-gaze and hand motion metrics, and fixations
(FD- SSD and FN- SSN) and saccades (SN- SSM) for novices. For the first
two pairs related to fixation, preliminary analyses showed the
relationship to be linear with both variables normally distributed, as
assessed by the Shapiro-Wilk's test (p &#x3E; .05) with no outliers. There
was a moderate positive correlation for both the FD- SSD and FN- SSN
metrics among the novice participants, r = .448. However, not all
variables were normally distributed for the saccade metrics SN and SSM,
as assessed by the Shapiro-Wilk's test (p &#x3C; .05). Accordingly, a
Spearman's rank-order correlation was run to assess the relationship
between the saccade number of eye-gaze data and the sudden sharp
movements of the hand data, with results showing a strong positive
correlation for the SN and SSM measures, rs = .590 (Table 4).</p>

<table-wrap id="t04" position="float">
					<label>Table 4.</label>
					<caption>
						<p>Eye- Hand Correlation Results for Scenario-1</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">

    <thead>
      <tr>
        <th><bold>Skill Level</bold></th>
        <th><bold>FD - SSD</bold></th>
        <th><bold>FN - SSN</bold></th>
        <th><bold>SN - SSM</bold></th>
      </tr>
    </thead>
    <tbody>
      <tr>
        <td>Intermediate</td>
        <td>-.836 Strong-</td>
        <td>-.837 Strong-</td>
        <td> .755 Strong+</td>
      </tr>
      <tr>
        <td>Novice</td>
        <td>.448 Moderate+</td>
        <td>.448 Moderate+</td>
        <td>.590 Strong+</td>
      </tr>
    </tbody>
  </table>
</table-wrap>

    </sec>

    <sec id="S3d">
      <title>Eye- Hand Correlation Results for Scenario-2</title>


<p>Figure 8 show the descriptive statistics for the eye (FD, FN, and SN)
and hand metrics (SSD, SSM, SSN) for the two groups in Scenario-2.
Additionally, the mean and standard deviations for all metrics are in
Scenario-2 is depicted in Table 5.</p>

<fig id="fig08" fig-type="figure" position="float">
					<label>Figure. 8</label>
					<caption>
						<p>Mean for Eye and Hand Metrics for each group in Scenario-2</p>
					</caption>
					<graphic id="graph08" xlink:href="jemr-11-06-a-figure-08.png"/>
				</fig>
<p>A: Scenario-2: Mean FD (ms) of eye and Mean SSD (ms) of hand</p>
<p>B: Scenario-2: Mean FN of eye and Mean SSN of hand</p>
<p>C: Scenario-2: Mean SN of eye and Mean SSM of hand</p>        

<table-wrap id="t05" position="float">
					<label>Table 5.</label>
					<caption>
						<p>Descriptive Results for the Metrics in Scenario-2</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">

    <thead>
      <tr>
        <th></th>
        <th colspan="2"><bold>Intermediate</bold></th>
        <th colspan="2"><bold>Novice</bold></th>
      </tr>
    </thead>
    <tbody>
      <tr>
        <td><italic><bold>Eye Metrics</bold></italic></td>
        <td><bold>M</bold></td>
        <td><bold>SD</bold></td>
        <td><bold>M</bold></td>
        <td><bold>SD</bold></td>
      </tr>
      <tr>
        <td>FD</td>
        <td>151.25</td>
        <td>30.48</td>
        <td>133.42</td>
        <td>42.49</td>
      </tr>
      <tr>
        <td>FN</td>
        <td>15.11</td>
        <td>3.04</td>
        <td>13.34</td>
        <td>4.24</td>
      </tr>
      <tr>
        <td>SN</td>
        <td>41.40</td>
        <td>23.58</td>
        <td>91.90</td>
        <td>82.30</td>
      </tr>
      <tr>
        <td><italic><bold>Hand Metrics</bold></italic></td>
        <td><bold>M</bold></td>
        <td><bold>SD</bold></td>
        <td><bold>M</bold></td>
        <td><bold>SD</bold></td>
      </tr>
      <tr>
        <td>SSD</td>
        <td>121.37</td>
        <td>16.60</td>
        <td>118.28</td>
        <td>15.13</td>
      </tr>
      <tr>
        <td>SSN</td>
        <td>12.15</td>
        <td>1.65</td>
        <td>11.84</td>
        <td>1.52</td>
      </tr>
      <tr>
        <td>SSM</td>
        <td>395.00</td>
        <td>143.63</td>
        <td>486.00</td>
        <td>143.86</td>
      </tr>
    </tbody>
  </table>
</table-wrap>

    </sec>

    <sec id="S3e">
      <title>Eye-Hand Correlation for Intermediates</title>


<p>A Spearman's rank-order correlation was run to assess the
relationship between the FD- SSD and FN- SSN measures since not all the
variables were normally distributed, as assessed by the Shapiro-Wilk's
test (p &#x3C; .05). There was a strong negative correlation for the pairs
related to fixation that is the FD- SSD and FN- SSN metrics. In other
words, an increase in the eye fixation duration and fixation number was
strongly correlated with a decrease in the hand stand still duration and
stand still number among the intermediates, rs = -.900, p &#x3C; .05
(Table 6). However, both variables of the saccade metrics, SN and SSM,
were normally distributed, as assessed by the Shapiro-Wilk's test (p
&#x3E; .05). Also, a Pearson’s correlation was run to assess the
relationship between the saccade number of eye-gaze data and the sudden
sharp movements of the hand data, with the outcome that there was also a
strong positive correlation for the SN and SSM metrics, r = .846 (Table
6).</p>
    </sec>

    <sec id="S3f">
      <title>Eye-Hand Correlation for Novices</title>

<p>A Pearson's product-moment correlation was run to assess the
relationship between the eye-gaze and hand motion metrics, the fixation
(FD- SSD and FN- SSN) and saccades (SN- SSM) for novices. For the first
two pairs related to fixation, preliminary analyses showed the
relationship to be linear with both variables normally distributed, as
assessed by the Shapiro-Wilk's test (p &#x3E; .05), and there were no
outliers. There was a moderate negative correlation for both FD- SSD and
FN- SSN metrics in novices, r = -.443 and -.441, respectively. However,
not all variables were normally distributed for the saccade metrics SN
and SSM, as assessed by the same test (p &#x3C; .05). Accordingly, a
Spearman's rank-order correlation was run to assess the relationship
between the saccade number of the eye-gaze data and sudden sharp
movements of the hand data. The results show a small positive
correlation for the SN and SSM metrics for novices, rs = .06 (Table
6).</p>

<table-wrap id="t06" position="float">
					<label>Table 6.</label>
					<caption>
						<p>Eye- Hand Correlation Results for Scenario-2</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">

    <thead>
      <tr>
        <th><bold>Skill Level</bold></th>
        <th><bold>FD - SSD</bold></th>
        <th><bold>FN – SSN</bold></th>
        <th><bold>SN - SSM</bold></th>
      </tr>
    </thead>
    <tbody>
      <tr>
        <td>Intermediate</td>
        <td>-.900* Strong-</td>
        <td>-.900* Strong-</td>
        <td> .846  Strong+</td>
      </tr>
      <tr>
        <td>Novice</td>
        <td>-.443 Moderate-</td>
        <td>-.441 Moderate-</td>
        <td> .06 Small+</td>
      </tr>
    </tbody>
  </table>
					<table-wrap-foot>
						<fn id="FN1">
						<p>*Note that the correlation is significant at the 0.05 level</p>
						</fn>
					</table-wrap-foot>  
</table-wrap>

    </sec>

    <sec id="S3g">
      <title>Analyzing the Questionnaire Data</title>

<p>Five observers have monitored the participants’ performances in
Scenario-2 in order to assess the left-right hand coordination and
eye-hand coordination skills of the participants. Their evaluation is
based on the items given in Table 7 to be ranked in accordance to five
alternatives (1: Strongly Disagree, 5: Strongly agree) in a Likert
scale-type questionnaire. 11 out of 15 participants (3 intermediates, 8
novices) were evaluated using the questionnaire data. The descriptive
results appear in Table 7.</p>


<table-wrap id="t07" position="float">
					<label>Table 7.</label>
					<caption>
						<p>Descriptive Results for Questionnaire Analysis of the
Observers</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">

    <thead>
      <tr>
        <th></th>
        <th colspan="2"><bold>Intermediate</bold></th>
        <th colspan="2"><bold>Novice</bold></th>
      </tr>

      <tr>
        <td><bold>Questionnaire Item</bold></td>
        <td><bold>M</bold></td>
        <td><bold>SD</bold></td>
        <td><bold>M</bold></td>
        <td><bold>SD</bold></td>
      </tr>
    </thead>
    <tbody>      
      <tr>
        <td>The participant shows developed depth perception skills in a
        3D environment.</td>
        <td>3.33</td>
        <td>.42</td>
        <td>2.12</td>
        <td>.34</td>
      </tr>
      <tr>
        <td>The participant shows developed skills in eye-hand
        coordination.</td>
        <td>3.53</td>
        <td>.30</td>
        <td>1.92</td>
        <td>.52</td>
      </tr>
    </tbody>
  </table>
</table-wrap>

<p>A Mann-Whitney U test was run to determine if there were any
differences in the scores for the given expressions (Table 7) between
the intermediate and novice groups. The distributions of the scores for
intermediates and novices were found to be dissimilar among the
participants once inspected visually. According to the results,
considering these expressions, the 3D depth perception and eye-hand
coordination skills of the intermediates (mean rank = 10.00) were
significantly higher than novices (mean rank = 4.50), U=0, z= -2.461,
p=.012.</p>
    </sec>
    </sec>

    <sec id="S4">
      <title>Discussion</title>

<p>The present study involved two main goals: first, to propose new
objective metrics by adapting our knowledge in the field of eye-movement
events; and, second, to test the appropriateness of these metrics in the
endoscopic surgery field to objectively measure the eye-hand
coordination skill levels of the surgeons.</p>

<p>Previous studies have attempted to evaluate the level of surgeons’
eye-hand coordination in the field of endoscopic surgery through some
related task analysis (<xref ref-type="bibr" rid="b47 b48 b49">47, 48, 49</xref>)
through eye movement analysis (<xref ref-type="bibr" rid="b31">31</xref>)
or eye movements and arm movements
(<xref ref-type="bibr" rid="b50">50</xref>). Hence, there are no current
studies designed to understand the eye-hand coordination skills by
analyzing both the hand movements and eye-movements of surgeons working
in coordination. Accordingly, the proposed measures provide alternatives
to understand such eye-hand coordination skills by analyzing the
hand-movements and eye-movements of surgeons.</p>

<p>Based on the results of this study, in both scenarios, there exists a
correlation between the average measured values of the three eye-gaze
and hand motion metrics. The fixation metrics (FD-SSD and FN-SSN) are
found strongly correlated for the intermediates, and moderately
correlated for the novices. This outcome indicates that the novices
require improvements in the related eye-hand skills. In other words, the
intermediate participants’ eye-hand coordination skills are better
improved compared to the novices. The results also show that, in
Scenario-1, once the average fixation duration (FD) and the fixation
number (FN) of intermediates increase, their average stand still
duration (SSD) and stand still number (SSN) decreases. The increase in
the fixation duration can be observed because of an increase in their
concentration while performing the designated tasks. In other words,
when their concentration increases, their hand movements become smoother
and fewer occurrences of stand-still take place, indicating serial and
smoother hand movements. This result supports the earlier studies which
reported that skilled surgeons’ hand performances are more stable than
less-experienced ones (<xref ref-type="bibr" rid="b51">51</xref>), and
that the fixation number of experts are higher in task-relevant areas
(<xref ref-type="bibr" rid="b52">52</xref>), but lower in task-redundant
areas (<xref ref-type="bibr" rid="b52">52</xref>).</p>

<p>Different from intermediates, in both Scenario-1, an increase in the
average fixation duration (FD) and fixation number (FN) of the eye-gaze
among the novice participants correlates with an increase in their hand
motion metrics (stand still duration (SSD) and stand still number
(SSN)). At this stage, it can be inferred that the novices’ hand
movements experience more idle or stand still states when their eye
fixation increases. This finding is supportive of earlier results
reporting that, while manipulating the tool, experts look directly at
the target location, whereas novices track the movement of the tool
until it reaches the target location
(<xref ref-type="bibr" rid="b53">53</xref>). On the other hand, in
Scenario-2, novices performed better in terms of eye-hand correlation,
where the strength of the correlation remains the same as moderate, but
the direction changes from positive to negative. This result indicates
that in Scenario-2, when the eye fixation increases their hand movements
also become smoother. This may be an indicator that their eye-hand
movement coordination is improved for this specific task in Scenario-2
after practicing in Scenario -1.</p>

<p>A strong correlation occurs between SN and SSM in both scenarios for
intermediates. In Scenario-2 the correlation coefficient increases
slightly for this group. The result shows that, when there is an
increase in the average number of saccades in the intermediates’ eye
movements, their average sudden sharp movement also increases.
Similarly, in Scenario-1 a strong correlation is also found for the
novices on SN and SSM. However, in Scenario-2, this value is
significantly smaller for the novice group, indicating that there is a
need for the novices’ to improve their eye-hand coordination skills,
while further supporting an earlier study which reported that novices
control on the environment and monitor is lower compared to the experts
(<xref ref-type="bibr" rid="b54">54</xref>). In this earlier study, the
researchers analyzed eye movements of experts and novices and reported
that novices concentrate on the surgical display and, as such, lose
track of patient’s status, whereas experts also observe these conditions
at the same time (<xref ref-type="bibr" rid="b54">54</xref>).</p>

<p>Similarly, the results obtained based on the questionnaire data
considering the observers’ evaluations on the skill levels of the
intermediates in terms of their 3D depth perception and eye-hand
coordination are reported as higher than that of the novices.</p>

    <sec id="S4a">
      <title>Conclusion</title>

<p>This study has two major contributions. Firstly, new hand movement
metrics are proposed by adapting an open-source eye-movement
classification algorithm (BIT) to the data collected through a
computer-based simulation software using a haptic interface and
eye-tracker. Secondly, it can be stated that such metrics and eye-hand
correlation analyses can be used for the objective assessment of the
skill levels required for endoscopic surgery trainees. As studies on
eye-hand coordination in the literature are very limited, we believe
that analyzing surgeons' hand and eye data jointly is an important
contribution to objective evaluations and assessment of their skills and
development of some standards for surgical education programs.</p>

<p>In the literature, dispersion-based algorithms are recommended for
eye movement event classification while using a low-frequency eye
tracking device. However, in this study, an adaptive algorithm is used
because the velocity of the eye and hand movements can be different. Due
to the ability of automatically defining task- and individual-specific
thresholds and being machine and sampling frequency independent, we
believe that the BIT algorithm proposed by
(<xref ref-type="bibr" rid="b38">38</xref>) is suitable for skill-based
studies.</p>
    </sec>

    <sec id="S4b">
      <title>Limitations and Future Work</title>

<p>As it is commonly the case, the number of surgeons in the
neurosurgery and ENT departments is very limited, for which reason this
study was conducted with 15 participants and only intermediate and
novice surgeons. In the future attempts, it may be possible to validate
the results of this study with a larger number of participants and also
upon a wider range of skill levels. Additionally, in the future,
experimental studies should be conducted with the help of scenarios in a
larger scope aiming at different tasks with different difficulty levels
and under different conditions. For instance, in the present work only
both-handed task performances were evaluated; whereas later, the
participants’ performances may also be compared in dominant,
non-dominant and both hands settings.</p>

<p>As we were unable to obtain the values for the exact time stamps of
the provided events, in this study the average values for each metric
could be analyzed because of the values provided by the BIT algorithm.
Due to this drawback, further analysis needs to be carried out to
examine whether the eye and hand events occur in a synchronized way
using other appropriate algorithms. Also, the differences among the
results of different algorithms can also be further analyzed.</p>

<p>Lastly, Scenario-1 in this study was designed in a more general sense
and to gain endoscopic skills, while Scenario-2 was arranged as more
specific to the endo-neurosurgery tasks. Accordingly, the experimental
study is conducted in the same order as the scenarios (Scenario-1 is
performed first and followed by Scenario-2). However, to eliminate the
order effect in learning, with more scenarios this effect can be brought
under control.</p>
    </sec>

    <sec id="S4c" sec-type="COI-statement">
      <title>Ethics and Conflict of Interest</title>

<p>The authors declare that the contents of the article are in agreement
with the ethics described in
<ext-link ext-link-type="uri" xlink:href="http://biblio.unibe.ch/portale/elibrary/BOP/jemr/ethics.html" xlink:show="new">http://biblio.unibe.ch/portale/elibrary/BOP/jemr/ethics.html</ext-link>
and that there is no conflict of interest regarding the publication of
this paper.</p>
    </sec>

    <sec id="S4d">
      <title>Acknowledgements</title>

<p>This study was conducted to improve the scenario designs applied in
educational materials developed for the endo-neurosurgery education
project (ECE: Tubitak 1001, Project No: 112K287). The authors would like
to thank the support of TÜBİTAK 1001 program for realizing this
research. Our thanks also go to the ECE project team and the Hacettepe
University Medical School for their valuable support throughout the
research. We also wish to express our sincere thanks to Payam Danesh for
his valuable comments on an earlier version of this manuscript.</p>
    </sec>
    </sec>
</body>
<back>
<ref-list>
<ref id="b37"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Andersson</surname>, <given-names>R.</given-names></name>, <name><surname>Larsson</surname>, <given-names>L.</given-names></name>, <name><surname>Holmqvist</surname>, <given-names>K.</given-names></name>, <name><surname>Stridh</surname>, <given-names>M.</given-names></name>, &#x26; <name><surname>Nystr&#246;m</surname>, <given-names>M.</given-names></name></person-group> (<year>2017</year>). <article-title>One algorithm to rule them all? An evaluation and discussion of ten eye movement event-detection algorithms.</article-title> <source>Behavior Research Methods</source>, <volume>49</volume>(<issue>2</issue>), <fpage>616</fpage>&#8211;<lpage>637</lpage>. <pub-id pub-id-type="doi">10.3758/s13428-016-0738-9</pub-id><pub-id pub-id-type="pmid">27193160</pub-id><issn>1554-351X</issn></mixed-citation></ref>
<ref id="b47"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Andreatta</surname>, <given-names>P. B.</given-names></name>, <name><surname>Woodrum</surname>, <given-names>D. T.</given-names></name>, <name><surname>Gauger</surname>, <given-names>P. G.</given-names></name>, &#x26; <name><surname>Minter</surname>, <given-names>R. M.</given-names></name></person-group> (<year>2008</year>). <article-title>LapMentor metrics possess limited construct validity.</article-title> <source>Simulation in Healthcare</source>, <volume>3</volume>(<issue>1</issue>), <fpage>16</fpage>&#8211;<lpage>25</lpage>. <pub-id pub-id-type="doi">10.1097/SIH.0b013e31816366b9</pub-id><pub-id pub-id-type="pmid">19088638</pub-id><issn>1559-2332</issn></mixed-citation></ref>
<ref id="b36"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Ayodeji</surname>, <given-names>I. D.</given-names></name>, <name><surname>Schijven</surname>, <given-names>M.</given-names></name>, <name><surname>Jakimowicz</surname>, <given-names>J.</given-names></name>, &#x26; <name><surname>Greve</surname>, <given-names>J. W.</given-names></name></person-group> (<year>2007</year>). <article-title>Face validation of the Simbionix LAP Mentor virtual reality training module and its applicability in the surgical curriculum.</article-title> <source>Surgical Endoscopy</source>, <volume>21</volume>(<issue>9</issue>), <fpage>1641</fpage>&#8211;<lpage>1649</lpage>. <pub-id pub-id-type="doi">10.1007/s00464-007-9219-7</pub-id><pub-id pub-id-type="pmid">17356944</pub-id><issn>0930-2794</issn></mixed-citation></ref>
<ref id="b40"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Basdogan</surname>, <given-names>C.</given-names></name>, <name><surname>De</surname>, <given-names>S.</given-names></name>, <name><surname>Kim</surname>, <given-names>J.</given-names></name>, <name><surname>Muniyandi</surname>, <given-names>M.</given-names></name>, <name><surname>Kim</surname>, <given-names>H.</given-names></name>, &#x26; <name><surname>Srinivasan</surname>, <given-names>M. A.</given-names></name></person-group> (<year>2004</year>). <article-title>Haptics in minimally invasive surgical simulation and training.</article-title> <source>IEEE Computer Graphics and Applications</source>, <volume>24</volume>(<issue>2</issue>), <fpage>56</fpage>&#8211;<lpage>64</lpage>. <pub-id pub-id-type="doi">10.1109/MCG.2004.1274062</pub-id><pub-id pub-id-type="pmid">15387229</pub-id><issn>0272-1716</issn></mixed-citation></ref>
<ref id="b7"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Batmaz</surname>, <given-names>A. U.</given-names></name>, <name><surname>de Mathelin</surname>, <given-names>M.</given-names></name>, &#x26; <name><surname>Dresp-Langley</surname>, <given-names>B.</given-names></name></person-group> (<year>2017</year>). <article-title>Seeing virtual while acting real: Visual display and strategy effects on the time and precision of eye-hand coordination.</article-title> <source>PLoS One</source>, <volume>12</volume>(<issue>8</issue>), <fpage>e0183789</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0183789</pub-id><pub-id pub-id-type="pmid">28859092</pub-id><issn>1932-6203</issn></mixed-citation></ref>
<ref id="b10"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Berkenstadt</surname>, <given-names>H.</given-names></name>, <name><surname>Ziv</surname>, <given-names>A.</given-names></name>, <name><surname>Barsuk</surname>, <given-names>D.</given-names></name>, <name><surname>Levine</surname>, <given-names>I.</given-names></name>, <name><surname>Cohen</surname>, <given-names>A.</given-names></name>, &#x26; <name><surname>Vardi</surname>, <given-names>A.</given-names></name></person-group> (<year>2003</year>). <article-title>The use of advanced simulation in the training of anesthesiologists to treat chemical warfare casualties.</article-title> <source>Anesthesia and Analgesia</source>, <volume>96</volume>(<issue>6</issue>), <fpage>1739</fpage>&#8211;<lpage>1742</lpage>. <pub-id pub-id-type="doi">10.1213/01.ANE.0000057027.52664.0B</pub-id><pub-id pub-id-type="pmid">12761005</pub-id><issn>0003-2999</issn></mixed-citation></ref>
<ref id="b25"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Bishop</surname>, <given-names>D.</given-names></name>, <name><surname>Kuhn</surname>, <given-names>G.</given-names></name>, &#x26; <name><surname>Maton</surname>, <given-names>C.</given-names></name></person-group> (<year>2014</year>). <article-title>Telling people where to look in a soccer-based decision task: A nomothetic approach.</article-title> <source>Journal of Eye Movement Research</source>, <volume>7</volume>(<issue>2</issue>).<issn>1995-8692</issn></mixed-citation></ref>
<ref id="b12"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Cagiltay</surname>, <given-names>N. E.</given-names></name>, <name><surname>Ozcelik</surname>, <given-names>E.</given-names></name>, <name><surname>Sengul</surname>, <given-names>G.</given-names></name>, &#x26; <name><surname>Berker</surname>, <given-names>M.</given-names></name></person-group> (<year>2017</year>). <article-title>Construct and face validity of the educational computer-based environment (ECE) assessment scenarios for basic endoneurosurgery skills.</article-title> <source>Surgical Endoscopy</source>, <volume>31</volume>(<issue>11</issue>), <fpage>4485</fpage>&#8211;<lpage>4495</lpage>. <pub-id pub-id-type="doi">10.1007/s00464-017-5502-4</pub-id><pub-id pub-id-type="pmid">28389794</pub-id><issn>0930-2794</issn></mixed-citation></ref>
<ref id="b45"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Cohen</surname>, <given-names>J.</given-names></name></person-group> (<year>1988</year>). <article-title>Set correlation and contingency tables.</article-title> <source>Applied Psychological Measurement</source>, <volume>12</volume>(<issue>4</issue>), <fpage>425</fpage>&#8211;<lpage>434</lpage>. <pub-id pub-id-type="doi">10.1177/014662168801200410</pub-id><issn>0146-6216</issn></mixed-citation></ref>
<ref id="b43"><mixed-citation publication-type="preprint" specific-use="unparsed"><person-group person-group-type="author"><name><surname>Dalmaijer</surname> <given-names>E.</given-names></name></person-group> <article-title>Is the low-cost EyeTribe eye tracker any good for research? PeerJ</article-title> <source>PrePrints</source>, <year>2014</year> 2167-9843.</mixed-citation></ref>
<ref id="b2"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Dankelman</surname>, <given-names>J.</given-names></name>, <name><surname>Grimbergen</surname>, <given-names>C. K. A.</given-names></name>, &#x26; <name><surname>Stassen</surname>, <given-names>H. G.</given-names></name></person-group> (<year>2007</year>). <article-title>New technologies supporting surgical intervenltions and training of surgical skills-a look at projects in europe supporting minimally invasive techniques.</article-title> <source>IEEE Engineering in Medicine and Biology Magazine</source>, <volume>26</volume>(<issue>3</issue>), <fpage>47</fpage>&#8211;<lpage>52</lpage>. <pub-id pub-id-type="doi">10.1109/MEMB.2007.364929</pub-id><pub-id pub-id-type="pmid">17549920</pub-id><issn>0739-5175</issn></mixed-citation></ref>
<ref id="b16"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Datta</surname>, <given-names>V.</given-names></name>, <name><surname>Chang</surname>, <given-names>A.</given-names></name>, <name><surname>Mackay</surname>, <given-names>S.</given-names></name>, &#x26; <name><surname>Darzi</surname>, <given-names>A.</given-names></name></person-group> (<year>2002</year>). <article-title>The relationship between motion analysis and surgical technical assessments.</article-title> <source>American Journal of Surgery</source>, <volume>184</volume>(<issue>1</issue>), <fpage>70</fpage>&#8211;<lpage>73</lpage>. <pub-id pub-id-type="doi">10.1016/S0002-9610(02)00891-7</pub-id><pub-id pub-id-type="pmid">12135725</pub-id><issn>0002-9610</issn></mixed-citation></ref>
<ref id="b18"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>D&#8217;Angelo</surname>, <given-names>A.-L. D.</given-names></name>, <name><surname>Rutherford</surname>, <given-names>D. N.</given-names></name>, <name><surname>Ray</surname>, <given-names>R. D.</given-names></name>, <name><surname>Laufer</surname>, <given-names>S.</given-names></name>, <name><surname>Kwan</surname>, <given-names>C.</given-names></name>, <name><surname>Cohen</surname>, <given-names>E. R.</given-names></name>, <etal>. . .</etal> <name><surname>Pugh</surname>, <given-names>C. M.</given-names></name></person-group> (<year>2015</year>). <article-title>Idle time: An underdeveloped performance metric for assessing surgical skill.</article-title> <source>American Journal of Surgery</source>, <volume>209</volume>(<issue>4</issue>), <fpage>645</fpage>&#8211;<lpage>651</lpage>. <pub-id pub-id-type="doi">10.1016/j.amjsurg.2014.12.013</pub-id><pub-id pub-id-type="pmid">25725505</pub-id><issn>0002-9610</issn></mixed-citation></ref>
<ref id="b11"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Evgeniou</surname>, <given-names>E.</given-names></name>, &#x26; <name><surname>Loizou</surname>, <given-names>P.</given-names></name></person-group> (<year>2013</year>). <article-title>Simulation-based surgical education.</article-title> <source>ANZ Journal of Surgery</source>, <volume>83</volume>(<issue>9</issue>), <fpage>619</fpage>&#8211;<lpage>623</lpage>. <pub-id pub-id-type="doi">10.1111/j.1445-2197.2012.06315.x</pub-id><pub-id pub-id-type="pmid">23088646</pub-id><issn>1445-1433</issn></mixed-citation></ref>
<ref id="b52"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Gegenfurtner</surname>, <given-names>A.</given-names></name>, <name><surname>Lehtinen</surname>, <given-names>E.</given-names></name>, &#x26; <name><surname>S&#228;lj&#246;</surname>, <given-names>R.</given-names></name></person-group> (<year>2011</year>). <article-title>Expertise differences in the comprehension of visualizations: A meta-analysis of eye-tracking research in professional domains.</article-title> <source>Educational Psychology Review</source>, <volume>23</volume>(<issue>4</issue>), <fpage>523</fpage>&#8211;<lpage>552</lpage>. <pub-id pub-id-type="doi">10.1007/s10648-011-9174-7</pub-id><issn>1040-726X</issn></mixed-citation></ref>
<ref id="b9"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Gordon</surname>, <given-names>J. A.</given-names></name>, <name><surname>Wilkerson</surname>, <given-names>W. M.</given-names></name>, <name><surname>Shaffer</surname>, <given-names>D. W.</given-names></name>, &#x26; <name><surname>Armstrong</surname>, <given-names>E. G.</given-names></name></person-group> (<year>2001</year>). <article-title>&#8220;Practicing&#8221; medicine without risk: Students&#8217; and educators&#8217; responses to high-fidelity patient simulation.</article-title> <source>Academic Medicine</source>, <volume>76</volume>(<issue>5</issue>), <fpage>469</fpage>&#8211;<lpage>472</lpage>. <pub-id pub-id-type="doi">10.1097/00001888-200105000-00019</pub-id><pub-id pub-id-type="pmid">11346525</pub-id><issn>1040-2446</issn></mixed-citation></ref>
<ref id="b22"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Helsen</surname>, <given-names>W. F.</given-names></name>, <name><surname>Elliott</surname>, <given-names>D.</given-names></name>, <name><surname>Starkes</surname>, <given-names>J. L.</given-names></name>, &#x26; <name><surname>Ricker</surname>, <given-names>K. L.</given-names></name></person-group> (<year>2000</year>). <article-title>Coupling of eye, finger, elbow, and shoulder movements during manual aiming.</article-title> <source>Journal of Motor Behavior</source>, <volume>32</volume>(<issue>3</issue>), <fpage>241</fpage>&#8211;<lpage>248</lpage>. <pub-id pub-id-type="doi">10.1080/00222890009601375</pub-id><pub-id pub-id-type="pmid">10975272</pub-id><issn>0022-2895</issn></mixed-citation></ref>
<ref id="b5"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Hermens</surname>, <given-names>F.</given-names></name>, <name><surname>Flin</surname>, <given-names>R.</given-names></name>, &#x26; <name><surname>Ahmed</surname>, <given-names>I.</given-names></name></person-group> (<year>2013</year>). <article-title>Eye movements in surgery: A literature review.</article-title> <source>Journal of Eye Movement Research</source>, <volume>6</volume>(<issue>4</issue>).<issn>1995-8692</issn></mixed-citation></ref>
<ref id="b6"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Hernandez</surname>, <given-names>J. D.</given-names></name>, <name><surname>Bann</surname>, <given-names>S. D.</given-names></name>, <name><surname>Munz</surname>, <given-names>Y.</given-names></name>, <name><surname>Moorthy</surname>, <given-names>K.</given-names></name>, <name><surname>Datta</surname>, <given-names>V.</given-names></name>, <name><surname>Martin</surname>, <given-names>S.</given-names></name>, <etal>. . .</etal> <name><surname>Rockall</surname>, <given-names>T.</given-names></name></person-group> (<year>2004</year>). <article-title>Qualitative and quantitative analysis of the learning curve of a simulated surgical task on the da Vinci system.</article-title> <source>Surgical Endoscopy</source>, <volume>18</volume>(<issue>3</issue>), <fpage>372</fpage>&#8211;<lpage>378</lpage>. <pub-id pub-id-type="doi">10.1007/s00464-003-9047-3</pub-id><pub-id pub-id-type="pmid">14752634</pub-id><issn>0930-2794</issn></mixed-citation></ref>
<ref id="b23"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Jiang</surname>, <given-names>X.</given-names></name>, <name><surname>Zheng</surname>, <given-names>B.</given-names></name>, &#x26; <name><surname>Atkins</surname>, <given-names>M. S.</given-names></name></person-group> (<year>2015</year>). <article-title>Video processing to locate the tooltip position in surgical eye-hand coordination tasks.</article-title> <source>Surgical Innovation</source>, <volume>22</volume>(<issue>3</issue>), <fpage>285</fpage>&#8211;<lpage>293</lpage>. <pub-id pub-id-type="doi">10.1177/1553350614541859</pub-id><pub-id pub-id-type="pmid">25049318</pub-id><issn>1553-3506</issn></mixed-citation></ref>
<ref id="b26"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Jiang</surname>, <given-names>X.</given-names></name>, <name><surname>Zheng</surname>, <given-names>B.</given-names></name>, <name><surname>Bednarik</surname>, <given-names>R.</given-names></name>, &#x26; <name><surname>Atkins</surname>, <given-names>M. S.</given-names></name></person-group> (<year>2015</year>). <article-title>Pupil responses to continuous aiming movements.</article-title> <source>International Journal of Human-Computer Studies</source>, <volume>83</volume>, <fpage>1</fpage>&#8211;<lpage>11</lpage>. <pub-id pub-id-type="doi">10.1016/j.ijhcs.2015.05.006</pub-id><issn>1071-5819</issn></mixed-citation></ref>
<ref id="b27"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Johnson</surname>, <given-names>B. P.</given-names></name>, <name><surname>Lum</surname>, <given-names>J. A.</given-names></name>, <name><surname>Rinehart</surname>, <given-names>N. J.</given-names></name>, &#x26; <name><surname>Fielding</surname>, <given-names>J.</given-names></name></person-group> (<year>2016</year>). <article-title>Ocular motor disturbances in autism spectrum disorders: Systematic review and comprehensive meta-analysis.</article-title> <source>Neuroscience and Biobehavioral Reviews</source>, <volume>69</volume>, <fpage>260</fpage>&#8211;<lpage>279</lpage>. <pub-id pub-id-type="doi">10.1016/j.neubiorev.2016.08.007</pub-id><pub-id pub-id-type="pmid">27527824</pub-id><issn>0149-7634</issn></mixed-citation></ref>
<ref id="b34"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Kirk</surname>, <given-names>R. M.</given-names></name></person-group> (<year>1996</year>). <article-title>Teaching the craft of operative surgery.</article-title> <source>Annals of the Royal College of Surgeons of England</source>, <volume>78</volume>(<issue>1</issue>, <supplement>Suppl</supplement>), <fpage>25</fpage>&#8211;<lpage>28</lpage>.<pub-id pub-id-type="pmid">8659997</pub-id><issn>0035-8843</issn></mixed-citation></ref>
<ref id="b46"><mixed-citation publication-type="web-page" specific-use="unparsed"><person-group person-group-type="author"><collab>Laerd Statistics</collab></person-group>. <article-title>Pearson’s product-moment correlation using SPSS Statistics.</article-title> Statistical tutorials and software guides. <year>2017</year>. Available from: <ext-link ext-link-type="uri" xlink:href="https://statistics.laerd.com/">https://statistics.laerd.com/</ext-link></mixed-citation></ref>
<ref id="b1"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Lanfranco</surname>, <given-names>A. R.</given-names></name>, <name><surname>Castellanos</surname>, <given-names>A. E.</given-names></name>, <name><surname>Desai</surname>, <given-names>J. P.</given-names></name>, &#x26; <name><surname>Meyers</surname>, <given-names>W. C.</given-names></name></person-group> (<year>2004</year>). <article-title>Robotic surgery: A current perspective.</article-title> <source>Annals of Surgery</source>, <volume>239</volume>(<issue>1</issue>), <fpage>14</fpage>&#8211;<lpage>21</lpage>. <pub-id pub-id-type="doi">10.1097/01.sla.0000103020.19595.7d</pub-id><pub-id pub-id-type="pmid">14685095</pub-id><issn>0003-4932</issn></mixed-citation></ref>
<ref id="b17"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Latko</surname>, <given-names>W. A.</given-names></name>, <name><surname>Armstrong</surname>, <given-names>T. J.</given-names></name>, <name><surname>Foulke</surname>, <given-names>J. A.</given-names></name>, <name><surname>Herrin</surname>, <given-names>G. D.</given-names></name>, <name><surname>Rabourn</surname>, <given-names>R. A.</given-names></name>, &#x26; <name><surname>Ulin</surname>, <given-names>S. S.</given-names></name></person-group> (<year>1997</year>). <article-title>Development and evaluation of an observational method for assessing repetition in hand tasks.</article-title> <source>American Industrial Hygiene Association Journal</source>, <volume>58</volume>(<issue>4</issue>), <fpage>278</fpage>&#8211;<lpage>285</lpage>. <pub-id pub-id-type="doi">10.1080/15428119791012793</pub-id><pub-id pub-id-type="pmid">9115085</pub-id><issn>0002-8894</issn></mixed-citation></ref>
<ref id="b53"><mixed-citation publication-type="conference" specific-use="linked"><person-group person-group-type="editor"><name><surname>Law</surname> <given-names>B</given-names></name>, <name><surname>Atkins</surname> <given-names>MS</given-names></name>, <name><surname>Kirkpatrick</surname> <given-names>AE</given-names></name>, <name><surname>Lomax</surname> <given-names>AJ</given-names></name><role>, editors</role></person-group>. <article-title>Eye gaze patterns differentiate novice and experts in a virtual laparoscopic surgery training environment.</article-title> Proceedings of the 2004 symposium on Eye tracking research &#x26; applications; <year>2004</year>: ACM. <pub-id pub-id-type="doi">10.1145/968363.968370</pub-id></mixed-citation></ref>
<ref id="b4"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Lehmann</surname>, <given-names>K. S.</given-names></name>, <name><surname>Ritz</surname>, <given-names>J. P.</given-names></name>, <name><surname>Maass</surname>, <given-names>H.</given-names></name>, <name><surname>Cakmak</surname>, <given-names>H. K.</given-names></name>, <name><surname>Kuehnapfel</surname>, <given-names>U. G.</given-names></name>, <name><surname>Germer</surname>, <given-names>C. T.</given-names></name>, <etal>. . .</etal> <name><surname>Buhr</surname>, <given-names>H. J.</given-names></name></person-group> (<year>2005</year>). <article-title>A prospective randomized study to test the transfer of basic psychomotor skills from virtual reality to physical reality in a comparable training setting.</article-title> <source>Annals of Surgery</source>, <volume>241</volume>(<issue>3</issue>), <fpage>442</fpage>&#8211;<lpage>449</lpage>. <pub-id pub-id-type="doi">10.1097/01.sla.0000154552.89886.91</pub-id><pub-id pub-id-type="pmid">15729066</pub-id><issn>0003-4932</issn></mixed-citation></ref>
<ref id="b13"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Martin</surname>, <given-names>J. A.</given-names></name>, <name><surname>Regehr</surname>, <given-names>G.</given-names></name>, <name><surname>Reznick</surname>, <given-names>R.</given-names></name>, <name><surname>MacRae</surname>, <given-names>H.</given-names></name>, <name><surname>Murnaghan</surname>, <given-names>J.</given-names></name>, <name><surname>Hutchison</surname>, <given-names>C.</given-names></name>, &#x26; <name><surname>Brown</surname>, <given-names>M.</given-names></name></person-group> (<year>1997</year>). <article-title>Objective structured assessment of technical skill (OSATS) for surgical residents.</article-title> <source>British Journal of Surgery</source>, <volume>84</volume>(<issue>2</issue>), <fpage>273</fpage>&#8211;<lpage>278</lpage>.<pub-id pub-id-type="pmid">9052454</pub-id><issn>0007-1323</issn></mixed-citation></ref>
<ref id="b48"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>McDougall</surname>, <given-names>E. M.</given-names></name>, <name><surname>Corica</surname>, <given-names>F. A.</given-names></name>, <name><surname>Boker</surname>, <given-names>J. R.</given-names></name>, <name><surname>Sala</surname>, <given-names>L. G.</given-names></name>, <name><surname>Stoliar</surname>, <given-names>G.</given-names></name>, <name><surname>Borin</surname>, <given-names>J. F.</given-names></name>, <etal>. . .</etal> <name><surname>Clayman</surname>, <given-names>R. V.</given-names></name></person-group> (<year>2006</year>). <article-title>Construct validity testing of a laparoscopic surgical simulator.</article-title> <source>Journal of the American College of Surgeons</source>, <volume>202</volume>(<issue>5</issue>), <fpage>779</fpage>&#8211;<lpage>787</lpage>. <pub-id pub-id-type="doi">10.1016/j.jamcollsurg.2006.01.004</pub-id><pub-id pub-id-type="pmid">16648018</pub-id><issn>1072-7515</issn></mixed-citation></ref>
<ref id="b44"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Menekse Dalveren</surname>, <given-names>G. G.</given-names></name>, &#x26; <name><surname>Cagiltay</surname>, <given-names>N. E.</given-names></name></person-group> (<year>2018</year>). <article-title>Insights from surgeons&#8217; eye-movement data in a virtual simulation surgical training environment: Effect of experience level and hand conditions.</article-title> <source>Behaviour &#x26; Information Technology</source>, <volume>37</volume>(<issue>5</issue>), <fpage>517</fpage>&#8211;<lpage>537</lpage>. <pub-id pub-id-type="doi">10.1080/0144929X.2018.1460399</pub-id><issn>0144-929X</issn></mixed-citation></ref>
<ref id="b21"><mixed-citation publication-type="conference" specific-use="linked"><person-group person-group-type="editor"><name><surname>Mohamadipanah</surname> <given-names>H</given-names></name>, <name><surname>Parthiban</surname> <given-names>C</given-names></name>, <name><surname>Law</surname> <given-names>K</given-names></name>, <name><surname>Nathwani</surname> <given-names>J</given-names></name>, <name><surname>Maulson</surname> <given-names>L</given-names></name>, <name><surname>DiMarco</surname> <given-names>S</given-names></name>, <etal>et al.</etal><role>, editors</role></person-group>. <article-title>Hand smoothness in laparoscopic surgery correlates to psychomotor skills in virtual reality.</article-title> Wearable and Implantable Body Sensor Networks (BSN), <year>2016</year> IEEE 13th International Conference on; 2016: IEEE. <pub-id pub-id-type="doi">10.1109/BSN.2016.7516267</pub-id></mixed-citation></ref>
<ref id="b3"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Moore</surname>, <given-names>M. J.</given-names></name>, &#x26; <name><surname>Bennett</surname>, <given-names>C. L.</given-names></name></person-group> (<year>1995</year>). <article-title>The learning curve for laparoscopic cholecystectomy. The Southern Surgeons Club.</article-title> <source>American Journal of Surgery</source>, <volume>170</volume>(<issue>1</issue>), <fpage>55</fpage>&#8211;<lpage>59</lpage>. <pub-id pub-id-type="doi">10.1016/S0002-9610(99)80252-9</pub-id><pub-id pub-id-type="pmid">7793496</pub-id><issn>0002-9610</issn></mixed-citation></ref>
<ref id="b14"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Moorthy</surname>, <given-names>K.</given-names></name>, <name><surname>Munz</surname>, <given-names>Y.</given-names></name>, <name><surname>Sarker</surname>, <given-names>S. K.</given-names></name>, &#x26; <name><surname>Darzi</surname>, <given-names>A.</given-names></name></person-group> (<year>2003</year>). <article-title>Objective assessment of technical skills in surgery.</article-title> <source>BMJ (Clinical Research Ed.)</source>, <volume>327</volume>(<issue>7422</issue>), <fpage>1032</fpage>&#8211;<lpage>1037</lpage>. <pub-id pub-id-type="doi">10.1136/bmj.327.7422.1032</pub-id><pub-id pub-id-type="pmid">14593041</pub-id><issn>0959-8138</issn></mixed-citation></ref>
<ref id="b42"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Ooms</surname>, <given-names>K.</given-names></name>, <name><surname>Dupont</surname>, <given-names>L.</given-names></name>, <name><surname>Lapon</surname>, <given-names>L.</given-names></name>, &#x26; <name><surname>Popelka</surname>, <given-names>S.</given-names></name></person-group> (<year>2015</year>). <article-title>Accuracy and precision of fixation locations recorded with the low-cost Eye Tribe tracker in different experimental setups.</article-title> <source>Journal of Eye Movement Research</source>, <volume>8</volume>(<issue>1</issue>).<issn>1995-8692</issn></mixed-citation></ref>
<ref id="b19"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Oropesa</surname>, <given-names>I.</given-names></name>, <name><surname>S&#225;nchez-Gonz&#225;lez</surname>, <given-names>P.</given-names></name>, <name><surname>Lamata</surname>, <given-names>P.</given-names></name>, <name><surname>Chmarra</surname>, <given-names>M. K.</given-names></name>, <name><surname>Pagador</surname>, <given-names>J. B.</given-names></name>, <name><surname>S&#225;nchez-Margallo</surname>, <given-names>J. A.</given-names></name>, <etal>. . .</etal> <name><surname>G&#243;mez</surname>, <given-names>E. J.</given-names></name></person-group> (<year>2011</year>). <article-title>Methods and tools for objective assessment of psychomotor skills in laparoscopic surgery.</article-title> <source>The Journal of Surgical Research</source>, <volume>171</volume>(<issue>1</issue>), <fpage>e81</fpage>&#8211;<lpage>e95</lpage>. <pub-id pub-id-type="doi">10.1016/j.jss.2011.06.034</pub-id><pub-id pub-id-type="pmid">21924741</pub-id><issn>0022-4804</issn></mixed-citation></ref>
<ref id="b20"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Oropesa</surname>, <given-names>I.</given-names></name>, <name><surname>Chmarra</surname>, <given-names>M. K.</given-names></name>, <name><surname>S&#225;nchez-Gonz&#225;lez</surname>, <given-names>P.</given-names></name>, <name><surname>Lamata</surname>, <given-names>P.</given-names></name>, <name><surname>Rodrigues</surname>, <given-names>S. P.</given-names></name>, <name><surname>Enciso</surname>, <given-names>S.</given-names></name>, <etal>. . .</etal> <name><surname>G&#243;mez</surname>, <given-names>E. J.</given-names></name></person-group> (<year>2013</year>). <article-title>Relevance of motion-related assessment metrics in laparoscopic surgery.</article-title> <source>Surgical Innovation</source>, <volume>20</volume>(<issue>3</issue>), <fpage>299</fpage>&#8211;<lpage>312</lpage>. <pub-id pub-id-type="doi">10.1177/1553350612459808</pub-id><pub-id pub-id-type="pmid">22983805</pub-id><issn>1553-3506</issn></mixed-citation></ref>
<ref id="b33"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Oropesa</surname>, <given-names>I.</given-names></name>, <name><surname>S&#225;nchez-Gonz&#225;ez</surname>, <given-names>P.</given-names></name>, <name><surname>Chmarra</surname>, <given-names>M. K.</given-names></name>, <name><surname>Lamata</surname>, <given-names>P.</given-names></name>, <name><surname>P&#233;rez-Rodr&#237;guez</surname>, <given-names>R.</given-names></name>, <name><surname>Jansen</surname>, <given-names>F. W.</given-names></name>, <etal>. . .</etal> <name><surname>G&#243;mez</surname>, <given-names>E. J.</given-names></name></person-group> (<year>2014</year>). <article-title>Supervised classification of psychomotor competence in minimally invasive surgery based on instruments motion analysis.</article-title> <source>Surgical Endoscopy</source>, <volume>28</volume>(<issue>2</issue>), <fpage>657</fpage>&#8211;<lpage>670</lpage>. <pub-id pub-id-type="doi">10.1007/s00464-013-3226-7</pub-id><pub-id pub-id-type="pmid">24122243</pub-id><issn>0930-2794</issn></mixed-citation></ref>
<ref id="b24"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Oropesa</surname>, <given-names>I.</given-names></name>, <name><surname>S&#225;nchez-Gonz&#225;lez</surname>, <given-names>P.</given-names></name>, <name><surname>Chmarra</surname>, <given-names>M. K.</given-names></name>, <name><surname>Lamata</surname>, <given-names>P.</given-names></name>, <name><surname>Fern&#225;ndez</surname>, <given-names>A.</given-names></name>, <name><surname>S&#225;nchez-Margallo</surname>, <given-names>J. A.</given-names></name>, <etal>. . .</etal> <name><surname>G&#243;mez</surname>, <given-names>E. J.</given-names></name></person-group> (<year>2013</year>). <article-title>EVA: Laparoscopic instrument tracking based on Endoscopic Video Analysis for psychomotor skills assessment.</article-title> <source>Surgical Endoscopy</source>, <volume>27</volume>(<issue>3</issue>), <fpage>1029</fpage>&#8211;<lpage>1039</lpage>. <pub-id pub-id-type="doi">10.1007/s00464-012-2513-z</pub-id><pub-id pub-id-type="pmid">23052495</pub-id><issn>0930-2794</issn></mixed-citation></ref>
<ref id="b28"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Parr</surname>, <given-names>J. V. V.</given-names></name>, <name><surname>Vine</surname>, <given-names>S. J.</given-names></name>, <name><surname>Harrison</surname>, <given-names>N. R.</given-names></name>, &#x26; <name><surname>Wood</surname>, <given-names>G.</given-names></name></person-group> (<year>2018</year>). <article-title>Examining the spatiotemporal disruption to gaze when using a myoelectric prosthetic hand.</article-title> <source>Journal of Motor Behavior</source>, <volume>50</volume>(<issue>4</issue>), <fpage>416</fpage>&#8211;<lpage>425</lpage>. <pub-id pub-id-type="doi">10.1080/00222895.2017.1363703</pub-id><pub-id pub-id-type="pmid">28925815</pub-id><issn>0022-2895</issn></mixed-citation></ref>
<ref id="b32"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Reiley</surname>, <given-names>C. E.</given-names></name>, <name><surname>Lin</surname>, <given-names>H. C.</given-names></name>, <name><surname>Yuh</surname>, <given-names>D. D.</given-names></name>, &#x26; <name><surname>Hager</surname>, <given-names>G. D.</given-names></name></person-group> (<year>2011</year>). <article-title>Review of methods for objective surgical skill evaluation.</article-title> <source>Surgical Endoscopy</source>, <volume>25</volume>(<issue>2</issue>), <fpage>356</fpage>&#8211;<lpage>366</lpage>. <pub-id pub-id-type="doi">10.1007/s00464-010-1190-z</pub-id><pub-id pub-id-type="pmid">20607563</pub-id><issn>0930-2794</issn></mixed-citation></ref>
<ref id="b29"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Schmitt</surname>, <given-names>L. M.</given-names></name>, <name><surname>Cook</surname>, <given-names>E. H.</given-names></name>, <name><surname>Sweeney</surname>, <given-names>J. A.</given-names></name>, &#x26; <name><surname>Mosconi</surname>, <given-names>M. W.</given-names></name></person-group> (<year>2014</year>). <article-title>Saccadic eye movement abnormalities in autism spectrum disorder indicate dysfunctions in cerebellum and brainstem.</article-title> <source>Molecular Autism</source>, <volume>5</volume>(<issue>1</issue>), <fpage>47</fpage>. <pub-id pub-id-type="doi">10.1186/2040-2392-5-47</pub-id><pub-id pub-id-type="pmid">25400899</pub-id><issn>2040-2392</issn></mixed-citation></ref>
<ref id="b39"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Silvennoinen</surname>, <given-names>M.</given-names></name>, <name><surname>Mecklin</surname>, <given-names>J.-P.</given-names></name>, <name><surname>Saariluoma</surname>, <given-names>P.</given-names></name>, &#x26; <name><surname>Antikainen</surname>, <given-names>T.</given-names></name></person-group> (<year>2009</year>). <article-title>Expertise and skill in minimally invasive surgery.</article-title> <source>Scandinavian Journal of Surgery</source>, <volume>98</volume>(<issue>4</issue>), <fpage>209</fpage>&#8211;<lpage>213</lpage>. <pub-id pub-id-type="doi">10.1177/145749690909800403</pub-id><pub-id pub-id-type="pmid">20218416</pub-id><issn>1457-4969</issn></mixed-citation></ref>
<ref id="b50"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Snyder</surname>, <given-names>L. H.</given-names></name>, <name><surname>Calton</surname>, <given-names>J. L.</given-names></name>, <name><surname>Dickinson</surname>, <given-names>A. R.</given-names></name>, &#x26; <name><surname>Lawrence</surname>, <given-names>B. M.</given-names></name></person-group> (<year>2002</year>). <article-title>Eye-hand coordination: Saccades are faster when accompanied by a coordinated arm movement.</article-title> <source>Journal of Neurophysiology</source>, <volume>87</volume>(<issue>5</issue>), <fpage>2279</fpage>&#8211;<lpage>2286</lpage>. <pub-id pub-id-type="doi">10.1152/jn.00854.2001</pub-id><pub-id pub-id-type="pmid">11976367</pub-id><issn>0022-3077</issn></mixed-citation></ref>
<ref id="b15"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Stylopoulos</surname>, <given-names>N.</given-names></name>, &#x26; <name><surname>Vosburgh</surname>, <given-names>K. G.</given-names></name></person-group> (<year>2007</year>). <article-title>Assessing technical skill in surgery and endoscopy: A set of metrics and an algorithm (C-PASS) to assess skills in surgical and endoscopic procedures.</article-title> <source>Surgical Innovation</source>, <volume>14</volume>(<issue>2</issue>), <fpage>113</fpage>&#8211;<lpage>121</lpage>. <pub-id pub-id-type="doi">10.1177/1553350607302330</pub-id><pub-id pub-id-type="pmid">17558017</pub-id><issn>1553-3506</issn></mixed-citation></ref>
<ref id="b41"><mixed-citation publication-type="web-page" specific-use="unparsed"><person-group person-group-type="author"><collab>The Eye Tribe</collab></person-group>. Basics <year>2014</year>. Available from: <ext-link ext-link-type="uri" xlink:href="http://theeyetribe.com/dev.theeyetribe.com/dev.theeyetribe.com/general/index.html">http://theeyetribe.com/dev.theeyetribe.com/dev.theeyetribe.com/general/index.html</ext-link></mixed-citation></ref>
<ref id="b35"><mixed-citation publication-type="unknown" specific-use="unparsed"><person-group person-group-type="author"><name><surname>Tien</surname> <given-names>T</given-names></name>, <name><surname>Pucher</surname> <given-names>PH</given-names></name>, <name><surname>Sodergren</surname> <given-names>MH</given-names></name>, <name><surname>Sriskandarajah</surname> <given-names>K</given-names></name>, <name><surname>Yang</surname> <given-names>G-Z</given-names></name>, <name><surname>Darzi</surname> <given-names>A</given-names></name></person-group>. Eye tracking for skills assessment and training: a systematic review. journal of surgical research. <year>2014</year>;191(1):169-78.</mixed-citation></ref>
<ref id="b54"><mixed-citation publication-type="conference" specific-use="linked"><person-group person-group-type="editor"><name><surname>Tien</surname> <given-names>G</given-names></name>, <name><surname>Atkins</surname> <given-names>MS</given-names></name>, <name><surname>Zheng</surname> <given-names>B</given-names></name>, <name><surname>Swindells</surname> <given-names>C</given-names></name><role>, editors</role></person-group>. <article-title>Measuring situation awareness of surgeons in laparoscopic training.</article-title> Proceedings of the 2010 symposium on eye-tracking research &#x26; applications; <year>2010</year>: ACM. <pub-id pub-id-type="doi">10.1145/1743666.1743703</pub-id></mixed-citation></ref>
<ref id="b51"><mixed-citation publication-type="unknown" specific-use="unparsed"><person-group person-group-type="author"><name><surname>Uemura</surname> <given-names>M</given-names></name>, <name><surname>Tomikawa</surname> <given-names>M</given-names></name>, <name><surname>Kumashiro</surname> <given-names>R</given-names></name>, <name><surname>Miao</surname> <given-names>T</given-names></name>, <name><surname>Souzaki</surname> <given-names>R</given-names></name>, <name><surname>Ieiri</surname> <given-names>S</given-names></name>, <etal>et al.</etal></person-group> Analysis of hand motion differentiates expert and novice surgeons. journal of surgical research. <year>2014</year>;188(1):8-13.</mixed-citation></ref>
<ref id="b30"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Wang</surname>, <given-names>T.-N.</given-names></name>, <name><surname>Howe</surname>, <given-names>T.-H.</given-names></name>, <name><surname>Lin</surname>, <given-names>K.-C.</given-names></name>, &#x26; <name><surname>Hsu</surname>, <given-names>Y.-W.</given-names></name></person-group> (<year>2014</year>). <article-title>Hand function and its prognostic factors of very low birth weight preterm children up to a corrected age of 24 months.</article-title> <source>Research in Developmental Disabilities</source>, <volume>35</volume>(<issue>2</issue>), <fpage>322</fpage>&#8211;<lpage>329</lpage>. <pub-id pub-id-type="doi">10.1016/j.ridd.2013.11.023</pub-id><pub-id pub-id-type="pmid">24316589</pub-id><issn>0891-4222</issn></mixed-citation></ref>
<ref id="b8"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Wentink</surname>, <given-names>B.</given-names></name></person-group> (<year>2001</year>). <article-title>Eye-hand coordination in laparoscopy - an overview of experiments and supporting aids.</article-title> <source>Minimally Invasive Therapy &#x26; Allied Technologies</source>, <volume>10</volume>(<issue>3</issue>), <fpage>155</fpage>&#8211;<lpage>162</lpage>. <pub-id pub-id-type="doi">10.1080/136457001753192277</pub-id><pub-id pub-id-type="pmid">16754008</pub-id><issn>1364-5706</issn></mixed-citation></ref>
<ref id="b31"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Wilson</surname>, <given-names>M.</given-names></name>, <name><surname>McGrath</surname>, <given-names>J.</given-names></name>, <name><surname>Vine</surname>, <given-names>S.</given-names></name>, <name><surname>Brewer</surname>, <given-names>J.</given-names></name>, <name><surname>Defriend</surname>, <given-names>D.</given-names></name>, &#x26; <name><surname>Masters</surname>, <given-names>R.</given-names></name></person-group> (<year>2010</year>). <article-title>Psychomotor control in a virtual laparoscopic surgery training environment: Gaze control parameters differentiate novices from experts.</article-title> <source>Surgical Endoscopy</source>, <volume>24</volume>(<issue>10</issue>), <fpage>2458</fpage>&#8211;<lpage>2464</lpage>. <pub-id pub-id-type="doi">10.1007/s00464-010-0986-1</pub-id><pub-id pub-id-type="pmid">20333405</pub-id><issn>0930-2794</issn></mixed-citation></ref>
<ref id="b49"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Yamaguchi</surname>, <given-names>S.</given-names></name>, <name><surname>Konishi</surname>, <given-names>K.</given-names></name>, <name><surname>Yasunaga</surname>, <given-names>T.</given-names></name>, <name><surname>Yoshida</surname>, <given-names>D.</given-names></name>, <name><surname>Kinjo</surname>, <given-names>N.</given-names></name>, <name><surname>Kobayashi</surname>, <given-names>K.</given-names></name>, <etal>. . .</etal> <name><surname>Hashizume</surname>, <given-names>M.</given-names></name></person-group> (<year>2007</year>). <article-title>Construct validity for eye-hand coordination skill on a virtual reality laparoscopic surgical simulator.</article-title> <source>Surgical Endoscopy</source>, <volume>21</volume>(<issue>12</issue>), <fpage>2253</fpage>&#8211;<lpage>2257</lpage>. <pub-id pub-id-type="doi">10.1007/s00464-007-9362-1</pub-id><pub-id pub-id-type="pmid">17479319</pub-id><issn>0930-2794</issn></mixed-citation></ref>
<ref id="b38"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>van der Lans</surname>, <given-names>R.</given-names></name>, <name><surname>Wedel</surname>, <given-names>M.</given-names></name>, &#x26; <name><surname>Pieters</surname>, <given-names>R.</given-names></name></person-group> (<year>2011</year>). <article-title>Defining eye-fixation sequences across individuals and tasks: The Binocular-Individual Threshold (BIT) algorithm.</article-title> <source>Behavior Research Methods</source>, <volume>43</volume>(<issue>1</issue>), <fpage>239</fpage>&#8211;<lpage>257</lpage>. <pub-id pub-id-type="doi">10.3758/s13428-010-0031-2</pub-id><pub-id pub-id-type="pmid">21287116</pub-id><issn>1554-351X</issn></mixed-citation></ref>
</ref-list>
</back>
</article>
