<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.0 20120330//EN" "JATS-journalpublishing1.dtd">

<article article-type="research-article" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:mml="http://www.w3.org/1998/Math/MathML">
 <front>
    <journal-meta>
	<journal-id journal-id-type="publisher-id">Jemr</journal-id>
      <journal-title-group>
        <journal-title>Journal of Eye Movement Research</journal-title>
      </journal-title-group>
      <issn pub-type="epub">1995-8692</issn>
	  <publisher>								
	  <publisher-name>Bern Open Publishing</publisher-name>
	  <publisher-loc>Bern, Switzerland</publisher-loc>
	</publisher>
    </journal-meta>
    <article-meta>
	<article-id pub-id-type="doi">10.16910/jemr.10.5.11</article-id> 
	  <article-categories>								
				<subj-group subj-group-type="heading">
					<subject>Research Article</subject>
				</subj-group>
		</article-categories>
      <title-group>
        <article-title>Using simultaneous scanpath visualization to investigate the relationship between accuracy and eye movement during medical image interpretation</article-title>
      </title-group>
	   <contrib-group> 
				<contrib contrib-type="author">
					<name>
						<surname>Davies</surname>
						<given-names>Alan</given-names>
					</name>
					<xref ref-type="aff" rid="aff1"></xref>
				</contrib>
				<contrib contrib-type="author">
					<name>
						<surname>Harper</surname>
						<given-names>Simon</given-names>
					</name>
					<xref ref-type="aff" rid="aff1"></xref>
				</contrib>				
				<contrib contrib-type="author">
					<name>
						<surname>Vigo</surname>
						<given-names>Markel</given-names>
					</name>
					<xref ref-type="aff" rid="aff1"></xref>
				</contrib>
				<contrib contrib-type="author">
					<name>
						<surname>Jay</surname>
						<given-names>Caroline</given-names>
					</name>
					<xref ref-type="aff" rid="aff1"></xref>
				</contrib>				
        <aff id="aff1">
		<institution>University of Manchester</institution>,  <country>UK</country>
        </aff>
		</contrib-group>
     
	  <pub-date date-type="pub" publication-format="electronic"> 
		<day>24</day>  
		<month>2</month>
        <year>2018</year>
      </pub-date>
	  <pub-date date-type="collection" publication-format="electronic"> 
	  <year>2017</year>
	</pub-date>
      <volume>10</volume>
      <issue>5</issue>
	 <elocation-id>10.16910/jemr.10.5.11</elocation-id>
	<permissions> 
	<copyright-year>2018</copyright-year>
	<copyright-holder>Davies, Harper, Vigo and Jay</copyright-holder>
	<license license-type="open-access">
  <license-p>This work is licensed under a Creative Commons Attribution 4.0 International License, 
  (<ext-link ext-link-type="uri" xlink:href="https://creativecommons.org/licenses/by/4.0/">
    https://creativecommons.org/licenses/by/4.0/</ext-link>), which permits unrestricted use and redistribution provided that the original author and source are credited.</license-p>
</license>
	</permissions>
      <abstract>
        <p>In this paper, we explore how a number of novel methods for visualizing and analyzing differences in eye-tracking data, including scanpath length, Levenshtein distance, and visual transition frequency, can help to elucidate the methods clinicians use for interpreting 12-lead electrocardiograms (ECGs). Visualizing the differences between multiple participants' scanpaths simultaneously allowed us to answer questions including: do clinicians fixate randomly on the ECG, or do they apply a systematic approach?; is there a relationship between interpretation accuracy and visual behavior? Results indicate that practitioners have very different visual search strategies. Clinicians who incorrectly interpret the image have greater scanpath variability than those who correctly interpret it, indicating that differences between practitioners in terms of accuracy are reflected in different eye-movement behaviors. The variation across practitioners is likely to be the result of differential training, clinical role and expertise.</p>
      </abstract>
      <kwd-group>
        <kwd>Eye movement</kwd>
        <kwd>eye tracking</kwd>
        <kwd>visualization</kwd>
        <kwd>electrocardiogram</kwd>
        <kwd>ECG</kwd>
        <kwd>EKG</kwd>
      </kwd-group>
    </article-meta>
  </front>  
  <body>

    <sec id="S1">
      <title>Introduction</title>
	  
      <p>Scanpath analysis -- examination of the sequence in
which people fixate on different parts of a stimulus -- is
widely used in eye-tracking research (
        <xref ref-type="bibr" rid="b1">1</xref>
		). Scanpaths can
be considered in terms of the sequence of AOIs (Areas Of
Interest defined by the researcher) that a participant
visits, which can be compared with string metrics such as
the Levenshtein distance, or in terms of the spatial
positions/alignment of fixations (vector sequence alignment).
Methods such as vector strings can also include temporal
aspects like fixation duration and saccadic amplitude (
        <xref ref-type="bibr" rid="b1">1</xref>
		).
Scanpath analysis attempts to provide insight into the
cognitive processes of users interacting with a visual
stimulus, as eye movements have been linked to decision
making (
        <xref ref-type="bibr" rid="b2">2</xref>
		).</p>

      <p>A basic method for enabling the visual comparison of
scanpaths is the gaze plot, which displays all fixation data
for a participant or set of participants over the stimulus.
While this is comprehensive in the information it
supplies, it can quickly become difficult to interpret, due to
the complexity of gaze data.</p>

      <p>Here we present a method for scanpath analysis,
which combines the Levenshtein distance and other
visualization methods to produce summary data that can be
simultaneously visualized for multiple participants in a
simple matrix form. This allows us to query the data
visually, and identify similarities and differences between
participants at a glance.</p>

      <p>The particular case we examine is clinician
interpretation of elctrocardiogram (ECG) images. Eye tracking has
been used to explore how humans interact with data in a
variety of medical domains, most notably in radiology (
        <xref ref-type="bibr" rid="b3 b4 b5">3, 4, 5</xref>
		).</p>

      <p>This work has primarily provided a qualitative
interpretation of the diagnostic process, however. Here, we
apply our methods to quantitatively analyze clinicians&#x2019;
visual behavior in the medical sub-domain of
electrocardiology. This field particularly lends itself to scanpath
analysis, as electrocardiogram (ECG) data consists of
signals from 12 sources, which are presented in different
equal-sized areas on a single output. These areas
naturally form pre-existing &#x201C;Areas of Interest&#x201D; (AOIs) which can
be interrogated for quantitative analysis. Here we
examine the scanpaths of clinicians as they attempt to interpret
ECGs. To do this we consider the transition behavior
between the leads by determining and visualizing the
Levenshtein distance. We do this to identify any
systematic and consistent approaches taken to interpretation that
are modelled by visual behavior, especially to determine
if there are differences in this behavior that are attributed
to the correct or incorrect interpretation of the ECG.</p>

      <sec id="S1a">
        <title>Electrocardiology</title>
		
        <p>The electrical activity generated by the myocardium
(heart) can be represented in graphical form by the
12lead electrocardiogram (ECG) (
        <xref ref-type="bibr" rid="b6">6</xref>
		).</p>

        <p>The ECG is one of the most commonly used medical
tests and is carried out in a large variety of clinical
environments (
        <xref ref-type="bibr" rid="b6">6</xref>
		). This is primarily down to its low cost and
availability. The electrical output is displayed as a
waveform that is composed of various waves (P, Q, R, S, T,
U), intervals (PR, QT, QRS) and the ST segment that
represent the depolarization and repolarization of the
constituent components of the cardiac conduction system (
        <xref ref-type="bibr" rid="b6 b7">6, 7</xref>
		). The waveform is displayed on a grid (Figure 1),
where time in seconds is represented on the x-axis and
amplitude in millivolts on the y-axis (
        <xref ref-type="bibr" rid="b8">8</xref>
		).</p>

<fig id="fig01" fig-type="figure" position="float">
					<label>Figure 1.</label>
					<caption>
						<p>A &#x201C;normal&#x201D; 12-lead ECG</p>
					</caption>
					<graphic id="graph01" xlink:href="jemr-10-05-k-figure-01.png"/>
				</fig>

        <p>The different &#x201C;leads&#x201D; are displayed as 12 equally sized
regions on the graph that are labelled. The leads labelled
I, II, III, aVR, aVL, aVF display activity &#x201C;viewed&#x201D; from
the coronal/frontal plane. Leads V1 to V6 view the
transverse plane. The waveforms are presented differently in
the different leads due to the direction of the electrical
impulse relative to the poles of the electrodes that are
attached to the surface of the patient (
        <xref ref-type="bibr" rid="b9">9</xref>
		).</p>

        <p>Interpretation of the ECG is considered a complicated
task and is carried out by a number of healthcare
practitioners, including doctors, nurses and allied health
professionals, paramedics and specially trained cardiac
physiologists/technicians. Failing to make a correct
interpretation of the underlying medical conditions presented on
the ECG can lead to inappropriate/incorrect or no
treatment being given, leading in some cases to injury and
even death (
        <xref ref-type="bibr" rid="b10">10</xref>
		). Despite ongoing improvements in the
field of automated ECG interpretation, humans are still
more reliable (
        <xref ref-type="bibr" rid="b11">11</xref>
		) and remain the end point in
interpretation as automated solutions are frequently inaccurate (
        <xref ref-type="bibr" rid="b12">12</xref>
		). The study presented in this paper represents a
subsection of wider exploratory work related to the visual
behavior of humans interpreting ECGs using eye-tracking
technology. Understanding this process could provide
essential information for improving automated
interpretation software. This work synthesizes varied disciplines,
including computer science, medicine and psychology.
The initial stage reported in this paper concerns visual
analysis of eye-movement data for hypothesis generation.</p>
      </sec>
	  
      <sec id="S1b">
        <title>Scanpath analysis techniques</title>
		
        <p>Similarity between two or more scanpaths can be
estimated by applying scanpath comparison measures (
        <xref ref-type="bibr" rid="b1">1</xref>
		).</p>

        <p>The scanpath can also be formed from a set of
locations represented by the order in which the AOI is visited
(in computing terms, a string). One such method for the
calculation of differences between two string sequences
is the Levenshtein distance. It works by imposing a cost
(penalty) for each operation (insertion, deletion and
substitution) carried out to transform one string into another,
where they both contain the same tokens in the same
sequence (
        <xref ref-type="bibr" rid="b13">13</xref>
		). The Levenshtein distance is still one of
most frequently used methods applied to scanpath
comparison (
        <xref ref-type="bibr" rid="b1 b14">1, 14</xref>
		) with applications spanning multiple
domains, including the scanning of websites (
        <xref ref-type="bibr" rid="b15">15</xref>
		) and
reasoning about others mental status (
        <xref ref-type="bibr" rid="b16">16</xref>
		).</p>

        <p>Other string edit distances also exist, including the
Damerau-Levenshtein distance, Hamming distance and
Longest Common Subsequence (LCS) technique (
        <xref ref-type="bibr" rid="b14">14</xref>
		).
The initial Levenshtein distance has been adapted and
improved. In one such example, Galgani et al. (
        <xref ref-type="bibr" rid="b17">17</xref>
		)
augmented the Levenshtein distance with the
NeedlemanWunsch approach. This allows for the definition of
custom defined cost functions. This approach was applied to
improve evaluation and diagnostic methods for
classification of attention disorders (
        <xref ref-type="bibr" rid="b17">17</xref>
		). Alternative methods for
the visualization of scanpaths include the Voronoi
method, a spatial method comparable to clustering fixations (
        <xref ref-type="bibr" rid="b18">18</xref>
		). Dotplots have also been used to visualize scanpath
similarities for the purpose of validation and exploration (
        <xref ref-type="bibr" rid="b19">19</xref>
		).</p>

        <p>In this work we apply visualization methods to
explore similarities and differences between participants&#x2019;
scanpaths as they carry out an ECG interpretation task.</p>
      </sec>
    </sec>
	
    <sec id="S2">
      <title>Methods</title>
      <sec id="S2a">
        <title>Participants</title>
		
        <p>Thirty one participants (males=8, females= 23) whose
clinical role includes regularly interpreting ECGs took
part in the study. Participants had an average of 9 years&#x2019;
experience in interpreting ECGs (range=29). Participants
were recruited from 3 hospitals in the north-west of
England. They belonged to 3 main professional categories:
cardiac physiologists/technicians (n=19), doctors/nurses
(n=7) and students (n=5).</p>
      </sec>
	  
      <sec id="S2b">
        <title>Stimuli</title>
		
        <p>Participants viewed eleven 12-lead ECGs taken from
open access on-line libraries (
<ext-link ext-link-type="uri" xlink:href="http://lifeinthefastlane.com/ecg-library/" xlink:show="new">http://lifeinthefastlane.com/ecg-library/</ext-link> 
and
<ext-link ext-link-type="uri" xlink:href="www.emedu.org/ecg\_lib/index.htm" xlink:show="new">www.emedu.org/ecg\_lib/index.htm</ext-link>) 
and displayed in a random order on a computer screen. The ECGs
represented a selection of conditions that would be
encountered in clinical and training scenarios:</p>

        <p>&#x2022;Anterolateral STEMI (ST-segment elevation
myocardial infarction)</p>
        <p>&#x2022;Atrial Flutter</p>
        <p>&#x2022;Hyperkalaemia</p>
        <p>&#x2022;Torsades de pointes (polymorphic ventricular
tachycardia)</p>
        <p>&#x2022;Wolff-Parkinson-White syndrome (WPW)</p>
        <p>&#x2022;Ventricular tachycardia (VT)</p>
        <p>&#x2022;Left bundle branch block (LBBB)</p>
        <p>&#x2022;Normal sinus rhythm (NSR)</p>
        <p>&#x2022;Supra-ventricular tachycardia (SVT)</p>
        <p>&#x2022;Ventricular paced rhythm</p>
        <p>&#x2022;Sinus tachycardia</p>
        </sec>

      <sec id="S2c">
        <title>Procedure</title>
		
        <p>The ECGs were presented in random sequence. No
time limit was imposed, allowing participants to take as
much time as they needed to reach an interpretation.
Their interpretation, which was spoken aloud, was
recorded with a voice recorder. Tobii X2-60 and 1750
eyetrackers were used to capture gaze-data as participants
viewed the ECGs. Areas of interest (AOIs) labelled A-M
were generated with Tobii studio software (V.1.2) for
each of the 12-leads and the rhythm strip, which is an
existing lead that is displayed for a longer time period at
the bottom of the image (see Figure 2). Following the
study, participants&#x2019; interpretations were rated as correct
or incorrect for each ECG by two expert interpreters. The
full stimuli, protocol, data and analysis code are available
from our data repository(
<ext-link ext-link-type="uri" xlink:href="http://iamdata.cs.manchester.ac.uk/investigations/12" xlink:show="new">http://iamdata.cs.manchester.ac.uk/investigations/12</ext-link>).</p>
 
<fig id="fig02" fig-type="figure" position="float">
					<label>Figure 2.</label>
					<caption>
						<p>AOIs mapped onto ECG leads. Labelled A-M</p>
					</caption>
					<graphic id="graph02" xlink:href="jemr-10-05-k-figure-02.png"/>
				</fig>

 </sec>
    </sec>
	
    <sec id="S3">
      <title>Analysis</title>
	  
      <p>Many studies focus on determining the similarity of
eye-movements across participants (
        <xref ref-type="bibr" rid="b20">20</xref>
		). Standard
techniques, including heat/focus maps and gaze-plots are
limited, as they often fail to properly display the
sequential/temporal nature of these eye-movements (
        <xref ref-type="bibr" rid="b20">20</xref>
		),
or allow for subject comparisons with multiple
participants, without the introduction of the excessive
visual complexity. Gazeplots display the fixation
sequence superimposed on a stimulus, and therefore
potentially allow a comparison between participants to be
made visually. Gazeplots can, however, become overly
complicated and even meaningless with large group
comparisons (or even just a small subset of participants).</p>

      <p>Scanpaths can be represented as a set of tokens or
characters, referred to as &#x201C;strings&#x201D;. The string contains
the sequence of AOIs visited by a participant. This can be
seen in an example from two participants in this study
who viewed the anterolateral STEMI ECG.</p>

      <p><italic>P<sub>25</sub> = {M,M,I,I,M,G,G,E,E,B,A,A,M,M,I,I}</italic></p>
      <p><italic>P<sub>19</sub> = {H,E,D,D,E,H,H,G,G,G,I,F,F,F,D}</italic></p>
	  
      <p>Differences in fixation duration, fixation count, or the
total amount of time spent viewing a particular AOI can
be used to identify participant similarity. This does not,
however, capture the similarity in the way participants
visually transition around the ECG. This is a potentially
important factor, as cross referencing different leads of
the ECG is crucial to the correct interpretation of certain
conditions, such as heart attacks. To examine these
similarities we apply the Levenshtein distance to compute
the distance (measure of similarity) of each participant
with all the other participants in the study or sub-group.
The distance is determined by the minimum number of
insertion, deletion and substitution operations required to
transform one string into another (
        <xref ref-type="bibr" rid="b1 b13">1, 13</xref>
		).</p>

      <p>When viewing the scanpath lengths for each stimulus
we truncate (collapse) the scanpath to remove
consecutive tokens. This is done to focus on the sequence
of AOIs visited, essentially removing fixation frequency,
i.e. a scanpath string consisting of <italic>{M,M,M,B,B,A,B,C}</italic>
would become <italic>{M,B,A,B,C}</italic>. Unless stated specifically
the results represent the un-truncated scanpaths.</p>

      <p>The scanpath analysis reported here focuses primarily
on the anterolateral STEMI ECG, as the identification of
a \quotes{heart attack} is a critical skill that is taught to
ECG interpreters of all levels, as opposed to specialists
(cardiologists). In order to identify the STEMI, one needs
to first identify ST-segment elevation, then rule out other
causes (i.e. pericarditis, pacemaker, bundle branch block)
before finally identifying the leads affected (
        <xref ref-type="bibr" rid="b9">9</xref>
		). The
pattern of ST elevation in certain leads identifies what
type of STEMI it is. Table 1 shows the portion of the
heart that the changes reflect. For example ST elevation
in the inferior leads (II, III and aVF) would indicate an
inferior STEMI. There can also be combinations of areas
affected. The anterolateral STEMI would involve ST
elevation in both the lateral and anterior leads. In order to
make the correct interpretation, ST elevation needs to be
identified in each relevant lead. This makes the STEMI
stimuli a good starting point for exploratory analysis as
with other conditions, the salient features can be
identified in different leads on an individual basis or
systematically.</p>

<table-wrap id="t01" position="float">
					<label>Table 1.</label>
					<caption>
						<p>ECG leads and portion of heart effected.</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">
						<tbody>
          <tr>
            <td rowspan="1" colspan="1">
              <bold>STEMI leads</bold>
            </td>
            <td rowspan="1" colspan="1">
              <bold>Myocardial area</bold>
            </td>
          </tr>
						</tbody>
						<tbody>
          <tr>
            <td rowspan="1" colspan="1">II, III, aVF</td>
            <td rowspan="1" colspan="1">Inferior</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">I, aVL, V5, V6</td>
            <td rowspan="1" colspan="1">Lateral</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">V1, V2, V3, V4</td>
            <td rowspan="1" colspan="1">Anterior</td>
          </tr>
						</tbody>
					</table>
					</table-wrap>					

      <p>To this end each of the participants&#x2019; scanpaths were
compared against all the other participants&#x2019; for this
stimulus and the results were displayed using a matrix to
allow for rapid visual comparison. The darker the matrix
cell the greater the difference between compared
scanpaths; conversely the lighter the cell the greater the
scanpath similarity. This method of visualization also
makes it easier to spot outliers and make multiple
comparisons simultaneously. In addition to this, we were
interested in the specific areas of the stimulus that were
fixated the most. It was hypothesized that these areas may
be different from the top down researcher-defined AOIs
that were mapped onto each ECG lead. This is because
we know from ECG training texts that in order to
interpret the ECG correctly one needs to focus on specific
parts of the ECG waveform (the various waves, intervals
and segments). In order to define these areas in a
nonarbitrary data-driven way we use the DBSCAN clustering
algorithm (Density-based spatial clustering of
applications with noise) (
        <xref ref-type="bibr" rid="b21">21</xref>
		). This allowed us to cluster
fixations and then subsequently determine the smallest
radius for what is termed a &#x201C;core point&#x201D; (threshold for the
number of points in a given radius to be included in a
core point). We use this value to inform the cell size for a
grid (minimum cell dimension = core point diameter). As
the stimulus is rectangular, the smallest cell dimension is
used to determine the width of the cell. This allows cells
to be rectangular, in order to increase coverage of the
stimulus. We are then able to detect fixations in each grid
cell and generate heat maps based on these values. As the
cell sizes for each stimulus are the same, we can then
produce heatmaps for the correct and incorrect groups for
each ECG and directly compare differences between cells
to quantify how similar or different they are as well as
using them to identify key areas of attention. All
statistical analysis was carried out using the R project for
statistical computing, version 3.3.2. (
        <xref ref-type="bibr" rid="b22">22</xref>
		), with &#x3B1; &#x3C; 0.05.
Mann-Whitney U tests were used to compare groups with
non-parametric data. We also demonstrate the utility of
web diagrams for analyzing scanpath length, and chord
diagrams for showing differences in transition behavior.</p>
    </sec>
	
    <sec id="S4">
      <title>Results</title>
	  
      <p>We present the results in terms of the scanpath lengths
and differences between scanpaths across all stimuli
using the Levenshtein distance. We then focus on the
anterolateral STEMI stimuli, looking at scanpath
similarities for the correct and incorrect interpretation groups.
Finally we look at distribution of attention using
datadriven heatmaps, and differences in visual transition
behavior between the salient leads.</p>

      <p>The aggregated scanpath lengths (with truncation)
representing the scanpath as the sequence of AOIs visited are
shown in the web diagram in Figure 3 for both groups for
each ECG. The average length of the scanpaths across all
stimuli for the combined groups was 23 AOIs (SD=
18.25, Mo=9, range=134).</p>

<fig id="fig03" fig-type="figure" position="float">
					<label>Figure 3.</label>
					<caption>
						<p>Average scanpath lengths for each stimulus for correct and incorrect groups</p>
					</caption>
					<graphic id="graph03" xlink:href="jemr-10-05-k-figure-03.png"/>
				</fig>

      <p>Figure 4 shows the average Levenshtein distance per
group for each ECG. As the number of participants
making correct and incorrect interpretations varies
considerably across the different ECGs (Table 2), using standard
statistical comparisons is problematic in all but one case.
It is necessary to group participants into correct and
incorrect interpretation groups per stimulus on a post hoc
basis, as they may get a certain ECG right and another
wrong and vice versa, making it impossible to assign
them to groups prior to beginning the task. The
Anterolateral STEMI (heart attack) has fairly evenly sized groups
making comparison possible. We compared the average
Levenshtein distance for the correct and incorrect groups
using a Mann-Whitney U test for this stimulus, which
highlights a significant difference (W = 21284, p = .004),
with the incorrect group having a larger Levenshtein
distance on average (M=86, SD=102.63) than the correct
group (M=46, SD=12.53).</p>

<fig id="fig04" fig-type="figure" position="float">
					<label>Figure 4.</label>
					<caption>
						<p>The average Levenshtein distance for both groups for each ECG (errorbars represent the SE)</p>
					</caption>
					<graphic id="graph04" xlink:href="jemr-10-05-k-figure-04.png"/>
				</fig>
				
<table-wrap id="t02" position="float">
					<label>Table 2.</label>
					<caption>
						<p>The number of participants making correct and incorrect interpretations per ECG.</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">
						<tbody>
          <tr>
            <td rowspan="1" colspan="1">
              <bold>Stimuli (ECG)</bold>
            </td>
            <td rowspan="1" colspan="1">
              <bold>Correct (n)</bold>
            </td>
            <td rowspan="1" colspan="1">
              <bold>Incorrect (n)</bold>
            </td>
          </tr>
						</tbody>
						<tbody>
          <tr>
            <td rowspan="1" colspan="1">Anterolateral STEMI</td>
            <td rowspan="1" colspan="1">16</td>
            <td rowspan="1" colspan="1">14</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">Atrial Flutter</td>
            <td rowspan="1" colspan="1">26</td>
            <td rowspan="1" colspan="1">5</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">Hyperkalaemia</td>
            <td rowspan="1" colspan="1">2</td>
            <td rowspan="1" colspan="1">30</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">Torsades de pointes</td>
            <td rowspan="1" colspan="1">5</td>
            <td rowspan="1" colspan="1">27</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">WPW</td>
            <td rowspan="1" colspan="1">13</td>
            <td rowspan="1" colspan="1">18</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">VT</td>
            <td rowspan="1" colspan="1">27</td>
            <td rowspan="1" colspan="1">5</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">LBBB</td>
            <td rowspan="1" colspan="1">24</td>
            <td rowspan="1" colspan="1">8</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">NSR</td>
            <td rowspan="1" colspan="1">24</td>
            <td rowspan="1" colspan="1">7</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">SVT</td>
            <td rowspan="1" colspan="1">10</td>
            <td rowspan="1" colspan="1">21</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">Ventricular paced</td>
            <td rowspan="1" colspan="1">9</td>
            <td rowspan="1" colspan="1">22</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">Sinus tachycardia</td>
            <td rowspan="1" colspan="1">12</td>
            <td rowspan="1" colspan="1">20</td>
          </tr>
						</tbody>
					</table>
					</table-wrap>				

      <p>Matrix visualizations (Figure 5) are used to compare each
participant against every other participant in the group
(correct or incorrect). The darker the cell, the greater the
distance, meaning that the compared scanpaths are less
similar. The plots are normalized by the maximum
Levenshtein distance to aid visual comparison. Participant
13 (P13M, a student cardiac physiologist) in the incorrect
group has a very different scanpath to all of the other
participants. This participant also has the longest
individual scanpath length (377) and the longest average fixation
duration (M=312.97, SD=384.86). Figure 6 shows the
average fixation duration per participant for each group
for the STEMI ECG.</p>

<fig id="fig05" fig-type="figure" position="float">
					<label>Figure 5.</label>
					<caption>
						<p>Levenshtein distance plots for correct (left) and incorrect (right) groups for the anterolateral STEMI ECG.</p>
					</caption>
					<graphic id="graph05" xlink:href="jemr-10-05-k-figure-05.png"/>
				</fig>
				
<fig id="fig06" fig-type="figure" position="float">
					<label>Figure 6.</label>
					<caption>
						<p>Average fixation duration for each participant for anterolateral STEMI ECG by group</p>
					</caption>
					<graphic id="graph06" xlink:href="jemr-10-05-k-figure-06.png"/>
				</fig>				

      <p>The average fixation duration for each lead of the ECG
for the anterolateral STEMI (Figure 7) is then examined.
For the fixation duration we apply pairwise comparisons
with Bonferroni correction (&#x3B1; = 0.004). A significant
difference between groups for lead I (W = 628.5, p =
0.002) was identified (Table 3). The most fixations were
made in lead V1 and then the rhythm strip for both of the
groups.</p>

<fig id="fig07" fig-type="figure" position="float">
					<label>Figure 7.</label>
					<caption>
						<p>Average fixation duration for both groups, per lead (errorbars represent the SE)</p>
					</caption>
					<graphic id="graph07" xlink:href="jemr-10-05-k-figure-07.png"/>
				</fig>
				
<table-wrap id="t03" position="float">
					<label>Table 3.</label>
					<caption>
						<p>Pairwise comparisons for each lead (Mann-Whitney U) with Bonferroni correction &#x3B1; = 0.004.</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">
						<tbody>
          <tr>
            <td rowspan="1" colspan="1">
              <bold>ECG lead name</bold>
            </td>
            <td rowspan="1" colspan="1">W</td>
            <td rowspan="1" colspan="1">p-value</td>
          </tr>
						</tbody>
						<tbody>
          <tr>
            <td rowspan="1" colspan="1">I</td>
            <td rowspan="1" colspan="1">628.5</td>
            <td rowspan="1" colspan="1">0.002*</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">II</td>
            <td rowspan="1" colspan="1">467.5</td>
            <td rowspan="1" colspan="1">0.071</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">III</td>
            <td rowspan="1" colspan="1">591</td>
            <td rowspan="1" colspan="1">0.741</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">aVR</td>
            <td rowspan="1" colspan="1">253</td>
            <td rowspan="1" colspan="1">0.186</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">aVL</td>
            <td rowspan="1" colspan="1">2021</td>
            <td rowspan="1" colspan="1">0.702</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">aVF</td>
            <td rowspan="1" colspan="1">1532.5</td>
            <td rowspan="1" colspan="1">0.594</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">V1</td>
            <td rowspan="1" colspan="1">3479.5</td>
            <td rowspan="1" colspan="1">0.695</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">V2</td>
            <td rowspan="1" colspan="1">12994</td>
            <td rowspan="1" colspan="1">0.346</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">V3</td>
            <td rowspan="1" colspan="1">5679.5</td>
            <td rowspan="1" colspan="1">0.452</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">V4</td>
            <td rowspan="1" colspan="1">675</td>
            <td rowspan="1" colspan="1">0.017</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">V5</td>
            <td rowspan="1" colspan="1">1294</td>
            <td rowspan="1" colspan="1">0.022</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">V6</td>
            <td rowspan="1" colspan="1">363</td>
            <td rowspan="1" colspan="1">0.046</td>
          </tr>
          <tr>
            <td rowspan="1" colspan="1">Rhythm strip (II)</td>
            <td rowspan="1" colspan="1">6777</td>
            <td rowspan="1" colspan="1">0.016</td>
          </tr>
						</tbody>
					</table>
					</table-wrap>				

      <p>Figure 8 highlights differences between the correct and
incorrect groups for the anterolateral STEMI stimulus in
relation to the dwell time (total fixation time) for each
grid cell (displayed in each cell). The correct group has a
greater dwell time in lead V1 and V2, which are two of
the most important leads for providing clues to interpret
this particular stimulus (ST segment elevation in the
anterior and lateral leads). In contrast the incorrect group
dwells mostly on the less useful lead (aVL). By
segmenting the stimulus into equal sized regions and proving a
numerical overlay on each cell, specific areas of stimuli
can be more readily compared, with measurable
differences between cells easily computed. This also provides
an overview of the focus of attention made by both
groups.</p>

<fig id="fig08" fig-type="figure" position="float">
					<label>Figure 8.</label>
					<caption>
						<p>Heatmaps showing the total fixation duration in each grid cell for the anterolateral STEMI stimulus. (a) Correct group, (b) Incorrect group</p>
					</caption>
					<graphic id="graph08" xlink:href="jemr-10-05-k-figure-08.png"/>
					<graphic id="graph09" xlink:href="jemr-10-05-k-figure-09.png"/>							
				</fig>

      <p>Finally the transitions between the leads (V1-V4)
presenting the most relevant salient information (highest
degree of ST-segment elevation) are computed for both
of the groups. Figure 9 shows the number of transitions
from one lead to another or within the same lead itself.
The number of transitions is represented by the thickness
of the arrow, with the arrow point showing the direction
of the transition (from - to). The actual number of
transitions is also displayed on arrow heads. The incorrect
group made a greater number of overall transitions
(n=2307) than the correct group (n=2146).</p>

<fig id="fig09" fig-type="figure" position="float">
					<label>Figure 9.</label>
					<caption>
						<p>Chord diagrams representing the number of visual transitions from one lead to another (or within the same lead) for the incorrect (left) and correct (right) groups. The thicker the line the more transitions occurred. The arrow head displays the direction of the transitions.</p>
					</caption>
					<graphic id="graph10" xlink:href="jemr-10-05-k-figure-10.png"/>
				</fig>

    </sec>
	
    <sec id="S5">
      <title>Discussion</title>
	  
      <p>Data-driven analysis can be challenging, especially when
exploring factors such as accuracy, which can only be
determined on a post hoc basis. The various
visualizations applied to the data through this work provide useful
information about and insights into the differences in
visual behavior between these two groups. The &#x201C;heart
attack&#x201D; stimulus is of special interest due to the clinical
urgency of the condition and death by ischemic heart
disease remaining the leading cause of mortality globally (
        <xref ref-type="bibr" rid="b23">23</xref>
		).</p>

      <p>Overall we see a greater variability in the scanpaths
between, rather than within, the two groups. When we
consider differences in fixations on the leads of the ECG, we
identify a significant difference between the accurate and
inaccurate groups for lead I using a conservative
approach. Lead I is not one of the leads showing the
greatest degree of ST-segment elevation. It does, however,
help the interpreter to see that there is elevation in the
lateral leads as well as the anterior leads - leading to the
conclusion that the interpretation should reflect lateral as
well as anterior involvement. Comparing the leads on a
pairwise basis may also be over simplistic; as the time
spent viewing different leads may have an impact on time
spent viewing subsequent leads. The heatmaps do,
however, indicate that the correct group focuses more
attentional resources on the lead showing the greatest degree
of ST-segment elevation (the salient clue essential to
identifying a heart attack).</p>

      <p>The results of the analysis show large differences
between the participants&#x27; individual scanpaths, which is
indicative of differing search strategies. This difference
could be attributable to the disparate backgrounds of the
participants. There are many different methods of
teaching ECG interpretation that vary in approach and duration (
        <xref ref-type="bibr" rid="b24">24</xref>
		). These methods also differ between countries and
institutions as well as varying according to the medical
discipline that the practitioner belongs to (
        <xref ref-type="bibr" rid="b25">25</xref>
		). Using a
matrix to visualize the similarities/differences between
the scanpaths with the Levenshtein distance is a helpful
initial way of gaining a comparative overview of multiple
participants in a study, and locating outliers who have
markedly different or similar scanpaths. This can
complement traditional methods, such as box plots.</p>

      <p>An example of this is seen in the Levenshtein distance
matrix (Figure 5). Here we can see participant 13 is a
clear outlier and has a markedly different scanpath to all
of the other participants in his group. This shows that
metrics such as dwell time and fixation duration alone do
not give us the whole picture with regard to behaviour
and strategy. A richer understanding can be obtained by
combining approaches to explore different aspects, such
as temporal and sequential factors.</p>

      <p>Scanpath analysis suffers from some limitations,
including the issue of scanpath length, with very different
lengths confounding alignment calculations (
        <xref ref-type="bibr" rid="b19">19</xref>
		). It
should also be noted that visual behavior is very rich, and
&#x201C;na&#xEF;ve&#x201D; scanpath analysis will not tell the whole story.
Future work will focus on refining this approach, by
considering visual transitions between leads, which is
discussed in more detail in other work (
        <xref ref-type="bibr" rid="b26">26</xref>
		), and will also
consider how factors such as accuracy of interpretation
affect the results in greater detail.</p>

      <p>The gridded heatmap visualizations serve a qualitative
function, as visual differences in fixation duration can be
quite striking. As the areas (cells) share the same size,
direct comparison can be made quantitatively to focus on
certain areas. Gridded AOIs also allow for analysis to
take place in a content independent manner (
        <xref ref-type="bibr" rid="b27">27</xref>
		). The use
of gridded AOIs and the segmentation approach used is
consistent with the recommendations of (
        <xref ref-type="bibr" rid="b28">28</xref>
		) that AOIs
margins should be predefined or based on data. In this
case we can see that the fixations are clustered around
smaller areas within the leads. This is consistent with the
fact that practitioners need to measure changes in
durations and morphologies of different parts of the ECG
waveform in different conditions (
        <xref ref-type="bibr" rid="b29">29</xref>
		).</p>

      <p>This is in keeping with previous work that demonstrates
people tend to focus on some leads more than others (
        <xref ref-type="bibr" rid="b30">30</xref>
		).
This indicates that participants were drawn toward
specific features, possibly the lead or a component of the
waveform that displays features of the ECG abnormality. This
may also be the case regardless of making a correct or
incorrect interpretation, as a participant may notice an
abnormal feature without necessarily understanding its
significance. Eye tracking data is frequently used to
augment usability studies (
        <xref ref-type="bibr" rid="b2">2</xref>
		). The small sample sizes
frequently used in usability studies coupled with the
richness of eye-tracking data can make analysis of datasets
such as the one used in this study challenging and often
not amenable to traditional statistical approaches. The
techniques described in this paper go some way toward
providing a quantitative approach for exploration of this
type of data, and we therefore anticipate they will have a
scope wider than the ECG sub-domain, as they provide a
means of understanding whether individuals are
employing a systematic approach, or have some intrinsic
similarity in their visual behavior.</p>

      <sec id="S5a">
        <title>Conclusions and future work</title>
		
        <p>The methods presented here offer a way of exploring
and visualizing the visual behavior of practitioners
viewing ECGs. They allow us to visualize differences in
scanpaths that can indicate different search strategies, which
may result from different training or experience. A
weighted distance metric could also be introduced to
incorporate the effect of time spent viewing the areas, as
well as transitions between them. The techniques in this
work provide a way of viewing the similarities and
differences in multiple scanpaths and stimuli
simultaneously, providing a quantifiable measure of difference without
increasing visual complexity. The results of this study
may be of future use in clinical practice, as differences in
visual behavior may be used to identify potential failures
to correctly interpreting ECGs that could be fed back to
the practitioner in training scenarios.</p>
      </sec>
	  
      <sec id="S5b">
        <title>Ethics and Conflict of Interest</title>
		
        <p>The authors declare that the contents of the article are
in agreement with the ethics described in
<ext-link ext-link-type="uri" xlink:href="http://biblio.unibe.ch/portale/elibrary/BOP/jemr/ethics.html" xlink:show="new">http://biblio.unibe.ch/portale/elibrary/BOP/jemr/ethics.html</ext-link> 
and that there is no conflict of interest regarding the
publication of this paper.</p>
      </sec>
	  
       <sec id="S5c" sec-type="COI-statement">	  
        <title>Acknowledgements</title>
        <p><bold>EPSRC</bold>: EP/K502947/1 and EP/L504877/1</p>
        </sec>
      </sec>
  </body>
  <back>  
<ref-list>
<ref id="b1"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Holmqvist</surname> <given-names>K</given-names></string-name>, <string-name><surname>Nystrom</surname> <given-names>M</given-names></string-name>, <string-name><surname>Anderson</surname> <given-names>R</given-names></string-name>, <string-name><surname>Dewhurst</surname> <given-names>R</given-names></string-name>, <string-name><surname>Jarodzka</surname> <given-names>H</given-names></string-name></person-group>. <source>Van de Weijer J. Eye tracking: A comprehensive guide to methods and measures</source>. <publisher-loc>New York</publisher-loc>: <publisher-name>Oxford University Press</publisher-name>; <year>2011</year>. <size units="page">537</size> pp.</mixed-citation></ref>
<ref id="b2"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Ehmke</surname> <given-names>C</given-names></string-name>, <string-name><surname>Wilson</surname> <given-names>S</given-names></string-name></person-group>. <source>Identifying web usability problems from eye-tracking data</source>. <publisher-name>HCI</publisher-name>; <year>2007</year>. pp. <fpage>119</fpage>–<lpage>28</lpage>. <pub-id pub-id-type="doi">10.14236/ewic/HCI2007.12</pub-id></mixed-citation></ref>
<ref id="b3"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Krupinski</surname> <given-names>EA</given-names></string-name>, <string-name><surname>Calvin</surname> <given-names>NF</given-names></string-name></person-group>. <article-title>L HK. Enhancing recognition of lesions in radiographic images using perceptual feedback</article-title>. <source>Opt Eng</source>. <year>2013</year>;<volume>37</volume>(<issue>3</issue>):<fpage>813</fpage>–<lpage>8</lpage>.<issn>0091-3286</issn></mixed-citation></ref>
<ref id="b4"><mixed-citation publication-type="book-chapter" specific-use="linked"><person-group person-group-type="author"><string-name><surname>Litchfield</surname> <given-names>D</given-names></string-name>, <string-name><surname>Ball</surname> <given-names>LJ</given-names></string-name>, <string-name><surname>Donovan</surname> <given-names>T</given-names></string-name>, <string-name><surname>Manning</surname> <given-names>DJ</given-names></string-name>, <string-name><surname>Crawford</surname> <given-names>T</given-names></string-name></person-group>. <chapter-title>Learning from others: effects of viewing another person’s eye movements while searching for chest nodules.</chapter-title> Sahiner B, Manning DJ, editors. Med Imaging. <year>2008</year> Mar 6;6917:691715-691715–9. <pub-id pub-id-type="doi">10.1117/12.768812</pub-id></mixed-citation></ref>
<ref id="b5"><mixed-citation publication-type="conference" specific-use="linked"><person-group person-group-type="author"><string-name><surname>Law</surname> <given-names>B</given-names></string-name>, <string-name><surname>Atkins</surname> <given-names>MS</given-names></string-name>, <string-name><surname>Lomax</surname> <given-names>AJ</given-names></string-name>, <string-name><surname>Mackenzie</surname> <given-names>CL</given-names></string-name></person-group>. <article-title>Eye Gaze Patterns Differentiate Novice and Experts in a Virtual Laparoscopic Surgery Training Environment.</article-title> ETRA ’04 Proc 2004 Symp Eye Track Res Appl. <year>2004</year>;1:41–8. <pub-id pub-id-type="doi">10.1145/968363.968370</pub-id></mixed-citation></ref>
<ref id="b6"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Davies</surname> <given-names>A</given-names></string-name>, <string-name><surname>Scott</surname> <given-names>A</given-names></string-name></person-group>. <source>Starting to read ECGs: The Basics</source>. <publisher-loc>London</publisher-loc>: <publisher-name>Springer-Verlag</publisher-name>; <year>2014</year>. <size units="page">166</size> pp. <pub-id pub-id-type="doi">10.1007/978-1-4471-4962-0</pub-id></mixed-citation></ref>
<ref id="b7"><mixed-citation publication-type="book" specific-use="unparsed"><person-group person-group-type="author"><string-name><surname>Wagner</surname> <given-names>G</given-names></string-name></person-group>. <source>Marriott’s Practical Electrocardiography.</source> 11th ed. Lippincott Williams &amp; Wilkins; <year>2008</year>. 468 p. Clifford GD, Azuaje F, McSharry PE. Advanced Methods and Tools for ECG Data Analysis. Boston: Artech house, Inc; 2006. 384 p.</mixed-citation></ref>
<ref id="b8"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Clifford</surname> <given-names>GD</given-names></string-name>, <string-name><surname>Azuaje</surname> <given-names>F</given-names></string-name>, <string-name><surname>McSharry</surname> <given-names>PE</given-names></string-name></person-group>. <source>Advanced Methods and Tools for ECG Data Analysis</source>. <publisher-loc>Boston</publisher-loc>: <publisher-name>Artech house, Inc</publisher-name>; <year>2006</year>. <size units="page">384</size> pp.</mixed-citation></ref>
<ref id="b9"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Davies</surname> <given-names>A</given-names></string-name>, <string-name><surname>Scott</surname> <given-names>A</given-names></string-name></person-group>. <source>Starting to read ECGs: A Comprehensive Guide to Theory and Practice</source>. <publisher-loc>London</publisher-loc>: <publisher-name>Springer-Verlag</publisher-name>; <year>2015</year>. <size units="page">207</size> pp.</mixed-citation></ref>
<ref id="b10"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Holst</surname> <given-names>H</given-names></string-name>, <string-name><surname>Ohlsson</surname> <given-names>M</given-names></string-name>, <string-name><surname>Peterson</surname> <given-names>C</given-names></string-name>, <string-name><surname>Edenbrandt</surname> <given-names>L</given-names></string-name></person-group>. <article-title>A confident decision support system for interpreting electrocardiograms</article-title>. <source>Clin Physiol</source>. <year>1999</year> <month>Sep</month>;<volume>19</volume>(<issue>5</issue>):<fpage>410</fpage>–<lpage>8</lpage>. <pub-id pub-id-type="doi">10.1046/j.1365-2281.1999.00195.x</pub-id><pub-id pub-id-type="pmid">10516892</pub-id><issn>0144-5979</issn></mixed-citation></ref>
<ref id="b11"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Salerno</surname> <given-names>SM</given-names></string-name>, <string-name><surname>Alguire</surname> <given-names>PC</given-names></string-name>, <string-name><surname>Waxman</surname> <given-names>HS</given-names></string-name></person-group>. <article-title>Competency in interpretation of 12-lead electrocardiograms: a summary and appraisal of published evidence</article-title>. <source>Ann Intern Med</source>. <year>2003</year> <month>May</month>;<volume>138</volume>(<issue>9</issue>):<fpage>751</fpage>–<lpage>60</lpage>. <pub-id pub-id-type="doi">10.7326/0003-4819-138-9-200305060-00013</pub-id><pub-id pub-id-type="pmid">12729431</pub-id><issn>0003-4819</issn></mixed-citation></ref>
<ref id="b12"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Anh</surname> <given-names>D</given-names></string-name>, <string-name><surname>Krishnan</surname> <given-names>S</given-names></string-name>, <string-name><surname>Bogun</surname> <given-names>F</given-names></string-name></person-group>. <article-title>Accuracy of electrocardiogram interpretation by cardiologists in the setting of incorrect computer analysis</article-title>. <source>J Electrocardiol</source>. <year>2006</year> <month>Jul</month>;<volume>39</volume>(<issue>3</issue>):<fpage>343</fpage>–<lpage>5</lpage>. <pub-id pub-id-type="doi">10.1016/j.jelectrocard.2006.02.002</pub-id><pub-id pub-id-type="pmid">16777525</pub-id><issn>0022-0736</issn></mixed-citation></ref>
<ref id="b13"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Levenshtein</surname> <given-names>VI</given-names></string-name></person-group>. <article-title>Binary codes capable of correcting deletions, insertions, and reversals</article-title>. <source>Sov Phys Dokl</source>. <year>1966</year>;<volume>10</volume>:<fpage>707</fpage>–<lpage>10</lpage>.<issn>0038-5689</issn></mixed-citation></ref>
<ref id="b14"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Le Meur</surname> <given-names>O</given-names></string-name>, <string-name><surname>Baccino</surname> <given-names>T</given-names></string-name></person-group>. <article-title>Methods for comparing scanpaths and saliency maps: strengths and weaknesses</article-title> <comment>[Internet]</comment>. <source>Behav Res Methods</source>. <year>2013</year> <month>Mar</month>;<volume>45</volume>(<issue>1</issue>):<fpage>251</fpage>–<lpage>66</lpage>. Available from: <ext-link ext-link-type="uri" xlink:href="http://www.ncbi.nlm.nih.gov/pubmed/22773434">http://www.ncbi.nlm.nih.gov/pubmed/22773434</ext-link> <pub-id pub-id-type="doi">10.3758/s13428-012-0226-9</pub-id><pub-id pub-id-type="pmid">22773434</pub-id><issn>1554-351X</issn></mixed-citation></ref>
<ref id="b15"><mixed-citation publication-type="conference" specific-use="linked"><person-group person-group-type="author"><string-name><surname>Pan</surname> <given-names>B</given-names></string-name></person-group>. <article-title>Hembrooke H a, Gay GK, Granka L a, Feusner MK, Newman JK. The determinants of web page viewing behavior: an eye-tracking study.</article-title> <source>Proc ETRA ’04 Symp Eye Track Res Appl</source>. <year>2004</year>;1(212):<fpage>147</fpage>–<lpage>54</lpage>. <pub-id pub-id-type="doi">10.1145/968363.968391</pub-id></mixed-citation></ref>
<ref id="b16"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Meijering</surname> <given-names>B</given-names></string-name>, <string-name><surname>van Rijn</surname> <given-names>H</given-names></string-name>, <string-name><surname>Taatgen</surname> <given-names>NA</given-names></string-name>, <string-name><surname>Verbrugge</surname> <given-names>R</given-names></string-name></person-group>. <article-title>What eye movements can tell about theory of mind in a strategic game</article-title>. <source>PLoS One</source>. <year>2012</year>;<volume>7</volume>(<issue>9</issue>):<fpage>e45961</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0045961</pub-id><pub-id pub-id-type="pmid">23029341</pub-id><issn>1932-6203</issn></mixed-citation></ref>
<ref id="b17"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><string-name><surname>Galgani</surname> <given-names>F</given-names></string-name>, <string-name><surname>Sun</surname> <given-names>Y</given-names></string-name>, <string-name><surname>Lanzi</surname> <given-names>PL</given-names></string-name>, <string-name><surname>Leigh</surname> <given-names>J</given-names></string-name></person-group>. <article-title>Automatic Analysis of Eye Tracking Data for Medical Diagnosis.</article-title> <year>2009</year>; <pub-id pub-id-type="doi">10.1109/CIDM.2009.4938649</pub-id></mixed-citation><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Over</surname> <given-names>EA</given-names></string-name>, <string-name><surname>Hooge</surname> <given-names>IT</given-names></string-name>, <string-name><surname>Erkelens</surname> <given-names>CJ</given-names></string-name></person-group>. <article-title>A quantitative measure for the uniformity of fixation density: the Voronoi method</article-title>. <source>Behav Res Methods</source>. <year>2006</year> <month>May</month>;<volume>38</volume>(<issue>2</issue>):<fpage>251</fpage>–<lpage>61</lpage>. <pub-id pub-id-type="doi">10.3758/BF03192777</pub-id><pub-id pub-id-type="pmid">16956102</pub-id><issn>1554-351X</issn></mixed-citation></ref>
<ref id="b18"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Over</surname> <given-names>EA</given-names></string-name>, <string-name><surname>Hooge</surname> <given-names>IT</given-names></string-name>, <string-name><surname>Erkelens</surname> <given-names>CJ</given-names></string-name></person-group>. <article-title>A quantitative measure for the uniformity of fixation density: the Voronoi method</article-title>. <source>Behav Res Methods</source>. <year>2006</year> <month>May</month>;<volume>38</volume>(<issue>2</issue>):<fpage>251</fpage>–<lpage>61</lpage>. <pub-id pub-id-type="doi">10.3758/BF03192777</pub-id><pub-id pub-id-type="pmid">16956102</pub-id><issn>1554-351X</issn></mixed-citation></ref>
<ref id="b19"><mixed-citation publication-type="conference" specific-use="linked"><person-group person-group-type="author"><string-name><surname>Goldberg</surname> <given-names>JH</given-names></string-name>, <string-name><surname>Helfman</surname> <given-names>JI</given-names></string-name></person-group>. <article-title>Scanpath clustering and aggregation.</article-title> Proc 2010 Symp Eye-Tracking Res Appl - ETRA ’10. <year>2010</year>;227. <pub-id pub-id-type="doi">10.1145/1743666.1743721</pub-id></mixed-citation></ref>
<ref id="b20"><mixed-citation publication-type="unknown" specific-use="parsed"><person-group person-group-type="author"><string-name><surname>Cirimele</surname> <given-names>J</given-names></string-name>, <string-name><surname>Heer</surname> <given-names>J</given-names></string-name>, <string-name><surname>Card</surname> <given-names>SK</given-names></string-name></person-group>. <article-title>The VERP Explorer : A Tool for Exploring Eye Movements of Visual-Cognitive Tasks Using Recurrence Plots.</article-title> <year>2014</year>;</mixed-citation><mixed-citation publication-type="conference" specific-use="parsed"><person-group person-group-type="author"><string-name><surname>Ester</surname> <given-names>M</given-names></string-name>, <string-name><surname>Kriegel</surname> <given-names>HP</given-names></string-name>, <string-name><surname>Sander</surname> <given-names>J</given-names></string-name>, <string-name><surname>Xu</surname> <given-names>X</given-names></string-name></person-group>. <article-title>A Density-Based Algorithm for Discovering Clusters in Large Spatial Databases with Noise.</article-title> In: <source>KDD-96 Proceedings</source>. <year>1996</year>. p. <fpage>226</fpage>–<lpage>31</lpage>.</mixed-citation></ref>
<ref id="b21"><mixed-citation publication-type="conference" specific-use="parsed"><person-group person-group-type="author"><string-name><surname>Ester</surname> <given-names>M</given-names></string-name>, <string-name><surname>Kriegel</surname> <given-names>HP</given-names></string-name>, <string-name><surname>Sander</surname> <given-names>J</given-names></string-name>, <string-name><surname>Xu</surname> <given-names>X</given-names></string-name></person-group>. <article-title>A Density-Based Algorithm for Discovering Clusters in Large Spatial Databases with Noise.</article-title> In: <source>KDD-96 Proceedings</source>. <year>1996</year>. p. <fpage>226</fpage>–<lpage>31</lpage>.</mixed-citation></ref>
<ref id="b22"><mixed-citation publication-type="web-page" specific-use="unparsed"><person-group person-group-type="author"><collab>R Core Team</collab></person-group>. R: A Language and Environment for Statistical Computing [Internet]. Vienna: R Foundation for Statistical Computing; <year>2014</year>. Available from: <ext-link ext-link-type="uri" xlink:href="http://www.r-project.org/">http://www.r-project.org/</ext-link></mixed-citation></ref>
<ref id="b23"><mixed-citation publication-type="web-page" specific-use="unparsed"><person-group person-group-type="author"><collab>WHO</collab></person-group>. World Health Organisation: The top 10 causes of death [Internet]. <year>2017</year> [<date-in-citation content-type="access-date">cited 2017 Oct 26</date-in-citation>]. Available from: <ext-link ext-link-type="uri" xlink:href="http://www.who.int/mediacentre/factsheets/fs310/en/">http://www.who.int/mediacentre/factsheets/fs310/en/</ext-link></mixed-citation></ref>
<ref id="b24"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Alinier</surname> <given-names>G</given-names></string-name>, <string-name><surname>Gordon</surname> <given-names>R</given-names></string-name>, <string-name><surname>Harwood</surname> <given-names>C</given-names></string-name>, <string-name><surname>Hunt</surname> <given-names>WB</given-names></string-name></person-group>. <article-title>12-lead ECG training: the way forward</article-title>. <source>Nurse Educ Today</source>. <year>2006</year> <month>Jan</month>;<volume>26</volume>(<issue>1</issue>):<fpage>87</fpage>–<lpage>92</lpage>. <pub-id pub-id-type="doi">10.1016/j.nedt.2005.08.004</pub-id><pub-id pub-id-type="pmid">16182413</pub-id><issn>0260-6917</issn></mixed-citation></ref>
<ref id="b25"><mixed-citation publication-type="unknown" specific-use="unparsed">Kadish a H, Buxton a E, Kennedy HL, Knight BP, Mason JW, Schuger CD, et al. ACC/AHA clinical competence statement on electrocardiography and ambulatory electrocardiography: A report of the ACC/AHA/ACP-ASIM task force on clinical competence (ACC/AHA Committee to develop a clinical competence statement on electrocardiography and am. Circulation. <year>2001</year> Dec 18;104(25):3169–78.</mixed-citation></ref>
<ref id="b26"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Davies</surname> <given-names>A</given-names></string-name></person-group>. <source>ECG Eye-tracking Experiment 1</source> [<comment>Internet</comment>]<publisher-loc>Manchester</publisher-loc>: <publisher-name>The University of Manchester</publisher-name>; <year>2016</year>., Available from <ext-link ext-link-type="uri" xlink:href="https://zenodo.org/record/996475#.WckterKGO70">https://zenodo.org/record/996475#.WckterKGO70</ext-link></mixed-citation></ref>
<ref id="b27"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Goldberg</surname> <given-names>JH</given-names></string-name>, <string-name><surname>Kotval</surname> <given-names>XP</given-names></string-name></person-group>. <article-title>Computer interface evaluation using eye movements: methods and constructs</article-title>. <source>Int J Ind Ergon</source>. <year>1999</year>;<volume>24</volume>(<issue>6</issue>):<fpage>631</fpage>–<lpage>45</lpage>. <pub-id pub-id-type="doi">10.1016/S0169-8141(98)00068-7</pub-id><issn>0169-8141</issn></mixed-citation></ref>
<ref id="b28"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Orquin</surname> <given-names>JL</given-names></string-name>, <string-name><surname>Ashby</surname> <given-names>NJ</given-names></string-name>, <string-name><surname>Clarke</surname> <given-names>AD</given-names></string-name></person-group>. <article-title>Areas of Interest as a Signal Detection Problem in Behavioral Eye-Tracking Research</article-title>. <source>J Behav Decis Making</source>. <year>2016</year>;<volume>29</volume>(<issue>2–3</issue>):<fpage>103</fpage>–<lpage>15</lpage>. <pub-id pub-id-type="doi">10.1002/bdm.1867</pub-id><issn>0894-3257</issn></mixed-citation></ref>
<ref id="b29"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Dayan</surname> <given-names>M</given-names></string-name>, <string-name><surname>Kreutzer</surname> <given-names>S</given-names></string-name>, <string-name><surname>Clark</surname> <given-names>CA</given-names></string-name></person-group>. <article-title>Tractography of the optic radiation: a repeatability and reproducibility study</article-title> <comment>[Internet]</comment>. <source>NMR Biomed</source>. <year>2015</year> <month>Apr</month>;<volume>28</volume>(<issue>4</issue>):<fpage>423</fpage>–<lpage>31</lpage>. Available from: <ext-link ext-link-type="uri" xlink:href="http://ovidsp.ovid.com/ovidweb.cgi?T=JS&amp;PAGE=reference&amp;D=medl&amp;NEWS=N&amp;AN=25703088">http://ovidsp.ovid.com/ovidweb.cgi?T=JS&amp;PAGE=reference&amp;D=medl&amp;NEWS=N&amp;AN=25703088</ext-link> <pub-id pub-id-type="doi">10.1002/nbm.3266</pub-id><pub-id pub-id-type="pmid">25703088</pub-id><issn>0952-3480</issn></mixed-citation></ref>
<ref id="b30"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><string-name><surname>Bond</surname> <given-names>RR</given-names></string-name>, <string-name><surname>Zhu</surname> <given-names>T</given-names></string-name>, <string-name><surname>Finlay</surname> <given-names>DD</given-names></string-name>, <string-name><surname>Drew</surname> <given-names>B</given-names></string-name>, <string-name><surname>Kligfield</surname> <given-names>PD</given-names></string-name>, <string-name><surname>Guldenring</surname> <given-names>D</given-names></string-name>, <etal>et al.</etal></person-group> <article-title>Assessing computerized eye tracking technology for gaining insight into expert interpretation of the 12-lead electrocardiogram: an objective quantitative approach</article-title>. <source>J Electrocardiol</source>. <year>2014</year> <month>Nov-Dec</month>;<volume>47</volume>(<issue>6</issue>):<fpage>895</fpage>–<lpage>906</lpage>. <pub-id pub-id-type="doi">10.1016/j.jelectrocard.2014.07.011</pub-id><pub-id pub-id-type="pmid">25110276</pub-id><issn>0022-0736</issn></mixed-citation></ref>
</ref-list>
 </back>  
</article>


