<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.0 20120330//EN" "JATS-journalpublishing1.dtd">

<article article-type="research-article" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:mml="http://www.w3.org/1998/Math/MathML">
 <front>
    <journal-meta>
	<journal-id journal-id-type="publisher-id">Jemr</journal-id>
      <journal-title-group>
        <journal-title>Journal of Eye Movement Research</journal-title>
      </journal-title-group>
      <issn pub-type="epub">1995-8692</issn>
	  <publisher>								
	  <publisher-name>Bern Open Publishing</publisher-name>
	  <publisher-loc>Bern, Switzerland</publisher-loc>
	</publisher>
    </journal-meta>
    <article-meta>
	<article-id pub-id-type="doi">10.16910/jemr.11.2.9</article-id> 
	  <article-categories>								
				<subj-group subj-group-type="heading">
					<subject>Research Article</subject>
				</subj-group>
		</article-categories>
      <title-group>
        <article-title>The rhythm of cognition – Effects of an
auditory beat on oculomotor control in
reading and sequential scanning</article-title>
      </title-group>
	   <contrib-group> 
				<contrib contrib-type="author">
					<name>
						<surname>Lange</surname>
						<given-names>Elke B.</given-names>
					</name>
					<xref ref-type="aff" rid="aff1">1</xref>
				</contrib>
				<contrib contrib-type="author">
					<name>
						<surname>Pieczykolan</surname>
						<given-names>Aleks</given-names>
					</name>
					<xref ref-type="aff" rid="aff2">2</xref>
				</contrib>
				<contrib contrib-type="author">
					<name>
						<surname>Trukenbrod</surname>
						<given-names>Hans A.</given-names>
					</name>
					<xref ref-type="aff" rid="aff3">3</xref>
				</contrib>
				<contrib contrib-type="author">
					<name>
						<surname>Huestegge</surname>
						<given-names>Lynn</given-names>
					</name>
					<xref ref-type="aff" rid="aff2">2</xref>
				</contrib>                				
        <aff id="aff1">
		<institution>Max-Planck-Institute for Empirical Aesthetics, Frankfurt</institution>,   <country>Germany</country>
        </aff>
        <aff id="aff2">
		<institution>University of Würzburg</institution>,   <country>Germany</country>
        </aff>
        <aff id="aff3">
		<institution>University of Potsdam,</institution>,   <country>Germany</country>
        </aff>                
		</contrib-group>   

		
	  <pub-date date-type="pub" publication-format="electronic"> 
		<day>20</day>  
		<month>8</month>
        <year>2018</year>
      </pub-date>
	  <pub-date date-type="collection" publication-format="electronic"> 
	  <year>2018</year>
	</pub-date>
      <volume>11</volume>
      <issue>2</issue>
	 <elocation-id>10.16910/jemr.11.2.9</elocation-id> 
	<permissions> 
	<copyright-year>2018</copyright-year>
	<copyright-holder>Lange, E.B., Pieczykolan, A., Trukenbrod, H.A., &#x26; Huestegge, L.</copyright-holder>
	<license license-type="open-access">
  <license-p>This work is licensed under a Creative Commons Attribution 4.0 International License, 
  (<ext-link ext-link-type="uri" xlink:href="https://creativecommons.org/licenses/by/4.0/">
    https://creativecommons.org/licenses/by/4.0/</ext-link>), which permits unrestricted use and redistribution provided that the original author and source are credited.</license-p>
</license>
	</permissions>
      <abstract>
        <p>Eye-movement behavior is inherently rhythmic. Even without cognitive input, the eyes
never rest, as saccades are generated 3 to 4 times per second. Based on an embodied view
of cognition, we asked whether mental processing in visual cognitive tasks is also rhythmic
in nature by studying the effects of an external auditory beat (rhythmic background music)
on saccade generation in exemplary cognitive tasks (reading and sequential scanning).
While in applied settings background music has been demonstrated to impair reading
comprehension, the effect of musical tempo on eye-movement control during reading or
scanning has not been investigated so far. We implemented a tempo manipulation in four
steps as well as a silent baseline condition, while participants completed a text reading or a
sequential scanning task that differed from each other in terms of underlying cognitive
processing requirements. The results revealed that increased tempo of the musical beat
sped up fixations in text reading, while the presence (vs. absence) of the auditory stimulus
generally reduced overall reading time. In contrast, sequential scanning was unaffected by
the auditory pacemaker. These results were supported by additionally applying Bayesian
inference statistics. Our study provides evidence against a cognitive load account (i.e., that
spare resources during low-demand sequential scanning allow for enhanced processing of
the external beat). Instead, the data suggest an interpretation in favor of a modulation of the
oculomotor saccade timer by irrelevant background music in cases involving highly automatized
oculomotor control routines (here: in text reading).</p>
      </abstract>
      <kwd-group>
        <kwd>Reading</kwd>
        <kwd>visual sequential scanning</kwd>
        <kwd>background music</kwd>
      </kwd-group>
    </article-meta>
  </front>	
  <body>

    <sec id="S1">
      <title>Introduction</title>

<p>The human body moves in rhythmic (repetitive) patterns, not only when
dancing, but also during walking (step by step), breathing (in and out),
and when moving our eyes, the latter being a fundamental constituent of
visual cognition. The interplay between rhythm and cognition has mainly
been investigated in one direction: how we perceive and produce rhythm,
usually in the domain of music psychology (e.g., (<xref ref-type="bibr" rid="b31 b54 b78">31, 54, 78</xref>)). However,
we here address the more general idea that cognition itself may be
rhythmic in nature, especially when associated bodily systems can be
regarded as inherently rhythm-based (such as eye movements in the
context of visual cognition). The last decades have witnessed growing
evidence for an embodied view of cognition (e.g., (<xref ref-type="bibr" rid="b8">8</xref>)), a view that also
entails the idea that cognitive processes are essentially determined by
associated bodily systems (<xref ref-type="bibr" rid="b75">75</xref>).</p>

<p>The idea that rhythms can shape cognitive processing has been
discussed extensively in the domain of neuroscience. Brain activation
contains rhythmic oscillations. Such synchronized activation plays a
major role for cognitive processes involved in memory representations
and attentional selection (for a review see (<xref ref-type="bibr" rid="b14">14</xref>)). Moreover, intrinsic
brain rhythms shape visual (<xref ref-type="bibr" rid="b5 b24">5, 24</xref>), as well as auditory (<xref ref-type="bibr" rid="b28">28</xref>), perception
via changes in the excitability of local neuronal ensembles (<xref ref-type="bibr" rid="b44">44</xref>).
Interestingly, brain rhythms can also be affected by perceptual rhythms.
Neural oscillations can be shifted upon perceptual stimulation, an
effect called entrainment (<xref ref-type="bibr" rid="b44 b43">44, 43</xref>; for a review see (<xref ref-type="bibr" rid="b6">6</xref>)). In addition, a
single stimulation in one modality (e.g., auditory) can reset the
oscillatory phase for processing stimuli from another modality (e.g.,
visual), demonstrating cross-modal interactions in perception (<xref ref-type="bibr" rid="b16 b17 b48">16, 17, 48</xref>). Different from entrainment, auditory-driven phase resets do not
need a sequence of auditory stimuli.</p>

<p>There are at least two findings indicating the relevance of brain
rhythms for the saccade system. Saccade generation was aligned to visual
perceptual oscillations (<xref ref-type="bibr" rid="b4 b30">4, 30</xref>). Saccadic reaction times to a visual
stimulus onset were reduced by a preceding sound, resulting in
crossmodal phase reset (<xref ref-type="bibr" rid="b12">12</xref>). That is, attentional dynamics do affect
motor behavior.</p>

<p>In the field of visual cognition, classical theories focus on
stimulus-dependent processes. However, in recent years the idea of a
close coupling of perception and action has transferred to the visual
cognition domain, and an “active vision” perspective has been proposed
(<xref ref-type="bibr" rid="b18">18</xref>), which highlights the role of eye movements as an essential
component shaping the visual cognition machinery. Thus, typical everyday
visual cognition tasks like reading or searching/scanning for an object
have been successfully studied using eye tracking techniques (e.g., <xref ref-type="bibr" rid="b57 b58">57, 58</xref>).</p>

<p>In this context, two diverging views regarding eye movement control
have been proposed, namely a “sequential processing” view (e.g., in
reading, the EZ-Reader model, see (<xref ref-type="bibr" rid="b59 b60">59, 60</xref>)) and a “modulated pulse” view
(e.g., in reading, the SWIFT model, see (<xref ref-type="bibr" rid="b13">13</xref>); for scene viewing, the
CRISP model, see (<xref ref-type="bibr" rid="b53">53</xref>); for sequential scanning, see (<xref ref-type="bibr" rid="b71">71</xref>)). According to
the former, eye movements are driven by an essentially sequential
cognitive process aimed at processing stimuli, for example,
comprehending words (in the case of reading) or perceiving and
categorizing objects (in the case of visual search). Corresponding
models therefore assume that eye movements are triggered by a certain
stage/level of cognitive stimulus processing (e.g., whether a certain
word promises to be decoded successfully, see (<xref ref-type="bibr" rid="b60">60</xref>)). Thus, eye movements
are triggered only when a certain level of stimulus processing has been
reached (“direct control”, (<xref ref-type="bibr" rid="b51">51</xref>)).</p>

<p>However, sequential processing models do not adequately capture one
crucial aspect of the eye movement system, namely the fact that the eyes
“never rest”, but always move. In particular, saccades are continually
generated every 250 to 300 ms (<xref ref-type="bibr" rid="b57">57</xref>), regardless of any cognitive
processing demands. This rhythmic behavior is fundamentally different
from other effector systems (e.g., those involving arms, feet, vocal
utterances etc.), which usually rest when no movement is required. This
special characteristic of the eyes is captured more convincingly in
models assuming an autonomous pulse that triggers eye movements
(sometimes referred to as “indirect control”, see (<xref ref-type="bibr" rid="b32">32</xref>)). Here, cognitive
control is regarded as a potential modulator of this pulse (“mixed
control”, see (<xref ref-type="bibr" rid="b25 b71">25, 71</xref>)). For instance, the pulse is assumed to be slowed
down when a difficult word needs to be processed (see (<xref ref-type="bibr" rid="b77 b13">77, 13</xref>)). More
specifically, prolonged fixation durations are assumed to occur either
due to a decreased rate at which the saccade timer accumulates
activation until a “move” threshold is reached, or due to a cancellation
of an ongoing saccade program (<xref ref-type="bibr" rid="b13 b53">13, 53</xref>). There is evidence for “mixed
control” in a variety of tasks, e.g. reading (<xref ref-type="bibr" rid="b61 b62">61, 62</xref>), scene viewing
(<xref ref-type="bibr" rid="b27">27</xref>), and visual search (<xref ref-type="bibr" rid="b45">45</xref>). The “mixed control” account is also
supported by the observation of two distinct saccade populations, of
which only one is affected by task settings (<xref ref-type="bibr" rid="b52">52</xref>). Whether processing
facilitation can also yield a <italic>decrease</italic> of fixation
duration is currently under debate (<xref ref-type="bibr" rid="b26 b73">26, 73</xref>). Taken together, this
“modulated pulse” view of eye movement control seems especially suited
to capture the rhythmic nature of eye movement control, which – when
taking the embodiment perspective of cognition into account – in turn
should shape (visual) cognition. The “modulated pulse” view is also
highly compatible with the neuropsychological evidence discussed above:
a saccade timer might relate to ongoing brain oscillations (e.g., (<xref ref-type="bibr" rid="b4 b12 b30">4, 12, 30</xref>)).</p>

<p>Based on the assumption of rhythmic cognition, we were looking for a
method to test the idea of a pulse underlying both visual cognition and
oculomotor action in a straightforward but simple way. Based on previous
findings of inter-sensory crosstalk and cross-modal attention (e.g.,
(<xref ref-type="bibr" rid="b67">67</xref>)), we reasoned that one promising way to manipulate the putative
pulse of visual cognition would be to utilize an external (auditory)
pacemaker, that is, a very simple musical stimulus introducing a
rhythmic auditory beat. We assumed that rhythmic patterns on the
irrelevant (auditory) channel should modulate the pulse in the
task-relevant visual processing channel connected to the generation of
eye movements. For example, in “modulated pulse” models of saccade
generation, the saccade timer is characterized by a random walk to the
threshold of saccade initiation. The tempo of musical beats might affect
the transition rate of this random walk and alter fixation durations
accordingly.</p>

<p>Interestingly, previous studies in the domain of music psychology
have already used research designs in which music was presented in
addition to, for example, a relevant reading or search/scanning task,
but with a completely different research focus, namely to address the
question of whether background music (and which type of music) affects
performance in the primary task (e.g., scanning: (<xref ref-type="bibr" rid="b10 b37">10, 37</xref>); reading: (<xref ref-type="bibr" rid="b2 b22 b68">2, 22, 68</xref>)). For example, literature on reading performance showed impaired
comprehension in conjunction with task-irrelevant background music (<xref ref-type="bibr" rid="b2 b26 b38 b41 b68">2, 26, 38, 41, 68</xref>; but see (<xref ref-type="bibr" rid="b42">42</xref>), for a beneficial effect). It has been
assumed that processing music requires cognitive resources which
conflict with the activation of working memory representations (e.g.,
(<xref ref-type="bibr" rid="b65">65</xref>)). This might also relate to the finding that background music
increased fixation durations during scene viewing (<xref ref-type="bibr" rid="b66">66</xref>; but see (<xref ref-type="bibr" rid="b21">21</xref>)). In
line with this argument, background music also impairs performance in a
wide range of memory tasks (<xref ref-type="bibr" rid="b7">7</xref>). Interestingly, faster music impaired
reading comprehension even more than slower music (<xref ref-type="bibr" rid="b68">68</xref>). This effect
might be due to higher information load with faster music, because more
musical events per time unit must be processed. Alternatively, arousal
might contribute to the results. Faster music is associated with higher
perceived arousal, and high arousal music might impair reading
comprehension (<xref ref-type="bibr" rid="b11">11</xref>), as well as memory performance (<xref ref-type="bibr" rid="b7">7</xref>). There is one
study showing faster reading with fast, compared to slow, background
music, using a stop-watch to measure paragraph reading on mobile
computers in a cafeteria setting (<xref ref-type="bibr" rid="b40">40</xref>). This study indicates that indeed
eye-movement control might be affected by the tempo of background
music.</p>

<p>Corresponding studies using visual scanning tasks have provided mixed
results so far as well. Whereas instrumental and lyrical music can speed
up performance when presented simultaneously with a scanning task (<xref ref-type="bibr" rid="b10">10</xref>),
participants took longer to complete a visual search task after exposure
to slow versus fast music (<xref ref-type="bibr" rid="b37">37</xref>). Taken together, these previous studies
on the effects of background music on reading and visual search yield
evidence for the possibility of inter-modal (auditory on visual)
crosstalk.</p>

<p>In the present study, we focused on two types of visual cognition
tasks which are relevant for everyday behavior: (a) text reading, and
(b) sequential scanning. Those tasks differ in several aspects. For
example, cognitive processing load during reading for comprehension is
relatively high, since it involves many memory-based and linguistic
processing demands (from syntax to semantics) to extract the meaning of
the text (e.g., (<xref ref-type="bibr" rid="b33 b34 b35">33, 34, 35</xref>)). In contrast, cognitive processing load (in
terms of memory-based and linguistic processing) is particularly low in
our utilized sequential scanning task, since the decision of which item
should be processed next is predetermined in a simple, task-inherent
manner. Specifically, the search array consisted of Landolt-like rings,
and the side of the opening of each ring indicated the direction of
which object to scan next (“follow the gaps in the Landolt-rings”) until
participants found a closed target ring (see (<xref ref-type="bibr" rid="b70">70</xref>)).</p>

<p>The two tasks differ also in a second aspect: the underlying
oculomotor (instead of cognitive) control demands. Eye-movement control
during reading is known to be a highly-learned process characterized by
largely autonomous scanning strategies, as indicated by studies showing
similar oculomotor patterns when normal text is exchanged by z-strings
which lack any semantic content (<xref ref-type="bibr" rid="b29 b72">29, 72</xref>). In contrast, selection of the
next saccade target in our sequential scanning task is not highly
learned and automatized. Instead, it rather depends on moment-to-moment
decision-making that is based on meticulous visual attention processing
involving detection of the gap location within the stimulus in order to
program the appropriate next saccade (<xref ref-type="bibr" rid="b70">70</xref>)). Taken together, the two
tasks thus differ on two levels of control demands, namely high-level
cognitive demands (text reading more demanding than sequential
scanning), and lower-level oculomotor demands (sequential scanning more
demanding than highly trained and automatized text reading). While,
during reading, the ongoing goal of the reader is text comprehension,
during scanning, participants should mainly intend to determine the
direction of the following saccade.</p>

<p>Our general hypothesis was as follows: if visual cognition is
inherently rhythmic in nature (as assumed on the basis of “modulated
pulse” accounts of eye movement control in reading and search), it
should be possible to influence these processes by employing an external
auditory beat of varying tempo. This influence on processing rhythms
should become observable in terms of corresponding shifts in temporal
eye movement parameters (i.e., a faster beat should yield shorter
temporal oculomotor parameters), which in turn reflect temporal
characteristics of the underlying cognitive processes.</p>

<p>Based on this general hypothesis, we reasoned that two outcomes are
conceivable, depending on how the external pulse is processed by
participants:</p>

<p>(a) processing of the auditory beat might consume central
(comparatively high-level) processing resources. Then, adding this
auditory stimulus might lead to a resource conflict resulting in general
slowing of the reading or scanning task. However, the scanning task has
a greater chance to be modulated by the auditory beat. Since text
reading for comprehension is assumed to be more cognitively demanding
than sequential scanning, the auditory beat should be more effective in
the latter, because central resources in reading are consumed by the
primary task (reading comprehension) and thus no resources are available
for processing of the irrelevant auditory stimulus. Additionally, if the
auditory beat influences oculomotor control through a more high-level
cognitive processing route, one should expect to see effects especially
in those oculomotor parameters that are known to be determined by more
high-level, cognitive processing. For example, in reading, the central
cognitively relevant unit is the word, that is, cognitive oculomotor
control (in terms of the decision “where” and “when” to go next) is
basically word centered (or object-centered in the case of visual object
search, see (<xref ref-type="bibr" rid="b58">58</xref>)). Therefore, the assumption of object-based processing
predicts effects on gaze durations or total reading times (which are
determined by cognitive decisions based on successful word/object
decoding), rather than on basic fixation durations. We will refer to
this reasoning as the <italic>high-level cognitive load
account</italic>.</p>

<p>(b) Processing of the auditory beat might operate on a lower (less
cognitive) control level more specifically devoted to basic oculomotor
control. On such a basic oculomotor control level, we reasoned that text
reading relies on largely automatized oculomotor control routines (i.e.,
there is relatively low oculomotor control demand), whereas the present
sequential scanning task is considerably less trained and associated
with oculomotor control decisions from stimulus to stimulus. This
greater demand on oculomotor control decisions might prevent any
influence of the auditory beat in the sequential scanning task. Instead,
oculomotor control during text reading should be affected. Given that
this presumable crosstalk operates on a relatively low level of basic
oculomotor control, one would expect more basic oculomotor control
parameters to be affected (i.e., basic fixation durations instead of
gaze durations). This is also plausible since the above cited “modulated
pulse” models of oculomotor control are devoted to explaining the
control of basic fixation durations. We will refer to this possibility
as the <italic>oculomotor control load account</italic>.</p>

<p>There is one important caveat regarding the predictions for temporal
parameters that needs to be considered. Previous literature on text
reading and sequential scanning has already provided a nearly exhaustive
picture of relevant variables determining oculomotor parameters, which
together explain a remarkable portion of oculomotor processing
variability (e.g., (<xref ref-type="bibr" rid="b58">58</xref>), for the case of reading). As a consequence,
there is little room left for remaining variables (including external
pacemakers) to affect these parameters. Thus, even though we consider
the present hypotheses as highly relevant on a theoretical level (as
outlined above), it is clear from the start that any potential effects
should be very small (i.e., in the range of milliseconds). To make the
observation of such small effects more likely, we decided to have
participants read through not just one, but many text passages (and to
scan through several displays, respectively) in order to maximize the
reliability of individual performance estimates. Additionally, we
utilized not just two but four different tempi for the auditory beat to
further minimize the probability of observing a Type I error, i.e. a
false positive result.</p>

<p>In order to implement a more natural experimental situation, we did
not use a basic metronome as an external auditory stimulus, but instead
composed a melodic-rhythmic pattern (resembling trance-like music with a
non-ambiguous, continuous pulse). In this way, our study is also open to
interpretation in the context of more applied research questions (i.e.,
regarding the influence of background music and its tempo on reading and
scanning performance, see above for a brief literature review).</p>

<p>The tempi differed strongly, that is between 80 to 140 beats per
minutes (bpm), which translates to inter-beat-intervals of 750 to 429 ms
or beat frequencies of 1.33 to 2.33 Hz (when regarding the quarter notes
as beat throughout, see Table 1). However, fixation durations for silent
reading are usually about 225 ms and for scene viewing about 330 ms (see
(<xref ref-type="bibr" rid="b57">57</xref>)), which translates to frequencies of 3–4 Hz. Importantly, then, the
pulse of saccade generation and the beats of our stimuli were on very
different time scales. Nevertheless, we expected some (albeit small)
modulation effects. Note that our study does not touch on entrainment
effects in the narrow sense, which in this case would be reflected in
saccades locking onto the beat. First, the range of tempi makes beat
entrainment unlikely. Eye movement behavior will unlikely be forced to
slow down towards a rate of 1.33 to 2.33 Hz, as given by the quarter
beat. However, entrainment to eighths or sixteenths is thinkable but
would require a rather complex processing of the simple stimuli. Second,
the auditory beats presented in our study were not synchronized in any
way to the eye movement recording, rendering a direct relational
entrainment analysis impossible. Instead, we simply compared temporal
eye movement parameters (e.g., fixation durations) across four different
auditory beat conditions to uncover a systematic effect of an external
pulse.</p>

<table-wrap id="t01" position="float">
					<label>Table 1.</label>
					<caption>
						<p>Features of the four tempi applied in the current study.</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">
						<thead>
      <tr>
        <th></th>
        <th colspan="3">Inter-Beat-Interval (ms)</th>
        <th colspan="3">Beat Frequency (Hz)</th>
      </tr>
    </thead>
    <tbody>
      <tr>
        <td>Beat (bpm)</td>
        <td>Eighth</td>
        <td>Quarter</td>
        <td>Halves</td>
        <td>Eighth</td>
        <td>Quarter</td>
        <td>Halves</td>
      </tr>
      <tr>
        <td>80</td>
        <td>375</td>
        <td>750</td>
        <td>1,500</td>
        <td>2.66</td>
        <td>1.33</td>
        <td>0.67</td>
      </tr>
      <tr>
        <td>100</td>
        <td>300</td>
        <td>600</td>
        <td>1,200</td>
        <td>3.33</td>
        <td>1.67</td>
        <td>0.83</td>
      </tr>
      <tr>
        <td>120</td>
        <td>250</td>
        <td>500</td>
        <td>1,000</td>
        <td>4</td>
        <td>2</td>
        <td>1</td>
      </tr>
      <tr>
        <td>140</td>
        <td>214</td>
        <td>429</td>
        <td>857</td>
        <td>4.66</td>
        <td>2.33</td>
        <td>1.17</td>
      </tr>
    </tbody>
  </table>
</table-wrap>

<p><italic>Note.</italic> Inter-beat-intervals and beat frequencies are
presented using either the quarter as a reference, which would most
likely be perceived as the basis for an intermediate tempo. However,
individual beat perception is difficult to predict. When tempo
decreases, the perceived beat reference might shift to the eighth (or to
the halves for increasing tempo).</p>
    </sec>

    <sec id="S2">
      <title>Methods</title>
    <sec id="S2a">
      <title>Participants</title>

<p>Forty students of the University of Würzburg participated in the
experiment (31 female and 9 male) with a mean age of 25 years
(<italic>SD =</italic> 3, range = 18–31). All participants reported
normal or corrected-to-normal vision and were naïve about the purpose of
the experiment. They gave informed consent and received either course
credits or were financially compensated (15 €) for their participation.
The experimental session took about 2 hours. The study was performed in
accordance with the ethical standards described in the Declaration of
Helsinki.</p>
    </sec>

    <sec id="S2b">
      <title>Apparatus</title>

<p>Stimuli were presented on a 21-inch cathode ray monitor (temporal
resolution: 100 Hz; spatial resolution: 1024 x 768 pixels). A head and
chin rest reduced head movements of the participants. The distance
between chin rest and the center of the monitor was 65 cm. Gaze location
was registered by measuring the right eye’s pupil using a
desktop-mounted infrared reflection system (EyeLink 1000, SR Research,
Ontario, Canada) with a temporal resolution of 1,000 Hz. The experiment
ran on Windows 7 on a PC. Stimulus presentation and eye movement
recording were controlled by Experiment Builder (SR Research, Ontario,
Canada).</p>
    </sec>

    <sec id="S2c">
      <title>Material</title>

<p><bold>Reading task.</bold> The text material was a subset of a German
text corpus created for a different study; a detailed description can be
found in (<xref ref-type="bibr" rid="b56">56</xref>). The text type was a non-fictional text about the Inuit
culture comprising 4,451 words in total. This report was split into 65
passages, with each passage containing six lines of text. Passages were
arranged in 5 blocks, with 13 passages per block. Blocks were of similar
length (942, 937, 881, 914, 877 words). For each passage, a question
regarding the semantic content was prepared, for which the participants
had to generate a verbal answer. This procedure ensured proper and
attentive reading for comprehension performance. Letter size was 0.44°
by 0.24° (height, width) for capital letters and 0.24° by 0.24° for
small letters (using an equidistant font). The reading task started with
a fixation cross (size: 0.53° by 0.53°) prior to the presentation of
each passage.</p>

<p><bold>Sequential scanning task.</bold> The task included 50 trials,
arranged in 5 blocks with 10 trials each. The number of trials differed
from the reading task to adjust for the processing time between tasks.
For every trial, a different sequential scanning display was generated
beforehand. The visual display consisted of an 18 x 18 grid containing
black Landolt Cs<sub>ij</sub> (i = horizontal position, j = vertical
position) on a white background, with a line width of 0.08°, and an
opening (gap size: 0.08) at one of the four positions: left, right, top,
bottom (see Figure 1 for an example array). Symbols had a diameter of
0.88°. Average horizontal and vertical distance between stimulus
elements was 1.32° (measured center to center). The position of Landolt
Cs varied horizontally and vertically around the grid centers. This
spatial jitter corresponded to one eighth of the size of a Landolt C (±
0.11°). Line thickness of the <italic>start symbol</italic> was doubled.
The <italic>end symbol</italic> was defined by a closed ring without a
gap. Sequence length of the visual scan path was 50 to 60 symbols in
each scanning display.</p>

<fig id="fig01" fig-type="figure" position="float">
					<label>Figure 1.</label>
					<caption>
						<p>Example of a trial in the sequential
scanning task, consisting of 18-by-18 symbols. The start symbol (bold
line) is located in column 14, row 4, the end symbol (closed circle) in
column 16, row 14 (counting columns and rows from the upper left
corner). Participants had to follow a sequence of symbols from the start
symbol to the end symbol. A sequence was defined by the openings of
Landolt Cs, e.g. from the start symbol: one symbol to the left, followed
by one symbol up, one symbol to the left, etc.</p>
					</caption>
					<graphic id="graph01" xlink:href="jemr-11-02-i-figure-01.png"/>
				</fig>

<p><bold>Music</bold>. A professional sound engineer created the musical
piece using the software Ableton Live 8 (8.0.9) (audio files are
available online as Supplementary). The goal was to compose a “musical
metronome”, that is a very basic musical stimulus without surprising
features. The stimulus comprised three major elements: bass drum,
bass-synthesizer, harmony-synthesizer. The bass drum was continuously
playing a very simple quarter note pulse, thus ensuring a maximally
unambiguous beat and hence tempo perception. The bass- and the
harmony-synthesizers played a simple harmonic sequence of tonic,
dominant, tonic, subdominant<xref ref-type="fn" rid="fn1">1</xref>,
changing the harmony on the first beat of each bar. The original
composition was presented in four different tempi (while keeping all
other aspects constant), measured as bpm: 80, 100, 120, 140 bpm. Sound
pressure level was kept constant within each session and between
participants. It was adjusted to approximately 55-60 dB. A fifth
condition without any music served as control condition. Musical stimuli
were presented via supra-aural headphones (AKG, model MK 271).</p>
    </sec>

    <sec id="S2d">
      <title>Procedure</title>

<p>The experimental sessions were conducted individually with one
participant at a time. The experimenter was present in order to operate
the eye tracking system (i.e., for calibration routines before each
experimental block), but seated out of the participant’s sight (to
prevent attentional distraction, see (<xref ref-type="bibr" rid="b76">76</xref>)). Participants received
instructions about the reading/scanning task. They were neither
specifically informed or instructed to attend to the music nor to ignore
it.</p>

<p><bold>Reading task.</bold> In the reading task, each block started
with a 9-point calibration routine. After successful calibration, the
trial began with a fixation cross presented in the upper left quadrant
of the screen (located 6.61° from the upper and 3.26° from the left edge
of the display). Participants had to fixate it and press the space bar
for confirmation. Upon successful fixation, the text was presented with
the first letter of the first word at the position of the previous
fixation cross. The participants’ task was to read the text and to press
the space bar again when they finished reading. This was followed by a
comprehension question that appeared centrally on the screen. The
experimenter coded manually whether the oral response was correct. To be
classified as correct, the response had to convey the meaning, if it did
not match the exact wording. Participants initiated the next trial by
pressing the space bar. After a delay of 1,000 ms the next trial
started.</p>

<p><bold>Sequential scanning task.</bold> At the beginning of each
block, a 13-point calibration routine was performed. We decided for this
more precise calibration because of the higher spatial resolution of the
display. Each trial began with the presentation of the start symbol. Its
position within the 18-by-18 grid was randomized. Participants had to
fixate on this symbol and press the space bar for confirmation. Upon
successful fixation, the complete 18 x 18 symbol array appeared.
Participants were instructed to find the end symbol, i.e., a closed
ring, by following a path through the array of Landolt Cs. The path was
defined by the openings of symbols. For example, a Landolt
C<sub>ij</sub> with an opening to the left indicated that the next
target symbol was located one symbol to the left (position: i-1, j),
which in turn indicated the next to-be-fixated symbol by its opening
(see Fig. 1; see (<xref ref-type="bibr" rid="b69 b70">69, 70</xref>), for an analogue scanning task). When
participants reached the <italic>end symbol</italic>, they indicated the
search end by pressing the space bar. The next trial started 200 ms
after the keypress with a new start symbol.</p>
    </sec>

    <sec id="S2e">
      <title>Design</title>

<p>Auditory background condition (five levels: music in four different
tempi and one silent control) was manipulated as an independent variable
within participants in both the reading and the scanning task. The five
levels of the auditory condition were blocked, resulting in ten blocks
in total. The order of task (reading, scanning) was counterbalanced
across participants, with either five blocks reading task followed by
five blocks scanning task or vice versa. The auditory background
situation was kept constant within one block, but the serial order of
the five levels was counterbalanced between participants. The serial
order of the trial passages or scanning displays was kept constant. This
was necessary for the text reading task, because the text contained a
semantically meaningful story and could not be scrambled. The five
auditory conditions were chosen to make the following specific
comparisons: firstly, the analysis focused on the subset of blocks
involving music with four different tempi (effect of tempo), and,
secondly, all music conditions were compared with the silent control
trials (effect of music presence).</p>
    </sec>

    <sec id="S2f">
      <title>Data Analysis</title>

<p>One block of one participant was not recorded due to technical
failure in the scanning task, so we excluded this participant in the
scanning task. We used the automatized sequencing procedure of SR
Research (see above) to differentiate fixations and saccades. We defined
the space covered by a word or Landolt C and its related blank space as
interest areas. The software determines all interest areas in a way that
the empty space between symbols (word or Landolt C) is assigned halfway
to adjacent areas. For instance, the blank space between words is split
into half, with the left half belonging to the prior word and the right
half to the succeeding one. We recorded a total amount of 307,106
fixations in the reading task and 202,664 in the scanning task across
all participants. Only fixations located in interest areas were further
analyzed (exclusion of 0.84% of all fixations in the reading task, 0.02%
in the scanning task). We defined fixations with a duration smaller than
75 ms and longer than the mean plus three standard deviations as
outliers and excluded them (additional exclusion of 4.53% in reading,
3.28% in scanning). The reading task resulted in 290,617 valid fixations
and the scanning task resulted in 195,971 valid fixations. Taking the
resulting saccades into account, we recorded 63.56% forward saccades,
19.47% refixations, and 16.97% backward saccades in the reading task.
These proportions are in excellent agreement with the reading literature
(<xref ref-type="bibr" rid="b56 b57">56, 57</xref>). In the scanning task, we primarily observed forward saccades
to the next symbol and refixations within a symbol. Proportion of
refixations was 40.66% of the data, which is higher than what has been
reported before (<xref ref-type="bibr" rid="b69 b70">69, 70</xref>), but was likely due to the visually more
challenging display (contrary to earlier studies, spatial positions of
Landolt-Cs were not located on a perfect regular grid but deviated
slightly from this regular arrangement).</p>

<p>For the detected fixations, we then analyzed fixation durations, gaze
durations, and total reading times. Gaze durations reflect the duration
of the cognitive process underlying the decision of when to move towards
the next word. It is a word-based measure and is defined as the summed
fixation durations of multiple fixations on one word during first pass
reading (i.e., excluding regressions). Our data set spanned 138,923
valid gaze durations (outliers were excluded in a similar manner as for
fixation durations). Total reading time is defined as the sum of all
fixation durations on one word, including those when a word was fixated
multiple times or passages were re-read. As such it is a measure for
overall word processing time. Correspondingly, in the scanning task, we
summed all fixation durations on a symbol in the 18-by-18 grid. We also
analyzed the mean task completion time, which was the time between start
of reading a passage or scanning a visual display and the key press of
participants signaling the end of the task in each trial.</p>

<p>We deliberately decided to not analyze both tasks within a single
statistical model, because the tasks differ in too many respects to
allow for a meaningful direct comparison. For example, as outlined in
the introduction, word processing in text reading is guided by
linguistic control demands and predictive processes based on semantic
context, while the scanning task requires moment-to-moment decisions
about where to move next in a two-dimensional, highly structured array.
In addition, reading results in mostly horizontal saccades, while
scanning requires vertical saccades as well, which are known to have a
different timing profile than horizontal ones. Finally, making incorrect
saccades in the scanning tasks pushes the eyes on the wrong track, e.g.,
dead ends, whereas the penalty for an incorrect saccade in the reading
task is less dramatic. These differences (apart from the general
difference between words and Landolt rings) are known to strongly
determine oculomotor control, and thus prohibit direct statistical
comparisons of both temporal and spatial eye movement parameters across
the two tasks. Therefore, we used Bayesian procedures to be able to
qualitatively compare result patterns across the two tasks (see
below).</p>

<p>Our analysis for each dependent variable was two-fold: first, we
incorporated the four music conditions (80, 100, 120, 140 bpm) in a
General Linear Model/ANOVA and tested whether a linear trend across
increasing tempo conditions emerged. Second, to test for an effect of
music presence, we additionally compared performance averaged across all
music conditions with the silent control condition.</p>

<p>As a complement to classical ANOVAs, we computed Bayes factors (<xref ref-type="bibr" rid="b63 b64">63, 64</xref>). Unlike conventional significance testing, the Bayes factor can be
interpreted continuously and quantifies the evidence in favor of one
hypothesis over another. Importantly, it also allows us to argue in
favor of the null hypothesis. For an interpretation of the magnitude of
the Bayes factor, we used Jeffrey’s scale of evidence (<xref ref-type="bibr" rid="b46">46</xref>). Bayes
factors <italic>BF<sub>10</sub></italic> larger than 1 generally support
hypothesis H<sub>1</sub> over hypothesis H<sub>0</sub>. Values in the
range of 1–3.16, 3.16–10, 10–100, and &#x3E;100 constitute weak,
substantial, strong, and decisive evidence,
respectively.<xref ref-type="fn" rid="fn2">2</xref> Since Bayes factors
are computed as likelihood ratios, Bayes factors BF<sub>10</sub> smaller
than 1 support hypothesis H<sub>0</sub> over hypothesis H<sub>1</sub>.
Accordingly, values in the range 1–0.31, 0.31–0.10, 0.10–0.01, and &#x3C;
.01 constitute weak, substantial, strong, and decisive evidence in
support of hypothesis H<sub>0</sub>. In our case we computed Bayes
factors to quantify the evidence for a linear effect of tempo and a
general effect of music presence, versus the absence of these effects
(H<sub>0</sub>). All Bayes factors were computed in R (<xref ref-type="bibr" rid="b55">55</xref>) using the
<italic>BayesFactor</italic> package (version 0.9.12-2; (<xref ref-type="bibr" rid="b50">50</xref>)).</p>
    </sec>
    </sec>

    <sec id="S3">
      <title>Results</title>

<p>We first compared fixation durations between tasks in the silent
condition. Mean fixation durations in the reading task, <italic>M
=</italic> 211 ms (<italic>SD =</italic> 23), matched what is known for
typical silent reading (<xref ref-type="bibr" rid="b57">57</xref>). In the scanning task, mean fixation
durations were somewhat longer, <italic>M =</italic> 308 ms (<italic>SD
=</italic> 34). These strong baseline differences,
<italic>t</italic>(38) = 17.38, <italic>p</italic> &#x3C; .001,
η<sup>2</sup> = .888, in addition to the very different underlying task
demands discussed above, further strengthen the validity of our decision
to separately analyze the reading and the scanning tasks. Results for
all dependent variables can be found in Table 2 (reading) and 3
(scanning).</p>

    <sec id="S3a">
      <title>Reading Task</title>

<p><bold>Mean fixation duration.</bold> We analyzed all trials, since
performance in the text comprehension task was sufficiently high; the
mean number of errors ranged between M = 1.45 and 1.90, out of 13 trials
for each of the five conditions, and an ANOVA revealed no significant
difference in the numbers of errors between the five conditions,
<italic>F</italic>(4, 156) = 1.19, <italic>p</italic> = .318,
η<sup>2</sup> = .030. Thus, participants were indeed reading for
comprehension as intended.</p>

<p>Importantly, in our ANOVA, including the four music tempo conditions
showed a clear linear trend, <italic>F</italic>(1, 39) = 5.61,
<italic>p</italic> = .023, η<sup>2</sup> = .126, which is also depicted
in Figure 2. The ANOVA for evaluating the effect of music presence on
fixation durations did not yield a significant effect of music versus
silent background, <italic>F</italic> &#x3C; 1, <italic>p</italic> = .904,
η<sup>2</sup> &#x3C; .001. In line with these results, Bayes factors
revealed substantial support for the presence of a linear trend of tempo
(<italic>BF<sub>10</sub></italic> = 4.43), as well as substantial
support for the absence of an effect of music presence
(<italic>BF<sub>10</sub></italic> = .23). That is, the mere presence of
music did not change mean fixation durations during reading in general,
but the tempo of music affected control such that a faster beat reduced
fixation durations.</p>

<table-wrap id="t02" position="float">
					<label>Table 2.</label>
					<caption>
						<p>Summary of ANOVA results and Bayes factors for the reading task.</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">

    <tbody>
      <tr>
        <td>Analysis</td>
        <td><italic>F</italic>(1, 39)</td>
        <td><italic>p</italic></td>
        <td>η<sub>p</sub><sup>2</sup></td>
        <td><italic>BF<sub>10</sub></italic></td>
      </tr>
      <tr>
        <td><bold>Mean fixation duration</bold></td>
        <td></td>
        <td></td>
        <td></td>
        <td></td>
      </tr>
      <tr>
        <td>Linear contrast for tempo</td>
        <td><italic>F</italic> = 5.61</td>
        <td>.023</td>
        <td>.126</td>
        <td>4.43<sup>ii</sup></td>
      </tr>
      <tr>
        <td>Effect of music presence</td>
        <td><italic>F</italic> &#x3C; 1</td>
        <td>.904</td>
        <td>.000</td>
        <td>.23<sup>ii</sup></td>
      </tr>
      <tr>
        <td><bold>Mean gaze duration</bold></td>
        <td></td>
        <td></td>
        <td></td>
        <td></td>
      </tr>
      <tr>
        <td>Linear contrast for tempo</td>
        <td><italic>F</italic> &#x3C; 1</td>
        <td>.584</td>
        <td>.008</td>
        <td>.21<sup>ii</sup></td>
      </tr>
      <tr>
        <td>Effect of music presence</td>
        <td><italic>F</italic> &#x3C; 1</td>
        <td>.792</td>
        <td>.002</td>
        <td>.24<sup>ii</sup></td>
      </tr>
      <tr>
        <td><bold>Mean total reading time</bold></td>
        <td></td>
        <td></td>
        <td></td>
        <td></td>
      </tr>
      <tr>
        <td>Linear contrast for tempo</td>
        <td><italic>F</italic> &#x3C; 1</td>
        <td>.390</td>
        <td>.019</td>
        <td>.27<sup>ii</sup></td>
      </tr>
      <tr>
        <td>Effect of music presence</td>
        <td><italic>F</italic> = 7.96</td>
        <td>.007</td>
        <td>.170</td>
        <td>5.38<sup>ii</sup></td>
      </tr>
      <tr>
        <td><bold>Mean task completion time</bold></td>
        <td></td>
        <td></td>
        <td></td>
        <td></td>
      </tr>
      <tr>
        <td>Linear contrast for tempo</td>
        <td><italic>F</italic> = 1.74</td>
        <td>.195</td>
        <td>.043</td>
        <td>.51<sup>i</sup></td>
      </tr>
      <tr>
        <td>Effect of music presence</td>
        <td><italic>F</italic> = 10.31</td>
        <td>.003</td>
        <td>.209</td>
        <td>12.53<sup>iii</sup></td>
      </tr>
    </tbody>
  </table>
					<table-wrap-foot>
						<fn id="FN3">
						<p>Note. In our case, Bayes factors BF<sub>10</sub> &#x3E; 1 provide
evidence for the presence of an effect while Bayes factors
BF<sub>10</sub> &#x3C; 1 provide evidence for the absence of an effect.
Magnitude of the Bayes factor is classified as (i) weak, (ii)
substantial, (iii) strong, and (iv) decisive evidence in support of a
hypothesis.</p>
						</fn>
					</table-wrap-foot>  
</table-wrap>

<fig id="fig02" fig-type="figure" position="float">
					<label>Figure 2.</label>
					<caption>
						<p>Linear effect of the tempo of the auditory
beat on mean fixation duration in the reading task. Error bars show
confidence intervals based on (<xref ref-type="bibr" rid="b9">9</xref>) and (<xref ref-type="bibr" rid="b49">49</xref>).</p>
					</caption>
					<graphic id="graph02" xlink:href="jemr-11-02-i-figure-02.png"/>
				</fig>

<p><bold>Mean gaze duration.</bold> There was no significant linear
trend for tempo, <italic>F</italic> &#x3C; 1, <italic>p</italic> = .584,
η<sup>2</sup> = .008. Also, music presence did not affect gaze
durations, <italic>F</italic> &#x3C; 1, <italic>p</italic> = .792,
η<sup>2</sup> = .002. Bayes factors showed substantial support for the
absence of a linear trend of tempo (<italic>BF<sub>10</sub></italic> =
.21) and for the absence of an effect of music presence
(<italic>BF<sub>10</sub></italic> = .24).</p>

<p><bold>Mean total reading time.</bold> The results showed a somewhat
different pattern than for mean fixation durations. Whereas the linear
contrast for tempo was far from significant, <italic>F</italic> &#x3C; 1,
<italic>p</italic> = .390, η<sup>2</sup> = .019, the main effect of
music presence showed reduced total reading times in blocks with music
(<italic>M =</italic> 426.68, <italic>SD =</italic> 17.07), in
comparison to silent blocks (<italic>M =</italic> 442.66, <italic>SD
=</italic> 18.86), <italic>F</italic>(1, 39) = 7.963, <italic>p</italic>
= .007, η<sup>2</sup> = .170. Accordingly, Bayes factors revealed
substantial support against an effect of a linear trend of tempo
(<italic>BF<sub>10</sub></italic> = .27) and substantial support for an
effect of music presence (<italic>BF<sub>10</sub></italic> = 5.38).</p>

<p><bold>Mean task completion time.</bold> There was no significant
linear trend for tempo, <italic>F</italic>(1, 39) = 1.74,
<italic>p</italic> = .195, η<sup>2</sup> = .043. Again, and analogous to
total reading time for words, the main effect of music presence was
significant, <italic>F</italic>(1, 39) = 10.312, <italic>p</italic> =
.003, η<sup>2</sup> = .209. Participants were faster when the reading
display was accompanied by music (<italic>M =</italic> 29,505 ms,
<italic>SD =</italic> 1,532), in comparison to silence (<italic>M
=</italic> 30,702 ms, <italic>SD =</italic> 1,668). While the Bayes
factor showed only weak support against the absence of a linear trend
(<italic>BF<sub>10</sub></italic> = .51), we found strong support for
the effect of music presence on mean task completion time
(<italic>BF<sub>10</sub></italic> = 12.53).</p>
    </sec>

    <sec id="S3b">
      <title>Scanning Task</title>

<p>For a summary of the detailed statistics of the scanning task
analyses, please see Table 3. All ANOVAs were far from significant, that
is, tempo did not have an obvious effect on fixation durations, gaze
durations, total scanning times, or task completion times. In addition,
musical presence did not show any effect as well. In line with this,
Bayes factor analyses provided weak or substantial evidence in support
of the null hypotheses (absence of a linear trend of tempo and no effect
of music presence; all <italic>BF<sub>10</sub></italic> &#x3C; 1). Most
importantly, the Bayes factor for mean fixation durations provided
substantial evidence against a linear trend
(<italic>BF<sub>10</sub></italic> = .18) (see Figure 3).</p>

<table-wrap id="t03" position="float">
					<label>Table 3.</label>
					<caption>
						<p>Summary of ANOVA results and Bayes factors for the
sequential scanning task.</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">

    <tbody>
      <tr>
        <td>Analysis</td>
        <td><italic>F</italic>(1, 38)</td>
        <td><italic>p</italic></td>
        <td>η<sub>p</sub><sup>2</sup></td>
        <td><italic>BF<sub>10</sub></italic></td>
      </tr>
      <tr>
        <td><bold>Mean fixation duration</bold></td>
        <td></td>
        <td></td>
        <td></td>
        <td></td>
      </tr>
      <tr>
        <td>Linear contrast for tempo</td>
        <td><italic>F</italic> &#x3C; 1</td>
        <td>.940</td>
        <td>.000</td>
        <td>.18<sup>ii</sup></td>
      </tr>
      <tr>
        <td>Effect of music presence</td>
        <td><italic>F</italic> &#x3C; 1</td>
        <td>.703</td>
        <td>.004</td>
        <td>.25<sup>ii</sup></td>
      </tr>
      <tr>
        <td><bold>Mean gaze duration</bold></td>
        <td></td>
        <td></td>
        <td></td>
        <td></td>
      </tr>
      <tr>
        <td>Linear contrast for tempo</td>
        <td><italic>F</italic> &#x3C; 1</td>
        <td>.910</td>
        <td>.000</td>
        <td>.17<sup>ii</sup></td>
      </tr>
      <tr>
        <td>Effect of music presence</td>
        <td><italic>F</italic> &#x3C; 1</td>
        <td>.374</td>
        <td>.021</td>
        <td>.33<sup>i</sup></td>
      </tr>
      <tr>
        <td><bold>Mean total scanning time</bold></td>
        <td></td>
        <td></td>
        <td></td>
        <td></td>
      </tr>
      <tr>
        <td>Linear contrast for tempo</td>
        <td><italic>F</italic> &#x3C; 1</td>
        <td>.706</td>
        <td>.004</td>
        <td>.19<sup>ii</sup></td>
      </tr>
      <tr>
        <td>Effect of music presence</td>
        <td><italic>F</italic> &#x3C; 1</td>
        <td>.546</td>
        <td>.010</td>
        <td>.27<sup>ii</sup></td>
      </tr>
      <tr>
        <td><bold>Mean task completion time</bold></td>
        <td></td>
        <td></td>
        <td></td>
        <td></td>
      </tr>
      <tr>
        <td>Linear contrast for tempo</td>
        <td><italic>F</italic> &#x3C; 1</td>
        <td>.363</td>
        <td>.022</td>
        <td>.32<sup>i</sup></td>
      </tr>
      <tr>
        <td>Effect of music presence</td>
        <td><italic>F</italic> = 2.44</td>
        <td>.127</td>
        <td>.060</td>
        <td>.64<sup>i</sup></td>
      </tr>
    </tbody>
  </table>
					<table-wrap-foot>
						<fn id="FN4">
						<p>Note. See Table 1.</p>
						</fn>
					</table-wrap-foot>  
</table-wrap>

<fig id="fig03" fig-type="figure" position="float">
					<label>Figure 3.</label>
					<caption>
						<p>Mean fixation duration as a function of
the tempo of the auditory beat. No significant effect was observed.
Error bars show confidence intervals based on (<xref ref-type="bibr" rid="b9">9</xref>) and (<xref ref-type="bibr" rid="b49">49</xref>).</p>
					</caption>
					<graphic id="graph03" xlink:href="jemr-11-02-i-figure-03.png"/>
				</fig>

    </sec>
    </sec>

    <sec id="S4">
      <title>Discussion</title>

<p>The present study addresses the general question of whether visual
cognition is inherently rhythmic in nature by measuring the potential
influence of an external auditory beat, with varying tempo, on temporal
eye movement parameters. Specifically, we reasoned that a faster beat
should yield shorter temporal oculomotor parameters, which in turn
should reflect temporal characteristics of the underlying cognitive
processes. We measured eye-movements in two exemplary visual cognition
tasks: text reading and sequential scanning. These tasks were either
completed in silence or with an auditory beat (simple, electronic music)
at four different tempi.</p>

<p>The most important result is that the tempo of the beat significantly
affected basic fixation durations in the reading task. Higher musical
tempo resulted in shorter mean fixation durations. While the effect was
notably small, several aspects strengthen the reliability of the effect.
First, it should be noted that we already expected any effect to be
small, since eye movements in reading are known to be strongly
determined by automatized routines and linguistic processing, leaving
only small room for further external determinants. Second, the effect
consists of a highly systematic (monotonous) pattern in the expected
direction across all four tempo conditions. The implementation of four
(instead of only two) tempo conditions makes it unlikely that such an
effect represents a random false positive. Finally, the effect was
successfully replicated in an independent parallel study involving
comparable demands ((<xref ref-type="bibr" rid="b47">47</xref>); see below for further details). On the
backdrop of our theoretical reasoning outlined in the introduction, this
central result is of high theoretical significance.</p>

<p>Specifically, the fact that the Bayes factor analyses provided
substantial evidence against an influence of the auditory beat on
oculomotor control (mean fixation durations) in sequential scanning but
substantial evidence for a reliable effect during text reading can be
regarded as evidence for an <italic>oculomotor control load
account</italic> and against a <italic>higher-level</italic>
<italic>cognitive load account</italic> regarding the underlying level
of processing for the auditory stimulus. While oculomotor control in the
present sequential scanning task necessitates moment-by-moment decisions
on where to move next, oculomotor control in reading relies on highly
automatized motor routines, thus leaving more room (in terms of motor
control resources) for the processing of the auditory stimulus. If
processing of the beat had occurred on a higher cognitive level, one
would have expected a strong influence of the beat tempo in the scanning
task, but no substantial influence in reading (a pattern we clearly did
not observe).</p>

<p>Our present interpretation in favor of the <italic>oculomotor control
load account</italic> is further corroborated by the observation that
gaze durations (as opposed to basic fixation durations) were not
affected by beat tempo during reading. This is in line with “modulated
pulse” models of temporal oculomotor control (e.g., (<xref ref-type="bibr" rid="b13">13</xref>)), which also
assume that an autonomous rhythmic timer determines basic fixation
durations to a large degree. Most likely, the external auditory pulse
affected the speed of the autonomous saccade timer (rhythmic crosstalk
between auditory input and oculomotor output). In contrast, a cognitive
account of the effects of the auditory stimulus would have predicted an
effect on temporal parameters that reflect more cognitively driven
decisions (e.g., gaze durations reflecting the decision of when to
fixate the next word, see (<xref ref-type="bibr" rid="b57">57</xref>)). However, we did not find any
significant effect on gaze durations, which speaks against a
<italic>higher-level cognitive load</italic> account of the present
data.</p>

<p>Furthermore, the <italic>higher-level cognitive load account</italic>
would have predicted a general detrimental effect of the processing
demand of the additional auditory stimulus on primary task performance.
However, this was not observed in either of the two tasks. While
sequential scanning was completely unaffected by the presence of music,
total reading times and reading completion times even decreased when the
auditory stimulus was present, while text comprehension was not
hampered. Several explanations for increased reading speed in the
presence of music appear conceivable. For example, music might have
increased motivation to complete the task based on positive emotion
induction. Or, given its rhythmic nature, the music might have helped
participants to stay focused. In addition, the music conditions likely
increased arousal in comparison to a silent control, since increased
emotional arousal is known to be generally associated with speed (e.g.,
(<xref ref-type="bibr" rid="b3 b15 b23">3, 15, 23</xref>)). Higher arousal might have resulted in the overall effect
of speeding up in the mere presence of music; however, it is unclear,
why this would affect reading but not scanning.</p>

<p>One difference between tasks is the involvement of articulatory
processes in reading (<xref ref-type="bibr" rid="b33">33</xref>), as opposed to sequential scanning (where
participants might only articulate self-instructions such as
“up”/”right” etc.). One might think that when articulating the text
silently, participants might match the tempo between articulation
(stressed/unstressed syllables) and the auditory beat. In that case,
however, the observed effect should have been much more pronounced
(similar to an entrainment hypothesis, see Table 1), which renders the
assumption of a strategy to match articulation speed with the auditory
beat unlikely.</p>

<p>It is important to note that the two tasks, sequential scanning and
reading, differed substantially regarding both the underlying processing
requirements and the actual oculomotor characteristics, which prohibits
any direct statistical comparison between tasks. For example, fixation
durations in the scanning task were much longer, and thus the frequency
of the oculomotor pulse was actually closer to the chosen frequencies of
the auditory beat, but still differed strongly. Future studies on the
influence of beat tempo on reading or scanning, particularly when
testing entrainment in the sense of typical corresponding studies (see
Introduction), should test the effect of pacemaker tempi higher than
those utilized in the present study, which were clearly slower than the
oculomotor rhythms. Here, we were less interested in such direct
entrainment, but rather in the ecologically more prominent situation of
background music during visual cognition. Hence the chosen tempi spanned
from very slow to rather fast music.</p>

<p>On a more theoretical level, the fact that our results favor an
<italic>oculomotor control load account</italic> rather than a
<italic>higher-level cognitive load account</italic> also has
implications for the underlying cognitive architecture. Previous
theoretical frameworks regarding the structural layout of the cognitive
system can be divided into models assuming one common central resource
for cognitive processing (e.g., (<xref ref-type="bibr" rid="b39">39</xref>)) and models assuming a more modular
layout (<xref ref-type="bibr" rid="b19">19</xref>) involving distinct processing resource pools for the
different processing modules (e.g., (<xref ref-type="bibr" rid="b74">74</xref>)). A single central module or
resource account would have predicted costs associated with additional
processing demands, which we did not observe. Thus, our results rather
indicate the presence of separate processing modules with rather
independent resource pools, one for central, higher-level cognitive
processing (linguistic processing, comprehension etc.), and one for more
peripheral (and more low-level) oculomotor control, a level on which
auditory beat processing appears to operate. However, it appears likely
that, despite some degree of modularity, there is still room for
inter-modular crosstalk to occur (see (<xref ref-type="bibr" rid="b36">36</xref>)), which explains the
possibility of an influence of the auditory beat on the oculomotor
control system.</p>

<p>Interestingly, the reliability of our main finding is further
corroborated by a similarly small but significant effect of musical
tempo on fixation durations in free scene viewing as reported recently
(<xref ref-type="bibr" rid="b47">47</xref>). Note, however, that this study had a somewhat different
theoretical focus, comparing the effect of auditory beats between
musicians and laymen, as well as between two different musical styles
(funk, techno). Only two different tempi were chosen (about 102 and 144
bpm) and task affordances (e.g., differences in involvement of cognitive
or oculomotor control processes) were not manipulated. While tempo had
an effect similar to the one we observed in our reading task, musical
expertise and musical style did not. Despite the differences in
theoretical focus, this study therefore nicely confirms our present
results and interpretation when assuming that free scene viewing can
also rely more on automatized oculomotor scanning routines than the
scanning task used in our present study, which necessitates
moment-to-moment decisions regarding saccade targets based on the
identity of the currently fixated object.</p>

<p>From a more practical viewpoint, our study also speaks to the issue
of effects of background music on cognitive task processing. In contrast
to other studies (<xref ref-type="bibr" rid="b2 b20 b68">2, 20, 68</xref>), we did not find a detrimental effect of
music on reading comprehension. Our participants’ comprehension
performance was unaffected by the auditory stimulus. In this context, it
is important to note that we used very simple musical stimuli,
specifically composed for our study. Other studies applied classical or
popular music. Such music differs from our stimuli in terms of its
complexity and familiarity. For example, music samples taken from
top-ten lists might trigger individual memories, associations, and
stronger emotional states, potentially conflicting with the reading
comprehension task at hand.</p>

<p>Our main finding motivates further research on the relation between
auditory rhythms and saccade generation. Even though effects are small
and sometimes cannot be observed (see (<xref ref-type="bibr" rid="b21">21</xref>), in this special issue of
“Music and Eye-Tracking”), they are of high theoretical relevance for
understanding crossmodal perception. There are several options for
future studies, for example, using experimental designs similar to
studies on crossmodal resets in neuroscience (e.g., (<xref ref-type="bibr" rid="b12">12</xref>)) and analyzing
periodicities for saccade generation (see for example (<xref ref-type="bibr" rid="b1">1</xref>)), or using
tasks that are suited to uncover different saccade populations (with or
without direct control), such as the stimulus-onset delay task (see
(<xref ref-type="bibr" rid="b52">52</xref>)).</p>

<p>In sum, we demonstrated a theoretically important tempo effect of an
external auditory beat on basic eye-movement control in reading. The
results are interpreted in favor of modulatory processes that affect the
speed parameter of the saccade timer, supporting the assumption of
inherently rhythmic underlying processing in exemplary visual control
tasks. The present results add to the growing evidence for an embodied
view of cognition (e.g., (<xref ref-type="bibr" rid="b8">8</xref>)), a view that also entails the idea that
cognitive processes are essentially determined by associated bodily
systems (<xref ref-type="bibr" rid="b75">75</xref>). Thus, tasks requiring rhythmic oculomotor behavior appear
to rely on corresponding rhythmic processing strategies, as evidenced by
the influence of the task-irrelevant, external pacemaker tempo.</p>
    </sec>

    <sec id="S4a" sec-type="COI-statement">
      <title>Ethics and Conflict of Interest</title>

<p>The author(s) declare(s) that the contents of the article are in
agreement with the ethics described in
<ext-link ext-link-type="uri" xlink:href="http://biblio.unibe.ch/portale/elibrary/BOP/jemr/ethics.html" xlink:show="new">http://biblio.unibe.ch/portale/elibrary/BOP/jemr/ethics.html</ext-link>
and that there is no conflict of interest regarding the publication of
this paper.</p>
    </sec>

    <sec id="S4b">
      <title>Acknowledgements</title>

<p>We thank Lena Plückebaum and Ruth Julier for data collection. Special
thanks to Maria Heuring for her support with the data preprocessing.
Special thanks as well to Fabian Greb for generating the musical
stimulus. This research did not receive any specific grant from funding
agencies in the public, commercial, or not-for-profit sectors.
Correspondence concerning this article should be addressed to
<email>Elke.Lange@aesthetics.mpg.de</email></p>
    </sec>


  </body>    
<back>
<ref-list>
<ref id="b1"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Amit</surname> <given-names>R</given-names></name>, <name><surname>Abeles</surname> <given-names>D</given-names></name>, <name><surname>Bar-Gad</surname> <given-names>I</given-names></name>, <name><surname>Yuval-Greenberg</surname> <given-names>S</given-names></name></person-group>. <article-title>Temporal dynamics of saccades explained by a self-paced process.</article-title> Sci Rep-Uk. <year>2017</year>;7. doi: <pub-id pub-id-type="doi" specific-use="author">10.1038/s41598-017-00881-7</pub-id>. PubMed PMID: WOS:000425897000001.</mixed-citation></ref>
<ref id="b2"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Avila</surname> <given-names>C</given-names></name>, <name><surname>Furnham</surname> <given-names>A</given-names></name>, <name><surname>McClelland</surname> <given-names>A</given-names></name></person-group>. <article-title>The influence of distracting familiar vocal music on cognitive performance of introverts and extraverts.</article-title> Psychol Music. <year>2012</year>;40(1):84-93. doi: <pub-id pub-id-type="doi" specific-use="author">10.1177/0305735611422672</pub-id>. PubMed PMID: WOS:000299486900006.</mixed-citation></ref>
<ref id="b3"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Balch</surname> <given-names>WR</given-names></name>, <name><surname>Lewis</surname> <given-names>BS</given-names></name></person-group>. Music-dependent memory: The roles of tempo change and mood mediation. J Exp Psychol Learn. <year>1996</year>;22(6):1354-63. doi: <pub-id pub-id-type="doi" specific-use="author">10.1037//0278-7393.22.6.1354</pub-id>. PubMed PMID: WOS:A1996VR35200003.</mixed-citation></ref>
<ref id="b4"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Benedetto</surname> <given-names>A</given-names></name>, <name><surname>Morrone</surname> <given-names>MC</given-names></name></person-group>. <article-title>Saccadic Suppression Is Embedded Within Extended Oscillatory Modulation of Sensitivity.</article-title> Journal of Neuroscience. <year>2017</year>;37(13):3661-70. doi: <pub-id pub-id-type="doi" specific-use="author">10.1523/Jneurosci.2390-16.2016</pub-id>. PubMed PMID: WOS:000397823700020. <pub-id pub-id-type="doi">10.1523/JNEUROSCI.2390-16.2016</pub-id></mixed-citation></ref>
<ref id="b5"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Busch</surname> <given-names>NA</given-names></name>, <name><surname>Dubois</surname> <given-names>J</given-names></name>, <name><surname>VanRullen</surname> <given-names>R</given-names></name></person-group>. <article-title>The Phase of Ongoing EEG Oscillations Predicts Visual Perception.</article-title> Journal of Neuroscience. <year>2009</year>;29(24):7869-76. doi: <pub-id pub-id-type="doi" specific-use="author">10.1523/Jneurosci.0113-09.2009</pub-id>. PubMed PMID: WOS:000267131000026. <pub-id pub-id-type="doi">10.1523/JNEUROSCI.0113-09.2009</pub-id></mixed-citation></ref>
<ref id="b6"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Calderone</surname> <given-names>DJ</given-names></name>, <name><surname>Lakatos</surname> <given-names>P</given-names></name>, <name><surname>Butler</surname> <given-names>PD</given-names></name>, <name><surname>Castellanos</surname> <given-names>FX</given-names></name></person-group>. <article-title>Entrainment of neural oscillations as a modifiable substrate of attention.</article-title> Trends Cogn Sci. <year>2014</year>;18(6):300-9. doi: <pub-id pub-id-type="doi" specific-use="author">10.1016/j.tics.2014.02.005</pub-id>. PubMed PMID: WOS:000337214600008.</mixed-citation></ref>
<ref id="b7"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Cassidy</surname>, <given-names>G.</given-names></name>, &#x26; <name><surname>MacDonald</surname>, <given-names>R. A. R.</given-names></name></person-group> <article-title>The effect of background music and background noise on the task performance of introverts and extraverts.</article-title> Psychol Music. <year>2007</year>;35(3):517-37. doi: <pub-id pub-id-type="doi" specific-use="author">10.1177/0305735607076444</pub-id></mixed-citation></ref>
<ref id="b8"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><name><surname>Clark</surname>, <given-names>A.</given-names></name></person-group> (<year>2008</year>). <source>Supersizing the mind: embodiment, action, and cognitive extension</source>. <publisher-name>Oxford University Press</publisher-name>. <pub-id pub-id-type="doi">10.1093/acprof:oso/9780195333213.001.0001</pub-id></mixed-citation></ref>
<ref id="b9"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Cousineau</surname> <given-names>D.</given-names></name></person-group> <article-title>Confidence intervals in within-subject designs: A simpler solution to Loftus and Masson's method.</article-title> TQMP. <year>2005</year>;1(1):42-5. doi: <pub-id pub-id-type="doi" specific-use="author">10.20982/tqmp.01.1.p042</pub-id></mixed-citation></ref>
<ref id="b10"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Crust</surname> <given-names>L</given-names></name>, <name><surname>Clough</surname> <given-names>PJ</given-names></name>, <name><surname>Robertson</surname> <given-names>C</given-names></name></person-group>. <article-title>Influence of music and distraction on visual search performance of participants with high and low affect intensity.</article-title> Perceptual and motor skills. <year>2004</year>;98(3):888-96. PubMed PMID: WOS:000221627100019. <pub-id pub-id-type="doi">10.2466/pms.98.3.888-896</pub-id></mixed-citation></ref>
<ref id="b11"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Daoussis</surname> <given-names>L</given-names></name>, <name><surname>McKelvie</surname> <given-names>SJ</given-names></name></person-group>. <article-title>Musical Preferences and Effects of Music on a Reading-Comprehension Test for Extroverts and Introverts.</article-title> Perceptual and motor skills. <year>1986</year>;62(1):283-9. doi: DOI <pub-id pub-id-type="doi" specific-use="author">10.2466/pms.1986.62.1.283</pub-id>. PubMed PMID: WOS:A1986A264400055.</mixed-citation></ref>
<ref id="b12"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Diederich</surname> <given-names>A</given-names></name>, <name><surname>Schomburg</surname> <given-names>A</given-names></name>, <name><surname>Colonius</surname> <given-names>H.</given-names></name></person-group> <article-title>Saccadic Reaction Times to Audiovisual Stimuli Show Effects of Oscillatory Phase Reset.</article-title> PloS one. <year>2012</year>;7(10). doi: ARTN e44910 <pub-id pub-id-type="doi" specific-use="author">10.1371/journal.pone.0044910</pub-id>. PubMed PMID: WOS:000309454000005.</mixed-citation></ref>
<ref id="b13"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Engbert</surname> <given-names>R</given-names></name>, <name><surname>Nuthmann</surname> <given-names>A</given-names></name>, <name><surname>Richter</surname> <given-names>EM</given-names></name>, <name><surname>Kliegl</surname> <given-names>R</given-names></name></person-group>. <article-title>SWIFT: A dynamical model of saccade generation during reading.</article-title> Psychol Rev. <year>2005</year>;112(4):777-813. doi: <pub-id pub-id-type="doi" specific-use="author">10.1037/0033-295x.112.4.777</pub-id>. PubMed PMID: WOS:000233222000003. <pub-id pub-id-type="doi">10.1037/0033-295X.112.4.777</pub-id></mixed-citation></ref>
<ref id="b14"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Engel</surname> <given-names>AK</given-names></name>, <name><surname>Fries</surname> <given-names>P</given-names></name>, <name><surname>Singer</surname> <given-names>W</given-names></name></person-group>. Dynamic predictions: Oscillations and synchrony in top-down processing. Nature Reviews Neuroscience. <year>2001</year>;2(10):704-16. doi: <pub-id pub-id-type="doi" specific-use="author">10.1038/35094565</pub-id>. PubMed PMID: WOS:000171454000017.</mixed-citation></ref>
<ref id="b15"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Etzel</surname> <given-names>JA</given-names></name>, <name><surname>Johnsen</surname> <given-names>EL</given-names></name>, <name><surname>Dickerson</surname> <given-names>J</given-names></name>, <name><surname>Tranel</surname> <given-names>D</given-names></name>, <name><surname>Adolphs</surname> <given-names>R</given-names></name></person-group>. <article-title>Cardiovascular and respiratory responses during musical mood induction.</article-title> International Journal of Psychophysiology. <year>2006</year>;61(1):57-69. doi: <pub-id pub-id-type="doi" specific-use="author">10.1016/j.ijpsycho.2005.10.025</pub-id>. PubMed PMID: WOS:000239162500007.</mixed-citation></ref>
<ref id="b16"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Fiebelkorn</surname> <given-names>IC</given-names></name>, <name><surname>Foxe</surname> <given-names>JJ</given-names></name>, <name><surname>Butler</surname> <given-names>JS</given-names></name>, <name><surname>Mercier</surname> <given-names>MR</given-names></name>, <name><surname>Snyder</surname> <given-names>AC</given-names></name>, <name><surname>Molholm</surname> <given-names>S</given-names></name></person-group>. Ready, Set, Reset: Stimulus-Locked Periodicity in Behavioral Performance Demonstrates the Consequences of Cross-Sensory Phase Reset. Journal of Neuroscience. <year>2011</year>;31(27):9971-81. doi: <pub-id pub-id-type="doi" specific-use="author">10.1523/Jneurosci.1338-11.2011</pub-id>. PubMed PMID: WOS:000292524400020. <pub-id pub-id-type="doi">10.1523/JNEUROSCI.1338-11.2011</pub-id></mixed-citation></ref>
<ref id="b17"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Fiebelkorn</surname> <given-names>IC</given-names></name>, <name><surname>Snyder</surname> <given-names>AC</given-names></name>, <name><surname>Mercier</surname> <given-names>MR</given-names></name>, <name><surname>Butler</surname> <given-names>JS</given-names></name>, <name><surname>Molholm</surname> <given-names>S</given-names></name>, <name><surname>Foxe</surname> <given-names>JJ</given-names></name></person-group>. <article-title>Cortical cross-frequency coupling predicts perceptual outcomes.</article-title> NeuroImage. <year>2013</year>;69:126-37. doi: <pub-id pub-id-type="doi" specific-use="author">10.1016/j.neuroimage.2012.11.021</pub-id>. PubMed PMID: WOS:000314627800014.</mixed-citation></ref>
<ref id="b18"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><name><surname>Findlay</surname>, <given-names>J. M.</given-names></name>, &#x26; <name><surname>Gilchrist</surname>, <given-names>I. D.</given-names></name></person-group> (<year>2003</year>). <source>Active vision: the psychology of looking and seeing</source>. <publisher-name>Oxford University Press</publisher-name>. <pub-id pub-id-type="doi">10.1093/acprof:oso/9780198524793.001.0001</pub-id></mixed-citation></ref>
<ref id="b19"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><name><surname>Fodor</surname>, <given-names>J. A.</given-names></name></person-group> (<year>1983</year>). <source>The modularity of mind: an essay on faculty psychology</source>. <publisher-name>MIT Press</publisher-name>. <pub-id pub-id-type="doi">10.7551/mitpress/4737.001.0001</pub-id></mixed-citation></ref>
<ref id="b20"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Fogelson</surname> <given-names>S.</given-names></name></person-group> <article-title>Music as a Distractor on Reading-Test Performance of Eighth Grade Students.</article-title> Perceptual and motor skills. <year>1973</year>;36(3):1265-6. doi: <pub-id pub-id-type="doi" specific-use="author">10.2466/pms.1973.36.3c.1265</pub-id>. PubMed PMID: WOS:A1973Q080400049.</mixed-citation></ref>
<ref id="b21"><mixed-citation publication-type="unknown" specific-use="unparsed"><person-group person-group-type="author"><name><surname>Franěk</surname> <given-names>M</given-names></name>, <name><surname>Šefara</surname> <given-names>D</given-names></name>, <name><surname>Petružálek</surname> <given-names>J</given-names></name>, <name><surname>Mlejnek</surname> <given-names>R</given-names></name>, <name><surname>Van Noorden</surname> <given-names>L</given-names></name></person-group>. <article-title>Eye movements in scene perception while listening to slow and fast music.</article-title> Journal of Eye Movement Research. <year>2018</year>;11(2):8. doi: <pub-id pub-id-type="doi" specific-use="author">10.16910/jemr.11.2.8</pub-id></mixed-citation></ref>
<ref id="b22"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Furnham</surname> <given-names>A</given-names></name>, <name><surname>Bradley</surname> <given-names>A</given-names></name></person-group>. Music while you work: The differential distraction of background music on the cognitive test performance of introverts and extraverts. Appl Cognitive Psych. <year>1997</year>;11(5):445-55. doi: <pub-id pub-id-type="doi" specific-use="author">10.1002/(SICI)1099-0720(199710)11:5&#x3C;445::AID-ACP472&#x3E;3.0.CO;2-R</pub-id>. PubMed PMID: WOS:A1997YC21200005.</mixed-citation></ref>
<ref id="b23"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Gomez</surname> <given-names>P</given-names></name>, <name><surname>Danuser</surname> <given-names>B</given-names></name></person-group>. <article-title>Relationships between musical structure and psychophysiological measures of emotion.</article-title> Emotion. <year>2007</year>;7(2):377-87. doi: <pub-id pub-id-type="doi" specific-use="author">10.1037/1528-3542.7.2.377</pub-id>. PubMed PMID: WOS:000246412200014.</mixed-citation></ref>
<ref id="b24"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Hanslmayr</surname>, <given-names>S.</given-names></name>, <name><surname>Aslan</surname>, <given-names>A.</given-names></name>, <name><surname>Staudigl</surname>, <given-names>T.</given-names></name>, <name><surname>Klimesch</surname>, <given-names>W.</given-names></name>, <name><surname>Herrmann</surname>, <given-names>C. S.</given-names></name>, &#x26; <name><surname>Bäuml</surname>, <given-names>K. H.</given-names></name></person-group> (<year>2007</year>). <article-title>Prestimulus oscillations predict visual perception performance between and within subjects.</article-title> <source>NeuroImage</source>, <volume>37</volume>(<issue>4</issue>), <fpage>1465</fpage>–<lpage>1473</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1016/j.neuroimage.2007.07.011</pub-id><pub-id pub-id-type="pmid">17706433</pub-id><issn>1053-8119</issn></mixed-citation></ref>
<ref id="b25"><mixed-citation publication-type="book-chapter" specific-use="restruct"><person-group person-group-type="author"><name><surname>Henderson</surname>, <given-names>J. M.</given-names></name></person-group> (<year>1992</year>). <chapter-title>Visual attention and eye movement control during reading and picture viewing</chapter-title>. In <person-group person-group-type="editor"><name><given-names>K.</given-names> <surname>Rayner</surname></name> (<role>Ed.</role>),</person-group> <source>Eye movements and visual cognition: Scene perception and reading</source> (pp. <fpage>260</fpage>–<lpage>283</lpage>). <publisher-name>Springer</publisher-name>. <pub-id pub-id-type="doi">10.1007/978-1-4612-2852-3_15</pub-id></mixed-citation></ref>
<ref id="b26"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Henderson</surname> <given-names>JM</given-names></name>, <name><surname>Olejarczyk</surname> <given-names>J</given-names></name>, <name><surname>Luke</surname> <given-names>SG</given-names></name>, <name><surname>Schmidt</surname> <given-names>J</given-names></name></person-group>. Eye movement control during scene viewing: Immediate degradation and enhancement effects of spatial frequency filtering. Vis Cogn. <year>2014</year>;22(3-4):486-502. doi: <pub-id pub-id-type="doi" specific-use="author">10.1080/13506285.2014.897662</pub-id>. PubMed PMID: WOS:000334070000013.</mixed-citation></ref>
<ref id="b27"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Henderson</surname> <given-names>JM</given-names></name>, <name><surname>Smith</surname> <given-names>TJ</given-names></name></person-group>. <article-title>How are eye fixation durations controlled during scene viewing? Further evidence from a scene onset delay paradigm.</article-title> Vis Cogn. <year>2009</year>;17(6-7):1055-82. doi: <pub-id pub-id-type="doi" specific-use="author">10.1080/13506280802685552</pub-id>. PubMed PMID: WOS:000268719400013.</mixed-citation></ref>
<ref id="b28"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Henry</surname> <given-names>MJ</given-names></name>, <name><surname>Obleser</surname> <given-names>J</given-names></name></person-group>. Frequency modulation entrains slow neural oscillations and optimizes human listening behavior. P Natl Acad Sci USA. <year>2012</year>;109(49):20095-100. doi: <pub-id pub-id-type="doi" specific-use="author">10.1073/pnas.1213390109</pub-id>. PubMed PMID: WOS:000312347200051.</mixed-citation></ref>
<ref id="b29"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Hillen</surname> <given-names>R</given-names></name>, <name><surname>Günther</surname> <given-names>T</given-names></name>, <name><surname>Kohlen</surname> <given-names>C</given-names></name>, <name><surname>Eckers</surname> <given-names>C</given-names></name>, <name><surname>van Ermingen-Marbach</surname> <given-names>M</given-names></name>, <name><surname>Sass</surname> <given-names>K</given-names></name>, <etal>et al.</etal></person-group> Identifying brain systems for gaze orienting during reading: fMRI investigation of the Landolt paradigm. Frontiers in human neuroscience. <year>2013</year>;7. doi: <pub-id pub-id-type="doi" specific-use="author">10.3389/fnhum.2013.00384</pub-id>. PubMed PMID: WOS:000322360400001.</mixed-citation></ref>
<ref id="b30"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Hogendoorn</surname> <given-names>H.</given-names></name></person-group> Voluntary Saccadic Eye Movements Ride the Attentional Rhythm. J Cognitive Neurosci. <year>2016</year>;28(10):1625-35. doi: <pub-id pub-id-type="doi" specific-use="author">10.1162/jocn_a_00986</pub-id>. PubMed PMID: WOS:000385498100013.</mixed-citation></ref>
<ref id="b31"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Honing</surname> <given-names>H.</given-names></name></person-group> <article-title>Without it no music: beat induction as a fundamental musical trait.</article-title> Ann Ny Acad Sci. <year>2012</year>;1252:85-91. doi: <pub-id pub-id-type="doi" specific-use="author">10.1111/j.1749-6632.2011.06402.x</pub-id>. PubMed PMID: WOS:000305518900011.</mixed-citation></ref>
<ref id="b32"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Hooge</surname> <given-names>ITC</given-names></name>, <name><surname>Erkelens</surname> <given-names>CJ</given-names></name></person-group>. <article-title>Adjustment of fixation duration in visual search.</article-title> Vision Research. <year>1998</year>;38(9):1295-302. doi: <pub-id pub-id-type="doi" specific-use="author">10.1016/S0042-6989(97)00287-3</pub-id>. PubMed PMID: WOS:000073529000011.</mixed-citation></ref>
<ref id="b33"><mixed-citation publication-type="unknown" specific-use="unparsed"><person-group person-group-type="author"><name><surname>Huestegge</surname> <given-names>L.</given-names></name></person-group> Effects of Vowel Length on Gaze Durations in Silent and Oral Reading. J Eye Movement Res. <year>2010</year>;3(5). PubMed PMID: WOS:000208817100005.</mixed-citation></ref>
<ref id="b34"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Huestegge</surname> <given-names>L</given-names></name>, <name><surname>Bocianski</surname> <given-names>D</given-names></name></person-group>. <article-title>Effects of syntactic context on eye movements during reading.</article-title> Adv Cogn Psychol. <year>2010</year>;6(6):79-87. doi: <pub-id pub-id-type="doi" specific-use="author">10.2478/v10053-008-0078-0</pub-id>. PubMed PMID: WOS:000209818400007.</mixed-citation></ref>
<ref id="b35"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Huestegge</surname> <given-names>L</given-names></name>, <name><surname>Heim</surname> <given-names>S</given-names></name>, <name><surname>Zettelmeyer</surname> <given-names>E</given-names></name>, <name><surname>Lange-Kuttner</surname> <given-names>C</given-names></name></person-group>. <article-title>Gender-specific contribution of a visual cognition network to reading abilities.</article-title> British journal of psychology. <year>2012</year>;103:117-28. doi: <pub-id pub-id-type="doi" specific-use="author">10.1111/j.2044-8295.2011.02050.x</pub-id>. PubMed PMID: WOS:000298917800011.</mixed-citation></ref>
<ref id="b36"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Huestegge</surname> <given-names>L</given-names></name>, <name><surname>Pieczykolan</surname> <given-names>A</given-names></name>, <name><surname>Koch</surname> <given-names>I.</given-names></name></person-group> Talking while looking: On the encapsulation of output system representations. Cognitive Psychol. <year>2014</year>;73:72-91. doi: <pub-id pub-id-type="doi" specific-use="author">10.1016/j.cogpsych.2014.06.001</pub-id>. PubMed PMID: WOS:000340975500003.</mixed-citation></ref>
<ref id="b37"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Ilie</surname> <given-names>G</given-names></name>, <name><surname>Thompson</surname> <given-names>WF</given-names></name></person-group>. <article-title>Experiential and cognitive changes following seven minutes exposure to music and speech.</article-title> Music Percept. <year>2011</year>;28(3):247-64. doi: <pub-id pub-id-type="doi" specific-use="author">10.1525/Mp.2011.28.3.247</pub-id>. PubMed PMID: WOS:000287778400002. <pub-id pub-id-type="doi">10.1525/mp.2011.28.3.247</pub-id></mixed-citation></ref>
<ref id="b38"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Johansson</surname> <given-names>R</given-names></name>, <name><surname>Holmqvist</surname> <given-names>K</given-names></name>, <name><surname>Mossberg</surname> <given-names>F</given-names></name>, <name><surname>Lindgren</surname> <given-names>M</given-names></name></person-group>. <article-title>Eye movements and reading comprehension while listening to preferred and non-preferred study music.</article-title> Psychol Music. <year>2012</year>;40(3):339-56. doi: <pub-id pub-id-type="doi" specific-use="author">10.1177/0305735610387777</pub-id>. PubMed PMID: WOS:000304708100005.</mixed-citation></ref>
<ref id="b39"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><name><surname>Kahneman</surname>, <given-names>D.</given-names></name></person-group> (<year>1973</year>). <source>Attention and effort</source>. <publisher-name>Prentice-Hall</publisher-name>.</mixed-citation></ref>
<ref id="b40"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Kallinen</surname> <given-names>K.</given-names></name></person-group> <article-title>Reading news from a pocket computer in a distracting environment: effects of the tempo of background music.</article-title> Comput Hum Behav. <year>2002</year>;18(5):537-51. doi: <pub-id pub-id-type="doi" specific-use="author">10.1016/S0747-5632(02)00005-5</pub-id>. PubMed PMID: WOS:000178236900005.</mixed-citation></ref>
<ref id="b41"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Kämpfe</surname> <given-names>J</given-names></name>, <name><surname>Sedlmeier</surname> <given-names>P</given-names></name>, <name><surname>Renkewitz</surname> <given-names>F</given-names></name></person-group>. <article-title>The impact of background music on adult listeners: A meta-analysis.</article-title> Psychol Music. <year>2011</year>;39(4):424-48. doi: <pub-id pub-id-type="doi" specific-use="author">10.1177/0305735610376261</pub-id>. PubMed PMID: WOS:000296228500002.</mixed-citation></ref>
<ref id="b42"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Kiger</surname> <given-names>DM</given-names></name></person-group>. <article-title>Effects of music information load on a reading comprehension task.</article-title> Perceptual and motor skills. <year>1989</year>;69(2):531-4. doi: <pub-id pub-id-type="doi" specific-use="author">10.2466/pms.1989.69.2.531</pub-id>. PubMed PMID: WOS:A1989AX46500038.</mixed-citation></ref>
<ref id="b43"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Lakatos</surname> <given-names>P</given-names></name>, <name><surname>Karmos</surname> <given-names>G</given-names></name>, <name><surname>Mehta</surname> <given-names>AD</given-names></name>, <name><surname>Ulbert</surname> <given-names>I</given-names></name>, <name><surname>Schroeder</surname> <given-names>CE</given-names></name></person-group>. <article-title>Entrainment of neuronal oscillations as a mechanism of attentional selection.</article-title> Science. <year>2008</year>;320(5872):110-3. doi: <pub-id pub-id-type="doi" specific-use="author">10.1126/science.1154735</pub-id>. PubMed PMID: WOS:000254633000043.</mixed-citation></ref>
<ref id="b44"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Lakatos</surname> <given-names>P</given-names></name>, <name><surname>Shah</surname> <given-names>AS</given-names></name>, <name><surname>Knuth</surname> <given-names>KH</given-names></name>, <name><surname>Ulbert</surname> <given-names>I</given-names></name>, <name><surname>Karmos</surname> <given-names>G</given-names></name>, <name><surname>Schroeder</surname> <given-names>CE</given-names></name></person-group>. <article-title>An oscillatory hierarchy controlling neuronal excitability and stimulus processing in the auditory cortex.</article-title> Journal of neurophysiology. <year>2005</year>;94(3):1904-11. doi: <pub-id pub-id-type="doi" specific-use="author">10.1152/jn.00263.2005</pub-id>. PubMed PMID: WOS:000231259400026.</mixed-citation></ref>
<ref id="b45"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Luke</surname> <given-names>SG</given-names></name>, <name><surname>Nuthmann</surname> <given-names>A</given-names></name>, <name><surname>Henderson</surname> <given-names>JM</given-names></name></person-group>. Eye Movement Control in Scene Viewing and Reading: Evidence From the Stimulus Onset Delay Paradigm. J Exp Psychol Human. <year>2013</year>;39(1):10-5. doi: <pub-id pub-id-type="doi" specific-use="author">10.1037/a0030392</pub-id>. PubMed PMID: WOS:000313934400003.</mixed-citation></ref>
<ref id="b46"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><name><surname>Marin</surname>, <given-names>J. M.</given-names></name>, &#x26; <name><surname>Robert</surname>, <given-names>C.</given-names></name></person-group> (<year>2007</year>). <source>Bayesian core: a practical approach to computational Bayesian statistics</source>. <publisher-name>Springer Science and Business Media</publisher-name>.</mixed-citation></ref>
<ref id="b47"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Maróti</surname> <given-names>E</given-names></name>, <name><surname>Knakker</surname> <given-names>B</given-names></name>, <name><surname>Vidnyánszky</surname> <given-names>Z</given-names></name>, <name><surname>Weiss</surname> <given-names>B</given-names></name></person-group>. <article-title>The effect of beat frequency on eye movements during free viewing.</article-title> Vision Research. <year>2017</year>;131:57-66. doi: <pub-id pub-id-type="doi" specific-use="author">10.1016/j.visres.2016.12.009</pub-id>. PubMed PMID: WOS:000393731200006.</mixed-citation></ref>
<ref id="b48"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Mercier</surname> <given-names>MR</given-names></name>, <name><surname>Foxe</surname> <given-names>JJ</given-names></name>, <name><surname>Fiebelkorn</surname> <given-names>IC</given-names></name>, <name><surname>Butler</surname> <given-names>JS</given-names></name>, <name><surname>Schwartz</surname> <given-names>TH</given-names></name>, <name><surname>Molholm</surname> <given-names>S</given-names></name></person-group>. Auditory-driven phase reset in visual cortex: Human electrocorticography reveals mechanisms of early multisensory integration. NeuroImage. <year>2013</year>;79:19-29. doi: <pub-id pub-id-type="doi" specific-use="author">10.1016/j.neuroimage.2013.04.060</pub-id>. PubMed PMID: WOS:000320412200003.</mixed-citation></ref>
<ref id="b49"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Morey</surname>, <given-names>R. D.</given-names></name></person-group> (<year>2008</year>). <article-title>Confidence Intervals from Normalized Data: A correction to Cousineau (2005).</article-title> <comment>[PubMed PMID]</comment>. <source>TQMP</source>, <volume>4</volume>(<issue>2</issue>), <fpage>61</fpage>–<lpage>64</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.20982/tqmp.04.2.p061</pub-id></mixed-citation></ref>
<ref id="b50"><mixed-citation publication-type="web-page" specific-use="unparsed"><person-group person-group-type="author"><name><surname>Morey</surname>, <given-names>R. D.</given-names></name>, &#x26; <name><surname>Rouder</surname>, <given-names>J. N.</given-names></name></person-group> BayesFactor: Computation of Bayes factors for common designs. R package version 0.9.12-2 ed<year>2015</year>. Available from: <ext-link ext-link-type="uri" xlink:href="https://CRAN.R-project.org/package=BayesFactor">https://CRAN.R-project.org/package=BayesFactor</ext-link></mixed-citation></ref>
<ref id="b51"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Morrison</surname> <given-names>RE</given-names></name></person-group>. Manipulation of Stimulus Onset Delay in Reading - Evidence for Parallel Programming of Saccades. J Exp Psychol Human. <year>1984</year>;10(5):667-82. doi: <pub-id pub-id-type="doi" specific-use="author">10.1037//0096-1523.10.5.667</pub-id>. PubMed PMID: WOS:A1984TK85000005.</mixed-citation></ref>
<ref id="b52"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Nuthmann</surname> <given-names>A</given-names></name>, <name><surname>Henderson</surname> <given-names>JM</given-names></name></person-group>. <article-title>Using CRISP to model global characteristics of fixation durations in scene viewing and reading with a common mechanism.</article-title> Vis Cogn. <year>2012</year>;20(4-5):457-94. doi: <pub-id pub-id-type="doi" specific-use="author">10.1080/13506285.2012.670142</pub-id>. PubMed PMID: WOS:000304370200005.</mixed-citation></ref>
<ref id="b53"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Nuthmann</surname> <given-names>A</given-names></name>, <name><surname>Smith</surname> <given-names>TJ</given-names></name>, <name><surname>Engbert</surname> <given-names>R</given-names></name>, <name><surname>Henderson</surname> <given-names>JM</given-names></name></person-group>. <article-title>CRISP: A Computational Model of Fixation Durations in Scene Viewing.</article-title> Psychol Rev. <year>2010</year>;117(2):382-405. doi: <pub-id pub-id-type="doi" specific-use="author">10.1037/a0018924</pub-id>. PubMed PMID: WOS:000276928600003.</mixed-citation></ref>
<ref id="b54"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Phillips-Silver</surname> <given-names>J</given-names></name>, <name><surname>Trainor</surname> <given-names>LJ</given-names></name></person-group>. Feeling the beat: Movement influences infant rhythm perception. Science. <year>2005</year>;308(5727):1430-. doi: <pub-id pub-id-type="doi" specific-use="author">10.1126/science.1110922</pub-id>. PubMed PMID: WOS:000229619300042.</mixed-citation></ref>
<ref id="b55"><mixed-citation publication-type="web-page" specific-use="unparsed"><person-group person-group-type="author"><collab>R Development Core Team</collab></person-group>. R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing; <year>2017</year>. Available from: <ext-link ext-link-type="uri" xlink:href="https://www.R-project.org/">https://www.R-project.org/</ext-link></mixed-citation></ref>
<ref id="b56"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Radach</surname> <given-names>R</given-names></name>, <name><surname>Huestegge</surname> <given-names>L</given-names></name>, <name><surname>Reilly</surname> <given-names>R</given-names></name></person-group>. <article-title>The role of global top-down factors in local eye-movement control in reading.</article-title> Psychol Res-Psych Fo. <year>2008</year>;72(6):675-88. doi: <pub-id pub-id-type="doi" specific-use="author">10.1007/s00426-008-0173-3</pub-id>. PubMed PMID: WOS:000260378800010.</mixed-citation></ref>
<ref id="b57"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Rayner</surname> <given-names>K.</given-names></name></person-group> Eye movements in reading and information processing: 20 years of research. Psychol Bull. <year>1998</year>;124(3):372-422. doi: <pub-id pub-id-type="doi" specific-use="author">10.1037/0033-2909.124.3.372</pub-id>. PubMed PMID: WOS:000077289700004.</mixed-citation></ref>
<ref id="b58"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Rayner</surname> <given-names>K.</given-names></name></person-group> <article-title>Eye movements and attention in reading, scene perception, and visual search.</article-title> Quarterly Journal of Experimental Psychology. <year>2009</year>;62(8):1457-506. doi: <pub-id pub-id-type="doi" specific-use="author">10.1080/17470210902816461</pub-id>. PubMed PMID: WOS:000267463700001.</mixed-citation></ref>
<ref id="b59"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Reichle</surname> <given-names>ED</given-names></name>, <name><surname>Pollatsek</surname> <given-names>A</given-names></name>, <name><surname>Fisher</surname> <given-names>DL</given-names></name>, <name><surname>Rayner</surname> <given-names>K</given-names></name></person-group>. <article-title>Toward a model of eye movement control in reading.</article-title> Psychol Rev. <year>1998</year>;105(1):125-57. doi: <pub-id pub-id-type="doi" specific-use="author">10.1037/0033-295x.105.1.125</pub-id>. PubMed PMID: WOS:000071480000006. <pub-id pub-id-type="doi">10.1037/0033-295X.105.1.125</pub-id></mixed-citation></ref>
<ref id="b60"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Reichle</surname> <given-names>ED</given-names></name>, <name><surname>Rayner</surname> <given-names>K</given-names></name>, <name><surname>Pollatsek</surname> <given-names>A</given-names></name></person-group>. The E-Z Reader model of eye-movement control in reading: Comparisons to other models. Behav Brain Sci. <year>2003</year>;26(4):445-+. doi: <pub-id pub-id-type="doi" specific-use="author">10.1017/S0140525x03000104</pub-id>. PubMed PMID: WOS:000222580000025. <pub-id pub-id-type="doi">10.1017/S0140525X03000104</pub-id></mixed-citation></ref>
<ref id="b61"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Reingold</surname> <given-names>EM</given-names></name>, <name><surname>Stampe</surname> <given-names>DM</given-names></name></person-group>. <article-title>Using the saccadic inhibition paradigm to investigate saccadic control in reading.</article-title> Mind's Eye: Cognitive and Applied Aspects of Eye Movement Research. <year>2003</year>:347-60. doi: <pub-id pub-id-type="doi" specific-use="author">10.1016/B978-044451020-4/50020-7</pub-id>. PubMed PMID: WOS:000184646800019.</mixed-citation></ref>
<ref id="b62"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Reingold</surname> <given-names>EM</given-names></name>, <name><surname>Stampe</surname> <given-names>DM</given-names></name></person-group>. Saccadic inhibition in reading. J Exp Psychol Human. <year>2004</year>;30(1):194-211. doi: <pub-id pub-id-type="doi" specific-use="author">10.1037/0096-1523.30.1.194</pub-id>. PubMed PMID: WOS:000188784300013.</mixed-citation></ref>
<ref id="b63"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Rouder</surname> <given-names>JN</given-names></name>, <name><surname>Morey</surname> <given-names>RD</given-names></name></person-group>. <article-title>Default Bayes Factors for Model Selection in Regression.</article-title> Multivar Behav Res. <year>2012</year>;47(6):877-903. doi: <pub-id pub-id-type="doi" specific-use="author">10.1080/00273171.2012.734737</pub-id>. PubMed PMID: WOS:000313673800003.</mixed-citation></ref>
<ref id="b64"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Rouder</surname> <given-names>JN</given-names></name>, <name><surname>Morey</surname> <given-names>RD</given-names></name>, <name><surname>Speckman</surname> <given-names>PL</given-names></name>, <name><surname>Province</surname> <given-names>JM</given-names></name></person-group>. Default Bayes factors for ANOVA designs. J Math Psychol. <year>2012</year>;56(5):356-74. doi: <pub-id pub-id-type="doi" specific-use="author">10.1016/j.jmp.2012.08.001</pub-id>. PubMed PMID: WOS:000309793900006.</mixed-citation></ref>
<ref id="b65"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Salamé</surname> <given-names>P</given-names></name>, <name><surname>Baddeley</surname> <given-names>A.</given-names></name></person-group> Effects of Background Music on Phonological Short-Term-Memory. Q J Exp Psychol-A. <year>1989</year>;41(1):107-22. doi: Doi <pub-id pub-id-type="doi" specific-use="author">10.1080/14640748908402355</pub-id>. PubMed PMID: WOS:A1989T883700006.</mixed-citation></ref>
<ref id="b66"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Schäfer</surname> <given-names>T</given-names></name>, <name><surname>Fachner</surname> <given-names>J</given-names></name></person-group>. <article-title>Listening to music reduces eye movements.</article-title> Atten Percept Psycho. <year>2015</year>;77(2):551-9. doi: <pub-id pub-id-type="doi" specific-use="author">10.3758/s13414-014-0777-1</pub-id>. PubMed PMID: WOS:000350102800014.</mixed-citation></ref>
<ref id="b67"><mixed-citation publication-type="unknown" specific-use="parsed"><person-group person-group-type="author"><name><surname>Spence</surname> <given-names>C</given-names></name>, <name><surname>Driver</surname> <given-names>J.</given-names></name></person-group> <article-title>Crossmodal Space and Crossmodal Attention2004.</article-title></mixed-citation></ref>
<ref id="b68"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Thompson</surname> <given-names>WF</given-names></name>, <name><surname>Schellenberg</surname> <given-names>EG</given-names></name>, <name><surname>Letnic</surname> <given-names>AK</given-names></name></person-group>. <article-title>Fast and loud background music disrupts reading comprehension.</article-title> Psychol Music. <year>2012</year>;40(6):700-8. doi: <pub-id pub-id-type="doi" specific-use="author">10.1177/0305735611400173</pub-id>. PubMed PMID: WOS:000309933700002.</mixed-citation></ref>
<ref id="b69"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Trukenbrod</surname> <given-names>HA</given-names></name>, <name><surname>Engbert</surname> <given-names>R</given-names></name></person-group>. <article-title>Oculomotor control in a sequential search task.</article-title> Vision Research. <year>2007</year>;47(18):2426-43. doi: <pub-id pub-id-type="doi" specific-use="author">10.1016/j.visres.2007.05.010</pub-id>. PubMed PMID: WOS:000249516200007.</mixed-citation></ref>
<ref id="b70"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Trukenbrod</surname> <given-names>HA</given-names></name>, <name><surname>Engbert</surname> <given-names>R</given-names></name></person-group>. Eye movements in a sequential scanning task: Evidence for distributed processing. J Vision. <year>2012</year>;12(1). doi: <pub-id pub-id-type="doi" specific-use="author">10.1167/12.1.5</pub-id>. PubMed PMID: WOS:000299731800005.</mixed-citation></ref>
<ref id="b71"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Trukenbrod</surname> <given-names>HA</given-names></name>, <name><surname>Engbert</surname> <given-names>R</given-names></name></person-group>. <article-title>ICAT: a computational model for the adaptive control of fixation durations.</article-title> Psychon B Rev. <year>2014</year>;21(4):907-34. doi: <pub-id pub-id-type="doi" specific-use="author">10.3758/s13423-013-0575-0</pub-id>. PubMed PMID: WOS:000339727600004.</mixed-citation></ref>
<ref id="b72"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Vitu</surname> <given-names>F</given-names></name>, <name><surname>Oregan</surname> <given-names>JK</given-names></name>, <name><surname>Inhoff</surname> <given-names>AW</given-names></name>, <name><surname>Topolski</surname> <given-names>R</given-names></name></person-group>. Mindless Reading - Eye-Movement Characteristics Are Similar in Scanning Letter Strings and Reading Texts. Percept Psychophys. <year>1995</year>;57(3):352-64. doi: <pub-id pub-id-type="doi" specific-use="author">10.3758/Bf03213060</pub-id>. PubMed PMID: WOS:A1995QU31200008. <pub-id pub-id-type="doi">10.3758/BF03213060</pub-id></mixed-citation></ref>
<ref id="b73"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Walshe</surname> <given-names>RC</given-names></name>, <name><surname>Nuthmann</surname> <given-names>A</given-names></name></person-group>. <article-title>Asymmetrical control of fixation durations in scene viewing.</article-title> Vision Research. <year>2014</year>;100:38-46. doi: <pub-id pub-id-type="doi" specific-use="author">10.1016/j.visres.2014.03.012</pub-id>. PubMed PMID: WOS:000337556100005.</mixed-citation></ref>
<ref id="b74"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Wickens</surname> <given-names>CD</given-names></name></person-group>. <article-title>Multiple resources and mental workload.</article-title> Hum Factors. <year>2008</year>;50(3):449-55. doi: <pub-id pub-id-type="doi" specific-use="author">10.1518/001872008x288394</pub-id>. PubMed PMID: WOS:000257726300018. <pub-id pub-id-type="doi">10.1518/001872008X288394</pub-id></mixed-citation></ref>
<ref id="b75"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Wilson</surname> <given-names>M.</given-names></name></person-group> <article-title>Six views of embodied cognition.</article-title> Psychon B Rev. <year>2002</year>;9(4):625-36. doi: <pub-id pub-id-type="doi" specific-use="author">10.3758/Bf03196322</pub-id>. PubMed PMID: WOS:000181104600001. <pub-id pub-id-type="doi">10.3758/BF03196322</pub-id></mixed-citation></ref>
<ref id="b76"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Wühr</surname> <given-names>P</given-names></name>, <name><surname>Huestegge</surname> <given-names>L.</given-names></name></person-group> <article-title>The Impact of Social Presence on Voluntary and Involuntary Control of Spatial Attention.</article-title> Soc Cognition. <year>2010</year>;28(2):145-60. doi: <pub-id pub-id-type="doi" specific-use="author">10.1521/soco.2010.28.2.145</pub-id> PubMed PMID: WOS:000276553900001.</mixed-citation></ref>
<ref id="b77"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Yang</surname> <given-names>SN</given-names></name>, <name><surname>McConkie</surname> <given-names>GW</given-names></name></person-group>. <article-title>Eye movements during reading: a theory of saccade initiation times.</article-title> Vision Research. <year>2001</year>;41(25-26):3567-85. doi: <pub-id pub-id-type="doi" specific-use="author">10.1016/S0042-6989(01)00025-6</pub-id>. PubMed PMID: WOS:000173087300031.</mixed-citation></ref>
<ref id="b78"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Zatorre</surname> <given-names>RJ</given-names></name>, <name><surname>Chen</surname> <given-names>JL</given-names></name>, <name><surname>Penhune</surname> <given-names>VB</given-names></name></person-group>. <article-title>When the brain plays music: auditory-motor interactions in music perception and production.</article-title> Nature Reviews Neuroscience. <year>2007</year>;8(7):547-58. doi: <pub-id pub-id-type="doi" specific-use="author">10.1038/nrn2152</pub-id>. PubMed PMID: WOS:000247907800016.</mixed-citation></ref>
</ref-list>
<fn-group>
  <fn id="fn1">
    <p>In music theory, the tonic represents the fundamental key of a
    musical piece. The dominant and subdominant are the fourth and fifth
    degree of a diatonic scale. Together, this sequence represents the
    very simple harmonic progression of a conventional tonal piece of
    music, e.g. a pop song.</p>
  </fn>
  <fn id="fn2">
    <p>Note that exact ranges correspond to
    10<sup>0</sup>–10<sup>1/2</sup>, 10<sup>1/2</sup>–10<sup>1</sup>,
    10<sup>1</sup>–10<sup>2</sup>, and &#x3E;10<sup>2</sup>.</p>
  </fn>
</fn-group>
</back>
</article>
