<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.0 20120330//EN" "JATS-journalpublishing1.dtd">

<article article-type="research-article" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:mml="http://www.w3.org/1998/Math/MathML">
 <front>
    <journal-meta>
	<journal-id journal-id-type="publisher-id">Jemr</journal-id>
      <journal-title-group>
        <journal-title>Journal of Eye Movement Research</journal-title>
      </journal-title-group>
      <issn pub-type="epub">1995-8692</issn>
	  <publisher>								
	  <publisher-name>Bern Open Publishing</publisher-name>
	  <publisher-loc>Bern, Switzerland</publisher-loc>
	</publisher>
    </journal-meta>
    <article-meta>
	<article-id pub-id-type="doi">10.16910/jemr.11.2.8</article-id> 
	  <article-categories>								
				<subj-group subj-group-type="heading">
					<subject>Research Article</subject>
				</subj-group>
		</article-categories>
      <title-group>
        <article-title>Eye movements in scene perception while listening to slow and fast music</article-title>
      </title-group>
	   <contrib-group> 
				<contrib contrib-type="author">
					<name>
						<surname>Franěk</surname>
						<given-names>Marek</given-names>
					</name>
					<xref ref-type="aff" rid="aff1">1</xref>
				</contrib>
				<contrib contrib-type="author">
					<name>
						<surname>Šefara</surname>
						<given-names>Denis</given-names>
					</name>
					<xref ref-type="aff" rid="aff1">1</xref>
				</contrib>	
				<contrib contrib-type="author">
					<name>
						<surname>Petružálek</surname>
						<given-names>Jan</given-names>
					</name>
					<xref ref-type="aff" rid="aff1">1</xref>
				</contrib>	
				<contrib contrib-type="author">
					<name>
						<surname>Mlejnek</surname>
						<given-names>Roman</given-names>
					</name>
					<xref ref-type="aff" rid="aff2">2</xref>
				</contrib>
 				<contrib contrib-type="author">
					<name>
						<surname>van Noorden</surname>
						<given-names>Leon</given-names>
					</name>
					<xref ref-type="aff" rid="aff3">3</xref>
				</contrib>	       	               			
        <aff id="aff1">
		<institution>University of Hradec Králové, Hradec Králové</institution>,   <country>Czech Republic</country>
        </aff>       	               			
        <aff id="aff2">
		<institution>The Prague Conservatoire, Prague</institution>,   <country>Czech Republic</country>
        </aff>     	               			
        <aff id="aff3">
		<institution>Ghent University, Ghent</institution>,   <country>Belgium</country>
        </aff>                
		</contrib-group>   

		
	  <pub-date date-type="pub" publication-format="electronic"> 
		<day>11</day>  
		<month>8</month>
        <year>2018</year>
      </pub-date>
	  <pub-date date-type="collection" publication-format="electronic"> 
	  <year>2018</year>
	</pub-date>
      <volume>11</volume>
      <issue>2</issue>
	 <elocation-id>10.16910/jemr.11.2.8</elocation-id> 
	<permissions> 
	<copyright-year>2018</copyright-year>
	<copyright-holder>Franěk, M., Šefara. D., Petružálek, J., Mlejnek, R., &#x26; van Noorden, L.</copyright-holder>
	<license license-type="open-access">
  <license-p>This work is licensed under a Creative Commons Attribution 4.0 International License, 
  (<ext-link ext-link-type="uri" xlink:href="https://creativecommons.org/licenses/by/4.0/">
    https://creativecommons.org/licenses/by/4.0/</ext-link>), which permits unrestricted use and redistribution provided that the original author and source are credited.</license-p>
</license>
	</permissions>
      <abstract>
        <p>To date, there is insufficient knowledge of how visual exploration of outdoor scenes may be influenced by the simultaneous processing of music. Eye movements during viewing various outdoor scenes while listening to music at either a slow or fast tempo or in silence were measured. Significantly shorter fixations were found for viewing urban scenes com-pared with natural scenes, but there was no interaction between the type of scene and the acoustic conditions. The results revealed shorter fixation durations in the silent control condition in the range 30 ms, compared to both music conditions but, in contrast to previ-ous studies, these differences were non-significant. Moreover, we did not find differences in eye movements between music conditions with a slow or fast tempo. It is supposed that the type of musical stimuli, the specific tempo, the specific experimental procedure, and the engagement of participants in listening to background music while processing visual information may be important factors that influence attentional processes, which are mani-fested in eye-movement behavior.</p>
      </abstract>
      <kwd-group>
        <kwd>Eye movement</kwd>
        <kwd>eye tracking</kwd>
        <kwd>attention</kwd>
        <kwd>music listening</kwd>
        <kwd>music tempo</kwd>
        <kwd>scene perception</kwd> 
        <kwd>background music</kwd>              
      </kwd-group>
    </article-meta>
  </front>	
  <body>

    <sec id="S1">
      <title>Introduction</title>


<p>Music can be listened to during many everyday activities. North,
Hargreaves, and Hargreaves [<xref ref-type="bibr" rid="b1">1</xref>] documented that roughly half of
participants’ musical experiences occurred within the home, although
approximately 18% of musical experiences occurred in public spaces. Some
people listen to music all the time while walking, biking, or driving.
In these situations, people should link visual exploration of outdoor
scenes with processing of auditory stimuli (i.e., music). Wearing large
studio headphones or earbuds that effectively silence all surrounding
noise may create various dangerous situations. People need to be able to
process information from the surrounding environment to avoid collisions
with other people, cars, various objects, or simply for their
orientation on a route. To date, there is insufficient knowledge of how
this process works in real situations and to what extent the processing
of acoustic information may restrict perception of visual information
from the surrounding environment.</p>

<p>Heye and Lamont [<xref ref-type="bibr" rid="b2">2</xref>], in their study based on subjective reports of
participants, propose that listeners create an “auditory bubble” while
walking or travelling and listening to music and that they readily
switch between inside (music) and outside (surrounding) worlds. In these
listening situations, instead of perception of surrounding worlds, the
attention might be directed to thoughts, memories, and emotions that are
elicited by the music. Music turns people’s attention away from the
environment toward inward experiences [<xref ref-type="bibr" rid="b3 b4 b5">3, 4, 5</xref>]. In this context, our
previous study [<xref ref-type="bibr" rid="b6">6</xref>] investigated whether listening to music might mask
effects of visual characteristics of a walking route at walking speed.
Previous investigations revealed that people walking outdoors
spontaneously react to various features of the surrounding environment,
which results in a fluctuation of their actual walking speed [<xref ref-type="bibr" rid="b7 b8">7, 8</xref>]. It
was suggested that the absence of typical changes in walking speed at
specific points on the route while listening to music reflects
individuals not paying adequate attention to the environmental features
of a location. The results revealed that music masked the influence of
the surrounding environment only to some extent. Fluctuations in walking
speed at specific locations on the route still appeared but were smaller
compared to no-music conditions. Thus, the study demonstrated that
visual exploration of the surrounding environment may be affected by
simultaneously listening to music.</p>

<p>Gaining deeper insight into these processes will require more precise
knowledge of how visual exploration of outdoor scenes may be influenced
by the simultaneous processing of music. Clearly, analysis of eye
movements can provide a better understanding of this problem because
research in cognitive neuroscience shows that eye movements are closely
linked to visual attention processes [<xref ref-type="bibr" rid="b9 b10 b11">9, 10, 11</xref>]. Although there are numerous
studies of eye movements and various musical activities in the field of
sight-reading research [e.g.,<xref ref-type="bibr" rid="b12 b13 b14 b15">12, 13, 14, 15</xref>], analysis of eye movements in the
perception of outdoor scenes while listening to music has started to be
investigated only recently. To date, there have been only two relevant
eye-tracking studies [<xref ref-type="bibr" rid="b16 b17">16, 17</xref>]; however, they investigated different
facets of this problem. Schäfer and Fachner [<xref ref-type="bibr" rid="b17">17</xref>] were interested in the
attentional shift from visual perception of outdoor scenes to absorption
in music, while Maróti and her colleagues [<xref ref-type="bibr" rid="b16">16</xref>] investigated changes in
eye movement parameters (such as fixation duration, saccade duration,
saccade amplitude, and saccade number) in relation to musical tempo.</p>

<p>The study by Schäfer and Fachner [<xref ref-type="bibr" rid="b17">17</xref>] provided new information on how
eye movements can be influenced by participants simultaneously listening
to music while observing outdoor scenes. They examined eye movements of
participants viewing a picture (a house by the sea) or a film clip (a
videotaped road trip on an empty road through an open landscape) while
listening to familiar music, unknown music, or in silence. Popular
music, with a tempo of approximately 120 beats per min (bpm), was used
as the acoustic stimulus. The authors proposed that listening to music
elicits an attentional shift from the outer to the inner world (provoked
by absorption in music and the emotions and memories it evoked),
resulting in lower eye activity. Their data indicated that music
significantly reduces eye-movement activity, with participants
exhibiting longer fixations, fewer saccades, and more blinks when they
listened to music than when they sat in silence. As a possible
explanation, they discussed either an attentional shift from the outer
to the inner world or simply a shift of attention from visual to
auditory input.</p>

<p>Studies investigating eye movements with different types of auditory
stimuli show various results. Coutrot, Guyader, Ionescu, &#x26; Caplier
[<xref ref-type="bibr" rid="b18">18</xref>] investigated the viewing of videos with and without their original
soundtracks and found longer fixation durations and longer saccade
amplitudes in an audio–visual condition than in a visual one. In the
study by Maróti et al. [<xref ref-type="bibr" rid="b16">16</xref>], participants listened to drum sequences
while viewing various natural scenes. It was found that fixation
durations and average intersaccade intervals were longer for the drum
sequence conditions compared with the quiet condition. On the contrary,
Song, Pellerin, and Granjon [<xref ref-type="bibr" rid="b19">19</xref>] reported shorter fixation durations in
an audio–visual (sounds, music) condition compared to a visual condition
while observing video excerpts. The most recent study by Lange,
Pieczykolan, Trukenbrod, and Huestegge [<xref ref-type="bibr" rid="b20">20</xref>] showed faster total reading
time and faster reading completion time in the music compared silence
condition. In contrast, some eye-tracking studies that investigated
reading with presentation of auditory stimuli revealed no effect of
background music. The study by Cauchard, Cane, and Weger [<xref ref-type="bibr" rid="b21">21</xref>] examined
the influence of background speech and music on overall reading time
(the summed fixation durations) using an eye-movement paradigm and found
that background music had no effect on reading process or on eye
movements while reading. Similarly, Johansson, Holmqvist, Mossberg, and
Lindgren [<xref ref-type="bibr" rid="b22">22</xref>] found no significant differences between acoustic
conditions (preferred or non-preferred music, noise from a café,
silence) for the eye movement measures (fixation duration, saccadic
amplitude) during reading.</p>

<p>In investigating eye movements while viewing outdoor scenes together
with listening to music, we must take into account that recent research
in environmental psychology has revealed that certain environmental
features have different effects on eye-fixation behavior. Dupont,
Antrop, and Van Eetvelde [<xref ref-type="bibr" rid="b23">23</xref>] investigated eye movements while viewing
photographs of various landscape types in Belgium that differed in
degree of openness and heterogeneity. They found a greater number of
fixations in enclosed compared to open landscapes and a greater number
of fixations in heterogeneous than in homogeneous landscapes. More
importantly, eye-tracking studies by Berto, Massaccesi, and Pasini [<xref ref-type="bibr" rid="b24">24</xref>]
and Valtchanov and Ellard [<xref ref-type="bibr" rid="b25">25</xref>] reported reduced eye-movement activity in
terms of number of fixations and eye-travel distance while viewing
photographs of natural scenes in contrast to urban scenes. Similar
results were found in our recent study [<xref ref-type="bibr" rid="b26">26</xref>]. Although the mentioned
studies were conducted in a laboratory, similar effects have been shown
for viewing real natural environments and various forms of surrogate
nature [<xref ref-type="bibr" rid="b27">27</xref>]. The difference between viewing natural and built scenes was
explained in terms of better perceptual fluency while processing natural
scenes compared to urban scenes [<xref ref-type="bibr" rid="b28">28</xref>]. Evidence is emerging that fractal
complexity, typical of natural scenes, may be a source of perceptual
fluency [<xref ref-type="bibr" rid="b29">29</xref>]. Fractals capture order and structure by the repetition of
similar visual information across multiple scale levels. Thus, walking
with music in an urban environment might require more attentional
resources, not only because of the necessity to observe traffic and the
movement of other pedestrians, both of which are more common in urban
than natural environments, but also because a built environment is more
difficult to visually process than a natural one [e.g., <xref ref-type="bibr" rid="b30">30</xref>].</p>

<p>To sum, the previous studies [<xref ref-type="bibr" rid="b24 b25">24, 25</xref>] found a higher visual
exploration while observing urban scenes compared to natural scenes,
which is manifested in shorter fixation durations, longer eye travel
distance, and a higher number of fixations. It is suggested that
processing of urban scenes requires greater visual exploration, compared
to natural scenes, because they are more difficult to process. The
investigations of processing outdoor scenes while listening to music
[<xref ref-type="bibr" rid="b16 b17">16, 17</xref>] observed significantly longer fixations while listening to music
compared to the no-music condition, which suggests that listening to
music while viewing visual images requires additional attentional
resources. Thus, there is a question of how these two processes interact
with each other, and whether the effect of background music on
eye-movement behavior while viewing a scene in comparison to a silent
control may be modulated by the type of scene (urban vs. natural).</p>

<p>A further question is whether music tempo might affect the motor
system of gaze control. Our study was designed to explore the effect of
fast-tempo music in contrast to slow-tempo music and a no-music
condition on eye movements. There is a large body of research
documenting how certain temporal properties of music (such as rhythm or
tempo) induce motor processes in a listener [for review, <xref ref-type="bibr" rid="b31 b32">31, 32</xref>]. The
tempo of music can influence the speed of movements in various
behavioral domains. For instance, the tempo of music can influence
walking speed [e.g., <xref ref-type="bibr" rid="b6 b33 b34">6, 33, 34</xref>] and the speed of various sports activities
[<xref ref-type="bibr" rid="b35 b36 b37 b38">35, 36, 37, 38</xref>]. Studies from the field of consumer psychology suggest that the
tempo of in-store music might influence visual exploration and the
consequent process of consumer decision-making [e.g., <xref ref-type="bibr" rid="b39 b40">39, 40</xref>]. Music can
also influence mood and arousal level before/during sports performance
[<xref ref-type="bibr" rid="b41 b42 b43 b44">41, 42, 43, 44</xref>], resulting in an improved performance. Karageorghis, Terry, and
Lane [<xref ref-type="bibr" rid="b45">45</xref>] developed a conceptual framework for predicting the
motivational qualities of music in exercise and sports environments. The
authors argue that some types of music motivate bodily movements
(namely, in sports), while others do not. To strengthen the effect of
tempo, in selecting musical stimuli, we employed Karageorghis’s concept
of motivational and non-motivational music [<xref ref-type="bibr" rid="b45">45</xref>]. Both types of music
were used in our previous study [<xref ref-type="bibr" rid="b6">6</xref>], in which participants were asked to
walk with music along an urban route. The study showed that fast
motivational music accelerated walking speed, whereas slow music
decreased walking speed, in contrast to common popular music of diverse
tempi.</p>

<p>Because eye-movement behavior is inherently rhythmic, an important
question is the possible synchronization between eye movements and a
musical beat, which may also have some implications for visual
processing of perceived scenes. This line of research is based on the
Dynamic Attending Theory [<xref ref-type="bibr" rid="b46">46</xref>]. According to this theory, the brain works
on the basis of internal neural oscillations that are capable of
entraining to external events and targeting attentional energy to
expected points in time. In accordance with this theory, it can be
expected that if visual attention synchronizes to the rhythm of musical
beats, the rhythm of eye movements would align with the musical beats as
well [<xref ref-type="bibr" rid="b16">16</xref>]. Maróti and her colleagues [<xref ref-type="bibr" rid="b16">16</xref>] investigated the effects of
music tempo (drum grooves) on the eye movements of participants viewing
various natural scenes. They used drum grooves with either 102 or 144
bpm. The results revealed that slow musical beats retarded sampling of
visual information. Fixation durations significantly increased at the
lower beat frequency compared to the higher beat frequency and to the
no-music condition. Although the study revealed modulation of eye
movements by a musical beat, it did not find evidence for entrainment.
Consistent with Maróti et al.’s [<xref ref-type="bibr" rid="b16">16</xref>] findings, Lange et al. [<xref ref-type="bibr" rid="b20">20</xref>] used
basic musical stimuli (bass drum, synthesized sequence of chords) and
reported that increased tempo of the musical stimuli (80–140 bpm)
accelerated eye fixations during text reading (but did not affect a
visual scanning task). However, the most recent study by Plöchl and
Obleser [<xref ref-type="bibr" rid="b47">47</xref>] provides contradictory findings. Their participants viewed
either a blank gray screen or a scene from a picture book while
listening to either isochronous or irregular auditory clicks at
different temporal frequencies between 180 and 300 bpm. Auditory
stimulation, however, had no significant impact on saccade frequency or
timing, either under rhythmic or arrhythmic conditions. Although these
mentioned studies used different visual stimuli and auditory stimuli of
different tempi and complexity, it can be concluded that the effect of
speed of auditory beats on the gaze control motor system needs further
clarification.</p>

<p>The aim of the present study was to investigate eye movements while
viewing urban or natural scenes while listening to two different types
of music, namely fast music that motivates bodily movements and slow
music. In view of the above literature, we expect that listening to
music while viewing outdoor scenes will increase fixation durations
compared to observing the scenes without music. Further, the effect of
the environmental features of the observed scene and the interaction
between the type of scene and music condition will be examined. We
expect fixation durations will decrease while viewing urban vs. natural
scenes. This should be the case, because of a lower perceptual fluency
of urban scenes compared to natural scenes. We also expect that fixation
durations will increase when music is presented. Finally, there is a
question of whether music tempo will modulate the speed of eye
movements. We predict that increased musical tempo will decrease the
duration of fixations.</p>
    </sec>

    <sec id="S2">
      <title>Methods</title>
    <sec id="S2a">
      <title>Participants</title>

<p>Ninety-eight undergraduates participated in the study. The students
were young adults aged 18–25 years (<italic>M</italic> age = 21.02 yr.,
<italic>SD</italic> = 1.30) and included 50 men and 48 women. They were
recruited from a range of fields of study (informatics, financial
management, tourism) at the University of Hradec Králové. None of the
participants had formal musical training. They were compensated by
partial course credit. Ethical approval for the experiment was obtained
from the Department of Management at the University of Hradec Králové.
None of the participants had a self-reported visual disorder or problem.
The participants provided written informed consent in which they
declared that they were voluntarily participating in the experiment and
that they were informed about the experimental procedure.</p>
        </sec>

    <sec id="S2b">
      <title>Design</title>

<p>A between-subjects design was employed. Participants viewed pictures
under three conditions: no music, fast music, and slow music. Two
measures of eye movements were selected as the dependent variables: (a)
the mean duration of all fixations (in s) within an image and (b) the
mean number of fixations within an image. Although the second measure is
redundant, it might be interesting for comparison with other
studies.</p>
        </sec>

    <sec id="S2c">
      <title>Eye-Tracking Equipment and Measures</title>


<p>The experiment was controlled by a PC computer with a 1366 × 768
pixel resolution screen and a diagonal of 38 cm. Eye movements were
recorded binocularly using a Tobii X2-60 eye tracker with a sampling
rate of 60 Hz. The apparatus tracks both eyes simultaneously and
automatically determines which eye is left and which is right regardless
of head pose and blinking. The binocular data were averaged across eyes.
The eye-tracking device was attached under the monitor of a laptop. The
device and presentation of stimuli, as well as the data processing, were
controlled by the Tobii Studio Version 3.2 software. Two measures of eye
movements were used: the mean duration of fixations and the mean number
of fixations.</p>
        </sec>

    <sec id="S2d">
      <title>Materials</title>

<p><bold>Images.</bold> All images used in this study were taken by
one of the authors using a digital camera (Nikon D90) with a wide-angle
zoom lens. They were composed of images of cities or natural scenery
(see Figure 1). All images were transformed to 1152 × 768 pixel
resolution using the Adobe Photoshop CS 6 software. Each image was
individually optimized. The images had their brightness levels and
contrast balanced using the “Auto Levels”, “Auto Contrast”, and “Auto
Colors” options in Adobe Photoshop. Twelve images were photographs from
cities in the Czech Republic (Beroun, Chlumec nad Cidlinou, Prague,
Uherský Brod), Belgium (Brussels) and the United States (Seattle). The
urban scenes had a diverse character. Some of them contained high-rise
buildings (Seattle), whereas others were streets with typical urban
buildings from the 19th century (Brussels, Prague) or streets with
low-rise buildings typical of small towns (Beroun, Chlumec nad
Cidlinou). The next 12 images with natural scenes were taken in the
Czech Republic. They consisted of conifer or deciduous forests, meadows,
or ponds. There were no people in any of these images.</p>

<fig id="fig01" fig-type="figure" position="float">
					<label>Figure 1.</label>
					<caption>
						<p>Examples of stimulus material. A, B - urban scenes, C, D -
natural scenes.</p>
					</caption>
					<graphic id="graph01" xlink:href="jemr-11-02-h-figure-01.png"/>
				</fig>

<p><bold>Music.</bold> To prevent the participants from being influenced by the lyrics of the
songs, we chose songs in English. Although all participants knew
English, it was difficult for non-native speakers to understand the
words of the songs. The musical stimuli were borrowed from our previous
study [<xref ref-type="bibr" rid="b6">6</xref>]. In this study, participants were asked to select and submit
two files of different types of music that they liked. The first type of
music, motivational music, was characterized as “Music that gives me a
strong urge to move in one way or the other”, while the second type of
music, non-motivational music, was described as “Nice music, but with no
strong urge to move”. Then, they were asked to evaluate the motivational
characteristics of these collected musical files, which were made
available on a network disk using the Czech version of the Brunel Music
Rating Inventory-2 [<xref ref-type="bibr" rid="b48">48</xref>]. Based on evaluation using the Brunel Music
Rating Inventory-2, musical pieces with the highest motivational
character (fast motivational music) and additional pieces with the
lowest motivational character (slow, non-motivational music) were
selected. For the purposes of the present study, we chose two songs from
this selection. The song “One Fine Day” (The Offspring, album Conspiracy
of One, 2000) with a tempo of 187 bpm was used as the motivational
music. The piece lasted 2 min and 45 s. There was a short half time
section (about 20 s) in the song, which repeated three times during the
experimental session. The song “Deadmen’s Gun” (Ashtar Command, video
game Red Dead Redemption, 2010) with a tempo of 69 bpm was used as the
non-motivational music. The piece lasted 3 min and 1 s. By using the
VirtualDJ software, a long seamless loop was used in both songs in order
to achieve homogenous musical accompaniment as much as possible. The
software automatically calculated bpm and then a multiple beat loop
(more than a hundred beats long) was created and shifted to the suitable
position in the track’s mid-section in order to create the most seamless
transition from the loop end to the loop start. The regularity of
musical beat was preserved. Participants heard only three transitions
between repetitions of the song. The music was presented via headphones
and its loudness was adjusted to a comfortable level.</p>
        </sec>

    <sec id="S2e">
      <title>Procedure</title>

<p>The participants were randomly assigned to a specific condition.
There were 17 males and 17 females in the no-music condition. In the
fast-music condition, there were 17 males and 17 females, and in the
slow-music condition, there were 16 males and 14 females. The following
instruction was given to the participants: “You will take part in a
study in which you will successively examine a series of images
presented on the computer screen. While doing so, you will listen to
music on headphones. View each image carefully.” Next, every subject
underwent an eye-tracking calibration. Each participant was calibrated
once using nine calibration points. The precision of fixations during
calibration was evaluated and participants were recalibrated to reach
sufficient accuracy. The participants sat approximately 70 cm from the
display monitor. The visual angle was 17–22 degrees.</p>

<p>There were 24 images within the experimental session. Twelve images
represented urban scenes and another twelve represented natural scenes.
The images were presented in a random order. Every trial started with a
fixation cross situated in the center of the screen on a light gray
background. The initial fixation cross served as a fixation check. The
participants had to fixate on the fixation cross for 2 s before the
image occurred. The initiated first fixation was not calculated. Each
image was displayed for 15 s. The whole experimental session lasted 6
min and 48 s. To simulate real situations when people listen to
background music while doing different tasks, in both music conditions
the music loop began to play the moment the first slide with the
experimental instructions appeared on the computer screen and continued
without interruption until the moment when all required tasks were
completed. Presentation of stimuli was controlled by Tobii Studio
Version 3.2 software.</p>
    </sec>
    </sec>

     <sec id="S3">
      <title>Results</title>       
     <sec id="S3a">
      <title>Fixation Durations</title> 


<p>The mean fixation durations were calculated for each image and then
averaged for type of scene (nature vs. urban) and then averaged across
participants (Table 1). There was homogeneity of variances
(<italic>p</italic> &#x3E; 0.05) and covariances (<italic>p</italic> &#x3E;
0.05), as assessed by Levene's test of homogeneity of variances and
Box's M test, respectively. A mixed analysis of variance (ANOVA) was
conducted to assess the effects of the type of scene and music condition
on the mean fixation durations. The type of scene (nature, urban) was
chosen as the within-subject factor, music condition (fast music, slow
music, no music) was chosen as the between-subject factor, and the mean
fixation duration was the dependent variable. The ANOVA indicated a
statistically significant effect of the type of scene,
<italic>F</italic>(1, 95) = 28.970, <italic>p</italic> &#x3C; .001,
η<sup>2</sup> = 0.243, but not a statistically significant effect of
music condition (fast music, slow music, no music),
<italic>F</italic>(2, 95) = 1.539, <italic>p</italic> = 0.220,
η<sup>2</sup> = 0.031. There was no statistically significant
interaction between type of scene and music condition,
<italic>F</italic>(2, 95) = 0.258, <italic>p</italic> = 0.773,
η<sup>2</sup> = 0.005. The results showed that the mean fixation
duration was significantly longer in the natural scenes than in the
urban scenes but did not reveal significant differences among music
conditions.</p>
        </sec>

     <sec id="S3b">
      <title>Number of fixations</title> 

<p>The mean number of fixations was calculated for each image, and then
averaged for type of scene (nature vs. urban) and then averaged across
participants (Table 1). There was homogeneity of variances
(<italic>p</italic> &#x3E; .05) and covariances (<italic>p</italic> &#x3E;
.05), as assessed by Levene's test of homogeneity of variances and Box's
M test, respectively. A mixed ANOVA was conducted to assess the effects
of the type of scene and music condition on the mean number of
fixations. The type of scene (nature, urban) was chosen as the
within-subject factor, music condition (fast music, slow music, no
music) was chosen as the between-subject factor, and the mean number of
fixations was the dependent variable. The ANOVA indicated a
statistically significant effect of the type of scene,
<italic>F</italic>(1, 95) = 58.761, <italic>p</italic> &#x3C; .05,
η<sup>2</sup> = 0.382, but not a statistically significant effect of
music condition (fast music, slow music, no-music),
<italic>F</italic>(2, 95) = 1.973, <italic>p</italic> = 0.145,
η<sup>2</sup> = 0.040. There was no statistically significant
interaction between type of scene and music condition,
<italic>F</italic>(2, 95) = 0.153, <italic>p</italic> = 0.858,
η<sup>2</sup> = 0.003. The results showed that the number of fixations
was significantly greater in urban scenes than in natural scenes but did
not reveal significant differences among music conditions.</p>


<table-wrap id="t01" position="float">
					<label>Table 1.</label>
					<caption>
						<p>Mean scores for the fixation durations in milliseconds and
the number of fixations for each experimental condition (no music, slow
music, fast music) and the type of scene (urban, nature).</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">

    <thead>
      <tr>
        <th></th>
        <th colspan="2">Urban scenes</th>
        <th colspan="2">Natural scenes</th>

      </tr>
    </thead>
    <tbody>
      <tr>
        <td></td>
        <td>Mean</td>
        <td>SD</td>
        <td>Mean</td>
        <td>SD</td>
      </tr>
      <tr>
        <td></td>
        <td colspan="4">Mean fixation durations</td>

      </tr>
      <tr>
        <td>No music</td>
        <td>348</td>
        <td>59.65</td>
        <td>385</td>
        <td>99.16</td>
      </tr>
      <tr>
        <td>Fast music</td>
        <td>369</td>
        <td>63.77</td>
        <td>420</td>
        <td>131.77</td>
      </tr>
      <tr>
        <td>Slow music</td>
        <td>378</td>
        <td>63.70</td>
        <td>426</td>
        <td>137.09</td>
      </tr>
      <tr>
        <td></td>
        <td colspan="4">Mean number of fixations</td>

      </tr>
      <tr>
        <td>No music</td>
        <td>41.85</td>
        <td>5.13</td>
        <td>38.99</td>
        <td>6.83</td>
      </tr>
      <tr>
        <td>Fast music</td>
        <td>39.75</td>
        <td>5.33</td>
        <td>36.71</td>
        <td>7.11</td>
      </tr>
      <tr>
        <td>Slow music</td>
        <td>38.20</td>
        <td>5.40</td>
        <td>36.38</td>
        <td>7.44</td>
      </tr>
    </tbody>
  </table>
</table-wrap>
        </sec>

     <sec id="S3c">
      <title>Temporal Evolution of Mean Fixation Durations during Stimulus
    Presentation</title> 

<p>In the next step, the temporal evolution of mean fixation durations
was analyzed to examine whether the effect of music on fixation
durations might change over time during stimulus presentation. We
separately calculated results in the three time windows over the course
of the visual stimulus presentation: 0–5 s, 5–10 s, and 10–15 s. The
mean fixation durations are listed in Table 2 for all time windows
separately.</p>

<p>A three-way mixed ANOVA was conducted to assess the effects of the
music condition (fast music, slow music, no music), the type of scene
(urban, nature) and the time interval (time window) of visual stimulus
presentation (0–5 s, 5–10 s, 10–15 s) on fixation durations. There was
homogeneity of variances, as assessed by Levene's test for equality of
variances (<italic>p</italic> &#x3E; .05). Greenhouse-Geisser correction
was applied where assumption of sphericity was violated as assessed by
Mauchly’s test of sphericity.</p>

<p>The ANOVA indicated a statistically significant within-subjects main
effect of the type of scene <italic>F</italic>(1, 95) = 28.977,
<italic>p</italic> &#x3C; .001, η2 = 0.234, and a statistically
significant within-subjects main effect of the time interval
<italic>F</italic>(1.264, 120.077) = 43.809, <italic>p</italic> &#x3C;
.001, η2 = 0.316, but a non-significant between-subjects main effect of
music condition <italic>F</italic>(2, 95) = 0.855, <italic>p</italic> =
.429, η2 = 0.018. There was a statistically significant two-way
interaction between the type of scene and the time interval,
<italic>F</italic>(1.406, 133.606) = 12.808, <italic>p</italic> &#x3C;
.001, η2 = .119. There was no statistically significant three-way
interaction between the type of scene, music condition, and the time
interval, <italic>F</italic>(2.813, 133.606) = 0.440, <italic>p</italic>
= .712, η2 = .009. A post hoc analysis with a Bonferroni adjustment
showed that the fixation durations were significantly shorter in the 0–5
s interval than in the 5–10 s interval (<italic>p</italic> &#x3C; .001),
in the 5–10 s interval than the 10–15 s interval (<italic>p</italic>
&#x3C; .05), and in the 0–5 s interval than in 10–15 s interval
(<italic>p</italic> &#x3C; .001).</p>

<p>The results showed that the effect of music condition on fixation
durations did not change over the course of stimulus presentation. Music
condition (fast music, slow music, no music), had no significant effect
on fixation durations in the three selected time windows. However, it
was observed that fixations durations were gradually extended over the
course of stimulus presentation under all experimental conditions. The
statistically significant two-way interaction between the type of scene
and the time interval shows that the significant effect of the type of
the scene appeared at the later stages of the stimulus presentation.</p>


<table-wrap id="t02" position="float">
					<label>Table 2.</label>
					<caption>
						<p>Mean scores for the fixation durations in milliseconds for
particular experimental conditions measured in the three time windows
over the course of visual stimulus presentation: 0–5 s, 5–10 s, and
10–15 s.</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">
    <thead>
      <tr>
        <th> </th>
        <th colspan="2">Urban scenes</th>
        <th colspan="2">Natural scenes</th>

      </tr>
    </thead>
    <tbody>
      <tr>
        <td></td>
        <td>Mean</td>
        <td>SD</td>
        <td>Mean</td>
        <td>SD</td>
      </tr>
      <tr>
         <td></td>     
        <td colspan="4">time window 0–5 s</td>

      </tr>
      <tr>
        <td>No music</td>
        <td>344</td>
        <td>117.38</td>
        <td>334</td>
        <td>66.76</td>
      </tr>
      <tr>
        <td>Slow music</td>
        <td>339</td>
        <td>56.92</td>
        <td>357</td>
        <td>93.39</td>
      </tr>
      <tr>
        <td>Fast music</td>
        <td>358</td>
        <td>90.71</td>
        <td>393</td>
        <td>211.21</td>
      </tr>
      <tr>
        <td>Total</td>
        <td>347</td>
        <td>91.74</td>
        <td>361</td>
        <td>138.36</td>
      </tr>
      <tr>
        <td></td>     
        <td colspan="4">time window 5–10 s</td>

      </tr>
      <tr>
        <td>No music</td>
        <td>360</td>
        <td>72.75</td>
        <td>450</td>
        <td>229.74</td>
      </tr>
      <tr>
        <td>Slow music</td>
        <td>389</td>
        <td>82.56</td>
        <td>497</td>
        <td>246.79</td>
      </tr>
      <tr>
        <td>Fast music</td>
        <td>399</td>
        <td>84.37</td>
        <td>491</td>
        <td>204.05</td>
      </tr>
      <tr>
        <td>Total</td>
        <td>382</td>
        <td>80.85</td>
        <td>479</td>
        <td>226.17</td>
      </tr>
      <tr>
        <td></td>      
        <td colspan="4">time window 10–15 s</td>

      </tr>
      <tr>
        <td>No music</td>
        <td>376</td>
        <td>103.61</td>
        <td>481</td>
        <td>252.08</td>
      </tr>
      <tr>
        <td>Slow music</td>
        <td>420</td>
        <td>142.81</td>
        <td>538</td>
        <td>364.52</td>
      </tr>
      <tr>
        <td>Fast music</td>
        <td>432</td>
        <td>142.27</td>
        <td>521</td>
        <td>207.68</td>
      </tr>
      <tr>
        <td>Total</td>
        <td>409</td>
        <td>131.25</td>
        <td>513</td>
        <td>280.12</td>
      </tr>
    </tbody>
  </table>
</table-wrap>

    </sec>
    </sec>    

     <sec id="S4">
      <title>Discussion</title> 

<p>This study analyzed eye movements while viewing urban or natural
scenes while listening to two different types of music – fast music that
motivates bodily movements or slow, non-motivational music – or silence.
Significantly shorter fixations were found for viewing urban scenes
compared with natural scenes, but we did not find a significant
interaction between the type of scene and music condition. The results
revealed shorter fixation durations on the range of 30 ms in the
no-music condition compared to both music conditions, but these
differences were not significant. Moreover, we did not find differences
in eye movements between music conditions with either a fast or slow
tempo.</p>

<p>While previous studies [<xref ref-type="bibr" rid="b16 b17">16, 17</xref>] observed significantly longer
fixations while listening to music compared to the no-music condition,
which suggests that listening to music while viewing visual images
requires attentional resources, we did not succeed in fully replicating
these results. Although we did not find significant differences between
the music and the no-music condition, we observed a similar trend in
both fixation durations and the number of fixations for both the entire
viewing time and during the temporal evolution that is consistent with
the previous studies [<xref ref-type="bibr" rid="b16 b17">16, 17</xref>].</p>

<p>It should be mentioned that the 60Hz sampling rate of our apparatus
may be one limitation of this study. While many significant effects on
fixation durations are in the range of 30 ms (e.g.,16-19], we did not
find significant differences between no music and musical conditions
occurring in this range. One possible explanation is that the lower
sampling rate of our device might cause higher variability in the data.
However, there are also other additional possible explanations.</p>

<p>Schäfer and Fachner [<xref ref-type="bibr" rid="b17">17</xref>] used two types of music: self-selected music
with the predicted effect of absorption and music from the
experimenters, for which absorption was not predicted. However,
listening to background music does not necessarily mean that listeners
are fully engaged in music listening while performing other activities.
Some studies conducted in a service environment (restaurants, shops)
showed that people are not often aware of the presence of background
music [e.g., <xref ref-type="bibr" rid="b49">49</xref>], particularly if they like the type of music being
played. Clearly, listening to background music while viewing scenes does
not necessarily recruit a considerable amount of attentional resources
to have a conspicuous effect on eye movements. An interest and
engagement in the music being played may be an important factor. In
Schäfer and Fachner’s [<xref ref-type="bibr" rid="b17">17</xref>] study, the participants were asked in advance
to bring their favorite music, which may, in general, attract their
attention to music in the course of the experiment because they simply
may suspect that the experiment has something to do with music
perception. Similarly, in Maróti et al.’s [<xref ref-type="bibr" rid="b16">16</xref>] study, participants had
to perform a tempo discrimination task that may also draw attention to
drum sequences during viewing outdoor scenes. In contrast, the
experimental procedure used in our study did not necessarily imply that
music would be an important part of the experiment.</p>

<p>It should be noted that individual variables might also have an
effect on the influence of music on eye-tracking behavior. Although
people have individual musical preferences depending on diverse factors
[e.g., personality, see <xref ref-type="bibr" rid="b50">50</xref>], we do not expect a confounding effect of
musical preference because the musical excerpts used in the experiment
were examples of easy-listening popular music. However, individual
differences in everyday use of music might potentially have some
influence. Chamorro‐Premuzic and Furnham [<xref ref-type="bibr" rid="b51">51</xref>] noted that there are three
different major uses of music. For some people, music serves mainly for
emotional regulation and mood manipulation. Other individuals are
characterized by a cognitive approach, which means rational or
intellectual processing of music. Finally, background use of music is
typical for people who use music as a background for social events,
work, or interpersonal interaction. There are also links between these
diverse types of music uses and certain personality traits, as found in
Chamorro‐Premuzic and Furnham [<xref ref-type="bibr" rid="b51">51</xref>]. Clearly, further research should
also control for the potential effect of these differences.</p>

<p>Another considerable factor is participants’ musical experience,
which may affect their interest in musical or acoustic stimuli while
viewing scenes. There is evidence that musicians perceive auditory
differences more finely than non-musicians and that they are slightly
better at sustained auditory attention than non-musicians [e.g., <xref ref-type="bibr" rid="b52">52</xref>].
Musicians are also better than non-musicians at pre-attentively
extracting information out of musically relevant stimuli [<xref ref-type="bibr" rid="b53">53</xref>]. Our
participants had no formal musical training, and therefore it is
unlikely that musical expertise played a confounding role in our study.
However, it is worth noting that Schäfer and Fachner’s [<xref ref-type="bibr" rid="b17">17</xref>] study did
not describe their participants’ musical expertise. In Maróti et al.’s
[<xref ref-type="bibr" rid="b16">16</xref>] study, some participants were musicians, but the authors did not
find any effect of music training. On the other hand, the participants
in this experiment listened to simple musical structures, namely, drum
sequences. A future research should consider musical expertise as a
factor that may play some role.</p>

<p>Although it is usually stated that the average fixation duration for
scene perception is between 260 and 330 milliseconds [e.g., <xref ref-type="bibr" rid="b54">54</xref>], the
fixation durations found in our experiment were longer. However, it is
known that fixation durations for scene perception vary as a function of
the task and the characteristics of the scene. For instance, fixation
duration is longer for full color photographs than for black-and-white
line drawings [<xref ref-type="bibr" rid="b55">55</xref>]. They may also be affected by scene luminance [<xref ref-type="bibr" rid="b56">56</xref>]
and contrast [<xref ref-type="bibr" rid="b57">57</xref>]. For instance, in Schäfer and Fachner’s [<xref ref-type="bibr" rid="b17">17</xref>] study,
average fixation duration values in the no-music condition ranged from
340 to 391 msec.</p>

<p>It is also worth commenting that the duration of visual stimulus
presentation might also affect the findings because there might be
differences in eye-movement activity between early and late phases
within the trial. While in Maróti et al.’s [<xref ref-type="bibr" rid="b16">16</xref>] study the participants
watched the image for 6 s, in Schäfer and Fachner’s [<xref ref-type="bibr" rid="b17">17</xref>] study it was
for 45 s, and in our study it was 15 s. However, our analysis of
temporal evolution of eye-movement measures within the trial did not
reveal significant changes in relation to music condition, though there
was an effect of increased fixation duration over image time on screen.
It shows that duration of stimulus presentation may also affect the
results.</p>

<p>One further question was whether the effect of background music on
eye-movement behavior during scene viewing, in comparison to a silent
control, may be modulated by the type of scene (urban vs. natural). In
accord with previous studies [<xref ref-type="bibr" rid="b24 b25 b26">24, 25, 26</xref>], the results showed significantly
shorter fixations for viewing urban scenes compared with natural scenes,
which is explained in terms of a higher perceptual fluency of natural
scenes with respect to ordinary urban scenes. However, our analysis did
not reveal any interaction between music and type of scene. This shows
that music processing does not interfere with scene processing.</p>

<p>The final question was whether music tempo would modulate the speed
of eye-movements. In our study, we did not find differences between the
effects of fast and slow music on eye movement. This finding is in
contrast with Maróti et al. [<xref ref-type="bibr" rid="b16">16</xref>], who reported that the beat frequency
of the drum grooves modulated the rate of eye movements, specifically
fixation durations, which increased at a lower beat frequency rather
than at a higher beat frequency. As we already mentioned, a limitation
of our study is that the sampling rate of the eye tracker used did not
enable us to measure small differences of approximately 10 ms precisely,
as would be expected for beat entrainment. Moreover, it is suggested
that the effect of beat frequency on eye movements may be also
influenced by the type of musical stimuli and the experimental
procedure. In Maróti et al.’s [<xref ref-type="bibr" rid="b16">16</xref>] study, the participants listened to
the isolated sound of drums; drum grooves might strengthen beat
perception. Our stimuli involved ordinary music without a stressed beat.
On the other hand, harmonic structure of the pop songs used in our
experiment may reinforce metric structure and therefore a beat may be
more easily perceived.</p>

<p>However, an alternative explanation is also possible. While Maróti et
al. [<xref ref-type="bibr" rid="b16">16</xref>] had used music of 102 and 144 bpm to make the effect of musical
tempo on eye movements more distinctive, we used tempi with a more
salient difference in our experiment. It might be possible that with
such a very fast tempo at 187 bpm, participants might extract
spontaneously the beat at half of the speed of the music, that is 93
bpm. Moreover, there is a short half time section in the song “One Fine
Day”, which can reinforce the 93 bmp feel. If so, the difference between
slow music at 69 bpm and fast music with 93 bpm would not be so strong.
Interestingly, in our previous experiment [<xref ref-type="bibr" rid="b6">6</xref>], in which we explored
synchronization between music tempo and walking speed, we found that
only one participant synchronized her steps with the beat of the music
while listening to a musical piece at a tempo of 187 bmp. Curiously, to
synchronize, instead of walking, she was running. However, we did not
control bodily synchronization with music in this experiment or the beat
extraction process. Clearly, further research should specify the
circumstances in which the musical beat may affect eye-movement
velocity.</p>

<p>To conclude, the effect of music on eye movements while freely
observing outdoor scenes is still not entirely clear. We suggest that
the type of stimuli, the specific experimental procedure, and the
interest and engagement of participants in listening to background music
while processing visual information are important factors that influence
attentional processes and the attentional shift from visual to acoustic
input, which is manifested in eye-movement behavior.</p>

     <sec id="S4a" sec-type="COI-statement">
      <title>Ethics and Conflict of Interest</title> 

<p>The authors declare that the contents of the article are in agreement
with the ethics described in
<ext-link ext-link-type="uri" xlink:href="http://biblio.unibe.ch/portale/elibrary/BOP/jemr/ethics.html" xlink:show="new">http://biblio.unibe.ch/portale/elibrary/BOP/jemr/ethics.html</ext-link>
and that there is no conflict of interest regarding the publication of
this paper.</p>
        </sec>

     <sec id="S4b">
      <title>Acknowledgements</title> 

<p>This research was supported by the Student Specific Research Grant
1/2017 from the Faculty of Informatics and Management at the University
of Hradec Králové.</p>

    </sec>
    </sec>    
  </body>

<back>
<ref-list>
<ref id="b9"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Awh</surname>, <given-names>E.</given-names></name>, <name><surname>Armstrong</surname>, <given-names>K. M.</given-names></name>, &#x26; <name><surname>Moore</surname>, <given-names>T.</given-names></name></person-group> (<year>2006</year>, <month>March</month>). <article-title>Visual and oculomotor selection: Links, causes and implications for spatial attention.</article-title> <source>Trends in Cognitive Sciences</source>, <volume>10</volume>(<issue>3</issue>), <fpage>124</fpage>-<lpage>130</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1016/j.tics.2006.01.001</pub-id><pub-id pub-id-type="pmid">16469523</pub-id><issn>1364-6613</issn></mixed-citation></ref>
<ref id="b24"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Berto</surname>, <given-names>R.</given-names></name>, <name><surname>Massaccesi</surname>, <given-names>S.</given-names></name>, &#x26; <name><surname>Pasini</surname>, <given-names>M.</given-names></name></person-group> (<year>2008</year>, <month>June</month>). <article-title>Do eye movements measured across high and low fascination photographs differ? Addressing Kaplan's fascination hypothesis.</article-title> <source>Journal of Environmental Psychology</source>, <volume>28</volume>(<issue>2</issue>), <fpage>185</fpage>-<lpage>191</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1016/j.jenvp.2007.11.004</pub-id><issn>0272-4944</issn></mixed-citation></ref>
<ref id="b33"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Buhmann</surname>, <given-names>J.</given-names></name>, <name><surname>Desmet</surname>, <given-names>F.</given-names></name>, <name><surname>Moens</surname>, <given-names>B.</given-names></name>, <name><surname>Van Dyck</surname>, <given-names>E.</given-names></name>, &#x26; <name><surname>Leman</surname>, <given-names>M.</given-names></name></person-group> (<year>2016</year>). <article-title>Spontaneous velocity effect of musical expression on self-paced walking.</article-title> <source>PLoS One</source>, <volume>11</volume>(<issue>5</issue>), <fpage>e0154414</fpage>. <pub-id specific-use="author" pub-id-type="doi">10.1371/journal.pone.0154414</pub-id><pub-id pub-id-type="pmid">27167064</pub-id><issn>1932-6203</issn></mixed-citation></ref>
<ref id="b52"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Carey</surname>, <given-names>D.</given-names></name>, <name><surname>Rosen</surname>, <given-names>S.</given-names></name>, <name><surname>Krishnan</surname>, <given-names>S.</given-names></name>, <name><surname>Pearce</surname>, <given-names>M. T.</given-names></name>, <name><surname>Shepherd</surname>, <given-names>A.</given-names></name>, <name><surname>Aydelott</surname>, <given-names>J.</given-names></name>, &#x26; <name><surname>Dick</surname>, <given-names>F.</given-names></name></person-group> (<year>2015</year>, <month>April</month>). <article-title>Generality and specificity in the effects of musical expertise on perception and cognition.</article-title> <source>Cognition</source>, <volume>137</volume>, <fpage>81</fpage>-<lpage>105</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1016/j.cognition.2014.12.005</pub-id><pub-id pub-id-type="pmid">25618010</pub-id><issn>0010-0277</issn></mixed-citation></ref>
<ref id="b21"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Cauchard</surname>, <given-names>F.</given-names></name>, <name><surname>Cane</surname>, <given-names>J. E.</given-names></name>, &#x26; <name><surname>Weger</surname>, <given-names>U. W.</given-names></name></person-group> (<year>2012</year>, <month>May-June</month>). <article-title>Influence of background speech and music in interrupted reading: An eye-tracking study.</article-title> <source>Applied Cognitive Psychology</source>, <volume>26</volume>(<issue>3</issue>), <fpage>381</fpage>-<lpage>390</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1002/acp.1837</pub-id><issn>0888-4080</issn></mixed-citation></ref>
<ref id="b51"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Chamorro-Premuzic</surname>, <given-names>T.</given-names></name>, &#x26; <name><surname>Furnham</surname>, <given-names>A.</given-names></name></person-group> (<year>2007</year>, <month>May</month>). <article-title>Personality and music: Can traits explain how people use music in everyday life?</article-title> <source>British Journal of Psychology</source>, <volume>98</volume>(<issue>Pt 2</issue>), <fpage>175</fpage>-<lpage>185</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1348/000712606X111177</pub-id><pub-id pub-id-type="pmid">17456267</pub-id><issn>0007-1269</issn></mixed-citation></ref>
<ref id="b18"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Coutrot</surname>, <given-names>A.</given-names></name>, <name><surname>Guyader</surname>, <given-names>N.</given-names></name>, <name><surname>Ionescu</surname>, <given-names>G.</given-names></name>, &#x26; <name><surname>Caplier</surname>, <given-names>A.</given-names></name></person-group> (<year>2012</year>). <article-title>Influence of soundtrack on eye movements during video exploration.</article-title> <source>Journal of Eye Movement Research</source>, <volume>5</volume>(<issue>2</issue>), <fpage>1</fpage>-<lpage>10</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.16910/jemr.5.4.2</pub-id><issn>1995-8692</issn></mixed-citation></ref>
<ref id="b10"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Deubel</surname>, <given-names>H.</given-names></name>, &#x26; <name><surname>Schneider</surname>, <given-names>W. X.</given-names></name></person-group> (<year>1996</year>, <month>June</month>). <article-title>Saccade target selection and object recognition: Evidence for a common attentional mechanism.</article-title> <source>Vision Research</source>, <volume>36</volume>(<issue>12</issue>), <fpage>1827</fpage>-<lpage>1837</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1016/0042-6989(95)00294-4</pub-id><pub-id pub-id-type="pmid">8759451</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="b12"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Drai-Zerbib</surname>, <given-names>V.</given-names></name>, <name><surname>Baccino</surname>, <given-names>T.</given-names></name>, &#x26; <name><surname>Bigand</surname>, <given-names>E.</given-names></name></person-group> (<year>2012</year>, <month>March</month>). <article-title>Sight-reading expertise: Cross-modality integration investigated using eye tracking.</article-title> <source>Psychology of Music</source>, <volume>40</volume>(<issue>2</issue>), <fpage>216</fpage>-<lpage>235</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1177/0305735610394710</pub-id><issn>0305-7356</issn></mixed-citation></ref>
<ref id="b23"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Dupont</surname>, <given-names>L.</given-names></name>, <name><surname>Antrop</surname>, <given-names>M.</given-names></name>, &#x26; <name><surname>Van Eetvelde</surname>, <given-names>V.</given-names></name></person-group> (<year>2013</year>, <month>August</month>). <article-title>Eye-tracking analysis in landscape perception research: Influence of photograph properties and landscape characteristics.</article-title> <source>Landscape Research</source>, <volume>39</volume>(<issue>4</issue>), <fpage>417</fpage>-<lpage>432</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1080/01426397.2013.773966</pub-id><issn>0142-6397</issn></mixed-citation></ref>
<ref id="b3"><mixed-citation specific-use="unparsed" publication-type="book-chapter"><person-group person-group-type="author"><name><surname>Fachner</surname>, <given-names>J.</given-names></name></person-group> Time is the key - music and ASC. In: E. Cardenas E, Winkelmann M, Tart C, Krippner S, editors. Altering consciousness: A multidisciplinary perspective. Vol. 1: History, culture and the humanities. Santa Barbara, CA: Praeger; <year>2011</year>. p. 355-376.</mixed-citation></ref>
<ref id="b7"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Franek</surname>, <given-names>M.</given-names></name></person-group> (<year>2013</year>, <month>June</month>). <article-title>Environmental factors influencing pedestrian walking speed.</article-title> <source>Perceptual and Motor Skills</source>, <volume>116</volume>(<issue>3</issue>), <fpage>992</fpage>-<lpage>1019</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.2466/06.50.PMS.116.3.992-1019</pub-id><pub-id pub-id-type="pmid">24175468</pub-id><issn>0031-5125</issn></mixed-citation></ref>
<ref id="b8"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Franek</surname>, <given-names>M.</given-names></name>, &#x26; <name><surname>Rezny</surname>, <given-names>L.</given-names></name></person-group> (<year>2014</year>). <article-title>Analyza faktoru ovlivnujicich kolisani rychlosti chuze v mestskem prostredi s prirodnimi prvky</article-title> <comment>[Analysis of factors affecting variations in walking speed in urban environment with natural elements]</comment>. <source>Ceskoslovenska Psychologie</source>, <volume>58</volume>(<issue>1</issue>), <fpage>14</fpage>-<lpage>30</lpage>.<issn>0009-062X</issn></mixed-citation></ref>
<ref id="b6"><mixed-citation specific-use="unparsed" publication-type="unknown"><person-group person-group-type="author"><name><surname>Franek</surname> <given-names>M</given-names></name>, <name><surname>van Noorden</surname> <given-names>L</given-names></name>, <name><surname>Rezny</surname> <given-names>L</given-names></name></person-group>. <article-title>Tempo and walking speed with music in the urban context.</article-title> Front Psychol. <year>2014</year>;5:1361. http://dx.doi.org/<pub-id specific-use="author" pub-id-type="doi">10.3389/fpsyg</pub-id>. 2014.01361</mixed-citation></ref>
<ref id="b26"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Franek</surname>, <given-names>M.</given-names></name>, <name><surname>Sefara</surname>, <given-names>D.</given-names></name>, <name><surname>Petruzalek</surname>, <given-names>J.</given-names></name>, <name><surname>Cabal</surname>, <given-names>J.</given-names></name>, &#x26; <name><surname>Myska</surname>, <given-names>K.</given-names></name></person-group> (<year>2018</year>, <month>June</month>). <article-title>Differences in eye movements while viewing images with various levels of restorativeness</article-title>. <source>Journal of Environmental Psychology</source>, <volume>57</volume>, <fpage>10</fpage>-<lpage>16</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1016/j.jenvp.2018.05.001</pub-id><issn>0272-4944</issn></mixed-citation></ref>
<ref id="b13"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Goolsby</surname>, <given-names>T. W.</given-names></name></person-group> (<year>1994</year>). <article-title>Profiles of processing: Eye movements during sightreading</article-title> <source>Music Perception</source>, <volume>12</volume>(<issue>1</issue>), <fpage>97</fpage>-<lpage>123</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.2307/40285757</pub-id></mixed-citation></ref>
<ref id="b55"><mixed-citation specific-use="restruct" publication-type="book-chapter"><person-group person-group-type="author"><name><surname>Henderson</surname>, <given-names>J. M.</given-names></name>, &#x26; <name><surname>Hollingworth</surname>, <given-names>A.</given-names></name></person-group> (<year>1998</year>). <chapter-title>Eye movements during scene viewing: An overview</chapter-title>. In <person-group person-group-type="editor"><name><given-names>G.</given-names> <surname>Underwood</surname></name> (<role>Ed.</role>),</person-group> <source>Eye guidance in reading and scene perception</source> (pp. <fpage>269</fpage>-<lpage>293</lpage>). <publisher-name>Elsevier Science Ltd.</publisher-name> <pub-id pub-id-type="doi">10.1016/B978-008043361-5/50013-4</pub-id></mixed-citation></ref>
<ref id="b4"><mixed-citation specific-use="restruct" publication-type="book"><person-group person-group-type="author"><name><surname>Herbert</surname>, <given-names>R.</given-names></name></person-group> (<year>2011</year>). <source>Music listening: Absorption, dissociation and trancing</source>. <publisher-name>Ashgate</publisher-name>.</mixed-citation></ref>
<ref id="b5"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Herbert</surname>, <given-names>R.</given-names></name></person-group> (<year>2012</year>). <article-title>Musical and non-musical involvement in daily life: The case of absorption</article-title> <source>Musicae Scientiae</source>, <volume>16</volume>(<issue>1</issue>), <fpage>41</fpage>-<lpage>66</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1177/1029864911423161</pub-id></mixed-citation></ref>
<ref id="b2"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Heye</surname>, <given-names>A.</given-names></name>, &#x26; <name><surname>Lamont</surname>, <given-names>A.</given-names></name></person-group> (<year>2010</year>). <article-title>Mobile listening situations in everyday life: The use of MP3 players while travelling.</article-title> <source>Musicae Scientiae</source>, <volume>14</volume>(<issue>1</issue>), <fpage>95</fpage>-<lpage>120</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1177/102986491001400104</pub-id></mixed-citation></ref>
<ref id="b22"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Johansson</surname>, <given-names>R.</given-names></name>, <name><surname>Holmqvist</surname>, <given-names>K.</given-names></name>, <name><surname>Mossberg</surname>, <given-names>F.</given-names></name>, &#x26; <name><surname>Lindgren</surname>, <given-names>M.</given-names></name></person-group> (<year>2012</year>, <month>May</month>). <article-title>Eye movements and reading comprehension while listening to preferred and non-preferred study music.</article-title> <source>Psychology of Music</source>, <volume>40</volume>(<issue>3</issue>), <fpage>339</fpage>-<lpage>356</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1177/0305735610387777</pub-id><issn>0305-7356</issn></mixed-citation></ref>
<ref id="b28"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Joye</surname>, <given-names>Y.</given-names></name>, &#x26; <name><surname>van den Berg</surname>, <given-names>A.</given-names></name></person-group> (<year>2011</year>). <article-title>Is love for green in our genes? A critical analysis of evolutionary assumptions in restorative environments research.</article-title> <source>Urban Forestry &#x26; Urban Greening</source>, <volume>10</volume>(<issue>4</issue>), <fpage>261</fpage>-<lpage>268</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1016/j.ufug.2011.07.004</pub-id><issn>1618-8667</issn></mixed-citation></ref>
<ref id="b30"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Joye</surname>, <given-names>Y.</given-names></name>, <name><surname>Pals</surname>, <given-names>R.</given-names></name>, <name><surname>Steg</surname>, <given-names>L.</given-names></name>, &#x26; <name><surname>Evans</surname>, <given-names>B. L.</given-names></name></person-group> (<year>2013</year>). <article-title>New methods for assessing the fascinating nature of nature experiences.</article-title> <source>PLoS One</source>, <volume>8</volume>(<issue>7</issue>), <fpage>e65332</fpage>. <pub-id specific-use="author" pub-id-type="doi">10.1371/journal.pone.0065332</pub-id><pub-id pub-id-type="pmid">23922645</pub-id><issn>1932-6203</issn></mixed-citation></ref>
<ref id="b29"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Joye</surname>, <given-names>Y.</given-names></name>, <name><surname>Steg</surname>, <given-names>L.</given-names></name>, <name><surname>Unal</surname>, <given-names>A. B.</given-names></name>, &#x26; <name><surname>Pals</surname>, <given-names>R.</given-names></name></person-group> (<year>2016</year>). <article-title>When complex is easy on the mind: Internal repetition of visual information in complex objects is a source of perceptual fluency.</article-title> <source>J Exp Psychol Human.</source>, <volume>42</volume>(<issue>1</issue>), <fpage>103</fpage>-<lpage>114</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1037/xhp0000105</pub-id><pub-id pub-id-type="pmid">26322692</pub-id><issn>1939-1277</issn></mixed-citation></ref>
<ref id="b45"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Karageorghis</surname>, <given-names>C. I.</given-names></name>, <name><surname>Terry</surname>, <given-names>P. C.</given-names></name>, &#x26; <name><surname>Lane</surname>, <given-names>A. M.</given-names></name></person-group> (<year>1999</year>, <month>September</month>). <article-title>Development and initial validation of an instrument to assess the motivational qualities of music in exercise and sport: The Brunel Music Rating Inventory.</article-title> <source>Journal of Sports Sciences</source>, <volume>17</volume>(<issue>9</issue>), <fpage>713</fpage>-<lpage>724</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1080/026404199365579</pub-id><pub-id pub-id-type="pmid">10521002</pub-id><issn>0264-0414</issn></mixed-citation></ref>
<ref id="b48"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Karageorghis</surname>, <given-names>C. I.</given-names></name>, <name><surname>Priest</surname>, <given-names>D. L.</given-names></name>, <name><surname>Terry</surname>, <given-names>P. C.</given-names></name>, <name><surname>Chatzisarantis</surname>, <given-names>N. L.</given-names></name>, &#x26; <name><surname>Lane</surname>, <given-names>A. M.</given-names></name></person-group> (<year>2006</year>, <month>August</month>). <article-title>Redesign and initial validation of an instrument to assess the motivational qualities of music in exercise: The Brunel Music Rating Inventory-2.</article-title> <source>Journal of Sports Sciences</source>, <volume>24</volume>(<issue>8</issue>), <fpage>899</fpage>-<lpage>909</lpage>.  <pub-id pub-id-type="doi">10.1080/02640410500298107</pub-id><pub-id pub-id-type="pmid">16815785</pub-id><issn>0264-0414</issn></mixed-citation></ref>
<ref id="b41"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Karageorghis</surname>, <given-names>C. I.</given-names></name>, &#x26; <name><surname>Priest</surname>, <given-names>D. L.</given-names></name></person-group> (<year>2012</year>). <article-title>Music in the exercise domain: A review and synthesis (Part I).</article-title> <source>International Review of Sport and Exercise Psychology</source>, <volume>5</volume>(<issue>1</issue>), <fpage>44</fpage>-<lpage>66</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1080/1750984X.2011.631026</pub-id><pub-id pub-id-type="pmid">22577472</pub-id><issn>1750-984X</issn></mixed-citation></ref>
<ref id="b53"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Koelsch</surname>, <given-names>S.</given-names></name>, <name><surname>Schroger</surname>, <given-names>E.</given-names></name>, &#x26; <name><surname>Tervaniemi</surname>, <given-names>M.</given-names></name></person-group> (<year>1999</year>, <month>April</month>). <article-title>Superior pre-attentive auditory processing in musicians.</article-title> <source>Neuroreport</source>, <volume>10</volume>(<issue>6</issue>), <fpage>1309</fpage>-<lpage>1313</lpage>. <pub-id pub-id-type="doi">10.1097/00001756-199904260-00029</pub-id><pub-id pub-id-type="pmid">10363945</pub-id><issn>0959-4965</issn></mixed-citation></ref>
<ref id="b11"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Kowler</surname>, <given-names>E.</given-names></name>, <name><surname>Anderson</surname>, <given-names>E.</given-names></name>, <name><surname>Dosher</surname>, <given-names>B.</given-names></name>, &#x26; <name><surname>Blaser</surname>, <given-names>E.</given-names></name></person-group> (<year>1995</year>, <month>July</month>). <article-title>The role of attention in the programming of saccades.</article-title> <source>Vision Research</source>, <volume>35</volume>(<issue>13</issue>), <fpage>1897</fpage>-<lpage>1916</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1016/0042-6989(94)00279-U</pub-id><pub-id pub-id-type="pmid">7660596</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="b43"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Lane</surname>, <given-names>A. M.</given-names></name>, <name><surname>Davis</surname>, <given-names>P. A.</given-names></name>, &#x26; <name><surname>Devonport</surname>, <given-names>T. J.</given-names></name></person-group> (<year>2011</year>, <month>June</month>). <article-title>Effects of music interventions on emotional States and running performance.</article-title> <source>Journal of Sports Science &#x26; Medicine</source>, <volume>10</volume>(<issue>2</issue>), <fpage>400</fpage>-<lpage>407</lpage>.<pub-id pub-id-type="pmid">24149889</pub-id><issn>1303-2968</issn></mixed-citation></ref>
<ref id="b20"><mixed-citation specific-use="parsed" publication-type="conference"><person-group person-group-type="author"><name><surname>Lange</surname>, <given-names>E. B.</given-names></name>, <name><surname>Pieczykolan</surname>, <given-names>A.</given-names></name>, <name><surname>Trukenbrod</surname>, <given-names>H.</given-names></name>, &#x26; <name><surname>Huestegge</surname>, <given-names>L.</given-names></name></person-group> <article-title>The rhythm of cognition - Effects of an external auditory pacemaker on oculomotor control in exemplary cognitive tasks (reading and visual search).</article-title> Paper presented at the <source>Conference on Music &#x26; Eye Tracking</source>, <conf-loc>Frankfurt, Germany</conf-loc>, <conf-date>August 17th-18th, 2017</conf-date>.</mixed-citation></ref>
<ref id="b46"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Large</surname>, <given-names>E. W.</given-names></name>, &#x26; <name><surname>Jones</surname>, <given-names>M. R.</given-names></name></person-group> (<year>1999</year>, <month>January</month>). <article-title>The dynamics of attending: How people track time-varying events.</article-title> <source>Psychological Review</source>, <volume>106</volume>(<issue>1</issue>), <fpage>119</fpage>-<lpage>159</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1037/0033-295X.106.1.119</pub-id><issn>0033-295X</issn></mixed-citation></ref>
<ref id="b42"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Laukka</surname>, <given-names>P.</given-names></name>, &#x26; <name><surname>Quick</surname>, <given-names>L.</given-names></name></person-group> (<year>2013</year>, <month>March</month>). <article-title>Emotional and motivational uses of music in sports and exercise: A questionnaire study among athletes.</article-title> <source>Psychology of Music</source>, <volume>41</volume>(<issue>2</issue>), <fpage>198</fpage>-<lpage>215</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1177/0305735611422507</pub-id><issn>0305-7356</issn></mixed-citation></ref>
<ref id="b56"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Loftus</surname>, <given-names>G. R.</given-names></name></person-group> (<year>1985</year>). <article-title>Picture perception: Effects of luminance on available information and information-extraction rate.</article-title> <source>Journal of Experimental Psychology. General</source>, <volume>114</volume>(<issue>3</issue>), <fpage>342</fpage>-<lpage>356</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1037/0096-3445.114.3.342</pub-id><pub-id pub-id-type="pmid">3161980</pub-id><issn>0096-3445</issn></mixed-citation></ref>
<ref id="b57"><mixed-citation specific-use="restruct" publication-type="book-chapter"><person-group person-group-type="author"><name><surname>Loftus</surname>, <given-names>G. R.</given-names></name>, <name><surname>Kaufman</surname>, <given-names>L.</given-names></name>, <name><surname>Nishimoto</surname>, <given-names>T.</given-names></name>, &#x26; <name><surname>Ruthruff</surname>, <given-names>E.</given-names></name></person-group> (<year>1992</year>). <chapter-title>Effects of visual degradation on eye-fixation duration, perceptual processing, and long-term visual memory</chapter-title>. In <person-group person-group-type="editor"><name><given-names>K.</given-names> <surname>Rayner</surname></name> (<role>Ed.</role>),</person-group> <source>Eye movements and visual cognition. Springer series in neuropsychology</source> (pp. <fpage>203</fpage>-<lpage>226</lpage>). <publisher-name>Springer</publisher-name>. <pub-id pub-id-type="doi">10.1007/978-1-4612-2852-3_12</pub-id></mixed-citation></ref>
<ref id="b14"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Madell</surname>, <given-names>J.</given-names></name>, &#x26; <name><surname>Heebert</surname>, <given-names>S.</given-names></name></person-group> (<year>2008</year>, <month>December</month>). <article-title>Eye movements and music reading: Where do we look next?</article-title> <source>Music Perception</source>, <volume>26</volume>(<issue>2</issue>), <fpage>157</fpage>-<lpage>170</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1525/mp.2008.26.2.157</pub-id><issn>0730-7829</issn></mixed-citation></ref>
<ref id="b16"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Maroti</surname>, <given-names>E.</given-names></name>, <name><surname>Knakker</surname>, <given-names>B.</given-names></name>, <name><surname>Vidnyanszky</surname>, <given-names>Z.</given-names></name>, &#x26; <name><surname>Weiss</surname>, <given-names>B.</given-names></name></person-group> (<year>2017</year>, <month>February</month>). <article-title>The effect of beat frequency on eye movements during free viewing.</article-title> <source>Vision Research</source>, <volume>131</volume>, <fpage>57</fpage>-<lpage>66</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1016/j.visres.2016.12.009</pub-id><pub-id pub-id-type="pmid">28057578</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="b39"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Milliman</surname>, <given-names>R. E.</given-names></name></person-group> (<year>1982</year>). <article-title>Using background music to affect the behavior of supermarket shoppers.</article-title> <source>Journal of Marketing</source>, <volume>46</volume>(<issue>3</issue>), <fpage>86</fpage>-<lpage>91</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.2307/1251706</pub-id> <pub-id pub-id-type="doi">10.1177/002224298204600313</pub-id><issn>0022-2429</issn></mixed-citation></ref>
<ref id="b44"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Nakamura</surname>, <given-names>P. M.</given-names></name>, <name><surname>Pereira</surname>, <given-names>G.</given-names></name>, <name><surname>Papini</surname>, <given-names>C. B.</given-names></name>, <name><surname>Nakamura</surname>, <given-names>F. Y.</given-names></name>, &#x26; <name><surname>Kokubun</surname>, <given-names>E.</given-names></name></person-group> (<year>2010</year>, <month>February</month>). <article-title>Effects of preferred and nonpreferred music on continuous cycling exercise performance.</article-title> <source>Perceptual and Motor Skills</source>, <volume>110</volume>(<issue>1</issue>), <fpage>257</fpage>-<lpage>264</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.2466/pms.110.1.257-264</pub-id><pub-id pub-id-type="pmid">20391890</pub-id><issn>0031-5125</issn></mixed-citation></ref>
<ref id="b31"><mixed-citation specific-use="restruct" publication-type="book-chapter"><person-group person-group-type="author"><name><surname>Nguyen</surname>, <given-names>T.</given-names></name>, <name><surname>Gibbings</surname>, <given-names>A.</given-names></name>, &#x26; <name><surname>Grahn</surname>, <given-names>J.</given-names></name></person-group> (<year>2018</year>). <chapter-title>Rhythm and beat perception</chapter-title>. In <person-group person-group-type="editor"><name><given-names>R.</given-names> <surname>Bader</surname></name> (<role>Ed.</role>),</person-group> <source>Springer handbook of systematic musicology</source> (pp. <fpage>507</fpage>-<lpage>521</lpage>). <publisher-name>Springer</publisher-name>., <pub-id specific-use="author" pub-id-type="doi">10.1007/978-3-662-55004-5_27</pub-id></mixed-citation></ref>
<ref id="b49"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>North</surname>, <given-names>A. C.</given-names></name>, &#x26; <name><surname>Hargreaves</surname>, <given-names>D. J.</given-names></name></person-group> (<year>1996</year>, <month>March</month>). <article-title>Responses to music in a dining area.</article-title> <source>Journal of Applied Social Psychology</source>, <volume>26</volume>(<issue>6</issue>), <fpage>491</fpage>-<lpage>501</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1111/j.1559-1816.1996.tb02727.x</pub-id><issn>0021-9029</issn></mixed-citation></ref>
<ref id="b1"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>North</surname>, <given-names>A. C.</given-names></name>, <name><surname>Hargreaves</surname>, <given-names>D. J.</given-names></name>, &#x26; <name><surname>Hargreaves</surname>, <given-names>J. J.</given-names></name></person-group> (<year>2004</year>). <article-title>Uses of music in everyday life.</article-title> <source>Music Perception</source>, <volume>22</volume>(<issue>1</issue>), <fpage>41</fpage>-<lpage>77</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1525/mp.2004.22.1.41</pub-id></mixed-citation></ref>
<ref id="b32"><mixed-citation specific-use="restruct" publication-type="book-chapter"><person-group person-group-type="author"><name><surname>Novembre</surname>, <given-names>G.</given-names></name>, &#x26; <name><surname>Keller</surname>, <given-names>P. E.</given-names></name></person-group> (<year>2018</year>). <chapter-title>Music and action</chapter-title>. In <person-group person-group-type="editor"><name><given-names>R.</given-names> <surname>Bader</surname></name> (<role>Ed.</role>),</person-group> <source>Springer handbook of systematic musicology</source> (pp. <fpage>523</fpage>-<lpage>537</lpage>). <publisher-name>Springer</publisher-name>., <pub-id specific-use="author" pub-id-type="doi">10.1007/978-3-662-55004-5_28</pub-id></mixed-citation></ref>
<ref id="b27"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Pearson</surname>, <given-names>D. G.</given-names></name>, &#x26; <name><surname>Craig</surname>, <given-names>T.</given-names></name></person-group> (<year>2014</year>). <article-title>The great outdoors? Exploring the mental health benefits of natural environments.</article-title> <source>Frontiers in Psychology</source>, <volume>5</volume>, <fpage>1178</fpage>. <ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/fpsyg.2014.01178">http://dx.doi.org/fpsyg.2014.01178</ext-link> <pub-id pub-id-type="doi">10.3389/fpsyg.2014.01178</pub-id><pub-id pub-id-type="pmid">25374550</pub-id><issn>1664-1078</issn></mixed-citation></ref>
<ref id="b47"><mixed-citation specific-use="parsed" publication-type="conference"><person-group person-group-type="author"><name><surname>Plochl</surname>, <given-names>M.</given-names></name>, &#x26; <name><surname>Obleser</surname>, <given-names>J.</given-names></name></person-group> <article-title>Do auditory rhythms influence eye movement statistics?</article-title> Paper presented at the <source>Conference on Music &#x26; Eye Tracking</source>, <conf-loc>Frankfurt, Germany</conf-loc>, <conf-date>August 17th-18th, 2017</conf-date>.</mixed-citation></ref>
<ref id="b40"><mixed-citation specific-use="unparsed" publication-type="conference"><person-group person-group-type="author"><name><surname>Petruzzellis</surname>, <given-names>L.</given-names></name>, <name><surname>Chebat</surname>, <given-names>J. C.</given-names></name>, &#x26; <name><surname>Palumbo</surname>, <given-names>A.</given-names></name></person-group> <article-title>Hey dee-jay let's play that song and keep me shopping all day long.</article-title> The effect of famous background music on consumer shopping behavior. In: <person-group person-group-type="editor"><name><surname>Kubacki</surname> <given-names>K</given-names></name><role>, editor</role></person-group>. Ideas in marketing: Finding the new and polishing the old. Developments in marketing science: <source>Proceedings of the Academy of Marketing Science</source>. <publisher-loc>Cham</publisher-loc>: <publisher-name>Springer</publisher-name>; <year>2015</year>. p. <fpage>756</fpage>-<lpage>765</lpage>.</mixed-citation></ref>
<ref id="b54"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Rayner</surname>, <given-names>K.</given-names></name></person-group> (<year>2009</year>). <article-title>Eye movements and attention in reading, scene perception, and visual search.</article-title> <source>Quarterly Journal of Experimental Psychology</source>, <volume>62</volume>(<issue>8</issue>), <fpage>1457</fpage>-<lpage>1506</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1080/17470210902816461</pub-id><pub-id pub-id-type="pmid">19449261</pub-id><issn>1747-0218</issn></mixed-citation></ref>
<ref id="b50"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Rentfrow</surname>, <given-names>P. J.</given-names></name>, &#x26; <name><surname>Gosling</surname>, <given-names>S. D.</given-names></name></person-group> (<year>2003</year>, <month>June</month>). <article-title>The do re mi's of everyday life: The structure and personality correlates of music preferences.</article-title> <source>Journal of Personality and Social Psychology</source>, <volume>84</volume>(<issue>6</issue>), <fpage>1236</fpage>-<lpage>1256</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1037/0022-3514.84.6.1236</pub-id><pub-id pub-id-type="pmid">12793587</pub-id><issn>0022-3514</issn></mixed-citation></ref>
<ref id="b17"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Schafer</surname>, <given-names>T.</given-names></name>, &#x26; <name><surname>Fachner</surname>, <given-names>J.</given-names></name></person-group> (<year>2015</year>, <month>February</month>). <article-title>Listening to music reduces eye movements.</article-title> <source>Atten Percept Psychol.</source>, <volume>77</volume>(<issue>2</issue>), <fpage>551</fpage>-<lpage>559</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.3758/s13414-014-0777-1</pub-id><pub-id pub-id-type="pmid">25280523</pub-id><issn>1943-393X</issn></mixed-citation></ref>
<ref id="b35"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Simpson</surname>, <given-names>S. D.</given-names></name>, &#x26; <name><surname>Karageorghis</surname>, <given-names>C. I.</given-names></name></person-group> (<year>2006</year>, <month>October</month>). <article-title>The effects of synchronous music on 400-m sprint performance.</article-title> <source>Journal of Sports Sciences</source>, <volume>24</volume>(<issue>10</issue>), <fpage>1095</fpage>-<lpage>1102</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1080/02640410500432789</pub-id><pub-id pub-id-type="pmid">17115524</pub-id><issn>0264-0414</issn></mixed-citation></ref>
<ref id="b19"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Song</surname>, <given-names>G.</given-names></name>, <name><surname>Pellerin</surname>, <given-names>D.</given-names></name>, &#x26; <name><surname>Granjon</surname>, <given-names>L.</given-names></name></person-group> (<year>2013</year>). <article-title>Different types of sounds influence gaze differently in videos.</article-title> <source>Journal of Eye Movement Research</source>, <volume>6</volume>(<issue>4</issue>), <fpage>1</fpage>-<lpage>13</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.16910/jemr.6.4.1</pub-id><issn>1995-8692</issn></mixed-citation></ref>
<ref id="b36"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Terry</surname>, <given-names>P. C.</given-names></name>, <name><surname>Karageorghis</surname>, <given-names>C. I.</given-names></name>, <name><surname>Saha</surname>, <given-names>A. M.</given-names></name>, &#x26; <name><surname>D'Auria</surname>, <given-names>S.</given-names></name></person-group> (<year>2012</year>, <month>January</month>). <article-title>Effects of synchronous music on treadmill running among elite triathletes.</article-title> <source>Journal of Science and Medicine in Sport</source>, <volume>15</volume>(<issue>1</issue>), <fpage>52</fpage>-<lpage>57</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1016/j.jsams.2011.06.003</pub-id><pub-id pub-id-type="pmid">21803652</pub-id><issn>1440-2440</issn></mixed-citation></ref>
<ref id="b34"><mixed-citation specific-use="linked" publication-type="unknown"><person-group person-group-type="author"><name><surname>Styns</surname> <given-names>F</given-names></name>, <name><surname>van Noorden</surname> <given-names>L</given-names></name>, <name><surname>Moelants</surname> <given-names>D</given-names></name>, <name><surname>Leman</surname> <given-names>M.</given-names></name></person-group> <article-title>Walking on music.</article-title> Hum Movemen Sci. (<year>2007</year>) Oct;26(5):769-785. http://dx.doi.org/<pub-id specific-use="author" pub-id-type="doi">10.1016/j.humov.2007.07.007</pub-id>   </mixed-citation></ref>
<ref id="b25"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Valtchanov</surname>, <given-names>D.</given-names></name>, &#x26; <name><surname>Ellard</surname>, <given-names>C. G.</given-names></name></person-group> (<year>2015</year>, <month>September</month>). <article-title>Cognitive and affective responses to natural scenes: Effects of low level visual properties on preference, cognitive load and eye-movements.</article-title> <source>Journal of Environmental Psychology</source>, <volume>43</volume>, <fpage>184</fpage>-<lpage>195</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1016/j.jenvp.2015.07.001</pub-id><issn>0272-4944</issn></mixed-citation></ref>
<ref id="b37"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Van Dyck</surname>, <given-names>E.</given-names></name>, <name><surname>Moens</surname>, <given-names>B.</given-names></name>, <name><surname>Buhmann</surname>, <given-names>J.</given-names></name>, <name><surname>Demey</surname>, <given-names>M.</given-names></name>, <name><surname>Coorevits</surname>, <given-names>E.</given-names></name>, <name><surname>Dalla Bella</surname>, <given-names>S.</given-names></name>, &#x26; <name><surname>Leman</surname>, <given-names>M.</given-names></name></person-group> (<year>2015</year>). <article-title>Spontaneous entrainment of running cadence to music tempo.</article-title> <source>Sports Medicine - Open</source>, <volume>1</volume>, <fpage>15</fpage>. <pub-id specific-use="author" pub-id-type="doi">10.1186/s40798-015-0025-9</pub-id><issn>2199-1170</issn></mixed-citation></ref>
<ref id="b38"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Waterhouse</surname>, <given-names>J.</given-names></name>, <name><surname>Hudson</surname>, <given-names>P.</given-names></name>, &#x26; <name><surname>Edwards</surname>, <given-names>B.</given-names></name></person-group> (<year>2010</year>, <month>August</month>). <article-title>Effects of music tempo upon submaximal cycling performance.</article-title> <source>Scandinavian Journal of Medicine &#x26; Science in Sports</source>, <volume>20</volume>(<issue>4</issue>), <fpage>662</fpage>-<lpage>669</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1111/j.1600-0838.2009.00948.x</pub-id><pub-id pub-id-type="pmid">19793214</pub-id><issn>0905-7188</issn></mixed-citation></ref>
<ref id="b15"><mixed-citation specific-use="restruct" publication-type="journal"><person-group person-group-type="author"><name><surname>Wurtz</surname>, <given-names>P.</given-names></name>, <name><surname>Mueri</surname>, <given-names>R. M.</given-names></name>, &#x26; <name><surname>Wiesendanger</surname>, <given-names>M.</given-names></name></person-group> (<year>2009</year>, <month>April</month>). <article-title>Sight-reading of violinists: Eye movements anticipate the musical flow.</article-title> <source>Experimental Brain Research</source>, <volume>194</volume>(<issue>3</issue>), <fpage>445</fpage>-<lpage>450</lpage>. <pub-id specific-use="author" pub-id-type="doi">10.1007/s00221-009-1719-3</pub-id><pub-id pub-id-type="pmid">19205680</pub-id><issn>0014-4819</issn></mixed-citation></ref>
</ref-list>
</back>
</article>
