<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.0 20120330//EN" "JATS-journalpublishing1.dtd">

<article article-type="research-article" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:mml="http://www.w3.org/1998/Math/MathML">
 <front>
    <journal-meta>
	<journal-id journal-id-type="publisher-id">Jemr</journal-id>
      <journal-title-group>
        <journal-title>Journal of Eye Movement Research</journal-title>
      </journal-title-group>
      <issn pub-type="epub">1995-8692</issn>
	  <publisher>								
	  <publisher-name>Bern Open Publishing</publisher-name>
	  <publisher-loc>Bern, Switzerland</publisher-loc>
	</publisher>
    </journal-meta>
    <article-meta>
	<article-id pub-id-type="doi">10.16910/jemr.11.2.10</article-id> 
	  <article-categories>								
				<subj-group subj-group-type="heading">
					<subject>Research Article</subject>
				</subj-group>
		</article-categories>
      <title-group>
        <article-title>The impact of music and stretched time on pupillary responses and eye movements in slow-motion film scenes</article-title>
      </title-group>
	   <contrib-group> 
				<contrib contrib-type="author">
					<name>
						<surname>Hammerschmidt</surname>
						<given-names>David</given-names>
					</name>
					<xref ref-type="aff" rid="aff1">1</xref>
				</contrib>
				<contrib contrib-type="author">
					<name>
						<surname>Wöllner</surname>
						<given-names>Clemens</given-names>
					</name>
					<xref ref-type="aff" rid="aff1">1</xref>
				</contrib>				
        <aff id="aff1">
		<institution>Universität Hamburg, Hamburg</institution>,   <country>Germany</country>
        </aff>
		</contrib-group>   

		
	  <pub-date date-type="pub" publication-format="electronic"> 
		<day>20</day>  
		<month>5</month>
        <year>2018</year>
      </pub-date>
	  <pub-date date-type="collection" publication-format="electronic"> 
	  <year>2018</year>
	</pub-date>
      <volume>11</volume>
      <issue>2</issue>
	 <elocation-id>10.16910/jemr.11.2.10</elocation-id> 
	<permissions> 
	<copyright-year>2018</copyright-year>
	<copyright-holder>Hammerschmidt, D., and Wöllner, C.</copyright-holder>
	<license license-type="open-access">
  <license-p>This work is licensed under a Creative Commons Attribution 4.0 International License, 
  (<ext-link ext-link-type="uri" xlink:href="https://creativecommons.org/licenses/by/4.0/">
    https://creativecommons.org/licenses/by/4.0/</ext-link>), which permits unrestricted use and redistribution provided that the original author and source are credited.</license-p>
</license>
	</permissions>
      <abstract>
        <p>This study investigated the effects of music and playback speed on arousal and visual perception in slow-motion scenes taken from commercial films. Slow-motion scenes are a ubiquitous film technique and highly popular. Yet the psychological effects of mediated time-stretching compared to real-time motion have not been empirically investigated. We hypothesised that music affects arousal and attentional processes. Furthermore, we as-sumed that playback speed influences viewers’ visual perception, resulting in a higher number of eye movements and larger gaze dispersion. Thirty-nine participants watched three film excerpts in a repeated-measures design in conditions with or without music and in slow motion vs. adapted real-time motion (both visual-only). Results show that music in slow-motion film scenes leads to higher arousal compared to no music as indicated by larger pupil diameters in the former. There was no systematic effect of music on visual perception in terms of eye movements. Playback speed influenced visual perception in eye movement parameters such that slow motion resulted in more and shorter fixations as well as more saccades compared to adapted real-time motion. Furthermore, in slow motion there was a higher gaze dispersion and a smaller centre bias, indicating that individuals attended to more detail in slow motion scenes.</p>
      </abstract>
      <kwd-group>
        <kwd>eye tracking</kwd>
        <kwd>gaze</kwd>
        <kwd>pupillometry</kwd>
        <kwd>saccades</kwd>
        <kwd>fixations</kwd>
        <kwd>blinks</kwd>
        <kwd>pupil diameter</kwd>
        <kwd>emotion</kwd> 
        <kwd>attention</kwd> 
        <kwd>perception</kwd>
        <kwd>film music</kwd>                                             
      </kwd-group>
    </article-meta>
  </front>	
  <body>

    <sec id="S1">
      <title>Introduction</title>

<p>The eyes are often called the window to our soul, which seems accurate in
the sense that a person’s eyes provide a lot of information regarding
emotional states (<xref ref-type="bibr" rid="b1">1</xref>). Music can modulate these states and is
used for psychological functions such as management of self-identity,
interpersonal relationships and mood in everyday life (<xref ref-type="bibr" rid="b2 b3 b4">2, 3, 4</xref>). In
other words, music can deeply move us. As for social and personal
contexts, music in film may induce emotions and associations as well,
and, even if not perceived consciously, may affect the viewer
substantially. Previous research has shown that film music can influence
the general and emotional meaning of a film scene (<xref ref-type="bibr" rid="b5 b6">5, 6</xref>), a
character’s likeability, and may modulate empathic concern and accuracy
in the viewer (<xref ref-type="bibr" rid="b7">7</xref>). It can affect cognitive functions such as
viewers’ attention (<xref ref-type="bibr" rid="b8">8</xref>) and memory (<xref ref-type="bibr" rid="b9">9</xref>). Although there
is considerable evidence for strong links between music and visual film
perception, relatively little empirical research has been done on the
interaction between these two domains.</p>

<p>Slow-motion scenes, meaning the artificial slowing down of playback
speed, is a relatively common film technique, which has recently seen a
rise in popularity even outside of the commercial film domain, as the
numbers of slow-motion videos on various internet platforms indicate.
This trend is substantially due to technical advances and the inclusion
of slow-motion functions in many smartphones, indicating and driving
fascination as well as demand for such time-stretched videos. In film,
slow-motion scenes are typically combined with emotionally expressive
music (<xref ref-type="bibr" rid="b10 b11">10, 11</xref>). We propose that they simulate psychological
situations of high emotional significance. In life-threatening
situations, for instance, it is known that a majority of individuals
subjectively perceive time to be slowed down (<xref ref-type="bibr" rid="b12">12</xref>). In fact, due
to higher arousal, it can be assumed that their cognitive processing is
faster and that individuals can thus attend to more detail in shorter
time. As a consequence, time appears to have passed more slowly in
retrospect as compared to normal situations. Film techniques have long
used these effects in decelerating playback speed. The viewers may focus
on different parts of a scene and grasp more detail of the presented
situation. They may also associate moments of heightened emotional
states with these films scenes. To the best of our knowledge, no
research has empirically investigated effects of slow-motion scenes on
viewers. We thus aimed at investigating the impact of emotional music
and playback speed in slow-motion film scenes on viewers’ responses of
their eye behaviour, both in terms of arousal and visual attention.</p>

<p>Examining individuals’ pupillary responses and eye movements provides
insights into the underlying mechanisms of emotional film perception. In
addition, eye movements are genuine processes of embodied cognition,
since the muscles involved in the movements facilitate perception and
attentional control (<xref ref-type="bibr" rid="b13 b14">13, 14</xref>). Nevertheless, only saccades are
under conscious control. It seems plausible that in highly emotional
situations such as in slow-motion scenes, visual attention is guided by
bodily processes that are to some degree co-experienced by viewers in
action-perception coupling. Pupillary changes have been shown to be
partly determined by emotional arousal and also correlate with skin
conductance changes in picture viewing, supporting the assumption that
the sympathetic nervous system modulates these processes (<xref ref-type="bibr" rid="b1">1</xref>).
One of the few experiments investigating pupillary responses in relation
to music was carried out by Gingras, Marin, Puig-Waldmüller, and Fitch
(<xref ref-type="bibr" rid="b15">15</xref>). Results of their study revealed correlations between
arousal and tension assessment as well as pupillary responses,
suggesting that pupil diameter is a psychophysiological parameter
sensitive to emotions evoked by music. A recent study by Laeng, Eidet,
Sulutvedt, and Panksepp (<xref ref-type="bibr" rid="b16">16</xref>) yielded similar conclusions. In
their study, pupil diameter was evaluated for music-induced chills.
Results showed that pupil size was larger at times when participants
experienced chills while listening to music, indicating that measuring
pupillary responses can reveal temporally fine-grained changes of
induced arousal.</p>

<p>When processing information consciously, visual attention makes it
necessary that one’s eyes are focused on the specific area from which
the information is to be extracted (<xref ref-type="bibr" rid="b17">17</xref>). Visual attention and
the oculomotor system are strongly linked as explained by the premotor
theory of spatial attention (<xref ref-type="bibr" rid="b18">18</xref>). Studies found that eye
movements performed while watching dynamic scenes are highly consistent
across viewers as well as in repeated viewing (<xref ref-type="bibr" rid="b19 b20 b21">19, 20, 21</xref>). This
consistency has been shown to be highest for strongly edited films, such
as Hollywood movies, compared to natural scenes (<xref ref-type="bibr" rid="b21">21</xref>) or street
scenes (<xref ref-type="bibr" rid="b20">20</xref>), suggesting that eye movements are influenced by
editing style and constrained to its dynamics (<xref ref-type="bibr" rid="b22 b23">22, 23</xref>). In line
with this finding are further results regarding shot cuts, an abrupt
transition from one scene to another, which have an impact on gaze
behaviour, with eye movements more influenced by them compared to
contextual information (<xref ref-type="bibr" rid="b24 b25">24, 25</xref>). This is probably due to
low-level visual features driving viewers’ attention (<xref ref-type="bibr" rid="b26">26</xref>). A
commonly observed gaze behaviour while watching dynamic scenes is the
so-called centre bias, which describes the tendency to look at the
centre of a motion picture and is regarded as the optimal position for
gaining an overview of a dynamic scene (<xref ref-type="bibr" rid="b27 b28 b21">27, 28, 21</xref>). The centre
bias seems to occur across various video genres (<xref ref-type="bibr" rid="b24">24</xref>) and can
even be observed in static scenes (<xref ref-type="bibr" rid="b29">29</xref>). Furthermore, motion has
been shown to be a strong predictor for eye movements, since motion and
temporal changes are considered to be one of the highest attractors of
attention (<xref ref-type="bibr" rid="b30 b31 b24">30, 31, 24</xref>).</p>

<p>The impact of sound on eye movements has not been studied
extensively, despite the fact that auditory and visual information can
profoundly influence each other, as for example the well-known “McGurk
effect” has demonstrated (<xref ref-type="bibr" rid="b32">32</xref>). Another, more recent example for
how an auditory signal can influence visual perception stems from van
der Burg, Olivers, Bronkhorst, and Theeuwes (<xref ref-type="bibr" rid="b33">33</xref>), who showed
that a non-spatial auditory signal can improve spatial visual search,
called the “pip and pop effect”. Examples for visual information in
musical performance videos such as musicians’ body movements influencing
auditory perception can be found in (<xref ref-type="bibr" rid="b34 b35">34, 35</xref>).</p>

<p>Sound may influence visual processes even on a more basic level.
Smith and Martin-Portugues Santacreu (<xref ref-type="bibr" rid="b36">36</xref>) investigated
match-action editing in film, an editing technique causing global change
blindness, which describes the inability to detect shot cuts in edited
film. This blindness occurs when a cut coincides with a sudden onset of
motion. In their study, the authors varied audio conditions (original
soundtrack vs. silence) in eighty film clips. Results show that sound
plays an important role in creating editing blindness. Cut detection
rate was significantly reduced and cut detection time was faster in the
silent condition. This suggests that with audio, either viewers were
more engaged with the visual content or less cognitive resources were
allocated to cut detection.</p>

<p>One of the few experiments investigating the impact of music on eye
movements was carried out by Schäfer and Fachner (<xref ref-type="bibr" rid="b37">37</xref>).
Participants watched pictures and video clips while listening to their
favourite music, unknown music, or no music. Results showed that music
had a significant effect on individuals’ eye movements. Music caused
participants to fixate longer, to perform fewer saccades, and to blink
more often in the music conditions than in the visual-only condition,
indicating that music reduces eye movements. Musical preference
(favourite vs. unknown music) did not influence eye movements. The
authors suggest that when listening to music, people may shift their
attention away from processing sensory information, and instead direct
their attention towards inner experiences such as emotions, thoughts and
memories. This assumption is also based on previous findings, suggesting
that higher blink rate is associated with decreased exogenous attention
(<xref ref-type="bibr" rid="b38">38</xref>), whereas high vigilance is associated with a higher
fixation rate (<xref ref-type="bibr" rid="b39">39</xref>). In line with this assumption are results of
an earlier study by Stern, Walrath, and Goldstein (<xref ref-type="bibr" rid="b40">40</xref>), showing
that sustained visual attention is associated with a decreased blink
rate. As Schäfer and Fachner point out, the conclusion regarding
attentional shifts is preliminary and needs further investigation.
Nonetheless, music might cause attentional shifts away from the
environment towards inward experiences (<xref ref-type="bibr" rid="b41 b42 b43">41, 42, 43</xref>).</p>

<p>Effects of film music on visual attention was also investigated by
Mera and Stumpf (<xref ref-type="bibr" rid="b44">44</xref>) using a film scene from “The Artist”. In
their study, the scene was presented in three conditions: silence,
focusing music that matched the narrative dynamics of the scene, or
distracting music that was expected to shift participants’ visual
attention frequently. Results showed that music influenced visual
attention in terms of fixation parameters. While distracting music
increased the number of fixations, focusing music increased fixation
duration, suggesting that music may guide the visual exploration of
dynamic scenes. Compared to the silent condition, both music conditions
led to less scene exploration. The authors conclude that targets were
focused more quickly with music and that the overall focus of attention
was reduced. Another study shows that music can influence fixations
while watching film scenes (<xref ref-type="bibr" rid="b45">45</xref>). Effects of music (soft vs.
intense music) on fixation durations varied between film scenes, showing
that music can shorten and lengthen fixation durations according to
visual dynamics. Auer et al. (<xref ref-type="bibr" rid="b46">46</xref>) investigated the influence of
music on viewers’ eye movements using two scenes, one from a documentary
and one from a film, and three different musical conditions: horror film
music, documentary film music, or no music. Their results showed no
influence of music on the number of fixations. An unexpected event in
the scenes (a red X occurring on screen) was perceived more often in the
conditions with music than without music, leading Auer et al. to the
conclusion that music systematically affected viewers’ visual attention
and related eye movements.</p>

<p>Other research, nevertheless, did not find systematic effects of
non-diegetic sounds, such as underlying music, on eye movements.
Coutrot, Guyader, Ionescu, and Caplier (<xref ref-type="bibr" rid="b47">47</xref>) investigated the
influence of film music on eye movements using 50 video sequences
including the corresponding soundtracks. Results indicate that in the
beginning of scene exploration, eye movement dispersion was not affected
by sound, yet at a later phase in scene perception, dispersion was lower
and the distance to the centre was higher with soundtracks than without,
also shown by larger saccades and differences in fixation locations. The
authors point out that the effect of sound is not constant over time and
is strongly affected by shot cuts, since no effect between conditions
was observed immediately after the shot cuts. In a further study,
Coutrot and Guyader (<xref ref-type="bibr" rid="b48">48</xref>) investigated different types of
non-diegetic sounds (unrelated speech, abrupt natural sounds, or
continuous natural sounds) in one-shot conversation scenes taken from
Hollywood-like French movies showing complex natural environments. There
were no differences in gaze dispersion, saccadic amplitudes, fixation
durations, scanpaths, or fixation ratios. They hypothesised that
“unrelated [non-diegetic] soundtracks are not correlated enough with the
visual information to be bound to it, preventing any further
integration” (p. 14). In a study by Smith (<xref ref-type="bibr" rid="b49">49</xref>), gaze behaviour
did not significantly change while watching the film “Alexander Nevsky”
with music compared to no music. The author suggests that the visual
information was prioritised over its auditory counterpart. Attentional
synchrony between participants, describing the spontaneous clustering of
gaze during film viewing, was highest immediately after shot cuts and in
scenes with minimal pictorial detail, and dropped significantly in more
complex ones.</p>

<p>Taken together, music in film seems to affect the viewer’s perception
in a considerable manner. Compared to visual-only scene perception,
music can influence the focus of attention and the interpretation of a
scene and its characters. Furthermore, music seems to cause a reduction
of eye movements while watching static and, to a certain extent dynamic
scenes, resulting in less gaze dispersion and less emphasis on the
centre of a scene. These effects are significantly reduced in strongly
edited scenes such as Hollywood movies compared to natural dynamic
scenes. It is plausible that the inherent scene dynamics and shot cuts
outweigh potential influences of the auditory signal. Not much is known
about how systematic the influence of music is on visual film
perception. Previous research shows strong links between music and
visual material in film perception, yet the number of studies which
looked into such effects on dynamic scene perception is sparse and the
results are, to some extent, inconclusive. There is a clear need for
more empirical investigations on the cross-modal effects involved in
film perception. Findings along these lines may not only be informative
for film producers and sound designers in various genres, but may also
enhance our knowledge of audiovisual perception more generally. We could
not find any study that has empirically investigated the role of music
in slow-motion film scenes, and how stretched time would affect viewers’
visual attention compared to the same scenes in real-time motion.</p>

<p>In the current study, we addressed three main hypotheses concerning
the impact of music in slow-motion film scenes on viewers’ physiological
and attentional responses. Based on the research discussed above, we
first hypothesised that presenting scenes with music compared to no
music results in viewers being more aroused, and that music influences
visual attention and perception. Specifically, we expected that average
pupil diameter would be larger when watching slow-motion scenes with
music than without music, indicating higher arousal (<xref ref-type="bibr" rid="b15 b16">15, 16</xref>),
and that music would cause a reduction in gaze behaviour
(<xref ref-type="bibr" rid="b44 b37">44, 37</xref>). Second, we assumed that playback speed (slow motion
vs. adapted real-time motion) influences viewers’ visual perception,
allowing for a more dispersed gaze behaviour and more attention to
detail. Third, since previous research showed that gaze behaviour is
strongly constrained by the dynamics of a given scene, we expected that
different slow-motion scenes would influence gaze behaviour according to
the scene dynamics – that is, the number of shot cuts and the pictorial
complexity (<xref ref-type="bibr" rid="b22">22</xref>).</p>
    </sec>
	
    <sec id="S2">
      <title>Methods</title>

<p>The current study was part of a larger research project investigating
the effects of music and playback speed in slow-motion scenes on
subjectively reported emotional meaning, psychophysiological responses
and perceived durations based on video clips from different genres (cf.
(<xref ref-type="bibr" rid="b50">50</xref>)). In the current study, we focused on analyses of eye
movement parameters and pupillary responses in slow-motion scenes taken
from commercial films including the corresponding soundtracks.</p>

    <sec id="S2a">
      <title>Participants</title>

<p>Forty-two participants took part in the study. Three participants had
to be excluded from analysis due to technical failure in the recording
process, and in one case due to uncorrected vision impairments.
Therefore, analysis was based on data from thirty-nine participants,
among whom twenty-one were male, with a mean age of 24.00 years
(<italic>SD</italic> = 4.23). All of them had normal or
corrected-to-normal vision and hearing. Self-reported musical experience
(playing an instrument actively) varied between none and fifteen years
(<italic>M</italic> = 6.33, <italic>SD</italic> = 5.34). None of the
participants had extensive experience in film making (<italic>M</italic>
= 2.38, <italic>SD</italic> = 1.76), rated on a discrete point scale,
ranging from 1 (not at all) to 7 (very much). Participants took part in
accordance with the guidelines of the local Ethics Committee.</p>
    </sec>
	
    <sec id="S2b">
      <title>Design</title>

<p>Participants watched slow-motion film excerpts in a multimodal
repeated-measures design. The excerpts were presented in original
audiovisual (slow motion with music) and in manipulated visual-only
conditions (slow motion without music and adapted real-time motion
without music). The design consisted of the factor Modality (audiovisual
vs. visual-only, both for original slow-motion scenes) and factor Tempo
(slow motion vs. adapted real-time motion, both visual-only). A third
factor consisted of the three film excerpts, correspondingly with three
levels. Taken together, each participant watched a total of 3 x 2 x 2
stimuli.</p>
    </sec>
	
    <sec id="S2c">
      <title>Materials</title>

<p>Film excerpts were meant to account for different dynamics and
complexities in slow-motion scenes, and were thus selected according to
the following criteria: original slow-motion scenes with non-diegetic
music as soundtracks, no spoken words nor any other diegetic sounds, and
varying complexity (i.e., number of shot cuts, camera movement, number
of actors visible, and amount of human motion). The three selected film
excerpts are specified in Table 1. All three scenes were presented with
their corresponding soundtracks (film music) in the audiovisual
condition.</p>

<p>The first slow-motion scene was taken from “A Clockwork Orange”
(<xref ref-type="bibr" rid="b51">51</xref>), in which character Alex attacks his friends Georgie and
Dim next to a river. The scene included four shot cuts and was combined
with the music “La gazza ladra – Overture” (The Thieving Magpie) by
Gioachino Rossini. The second slow-motion scene was taken from “Forrest
Gump” (<xref ref-type="bibr" rid="b52">52</xref>) in which character Forrest is chased by other
children and, while running away from them, breaks of his leg braces.
The scene included eight shot cuts and was presented with the music “Run
Forrest Run” by Alan Silvestri. The third slow-motion scene was taken
from “Silent Youth” (<xref ref-type="bibr" rid="b53">53</xref>), showing multiple people from behind
walking along a pedestrian passageway in a Berlin train station. The
scene was filmed as a one-take shot, therefore included no shot cuts and
used a static camera position. This scene was presented with an
atmospheric piano sound playing D3 notes repetitively, and was composed
by Florian Mönks.</p>

<p>Apart from the described film music, there were no other sounds
audible in the excerpts. For illustrations of the film excerpts and
their different shots, see Figure 1. Film excerpts were manipulated
using Premiere Pro CC 2016 (Adobe Systems). For the visual-only
conditions, audio tracks were removed completely, thus the excerpts were
presented silently. In order to compare playback speeds, excerpts were
sped-up according to real-time motion speed. Appropriate speed-up
factors for each excerpt were determined in a pilot study, resulting in
different speed-up factors for each excerpt (Table 1). In the pilot
study, four experienced participants including the authors rated
different adapted playback speeds for each excerpt until unanimous
agreement was reached on appropriate real-time motion.</p>

<fig id="fig01" fig-type="figure" position="float">
					<label>Figure 1.</label>
					<caption>
						<p>Illustration of film excerpts showing single frames approximately one second after shot onset for “A Clockwork Orange” and “Forrest Gump“. “Silent Youth” is illustrated by single frames every eight seconds since the scene did not include shot cuts. Timecode values (25-fps) specify the time from scene onset.</p>
					</caption>
					<graphic id="graph01" xlink:href="jemr-11-02-j-figure-01.png"/>
				</fig>

<table-wrap id="t01" position="float">
					<label>Table 1.</label>
					<caption>
						<p>Details of the slow-motion film excerpts.</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">
						<thead>
      <tr>
        <th><bold>Excerpt</bold></th>
        <th colspan="4"><bold>Duration (sec)</bold></th>
      </tr>
    </thead>
    <tbody>
      <tr>
        <td></td>
        <td>Start of excerpt (min:sec)</td>
        <td>Slow motion</td>
        <td>Adapted real-time motion</td>
        <td>Speed-up factor</td>
      </tr>
      <tr>
        <td>CO: A Clockwork Orange</td>
        <td>31:56</td>
        <td>22.07</td>
        <td>5.66</td>
        <td>3.9</td>
      </tr>
      <tr>
        <td>FG: Forrest Gump</td>
        <td>16:32</td>
        <td>26.20</td>
        <td>10.92</td>
        <td>2.4</td>
      </tr>
      <tr>
        <td>SY: Silent Youth</td>
        <td>69:00</td>
        <td>40.00</td>
        <td>20.00</td>
        <td>2.0</td>
      </tr>
    </tbody>
  </table>
</table-wrap>

    </sec>
	
    <sec id="S2d">
      <title>Procedure</title>

<p>After providing informed consent, participants were introduced to the
task to watch the film excerpts attentively. Participants sat 60–80 cm
in front of a computer monitor. A REDn eye tracker (SensoMotoric
Instruments) was placed at the bottom of the monitor, and the system was
calibrated for each participant before stimulus presentation. Stimuli
were presented centralised on 75% of the screen with black edges on each
side. Before a stimulus appeared, a white fixation cross on black
background was shown for two seconds in the centre of the screen.
Participants were asked to look at it as soon as it was visible until
the stimulus onset. Stimuli were presented via E-Prime 2 Pro (Psychology
Software Tools). Stimuli were presented in individually randomised
orders for each participant, including all film excerpts under all
conditions in one block. Audio was presented via headphones
(Beyerdynamic DT-880 Pro) using a Steinberg UR242 audio interface. Each
participant was tested individually and under uniform conditions (e.g.
same room, position, and room brightness).</p>
    </sec>
	
    <sec id="S2e">
      <title>Data analysis</title>

<p>Pupil diameters were calculated for each sample (60 Hz) and
individually for the left and right eye of each participant and for each
stimulus. Since missing data samples (5.45%) resulting in zero values were most
likely caused by blinks, we interpolated zero data of each time series
using Piecewise Cubic Hermite Interpolating Polynomial (PCHIP) in Matlab
R2016b (MathWorks). After interpolation, data was averaged over both
eyes and stimulus duration before separate repeated-measures Analyses of
Variance (ANOVAs) were run on average pupil diameter according to
factors Modality and Tempo. Each of the three excerpts was analysed as a
factor as well, since the different dynamics and complexities were
expected to cause participants’ gaze to vary considerably. We analysed
fixation duration, fixation frequency, saccadic frequency and blink
frequency, averaged over both eyes of each participant and stimulus as
measures for visual perception. These eye movement parameters can be
considered standard measures in eye movement research (<xref ref-type="bibr" rid="b54 b55">54, 55</xref>),
and are well-suited for examining systematic effects of scene perception
during test conditions. For event detection, we used an interpolated
dispersion based algorithm (BeGaze 3.7, SensoMotoric Instruments). The
minimum fixation duration was set to 100 ms using the default settings
for maximum dispersion. Before ANOVAs were computed, data was checked
for outliers. Data was discarded when values exceeded three standard
deviations. Outlier detection resulted in the discarding of six data
points (0.43%). If the data did not meet the sphericity assumption, a
Greenhouse-Geisser correction was used. Post-hoc comparisons were
calculated with a Bonferroni adjustment.</p>

<p>To check for general effects of test conditions on gaze dispersion,
dwell times for gridded areas were calculated. Dwell time is the sum of
all fixations and saccades within a defined area. For this purpose, each
stimulus was divided into 16 x 16 gridded areas, resulting in 256
defined grids, each covering 0.4% of the area that the eye tracker was
calibrated for, including the black edges. We computed dwell times on
every grid for each eye of each participant. Dwell times were then
averaged over both eyes and participants and normalised over time. This
procedure resulted in standardised dwell time profiles for each excerpt
and condition, in which each grid shows the average dwell time per
second in milliseconds. In the last step, these dwell time profiles were
averaged over all film excerpts according to conditions (slow motion vs.
adapted real-time motion, audiovisual vs. visual-only) to check for
general effects, independent from specific scene dynamics. Dwell time
profiles were compared using chi<sup>2</sup>-tests of goodness of fit to
check for differences in gaze dispersions between conditions, and
paired-samples t-tests were used to compare dwell times of centre
grids.</p>
    </sec>
    </sec>

    <sec id="S3">
      <title>Results</title>

<p>The results are presented in the following way: Results for effects
of music (i.e., Modality) on pupillary responses, eye movements, and
gaze dispersion are reported first, followed by effects of playback
speed (i.e., Tempo) on the same parameters. Effects of the three
slow-motion film excerpts are included in main analyses of the factors
Modality and Tempo.</p>

    <sec id="S3a">
      <title>Music effects: Audiovisual vs. visual-only</title>

<p>Pupillary responses: The ANOVA on average pupil diameter yielded a
main effect for factor Modality [F(1, 38) = 65.66, p &#x3C; .001,
ƞ<sub>P</sub><sup>2</sup> = .63], indicating that participants’ pupil
diameters were larger in the audiovisual compared to the visual-only
condition. On average, pupil diameter differed by 0.12 mm between
conditions (SE = 0.01), suggesting that participants were more aroused
when watching the slow motion excerpts with music than without music
(Table 2). As expected, there was also a main effect for factor Excerpt
[F(2, 76) = 44.61, p &#x3C; .001, ƞ<sub>P</sub><sup>2</sup> = .54].
Post-hoc analysis revealed that pupil diameters were largest for
“Forrest Gump” (henceforth FG) compared to the other two excerpts (both
p &#x3C; .001). “A Clockwork Orange” (henceforth CO) and “Silent Youth”
(henceforth SY) did not differ in average pupil diameter (p &#x3E; .05).
The interaction between main factors was not significant (p &#x3E;
.05).</p>

<table-wrap id="t02" position="float">
					<label>Table 2.</label>
					<caption>
						<p>Eye tracking parameters (means and standard deviations) of
each film excerpt according to factors Modality and Tempo.</p>
					</caption>
					<table frame="hsides" rules="groups" cellpadding="3">
    <thead>
      <tr>
        <th><bold>Eye tracking</bold>
        <bold>parameters</bold></th>
        <th><bold>Excerpt</bold></th>
        <th colspan="2"><bold>Factor Modality</bold></th>
        <th colspan="2"><bold>Factor Tempo</bold></th>
      </tr>
    </thead>
    <tbody>
      <tr>
        <td></td>
        <td></td>        
        <td>Audiovisual</td>
        <td>Visual-only</td>
        <td>Slow motion</td>
        <td>Adapted real-time motion</td>
      </tr>
      <tr>
        <td></td>
        <td></td>        
        <td><italic>M</italic> (<italic>SD</italic>)</td>
        <td><italic>M</italic> (<italic>SD</italic>)</td>
        <td><italic>M</italic> (<italic>SD</italic>)</td>
        <td><italic>M</italic> (<italic>SD</italic>)</td>
      </tr>
      <tr>
        <td rowspan="4">Pupil Diameter
        (mm)</td>
        <td>CO</td>
        <td>4.23 (0.59)</td>
        <td>4.11 (0.60)</td>
        <td>4.11 (0.60)</td>
        <td>4.30 (0.60)</td>
      </tr>
      <tr>

        <td>FG</td>
        <td>4.43 (0.63)</td>
        <td>4.29 (0.64)</td>
        <td>4.29 (0.64)</td>
        <td>4.35 (0.63)</td>
      </tr>
      <tr>

        <td>SY</td>
        <td>4.18 (0.61)</td>
        <td>4.09 (0.60)</td>
        <td>4.09 (0.60)</td>
        <td>4.21 (0.62)</td>
      </tr>
      <tr>

        <td>Mean</td>
        <td>4.28 (0.60)***</td>
        <td>4.16 (0.60)</td>
        <td>4.16 (0.60)</td>
        <td>4.29 (0.61)***</td>
      </tr>
      <tr>
        <td rowspan="4">Fixations/sec</td>
        <td>CO</td>
        <td>1.77 (0.36)</td>
        <td>1.74 (0.31)</td>
        <td>1.74 (0.31)</td>
        <td>1.57 (0.45)</td>
      </tr>
      <tr>

        <td>FG</td>
        <td>1.34 (0.39)</td>
        <td>1.38 (0.33)</td>
        <td>1.38 (0.33)</td>
        <td>1.34 (0.41)</td>
      </tr>
      <tr>

        <td>SY</td>
        <td>1.29 (0.39)</td>
        <td>1.42 (0.40)</td>
        <td>1.42 (0.40)</td>
        <td>1.23 (0.50)</td>
      </tr>
      <tr>

        <td>Mean</td>
        <td>1.47 (0.27)</td>
        <td>1.52 (0.28)</td>
        <td>1.52 (0.28)**</td>
        <td>1.38 (0.36)</td>
      </tr>
      <tr>
        <td rowspan="4">Mean Fixation Duration (ms)</td>
        <td>CO</td>
        <td>528.69 (136.57)</td>
        <td>538.04 (112.83)</td>
        <td>538.40 (114.37)<sup>1</sup></td>
        <td>667.13 (244.06)</td>
      </tr>
      <tr>

        <td>FG</td>
        <td>737.09 (248.76)</td>
        <td>693.02 (191.69)</td>
        <td>696.86 (192.85)<sup>1</sup></td>
        <td>756.91 (272.34)</td>
      </tr>
      <tr>

        <td>SY</td>
        <td>723.10 (258.13)</td>
        <td>659.13 (217.95)</td>
        <td>652.93 (217.53)<sup>1</sup></td>
        <td>807.24 (362.86)</td>
      </tr>
      <tr>

        <td>Mean</td>
        <td>674.45 (165.12)</td>
        <td>645.10 (168.62)</td>
        <td>629.28 (174.92)<sup>1</sup></td>
        <td>768.11 (228.14)***</td>
      </tr>
      <tr>
        <td rowspan="4">Saccades/sec</td>
        <td>CO</td>
        <td>1.64 (0.35)</td>
        <td>1.64 (0.33)</td>
        <td>1.64 (0.33)</td>
        <td>1.52 (0.44)</td>
      </tr>
      <tr>

        <td>FG</td>
        <td>1.22 (0.37)</td>
        <td>1.27 (0.32)</td>
        <td>1.27 (0.32)</td>
        <td>1.23 (0.37)</td>
      </tr>
      <tr>

        <td>SY</td>
        <td>1.13 (0.40)</td>
        <td>1.26 (0.39)</td>
        <td>1.26 (0.39)</td>
        <td>1.09 (0.49)</td>
      </tr>
      <tr>

        <td>Mean</td>
        <td>1.33 (0.26)</td>
        <td>1.39 (0.28)</td>
        <td>1.39 (0.28)**</td>
        <td>1.28 (0.35)</td>
      </tr>
      <tr>
        <td rowspan="4">Blinks/sec</td>
        <td>CO</td>
        <td>0.21 (0.23)</td>
        <td>0.17 (0.19)</td>
        <td>0.17 (0.19)</td>
        <td>0.06 (0.10)</td>
      </tr>
      <tr>

        <td>FG</td>
        <td>0.24 (0.21)</td>
        <td>0.19 (0.15)</td>
        <td>0.19 (0.15)</td>
        <td>0.17 (0.19)</td>
      </tr>
      <tr>

        <td>SY</td>
        <td>0.29 (0.21)</td>
        <td>0.29 (0.23)</td>
        <td>0.29 (0.23)</td>
        <td>0.26 (0.22)</td>
      </tr>
      <tr>

        <td>Mean</td>
        <td>0.27 (0.23)*</td>
        <td>0.25 (0.21)</td>
        <td>0.25 (0.21)**</td>
        <td>0.20 (0.22)</td>
      </tr>
    </tbody>
  </table>
					<table-wrap-foot>
						<fn id="FN1">
<p>Note. Asterisks indicate the significant higher values of main
effects between conditions for factors Modality (both slow motion) and
Tempo (both visual-only).</p>

<p><sup>1</sup> Fixation duration values between visual-only and slow
motion, which are otherwise similar, vary slightly due to listwise
discard of outliers in comparisons.</p>            
						</fn>
					</table-wrap-foot>  
</table-wrap>

<p>As shown in Figure 2, the effect of larger pupil diameter in the
audiovisual condition, compared to visual-only, was consistently higher
over the three excerpts and relatively constant over time. Especially
for CO and FG, which included more editing and close-up shots than SY,
the progression of pupil diameter seems to be remarkably similar across
conditions. Despite the higher arousal in the audiovisual condition,
then, there seems to be a high consistency in scene perception across
both conditions.</p>

<fig id="fig02" fig-type="figure" position="float">
					<label>Figure 2.</label>
					<caption>
						<p>Average pupil diameters (solid lines) and standard errors (shaded areas) for each sample point in audiovisual and visual-only conditions. Vertical grey lines represent stimulus onsets (at 2 second mark) and shot cuts.</p>
					</caption>
					<graphic id="graph02" xlink:href="jemr-11-02-j-figure-02.png"/>
				</fig>

<p>Eye movements: We analysed eye movement parameters in relation to the
factor Modality (Table 2). Fixation frequency did not differ between
audiovisual and visual-only conditions [<italic>F</italic>(1, 38) =
2.41, <italic>p</italic> &#x3E; .05,
<italic>ƞ<sub>P</sub></italic><sup>2</sup> = .06], suggesting that
participants had comparable numbers of fixations per second with or
without music. Fixation frequency differed between excerpts
[<italic>F</italic>(2, 76) = 40.62, <italic>p</italic> &#x3C; .001,
<italic>ƞ<sub>P</sub></italic><sup>2</sup> = .52], and participants
performed more fixations while watching CO than the other two films
(both <italic>p</italic> &#x3C; .001). No further post-hoc differences
were observed, and main factors did not interact (<italic>p</italic>
&#x3E; .05).</p>

<p>Fixation durations did not significantly differ between conditions
either, as factor Modality showed no main effect [<italic>F</italic>(1,
38) = 2.64, <italic>p</italic> &#x3E; .05,
<italic>ƞ<sub>P</sub></italic><sup>2</sup> = .06]. Fixations were
slightly longer in the audiovisual condition, yet the effect did not
reach significance. Excerpts influenced average fixation durations
[<italic>F</italic>(2, 76) = 21.67, <italic>p</italic> &#x3C; .001,
<italic>ƞ<sub>P</sub></italic><sup>2</sup> = .36]. On average,
participants had shorter fixations when watching CO than when watching
FG or SY (both <italic>p</italic> &#x3C; .001). No further post-hoc
effects and no interactions between main factors were observed
(<italic>p</italic> &#x3E;.05).</p>

<p>Saccadic frequency was not influenced by Modality
[<italic>F</italic>(1, 38) = 3.35, <italic>p</italic> &#x3E; .05,
<italic>ƞ<sub>P</sub></italic><sup>2</sup> = .08], suggesting that music
in slow-motion scenes did not affect the number of saccades per second.
The factor Excerpt led to a main effect in saccades performed per second
[<italic>F</italic>(2, 76) = 42.01, <italic>p</italic> &#x3C; .001,
<italic>ƞ<sub>P</sub></italic><sup>2</sup> = .53], indicating that
participants performed fewer saccades while watching CO than the other
two films (both <italic>p</italic> &#x3C; .001). There were no
interactions (<italic>p</italic> &#x3E; .05).</p>

<p>The fourth eye movement parameter we analysed was blink frequency per
second. Results for the factor Modality revealed a main effect
[<italic>F</italic>(1, 36) = 5.37, <italic>p</italic> &#x3C; .05,
<italic>ƞ<sub>P</sub></italic><sup>2</sup> = .13], suggesting that participants blinked more often with music
compared to no music. Blink frequency differed between excerpts
[<italic>F</italic>(1.43, 51.53) = 9.99, <italic>p</italic> &#x3C; .001,
<italic>ƞ<sub>P</sub></italic><sup>2</sup> = .22]. Participants blinked
more often while watching SY than the other two excerpts (both
<italic>p</italic> &#x3C; .05), with no further post-hoc or interaction
effects (<italic>p</italic> &#x3E; .05).</p>

<p>Dwell time profiles: In order to assess whether participants
perceived slow-motion scenes with and without music differently in terms
of gaze dispersion, we compared dwell time profiles between conditions,
averaged across the three films (Figure 3). Comparing the number of
active grids, meaning the number of grids participants gazed at, offers
a simple measure of gaze dispersion. Participants actively looked at 116
grids in the audiovisual condition compared to 131 grids in the
visual-only condition, out of a total number of 256 grids. A
chi<sup>2</sup>-test resulted in no significant differences between the
audiovisual and visual-only conditions regarding the dispersion of
actively viewed grids [<italic>X</italic><sup>2</sup>(1,
<italic>N</italic> = 247) = 0.46, <italic>p</italic> &#x3E; .05,
<italic>ω</italic> = .04]. To check for possible effects on centre bias
between conditions, the four centre grids were analysed. Dwell times of
these grids were averaged and compared in a paired-samples t-test.
Average dwell time per second for the audiovisual condition was 41.88 ms
(<italic>SD</italic> = 10.85) compared to 44.16 ms (<italic>SD</italic>
= 12.27) in the visual-only condition, yielding no significant effect
[<italic>t</italic>(38) = 1.33, <italic>p</italic> &#x3E; .05,
<italic>d</italic> = .20].</p>

<fig id="fig03" fig-type="figure" position="float">
					<label>Figure 3.</label>
					<caption>
						<p>Dwell time profiles averaged over excerpts according to conditions. Colours from blue to red represent average dwell times per second.</p>
					</caption>
					<graphic id="graph03" xlink:href="jemr-11-02-j-figure-03.png"/>
				</fig>

<p>Familiarity: Since familiarity with excerpts may influence scene
perception, we collected familiarity ratings for the visual scenes and
film music separately. Out of the three excerpts, participants were most
familiar with FG (<italic>M</italic> = 5.69, <italic>SD</italic> = 2.24)
and CO (<italic>M</italic> = 5.17, <italic>SD</italic> = 2.24), rated on
a discrete point scale from 1 (not at all) to 7 (very much). As
expected, SY was rather unfamiliar (<italic>M</italic> = 1.50,
<italic>SD</italic> = 1.44). Familiarity ratings of the film music
revealed that “La gazza ladra” from CO was the most familiar one
(<italic>M</italic> = 3.63, <italic>SD</italic> = 1.89) followed by “Run
Forrest Run” from FG (<italic>M</italic> = 3.47, <italic>SD</italic> =
2.27). The music from SY was generally not known by participants
(<italic>M</italic> = 1.42, <italic>SD</italic> = 1.05). Several Pearson
correlations were calculated (alpha corrected for multiple correlations)
for familiarity of each excerpt and eye movement parameters as well as
pupil diameter. No correlations were found between scene familiarity and
pupil diameter (all <italic>r</italic> &#x3C; .30, all <italic>p</italic>
&#x3E; .05) or eye movement parameters (all <italic>r</italic> &#x3C; .30,
all <italic>p</italic> &#x3E; .05) for any of the film or music excerpts
apart from SY. For this widely unknown film, music familiarity
correlated with fixation frequency (<italic>r</italic> = .50,
<italic>p</italic> &#x3C; .01) and saccadic frequency (<italic>r</italic>
= .50, <italic>p</italic> &#x3C; .01). These results suggest that
familiarity with the films was generally not related to eye movements or
pupillary responses.</p>
    </sec>

    <sec id="S3b">
      <title>Tempo effects: Slow motion vs. adapted real-time motion</title>  

<p>In the next section, we report results for the factor Tempo,
comparing slow motion to adapted real-time motion in scene
perception.</p>

<p>Pupillary responses: The ANOVA on average pupil diameter according to
conditions slow motion and adapted real-time motion yielded a main
effect for factor Tempo [<italic>F</italic>(1, 38) = 26.49,
<italic>p</italic> &#x3C; .001, <italic>ƞ<sub>P</sub></italic><sup>2</sup>
= .41], suggesting that participants’ pupil diameters were larger in
adapted real-time motion than in slow motion (Table 2). On average,
pupil diameter differed by 0.12 mm between conditions
(<italic>SE</italic> = .02). Factor Excerpt showed a main effect as well
[<italic>F</italic>(2, 76) = 17.45, <italic>p</italic> &#x3C; .001,
<italic>ƞ<sub>P</sub></italic><sup>2</sup> = .32]. Post-hoc analyses
indicate that average pupil diameters were largest for FG compared to
the other two films (both <italic>p</italic> &#x3C; .001), which did not
differ from each other (<italic>p</italic> &#x3E; .05). The interaction
between main factors reached significance [<italic>F</italic>(2, 76) =
3.60, <italic>p</italic> &#x3C; .05,
<italic>ƞ<sub>P</sub></italic><sup>2</sup> = .09], indicating that of
all three excerpts pupil diameter was larger in the adapted real-time
condition, with differences between film excerpts strongly influencing
the results.</p>

<p>Eye movements: Results for factor Tempo with regard to eye movement
parameters are presented in Table 2. Fixation frequency was influenced
by Tempo [<italic>F</italic>(1, 38) = 11.35, <italic>p</italic> &#x3C;
.005, <italic>ƞ<sub>P</sub></italic><sup>2</sup> = .23], suggesting that
participants fixated more often per second in the slow motion than in
the adapted real-time motion condition. Factor Excerpt influenced
fixation frequency as well [<italic>F</italic>(2, 76) = 25.03,
<italic>p</italic> &#x3C; .001, <italic>ƞ<sub>P</sub></italic><sup>2</sup>
= .40]. While FG and SY showed similar numbers of fixations per second
(<italic>p</italic> &#x3E; .05), fixation frequency was higher for CO than
for the other excerpts (both <italic>p</italic> &#x3C; .001).</p>

<p>Fixation durations were longer in the adapted real-time motion than
in the slow-motion condition [<italic>F</italic>(1, 36) = 18.96,
<italic>p</italic> &#x3C; .001, <italic>ƞ<sub>P</sub></italic><sup>2</sup>
= .34], thus slow motion caused shorter fixations. The excerpts also
affected fixation durations [<italic>F</italic>(2, 72) = 8.03,
<italic>p</italic> &#x3C; .005, <italic>ƞ<sub>P</sub></italic><sup>2</sup>
= .18]; participants’ average fixations lasted the shortest for CO (both
<italic>p</italic> &#x3C; .01) and did not differ between FG and SY
(<italic>p</italic> &#x3E; .05).</p>

<p>The number of saccades per second was affected by factor Tempo
[<italic>F</italic>(1, 38) = 8.85, <italic>p</italic> &#x3C; .01,
<italic>ƞ<sub>P</sub></italic><sup>2</sup> = .19]. Watching the excerpts
in adapted real-time motion led to fewer saccades performed than in slow
motion. Saccadic frequency differed between excerpts as well
[<italic>F</italic>(2, 76) = 34.88, <italic>p</italic> &#x3C; .001,
<italic>ƞ<sub>P</sub></italic><sup>2</sup> = .48]. Most saccades were
performed during CO (both <italic>p</italic> &#x3C; .001) and again, no
difference was observed between the other two excerpts
(<italic>p</italic> &#x3E; .05).</p>

<p>Blink frequency was influenced by both main factors. Factor Tempo
affected blink frequency, showing that participants blinked more often
per second while watching the scenes in slow motion than in adapted
real-time motion [<italic>F</italic>(1, 36) = 11.23, <italic>p</italic>
&#x3C; .005, <italic>ƞ<sub>P</sub></italic><sup>2</sup> = .24]. Factor
Excerpt influenced blink frequency [<italic>F</italic>(1.58, 56.94) =
29.20, <italic>p</italic> &#x3C; .001,
<italic>ƞ<sub>P</sub></italic><sup>2</sup> = .45]. Participants blinked
most often while watching SY, followed by FG and CO, which showed the
lowest blink frequency (all <italic>p</italic> &#x3C; .001). Interactions
between main factors were not significant for any eye movement
parameters (all <italic>p</italic> &#x3E; .05). Taken together, slow
motion led to more eye movements with shorter fixations, which were
further influenced by the film excerpts.</p>

<p>Dwell time profiles: Did participants attend differently to slow
motion compared to adapted real-time motion? To answer this question,
dwell time profiles were compared for conditions slow motion and adapted
real-time motion (Figure 3). The chi<sup>2</sup>-test revealed an effect
on gaze dispersion [<italic>X</italic><sup>2</sup>(1, <italic>N</italic>
= 209) = 6.72, <italic>p</italic> &#x3C; .01, <italic>ω</italic> = .18].
Participants actively looked at 131 grids in the slow-motion condition
compared to 78 grids in adapted real-time motion. Results of average
centre dwell times yielded a strong effect [<italic>t</italic>(38) =
3.97, <italic>p</italic> &#x3C; .001, <italic>d</italic> = .83], showing
that participants looked longer at the centre in adapted real-time
motion (<italic>M</italic> = 59.75 ms, <italic>SD</italic> = 23.71)
compared to slow motion (<italic>M</italic> = 44.16 ms,
<italic>SD</italic> = 12.27). These results indicate that faster
playback speed caused the viewers to look longer and more frequently at
the centre of the screen, while slow motion led to a more dispersed gaze
behaviour.</p>
    </sec>
    </sec>

    <sec id="S4">
      <title>Discussion</title>  

<p>This study aimed at finding out whether music influences arousal and
eye movements in slow-motion film scenes compared to conditions without
music, and how playback speed affects visual perception. Furthermore, we
expected to find differences between slow-motion film excerpts, since
all three excerpts consisted of different dynamics and complexities as
well as speed factors.</p>

<p>While arousal was higher in conditions with film music, eye movements
were only effected by playback speed. These findings provide new
insights into audiovisual interactions in the perception of emotional
film scenes.</p>

<p>We hypothesised that music compared to no music would cause higher
arousal in the viewer and a reduction of eye movements. Results of
average pupil diameter support our hypothesis regarding higher arousal,
showing that pupil diameters were indeed larger in the music condition.
Our results are in line with previous studies showing that music affects
autonomic emotional responses in terms of arousal, and that pupillometry
is well-suited to investigate these processes (<xref ref-type="bibr" rid="b15 b16">15, 16</xref>). This
finding is further corroborated by peripheral physiological responses we
recently investigated in another paper (<xref ref-type="bibr" rid="b50">50</xref>). Bodily arousal as
measured by Galvanic skin conductance, heart and respiration rate all
increased when music was present as compared to visual stimuli without
music.</p>

<p>On the other hand, our results do not support the assumption that
music causes a reduction in eye movements, since no effects were found
regarding fixation parameters and saccades, dwell time profiles or
centre dwell times between conditions. Only blink frequency was affected
by music, showing that participants blinked more often with music than
without. Contrary to other research, there were also no effects on eye
movement parameters (<xref ref-type="bibr" rid="b44 b37">44, 37</xref>), and we did not find systematic
effects of music on viewers’ scene perception as suggested by Auer et
al. (<xref ref-type="bibr" rid="b46">46</xref>). Our results are more in line with Coutrot and Guyader
(<xref ref-type="bibr" rid="b48">48</xref>) and Smith (<xref ref-type="bibr" rid="b49">49</xref>), who found no influence of
non-diegetic sounds on eye movements, using speech, natural sounds, and
music. Even though gaze behaviour did not change, pupil diameter was
nevertheless impacted, as our results show. This is a novel finding and
should be further investigated in future studies since it raises a
number of questions: For example, can this effect be observed for other
non-diegetic sounds as well or is it specific to film music?
Furthermore, it should be investigated if this effect depends on the
degree of coherence between visual and auditory semantics (e.g. sad
visual scene combined with happy music and vice versa).</p>

<p>Playback speed (slow motion vs. adapted real-time motion) was
expected to influence scene perception, so that slow motion allows for
more attention to detail. Our results show that slow motion compared to
adapted real-time motion influenced all eye movement parameters. Slow
motion caused participants to fixate more often, and fixations were
generally shorter than in adapted real-time motion. Furthermore,
participants performed more saccades in the slow-motion condition. Gaze
dispersion was also affected by playback speed, indicating that
participants gazed in more areas while watching the excerpts in slow
motion, as measured by the number of active grids. Correspondingly,
average centre dwell times were affected, showing that participants
focused their gaze more towards the centre in the adapted real-time
condition. When watching scenes in slow motion, viewers may thus have
different cognitive processing (cf. (<xref ref-type="bibr" rid="b12">12</xref>)), which is reflected
in their visual attention to detail.</p>

<p>Average pupil diameter was affected by playback speed as well,
suggesting that participants were more aroused when watching the
excerpts in adapted real-time motion than slow motion. Somewhat contrary
to these results, blink frequency was higher in the slow-motion
condition. Based on previous findings, we expected a higher blink
frequency in the adapted real-time motion condition since previous
research linked reduced eye movements with an increase in blink rate
(<xref ref-type="bibr" rid="b38 b39 b40">38, 39, 40</xref>). A possible explanation for this finding could be that
in our case, blink frequency reflected cognitive load. Studies suggest
that blink rate may function as a measure of cognitive load, so that
blink rate increases when cognitive load has been high (<xref ref-type="bibr" rid="b56 b57">56, 57</xref>).
If this is indeed the case, then watching scenes in slow motion compared
to real-time motion should increase cognitive load. A possible reason
might be a longer exposure time to visual information, allowing the
viewer to perceive more details of the image, therefore more visual
information is parsed and stored in working memory. Influences of music
or diegetic sounds can be ruled out since both conditions were presented
silently (visually-only). In a related study (<xref ref-type="bibr" rid="b50">50</xref>), we found
that slow motion, as compared to real-time motion, affected cognitive
dimensions of perceived duration, which was underestimated in slow
motion. Valence was also more positive in slow motion. These findings
indicate that spectators do indeed perceive differences between both
conditions that affect them in attention and emotion. Time estimates are
clearly influenced by cognitive load (<xref ref-type="bibr" rid="b58">58</xref>).</p>

<p>As expected, the three film excerpts influenced all eye parameters.
“A Clockwork Orange” differed particularly from the other two excerpts
for factors Modality and Tempo, which is most likely caused by the
dynamics of the scene. In all conditions, participants fixated more
frequently, and fixations lasted for a shorter amount of time.
Participants also performed more saccades compared to “Forrest Gump” and
“Silent Youth”. Familiarity with the individual excerpt (scene and
music) did not yield any systematic results. Only music familiarity with
“Silent Youth”, which was generally the most unfamiliar one to the
participants, correlated with fixation and saccadic frequencies. Further
research should investigate familiarity in relation to scene perception
and attention to detail, for instance by studying areas of interest in
gaze behaviour.</p>

<p>Our results partly support the conclusion by Coutrot et al.
(<xref ref-type="bibr" rid="b47">47</xref>), stating that effects of music on visual attention in
dynamic scenes may not be consistent over time. In this regard, no
general effects of music on eye movement parameters across time were
found in our study. It is possible that participants prioritised visual
over auditory information as Smith (<xref ref-type="bibr" rid="b49">49</xref>) suggests, or that the
dynamics of excerpts constrained individuals’ gaze behaviour to a large
extent, outweighing potential effects of non-diegetic sounds
(<xref ref-type="bibr" rid="b48 b19 b20 b21 b22">48, 19, 20, 21, 22</xref>). Results concerning participants blink behaviour
support the finding from Schäfer and Fachner (<xref ref-type="bibr" rid="b37">37</xref>), stating that
music causes viewers to blink more often. This would be in line with
their assumption that in audiovisual contexts, music leads to more
attentional shifts between exogenous and endogenous attention.</p>

<p>Our study used realistic excerpts taken from three commercial films.
Since we were interested in both the effects of slow motion and the
effects of the underlying music, the excerpts chosen varied considerably
in terms of content and form. Among these features, the emotional
valence of the scenes as well as the number of shot cuts, and further
filmmaking decisions such as brightness of the footage, were different
across excerpts. This may constitute a limitation of our study with
regard to the generalisability of the findings. All three film excerpts,
on the other hand, consisted of slow-motion scenes, showing human
movements of more than one character and were at least 22 seconds in
duration, with a deceleration factor of at least 2. Viewers in our
experiment could thus follow the characters’ movements and perceive them
in slow motion in comparison with real-time motion.</p>

<p>A further source of variance across films stems from factors such as
luminance. Eye movements and pupillary responses may change due to
low-level visual features of the material, irrespective of the content
(<xref ref-type="bibr" rid="b22 b24 b25">22, 24, 25</xref>). Changes in playback speed, leading to different
exposure time of these features, may alter participants’ ocular
responses to a large extent, typically without them being aware of the
autonomic changes for instance in pupillary responses (<xref ref-type="bibr" rid="b1">1</xref>).
Since we did not find evidence for differences in eye movements and gaze
behaviour between audiovisual and visual-only conditions, we conclude
that the strong effect of music on pupil dilations was not influenced by
different exposure to luminance. Future studies testing the effects of
slow-motion scenes may present novel material that controls for these
factors. In addition, purpose-written music could be used that does not
depend on playback speed. This music could vary according to different
emotions such as happy or sad, in order to find out whether the semantic
relation between music and visual scene dynamics may influence pupillary
responses.</p>

<p>We decided to use dwell time profiles as a measure of gaze
distribution, since they include fixations and saccades of each
participant. A limitation of this method is that it relies on the number
of grids. Future research could use more fine-grained metrics which are
more data-driven such as Normalized Scanpath Salience (NSS)
(<xref ref-type="bibr" rid="b59">59</xref>), especially when looking into temporal aspects of gaze
distribution. Nevertheless, as a measure of gaze distribution according
to conditions, independently from individual scene dynamics, these
analyses still revealed robust effects in our study.</p>

<p>Finally, in order to estimate the effects of slowing down, film
scenes could systematically be decelerated in a number of versions for
each scene, and responses be measured in controlled conditions that take
into account the number of shot cuts, quantity of motion and further
low-level visual characteristics such as luminance. Another interesting
aspect that should be taken into account is cognitive load. Future
studies may vary the amount of visual and auditory information in a
systematic way to reassess the assumption of increased cognitive
activity when watching scenes in slow motion. Nevertheless, in our study
there was a strong effect of music on pupillary responses, and of
playback speed on eye movements and gaze dispersion. These results were
found across the different film scenes, in spite of the variety in
visual and dynamic features. This finding suggests that there are
underlying psychophysiological mechanisms in the perception of films
with highly expressive music that go beyond the characteristics of a
given example.</p>

<p>We conclude that music affects the arousal level in viewers when
watching slow-motion scenes taken from commercial films. When music was
present, pupil diameters were larger, which is related to the emotional
dimension of arousal. On the other hand, music did not influence gaze
behaviour in a systematic way, since no main effects on fixations and
saccades or significant differences in gaze dispersion were found.
Playback speed influenced visual perception strongly, causing the
viewers to focus their gaze more towards the centre with fewer eye
movements, longer fixations, and larger pupil diameters at higher
playback speed. These findings not only offer new insights into the
perception of films, but may also be informative for further research
into the perception of audiovisual material in relation to temporal
expansion and contraction.</p>
    </sec>

    <sec id="S5" sec-type="COI-statement">
      <title>Ethics and Conflict of Interest</title> 

<p>The authors declare that the contents of the article are in agreement
with the ethics described in
<ext-link ext-link-type="uri" xlink:href="http://biblio.unibe.ch/portale/elibrary/BOP/jemr/ethics.html" xlink:show="new">http://biblio.unibe.ch/portale/elibrary/BOP/jemr/ethics.html</ext-link>
and that there is no conflict of interest regarding the publication of
this paper.</p>
    </sec>

    <sec id="S6">
      <title>Acknowledgements</title> 

<p>This research was supported by the European Research Council (grant
agreement: 725319, PI: Clemens Wöllner) for the five-years project “Slow
motion: Transformations of musical time in perception and performance”
(SloMo).</p>

<p>We wish to thank Henning Albrecht and Jesper Hohagen for their
contribution to stimuli preparation and data collection.</p>
    </sec>
  </body>
<back>
<ref-list>
<ref id="b12"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Arstila</surname>, <given-names>V.</given-names></name></person-group> (<year>2012</year>). <article-title>Time Slows Down during Accidents.</article-title> <source>Frontiers in Psychology</source>, <volume>3</volume>, <fpage>196</fpage>. <pub-id pub-id-type="doi" specific-use="author">10.3389/fpsyg.2012.00196</pub-id><pub-id pub-id-type="pmid">22754544</pub-id><issn>1664-1078</issn></mixed-citation></ref>
<ref id="b46"><mixed-citation publication-type="conference" specific-use="parsed"><person-group person-group-type="author"><name><surname>Auer</surname> <given-names>K</given-names></name>, <name><surname>Vitouch</surname> <given-names>O</given-names></name>, <name><surname>Koreimann</surname> <given-names>S</given-names></name>, <name><surname>Pesjak</surname> <given-names>G</given-names></name>, <name><surname>Leitner</surname> <given-names>G</given-names></name>, <name><surname>Hitz</surname> <given-names>M</given-names></name></person-group>. <article-title>When music drives vision: Influences of film music on viewers&#8217; eye movements.</article-title> In: <person-group person-group-type="editor"><name><surname>Cambouropoulos</surname> <given-names>E</given-names></name>, <name><surname>Tsougras</surname> <given-names>C</given-names></name>, <name><surname>Mavromatis</surname> <given-names>P</given-names></name>, <name><surname>Pastiadis</surname> <given-names>K</given-names></name><role>, editors</role></person-group>. <source>Proceeding of the 12th International Conference on Music Perception and Cognition and the 8th Triennial Conference of the European Society for the Cognitive Sciences of Music</source>; <year>2012</year>.</mixed-citation></ref>
<ref id="b34"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Behne</surname>, <given-names>K. E.</given-names></name>, &#x26; <name><surname>W&#246;llner</surname>, <given-names>C.</given-names></name></person-group> (<year>2011</year>). <article-title>Seeing or hearing the pianists?: A synopsis of an early audiovisual perception experiment and a replication.</article-title> <source>Musicae Scientiae</source>, <volume>15</volume>(<issue>3</issue>), <fpage>324</fpage>&#8211;<lpage>342</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1177/1029864911410955</pub-id><issn>1029-8649</issn></mixed-citation></ref>
<ref id="b18"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Rizzolatti</surname>, <given-names>G.</given-names></name>, <name><surname>Riggio</surname>, <given-names>L.</given-names></name>, <name><surname>Dascola</surname>, <given-names>I.</given-names></name>, &#x26; <name><surname>Umilt&#225;</surname>, <given-names>C.</given-names></name></person-group> (<year>1987</year>). <article-title>Reorienting attention across the horizontal and vertical meridians: Evidence in favor of a premotor theory of attention.</article-title> <source>Neuropsychologia</source>, <volume>25</volume>(<issue>1</issue>, <supplement>1A</supplement>), <fpage>31</fpage>&#8211;<lpage>40</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1016/0028-3932(87)90041-8</pub-id><pub-id pub-id-type="pmid">3574648</pub-id><issn>0028-3932</issn></mixed-citation></ref>
<ref id="b58"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Block</surname>, <given-names>R. A.</given-names></name>, <name><surname>Hancock</surname>, <given-names>P. A.</given-names></name>, &#x26; <name><surname>Zakay</surname>, <given-names>D.</given-names></name></person-group> (<year>2010</year>). <article-title>How cognitive load affects duration judgments: A meta-analytic review.</article-title> <source>Acta Psychologica</source>, <volume>134</volume>(<issue>3</issue>), <fpage>330</fpage>–<lpage>343</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1016/j.actpsy.2010.03.006</pub-id><pub-id pub-id-type="pmid">20403583</pub-id><issn>0001-6918</issn></mixed-citation></ref>
<ref id="b22"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Boccignone</surname>, <given-names>G.</given-names></name>, &#x26; <name><surname>Ferraro</surname>, <given-names>M.</given-names></name></person-group> (<year>2004</year>). <article-title>Modelling gaze shift as a constrained random walk.</article-title> <source>Physica A</source>, <volume>331</volume>(<issue>1-2</issue>), <fpage>207</fpage>&#8211;<lpage>218</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1016/j.physa.2003.09.011</pub-id><issn>0378-4371</issn></mixed-citation></ref>
<ref id="b9"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Boltz</surname>, <given-names>M. G.</given-names></name></person-group> (<year>2001</year>). <article-title>Musical Soundtracks as a Schematic Influence on the Cognitive Processing of Filmed Events.</article-title> <source>Music Perception</source>, <volume>18</volume>(<issue>4</issue>), <fpage>427</fpage>&#8211;<lpage>454</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1525/mp.2001.18.4.427</pub-id><issn>0730-7829</issn></mixed-citation></ref>
<ref id="b1"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Bradley</surname>, <given-names>M. M.</given-names></name>, <name><surname>Miccoli</surname>, <given-names>L.</given-names></name>, <name><surname>Escrig</surname>, <given-names>M. A.</given-names></name>, &#x26; <name><surname>Lang</surname>, <given-names>P. J.</given-names></name></person-group> (<year>2008</year>). <article-title>The pupil as a measure of emotional arousal and autonomic activation.</article-title> <source>Psychophysiology</source>, <volume>45</volume>(<issue>4</issue>), <fpage>602</fpage>&#8211;<lpage>607</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1111/j.1469-8986.2008.00654.x</pub-id><pub-id pub-id-type="pmid">18282202</pub-id><issn>0048-5772</issn></mixed-citation></ref>
<ref id="b13"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Bridgeman</surname>, <given-names>B.</given-names></name>, &#x26; <name><surname>Tseng</surname>, <given-names>P.</given-names></name></person-group> (<year>2011</year>). <article-title>Embodied cognition and the perception-action link.</article-title> <source>Physics of Life Reviews</source>, <volume>8</volume>(<issue>1</issue>), <fpage>73</fpage>&#8211;<lpage>85</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1016/j.plrev.2011.01.002</pub-id><pub-id pub-id-type="pmid">21257354</pub-id><issn>1571-0645</issn></mixed-citation></ref>
<ref id="b10"><mixed-citation publication-type="thesis" specific-use="unparsed"><person-group person-group-type="author"><name><surname>Brockmann</surname> <given-names>T.</given-names></name></person-group> Die Zeitlupe: Anatomie eines filmischen Stilmittels Dissertation; <year>2013</year>. (Z&#252;rcher Filmstudien; vol. 33).</mixed-citation></ref>
<ref id="b26"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Carmi</surname>, <given-names>R.</given-names></name>, &#x26; <name><surname>Itti</surname>, <given-names>L.</given-names></name></person-group> (<year>2006</year>). <article-title>Visual causes versus correlates of attentional selection in dynamic scenes.</article-title> <source>Vision Research</source>, <volume>46</volume>(<issue>26</issue>), <fpage>4333</fpage>&#8211;<lpage>4345</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1016/j.visres.2006.08.019</pub-id><pub-id pub-id-type="pmid">17052740</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="b47"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Coutrot</surname>, <given-names>A.</given-names></name>, <name><surname>Guyader</surname>, <given-names>N.</given-names></name>, <name><surname>Ionescu</surname>, <given-names>G.</given-names></name>, &#x26; <name><surname>Caplier</surname>, <given-names>A.</given-names></name></person-group> (<year>2012</year>). <article-title>Influence of soundtrack on eye movements during video exploration.</article-title> <source>Journal of Eye Movement Research</source>, <volume>5</volume>(<issue>4</issue>), <fpage>2</fpage>. <pub-id pub-id-type="doi" specific-use="author">10.16910/jemr.5.4.2</pub-id><issn>1995-8692</issn></mixed-citation></ref>
<ref id="b48"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Coutrot</surname>, <given-names>A.</given-names></name>, &#x26; <name><surname>Guyader</surname>, <given-names>N.</given-names></name></person-group> (<year>2014</year>). <article-title>How saliency, faces, and sound influence gaze in dynamic social scenes.</article-title> <source>Journal of Vision (Charlottesville, Va.)</source>, <volume>14</volume>(<issue>8</issue>), <fpage>5</fpage>. <pub-id pub-id-type="doi" specific-use="author">10.1167/14.8.5</pub-id><pub-id pub-id-type="pmid">24993019</pub-id><issn>1534-7362</issn></mixed-citation></ref>
<ref id="b17"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Deubel</surname>, <given-names>H.</given-names></name>, &#x26; <name><surname>Schneider</surname>, <given-names>W. X.</given-names></name></person-group> (<year>1996</year>). <article-title>Saccade target selection and object recognition: Evidence for a common attentional mechanism.</article-title> <source>Vision Research</source>, <volume>36</volume>(<issue>12</issue>), <fpage>1827</fpage>&#8211;<lpage>1837</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1016/0042-6989(95)00294-4</pub-id><pub-id pub-id-type="pmid">8759451</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="b21"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Dorr</surname>, <given-names>M.</given-names></name>, <name><surname>Martinetz</surname>, <given-names>T.</given-names></name>, <name><surname>Gegenfurtner</surname>, <given-names>K. R.</given-names></name>, &#x26; <name><surname>Barth</surname>, <given-names>E.</given-names></name></person-group> (<year>2010</year>). <article-title>Variability of eye movements when viewing dynamic natural scenes.</article-title> <source>Journal of Vision (Charlottesville, Va.)</source>, <volume>10</volume>(<issue>10</issue>), <fpage>28</fpage>. <pub-id pub-id-type="doi" specific-use="author">10.1167/10.10.28</pub-id><pub-id pub-id-type="pmid">20884493</pub-id><issn>1534-7362</issn></mixed-citation></ref>
<ref id="b54"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><name><surname>Duchowski</surname>, <given-names>A.</given-names></name></person-group> (<year>2007</year>). <source>Eye Tracking Methodology: Theory and Practice</source>. <publisher-loc>London</publisher-loc>: <publisher-name>Springer-Verlag London Limited</publisher-name>.</mixed-citation></ref>
<ref id="b6"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Ellis</surname>, <given-names>R. J.</given-names></name>, &#x26; <name><surname>Simons</surname>, <given-names>R. F.</given-names></name></person-group> (<year>2005</year>). <article-title>The Impact of Music on Subjective and Physiological Indices of Emotion While Viewing Films.</article-title> <source>Psychomusicology: Music, Mind, and Brain</source>, <volume>19</volume>(<issue>1</issue>), <fpage>15</fpage>&#8211;<lpage>40</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1037/h0094042</pub-id><issn>0275-3987</issn></mixed-citation></ref>
<ref id="b41"><mixed-citation publication-type="book-chapter" specific-use="unparsed"><person-group person-group-type="author"><name><surname>Fachner</surname> <given-names>J.</given-names></name></person-group> Time is the key: Music and Altered States of Consciousness. In: Cardena E, editor. Altering consciousness. - Vol. 1-2, Multidisciplinary perspectives. Santa Barbara Calif.: Praeger; <year>2011</year>. p. 355&#8211;76.</mixed-citation></ref>
<ref id="b15"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Gingras</surname>, <given-names>B.</given-names></name>, <name><surname>Marin</surname>, <given-names>M. M.</given-names></name>, <name><surname>Puig-Waldm&#252;ller</surname>, <given-names>E.</given-names></name>, &#x26; <name><surname>Fitch</surname>, <given-names>W. T.</given-names></name></person-group> (<year>2015</year>). <article-title>The Eye is Listening: Music-Induced Arousal and Individual Differences Predict Pupillary Responses.</article-title> <source>Frontiers in Human Neuroscience</source>, <volume>9</volume>, <fpage>619</fpage>. <pub-id pub-id-type="doi" specific-use="author">10.3389/fnhum.2015.00619</pub-id><pub-id pub-id-type="pmid">26617511</pub-id><issn>1662-5161</issn></mixed-citation></ref>
<ref id="b19"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Goldstein</surname>, <given-names>R. B.</given-names></name>, <name><surname>Woods</surname>, <given-names>R. L.</given-names></name>, &#x26; <name><surname>Peli</surname>, <given-names>E.</given-names></name></person-group> (<year>2007</year>). <article-title>Where people look when watching movies: Do all viewers look at the same place?</article-title> <source>Computers in Biology and Medicine</source>, <volume>37</volume>(<issue>7</issue>), <fpage>957</fpage>&#8211;<lpage>964</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1016/j.compbiomed.2006.08.018</pub-id><pub-id pub-id-type="pmid">17010963</pub-id><issn>0010-4825</issn></mixed-citation></ref>
<ref id="b4"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Hargreaves</surname>, <given-names>D. J.</given-names></name>, &#x26; <name><surname>North</surname>, <given-names>A. C.</given-names></name></person-group> (<year>1999</year>). <article-title>The Functions of Music in Everyday Life: Redefining the Social in Music Psychology.</article-title> <source>Psychology of Music</source>, <volume>27</volume>(<issue>1</issue>), <fpage>71</fpage>&#8211;<lpage>83</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1177/0305735699271007</pub-id><issn>0305-7356</issn></mixed-citation></ref>
<ref id="b20"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Hasson</surname>, <given-names>U.</given-names></name>, <name><surname>Landesman</surname>, <given-names>O.</given-names></name>, <name><surname>Knappmeyer</surname>, <given-names>B.</given-names></name>, <name><surname>Vallines</surname>, <given-names>I.</given-names></name>, <name><surname>Rubin</surname>, <given-names>N.</given-names></name>, &#x26; <name><surname>Heeger</surname>, <given-names>D. J.</given-names></name></person-group> (<year>2008</year>). <article-title>Neurocinematics: The Neuroscience of Film.</article-title> <source>Projections.</source>, <volume>2</volume>(<issue>1</issue>), <fpage>1</fpage>&#8211;<lpage>26</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.3167/proj.2008.020102</pub-id></mixed-citation></ref>
<ref id="b42"><mixed-citation publication-type="book" specific-use="restruct"><person-group person-group-type="author"><name><surname>Herbert</surname>, <given-names>R.</given-names></name></person-group> (<year>2011</year>). <source>Everyday music listening: Absorption, dissociation and trancing</source>. <publisher-loc>Farnham, Surrey, Burlington, VT</publisher-loc>: <publisher-name>Ashgate</publisher-name>.</mixed-citation></ref>
<ref id="b43"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Herbert</surname>, <given-names>R.</given-names></name></person-group> (<year>2013</year>). <article-title>An empirical study of normative dissociation in musical and non-musical everyday life experiences.</article-title> <source>Psychology of Music</source>, <volume>41</volume>(<issue>3</issue>), <fpage>372</fpage>&#8211;<lpage>394</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1177/0305735611430080</pub-id><issn>0305-7356</issn></mixed-citation></ref>
<ref id="b53"><mixed-citation publication-type="book" specific-use="unparsed"><person-group person-group-type="author"><name><surname>Hirsch</surname> <given-names>H.</given-names></name><name><surname>Kemmesies</surname> <given-names>D.</given-names></name><name><surname>von Grünhagen</surname> <given-names>A.</given-names></name></person-group> Silent Youth: Milieu Film Production; <year>2012</year>.</mixed-citation></ref>
<ref id="b7"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Hoeckner</surname>, <given-names>B.</given-names></name>, <name><surname>Wyatt</surname>, <given-names>E. W.</given-names></name>, <name><surname>Decety</surname>, <given-names>J.</given-names></name>, &#x26; <name><surname>Nusbaum</surname>, <given-names>H.</given-names></name></person-group> (<year>2011</year>). <article-title>Film music influences how viewers relate to movie characters.</article-title> <source>Psychology of Aesthetics, Creativity, and the Arts</source>, <volume>5</volume>(<issue>2</issue>), <fpage>146</fpage>&#8211;<lpage>153</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1037/a0021544</pub-id><issn>1931-3896</issn></mixed-citation></ref>
<ref id="b55"><mixed-citation publication-type="book" specific-use="unparsed"><person-group person-group-type="author"><name><surname>Holmqvist</surname> <given-names>K</given-names></name>, <name><surname>Nystrom</surname> <given-names>M</given-names></name>, <name><surname>Andersson</surname> <given-names>R</given-names></name>, <name><surname>Dewhurst</surname> <given-names>R</given-names></name>, <name><surname>Jarodzka</surname> <given-names>H</given-names></name>, &#x26;  <name><surname>van de Weijer</surname><given-names>J</given-names></name></person-group>  <article-title>Eye tracking: A comprehensive guide to methods and measures.</article-title> <source>Oxford: Oxford University Press</source>; <year>2011</year>. p. 537</mixed-citation></ref>
<ref id="b30"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Itti</surname>, <given-names>L.</given-names></name></person-group> (<year>2005</year>). <article-title>Quantifying the contribution of low-level saliency to human eye movements in dynamic scenes.</article-title> <source>Visual Cognition</source>, <volume>12</volume>(<issue>6</issue>), <fpage>1093</fpage>&#8211;<lpage>1123</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1080/13506280444000661</pub-id><issn>1350-6285</issn></mixed-citation></ref>
<ref id="b2"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Juslin</surname>, <given-names>P. N.</given-names></name>, &#x26; <name><surname>Laukka</surname>, <given-names>P.</given-names></name></person-group> (<year>2004</year>). <article-title>Expression, Perception, and Induction of Musical Emotions: A Review and a Questionnaire Study of Everyday Listening.</article-title> <source>Journal of New Music Research</source>, <volume>33</volume>(<issue>3</issue>), <fpage>217</fpage>&#8211;<lpage>238</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1080/0929821042000317813</pub-id><issn>0929-8215</issn></mixed-citation></ref>
<ref id="b3"><mixed-citation publication-type="book-chapter" specific-use="unparsed"><person-group person-group-type="editor"><name><surname>Juslin</surname> <given-names>PN</given-names></name>, <name><surname>Sloboda</surname> <given-names>JA</given-names></name><role>, editors</role></person-group>. Handbook of music and emotion: Theory, research, applications. 1st ed. Oxford: Oxford University Press; <year>2011</year>. 975 p. (Series in affective science).</mixed-citation></ref>
<ref id="b14"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>K&#246;nig</surname>, <given-names>P.</given-names></name>, <name><surname>Wilming</surname>, <given-names>N.</given-names></name>, <name><surname>Kietzmann</surname>, <given-names>T. C.</given-names></name>, <name><surname>Ossand&#243;n</surname>, <given-names>J. P.</given-names></name>, <name><surname>Onat</surname>, <given-names>S.</given-names></name>, <name><surname>Ehinger</surname>, <given-names>B. V.</given-names></name>, <etal>. . .</etal> <name><surname>Kaspar</surname>, <given-names>K.</given-names></name></person-group> (<year>2016</year>). <article-title>Eye movements as a window to cognitive processes.</article-title> <source>Journal of Eye Movement Research</source>, <volume>9</volume>(<issue>5</issue>), <fpage>1</fpage>&#8211;<lpage>16</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.16910/jemr.9.5.3</pub-id><issn>1995-8692</issn></mixed-citation></ref>
<ref id="b51"><mixed-citation publication-type="unknown" specific-use="unparsed"><person-group person-group-type="author"><name><surname>Kubrick</surname> <given-names>S.</given-names></name></person-group> A Clockwork Orange: Hawk Films Limited; <year>1971</year>.</mixed-citation></ref>
<ref id="b16"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Laeng</surname>, <given-names>B.</given-names></name>, <name><surname>Eidet</surname>, <given-names>L. M.</given-names></name>, <name><surname>Sulutvedt</surname>, <given-names>U.</given-names></name>, &#x26; <name><surname>Panksepp</surname>, <given-names>J.</given-names></name></person-group> (<year>2016</year>). <article-title>Music chills: The eye pupil as a mirror to music&#8217;s soul.</article-title> <source>Consciousness and Cognition</source>, <volume>44</volume>, <fpage>161</fpage>&#8211;<lpage>178</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1016/j.concog.2016.07.009</pub-id><pub-id pub-id-type="pmid">27500655</pub-id><issn>1053-8100</issn></mixed-citation></ref>
<ref id="b31"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Le Meur</surname>, <given-names>O.</given-names></name>, <name><surname>Le Callet</surname>, <given-names>P.</given-names></name>, &#x26; <name><surname>Barba</surname>, <given-names>D.</given-names></name></person-group> (<year>2007</year>). <article-title>Predicting visual fixations on video based on low-level visual features.</article-title> <source>Vision Research</source>, <volume>47</volume>(<issue>19</issue>), <fpage>2483</fpage>&#8211;<lpage>2498</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1016/j.visres.2007.06.015</pub-id><pub-id pub-id-type="pmid">17688904</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="b5"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Lipscomb</surname>, <given-names>S. D.</given-names></name>, &#x26; <name><surname>Kendall</surname>, <given-names>R. A.</given-names></name></person-group> (<year>1994</year>). <article-title>Perceptual judgement of the relationship between musical and visual components in film.</article-title> <source>Psychomusicology: Music, Mind, and Brain</source>, <volume>13</volume>(<issue>1-2</issue>), <fpage>60</fpage>&#8211;<lpage>98</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1037/h0094101</pub-id><issn>0275-3987</issn></mixed-citation></ref>
<ref id="b8"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Marshall</surname>, <given-names>S. K.</given-names></name>, &#x26; <name><surname>Cohen</surname>, <given-names>A. J.</given-names></name></person-group> (<year>1988</year>). <article-title>Effects of Musical Soundtracks on Attitudes toward Animated Geometric Figures.</article-title> <source>Music Perception</source>, <volume>6</volume>(<issue>1</issue>), <fpage>95</fpage>&#8211;<lpage>112</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.2307/40285417</pub-id><issn>0730-7829</issn></mixed-citation></ref>
<ref id="b32"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>McGurk</surname>, <given-names>H.</given-names></name>, &#x26; <name><surname>MacDonald</surname>, <given-names>J.</given-names></name></person-group> (<year>1976</year>). <article-title>Hearing lips and seeing voices.</article-title> <source>Nature</source>, <volume>264</volume>(<issue>5588</issue>), <fpage>746</fpage>&#8211;<lpage>748</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1038/264746a0</pub-id><pub-id pub-id-type="pmid">1012311</pub-id><issn>0028-0836</issn></mixed-citation></ref>
<ref id="b44"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Mera</surname>, <given-names>M.</given-names></name>, &#x26; <name><surname>Stumpf</surname>, <given-names>S.</given-names></name></person-group> (<year>2014</year>). <article-title>Eye-Tracking Film Music.</article-title> <source>Music and the Moving Image.</source>, <volume>7</volume>(<issue>3</issue>), <fpage>3</fpage>&#8211;<lpage>23</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.5406/musimoviimag.7.3.0003</pub-id></mixed-citation></ref>
<ref id="b24"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Mital</surname>, <given-names>P. K.</given-names></name>, <name><surname>Smith</surname>, <given-names>T. J.</given-names></name>, <name><surname>Hill</surname>, <given-names>R. L.</given-names></name>, &#x26; <name><surname>Henderson</surname>, <given-names>J. M.</given-names></name></person-group> (<year>2011</year>). <article-title>Clustering of Gaze During Dynamic Scene Viewing is Predicted by Motion.</article-title> <source>Cognitive Computation</source>, <volume>3</volume>(<issue>1</issue>), <fpage>5</fpage>&#8211;<lpage>24</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1007/s12559-010-9074-z</pub-id><issn>1866-9956</issn></mixed-citation></ref>
<ref id="b59"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Peters</surname>, <given-names>R. J.</given-names></name>, <name><surname>Iyer</surname>, <given-names>A.</given-names></name>, <name><surname>Itti</surname>, <given-names>L.</given-names></name>, &#x26; <name><surname>Koch</surname>, <given-names>C.</given-names></name></person-group> (<year>2005</year>). <article-title>Components of bottom-up gaze allocation in natural images.</article-title> <source>Vision Research</source>, <volume>45</volume>(<issue>18</issue>), <fpage>2397</fpage>–<lpage>2416</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1016/j.visres.2005.03.019</pub-id><pub-id pub-id-type="pmid">15935435</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="b11"><mixed-citation publication-type="book-chapter" specific-use="restruct"><person-group person-group-type="author"><name><surname>Rogers</surname>, <given-names>S.</given-names></name></person-group> (<year>2013</year>). <chapter-title>Truth, Lies, and Meaning in Slow Motion Images</chapter-title>. In <person-group person-group-type="editor"><name><given-names>A. P.</given-names> <surname>Shimamura</surname></name> (<role>Ed.</role>),</person-group> <source>Psychocinematics, Exploring cognition at the movies</source> (pp. <fpage>149</fpage>&#8211;<lpage>164</lpage>). <publisher-loc>Oxford</publisher-loc>: <publisher-name>Oxford Univ. Press</publisher-name>. <pub-id pub-id-type="doi">10.1093/acprof:oso/9780199862139.003.0008</pub-id></mixed-citation></ref>
<ref id="b37"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Sch&#228;fer</surname>, <given-names>T.</given-names></name>, &#x26; <name><surname>Fachner</surname>, <given-names>J.</given-names></name></person-group> (<year>2015</year>). <article-title>Listening to music reduces eye movements.</article-title> <source>Attention, Perception &#x26; Psychophysics</source>, <volume>77</volume>(<issue>2</issue>), <fpage>551</fpage>&#8211;<lpage>559</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.3758/s13414-014-0777-1</pub-id><pub-id pub-id-type="pmid">25280523</pub-id><issn>1943-3921</issn></mixed-citation></ref>
<ref id="b38"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Schleicher</surname>, <given-names>R.</given-names></name>, <name><surname>Galley</surname>, <given-names>N.</given-names></name>, <name><surname>Briest</surname>, <given-names>S.</given-names></name>, &#x26; <name><surname>Galley</surname>, <given-names>L.</given-names></name></person-group> (<year>2008</year>). <article-title>Blinks and saccades as indicators of fatigue in sleepiness warnings: Looking tired?</article-title> <source>Ergonomics</source>, <volume>51</volume>(<issue>7</issue>), <fpage>982</fpage>&#8211;<lpage>1010</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1080/00140130701817062</pub-id><pub-id pub-id-type="pmid">18568959</pub-id><issn>0014-0139</issn></mixed-citation></ref>
<ref id="b57"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Siegle</surname>, <given-names>G. J.</given-names></name>, <name><surname>Ichikawa</surname>, <given-names>N.</given-names></name>, &#x26; <name><surname>Steinhauer</surname>, <given-names>S.</given-names></name></person-group> (<year>2008</year>). <article-title>Blink before and after you think: Blinks occur prior to and following cognitive load indexed by pupillary responses.</article-title> <source>Psychophysiology</source>, <volume>45</volume>(<issue>5</issue>), <fpage>679</fpage>–<lpage>687</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1111/j.1469-8986.2008.00681.x</pub-id><pub-id pub-id-type="pmid">18665867</pub-id><issn>0048-5772</issn></mixed-citation></ref>
<ref id="b39"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Smith</surname>, <given-names>C. N.</given-names></name>, <name><surname>Hopkins</surname>, <given-names>R. O.</given-names></name>, &#x26; <name><surname>Squire</surname>, <given-names>L. R.</given-names></name></person-group> (<year>2006</year>). <article-title>Experience-dependent eye movements, awareness, and hippocampus-dependent memory.</article-title> <source>The Journal of Neuroscience : The Official Journal of the Society for Neuroscience</source>, <volume>26</volume>(<issue>44</issue>), <fpage>11304</fpage>&#8211;<lpage>11312</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1523/JNEUROSCI.3071-06.2006</pub-id><pub-id pub-id-type="pmid">17079658</pub-id><issn>0270-6474</issn></mixed-citation></ref>
<ref id="b23"><mixed-citation publication-type="book-chapter" specific-use="restruct"><person-group person-group-type="author"><name><surname>Smith</surname>, <given-names>T. J.</given-names></name></person-group> (<year>2013</year>). <chapter-title>Watching You Watch Movies: Using Eye Tracking to Inform Cognitive Film Theory</chapter-title>. In <person-group person-group-type="editor"><name><given-names>A. P.</given-names> <surname>Shimamura</surname></name> (<role>Ed.</role>),</person-group> <source>Psychocinematics, Exploring cognition at the movies</source> (pp. <fpage>165</fpage>&#8211;<lpage>192</lpage>). <publisher-loc>Oxford</publisher-loc>: <publisher-name>Oxford Univ. Press</publisher-name>. <pub-id pub-id-type="doi">10.1093/acprof:oso/9780199862139.003.0009</pub-id></mixed-citation></ref>
<ref id="b36"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Smith</surname>, <given-names>T. J.</given-names></name>, &#x26; <name><surname>Martin-Portugues Santacreu</surname>, <given-names>J. Y.</given-names></name></person-group> (<year>2016</year>). <article-title>Match-Action: The Role of Motion and Audio in Creating Global Change Blindness in Film.</article-title> <source>Media Psychology</source>, <volume>20</volume>(<issue>2</issue>), <fpage>317</fpage>&#8211;<lpage>348</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1080/15213269.2016.1160789</pub-id><issn>1521-3269</issn></mixed-citation></ref>
<ref id="b49"><mixed-citation publication-type="book-chapter" specific-use="unparsed"><person-group person-group-type="author"><name><surname>Smith</surname> <given-names>TJ</given-names></name></person-group>. <chapter-title>Audiovisual correspondences in Sergei Eisenstein&#8217;s Alexander Nevsky: A case study in viewer attention.</chapter-title> In: Nannicelli T, Taberham P, editors. Cognitive media theory. New York, NY, London: Routledge; <year>2014</year>. (AFI film readers).</mixed-citation></ref>
<ref id="b40"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Stern</surname>, <given-names>J. A.</given-names></name>, <name><surname>Walrath</surname>, <given-names>L. C.</given-names></name>, &#x26; <name><surname>Goldstein</surname>, <given-names>R.</given-names></name></person-group> (<year>1984</year>). <article-title>The endogenous eyeblink.</article-title> <source>Psychophysiology</source>, <volume>21</volume>(<issue>1</issue>), <fpage>22</fpage>&#8211;<lpage>33</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1111/j.1469-8986.1984.tb02312.x</pub-id><pub-id pub-id-type="pmid">6701241</pub-id><issn>0048-5772</issn></mixed-citation></ref>
<ref id="b29"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Tatler</surname>, <given-names>B. W.</given-names></name>, <name><surname>Baddeley</surname>, <given-names>R. J.</given-names></name>, &#x26; <name><surname>Gilchrist</surname>, <given-names>I. D.</given-names></name></person-group> (<year>2005</year>). <article-title>Visual correlates of fixation selection: Effects of scale and time.</article-title> <source>Vision Research</source>, <volume>45</volume>(<issue>5</issue>), <fpage>643</fpage>&#8211;<lpage>659</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1016/j.visres.2004.09.017</pub-id><pub-id pub-id-type="pmid">15621181</pub-id><issn>0042-6989</issn></mixed-citation></ref>
<ref id="b27"><mixed-citation publication-type="unknown" specific-use="linked"><person-group person-group-type="author"><name><surname>Tatler</surname> <given-names>BW</given-names></name></person-group>. The central fixation bias in scene viewing: Selecting an optimal viewing position independently of motor biases and image feature distributions. J Vis. <year>2007</year>;7(14):4.1-17. doi:<pub-id pub-id-type="doi" specific-use="author">10.1167/7.14.4</pub-id></mixed-citation></ref>
<ref id="b52"><mixed-citation publication-type="book" specific-use="unparsed"><person-group person-group-type="author"><name><surname>Tisch</surname> <given-names>S.</given-names></name><name><surname>Finerman</surname> <given-names>W.</given-names></name><name><surname>Zemeckis</surname> <given-names>R.</given-names></name></person-group> Forrest Gump: Paramount Pictures;<year>1994</year>. </mixed-citation></ref>
<ref id="b28"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Tseng</surname>, <given-names>P. H.</given-names></name>, <name><surname>Carmi</surname>, <given-names>R.</given-names></name>, <name><surname>Cameron</surname>, <given-names>I. G. M.</given-names></name>, <name><surname>Munoz</surname>, <given-names>D. P.</given-names></name>, &#x26; <name><surname>Itti</surname>, <given-names>L.</given-names></name></person-group> (<year>2009</year>). <article-title>Quantifying center bias of observers in free viewing of dynamic natural scenes.</article-title> <source>Journal of Vision (Charlottesville, Va.)</source>, <volume>9</volume>(<issue>7</issue>), <fpage>4</fpage>. <pub-id pub-id-type="doi" specific-use="author">10.1167/9.7.4</pub-id><pub-id pub-id-type="pmid">19761319</pub-id><issn>1534-7362</issn></mixed-citation></ref>
<ref id="b56"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Valtchanov</surname>, <given-names>D.</given-names></name>, &#x26; <name><surname>Ellard</surname>, <given-names>C. G.</given-names></name></person-group> (<year>2015</year>). <article-title>Cognitive and affective responses to natural scenes: Effects of low level visual properties on preference, cognitive load and eye-movements.</article-title> <source>Journal of Environmental Psychology</source>, <volume>43</volume>, <fpage>43184</fpage>–<lpage>43195</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1016/j.jenvp.2015.07.001</pub-id><issn>0272-4944</issn></mixed-citation></ref>
<ref id="b33"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Van der Burg</surname>, <given-names>E.</given-names></name>, <name><surname>Olivers</surname>, <given-names>C. N. L.</given-names></name>, <name><surname>Bronkhorst</surname>, <given-names>A. W.</given-names></name>, &#x26; <name><surname>Theeuwes</surname>, <given-names>J.</given-names></name></person-group> (<year>2008</year>). <article-title>Pip and pop: Nonspatial auditory signals improve spatial visual search.</article-title> <source>Journal of Experimental Psychology. Human Perception and Performance</source>, <volume>34</volume>(<issue>5</issue>), <fpage>1053</fpage>&#8211;<lpage>1065</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.1037/0096-1523.34.5.1053</pub-id><pub-id pub-id-type="pmid">18823194</pub-id><issn>0096-1523</issn></mixed-citation></ref>
<ref id="b35"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Vuoskoski</surname>, <given-names>J. K.</given-names></name>, <name><surname>Thompson</surname>, <given-names>M. R.</given-names></name>, <name><surname>Clarke</surname>, <given-names>E. F.</given-names></name>, &#x26; <name><surname>Spence</surname>, <given-names>C.</given-names></name></person-group> (<year>2014</year>). <article-title>Crossmodal interactions in the perception of expressivity in musical performance.</article-title> <source>Attention, Perception &#x26; Psychophysics</source>, <volume>76</volume>(<issue>2</issue>), <fpage>591</fpage>&#8211;<lpage>604</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.3758/s13414-013-0582-2</pub-id><pub-id pub-id-type="pmid">24233641</pub-id><issn>1943-3921</issn></mixed-citation></ref>
<ref id="b45"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Wallengren</surname>, <given-names>A. K.</given-names></name>, &#x26; <name><surname>Strukelj</surname>, <given-names>A.</given-names></name></person-group> (<year>2015</year>). <article-title>Film Music and Visual Attention: A Pilot Experiment using Eye-Tracking.</article-title> <source>Music and the Moving Image.</source>, <volume>8</volume>(<issue>2</issue>), <fpage>69</fpage>&#8211;<lpage>80</lpage>. <pub-id pub-id-type="doi" specific-use="author">10.5406/musimoviimag.8.2.0069</pub-id></mixed-citation></ref>
<ref id="b25"><mixed-citation publication-type="journal" specific-use="restruct"><person-group person-group-type="author"><name><surname>Wang</surname>, <given-names>H. X.</given-names></name>, <name><surname>Freeman</surname>, <given-names>J.</given-names></name>, <name><surname>Merriam</surname>, <given-names>E. P.</given-names></name>, <name><surname>Hasson</surname>, <given-names>U.</given-names></name>, &#x26; <name><surname>Heeger</surname>, <given-names>D. J.</given-names></name></person-group> (<year>2012</year>). <article-title>Temporal eye movement strategies during naturalistic viewing.</article-title> <source>Journal of Vision (Charlottesville, Va.)</source>, <volume>12</volume>(<issue>1</issue>), <fpage>16</fpage>. <pub-id pub-id-type="doi" specific-use="author">10.1167/12.1.16</pub-id><pub-id pub-id-type="pmid">22262911</pub-id><issn>1534-7362</issn></mixed-citation></ref>
<ref id="b50"><mixed-citation publication-type="journal" specific-use="unparsed"><person-group person-group-type="author"><name><surname>W&#246;llner</surname> <given-names>C</given-names></name>, <name><surname>Hammerschmidt</surname> <given-names>D</given-names></name>, <name><surname>Albrecht</surname> <given-names>H</given-names></name></person-group>. <article-title>Slow motion in films and video clips: Music influences perceived duration and emotion, autonomic physiological activation and pupillary responses.</article-title> <source>PLoS ONE.</source>  (in press). </mixed-citation></ref>
</ref-list>
</back>
</article>
