Audio-visual integration during overt visual attention
Abstract
How do different sources of information arising from different modalities interact to control where we look? To answer this question with respect to real-world operational conditions we presented natural images and spatially localized sounds in (V)isual, Audio-visual (AV) and (A)uditory conditions and measured subjects' eye-movements. Our results demonstrate that eye-movements in AV conditions are spatially biased towards the part of the image corresponding to the sound source. Interestingly, this spatial bias is dependent on the probability of a given image region to be fixated (saliency) in the V condition. This indicates that fixation behaviour during the AV conditions is the result of an integration process. Regression analysis shows that this integration is best accounted for by a linear combination of unimodal saliencies.
Published
2008-09-17
How to Cite
Quigley, C., Onat, S., Harding, S., Cooke, M., & König, P. (2008). Audio-visual integration during overt visual attention. Journal of Eye Movement Research, 1(2). https://doi.org/10.16910/jemr.1.2.4
Issue
Section
Articles
License
Copyright (c) 2008 Cliodhna Quigley, Selim Onat, Sue Harding, Martin Cooke, Peter König
This work is licensed under a Creative Commons Attribution 4.0 International License.