PUMA
Istituto di Scienza e Tecnologie dell'Informazione     
Nguyen K., Dellepiane M., Viaud-Delmon I., Warusfel O. Investigation of auditory-visual integration in VR Environments. In: 8th International Multisensory Research Forum (Sydney, 5-7 July 2007).
 
 
Abstract
(English)
Investigating the time and spatial constraints under which visual and auditory stimuli are perceived as a unique percept or as spatially coincident has been a topic of numerous researches. However, these findings have been derived up to now in extremely simplified stimulation context consisting in the combination of elementary auditory and visual stimuli usually displayed in dark and anechoic conditions. The present experiment is conducted in a VR environment using a passive stereoscopic display and binaural audio rendering. Subjects have to indicate the point of subjective spatial alignment (PSSA) between a horizontally moving visual stimulus that crosses the direction of a stationary sound. Auditory stimuli are displayed on headphones using individualized head-related transfer functions and the visual stimulus is integrated in a visual background texture in order to convey visual perspective. Two types of audio stimuli are used to evaluate the influence of auditory localisation acuity on the auditory-visual integration: periodic white noise bursts providing optimal localisation cues and periodic 1kHz tone bursts. The present study will indicate whether previous findings (Lewald et al., Behavioural Brain Research, 2001) still hold in more complex audio-visual contexts such as those offered by cutting edge VR environments.
Subject Neuroscience
Crossmodal
Perception
I.m Computing Methodologies. Miscellaneous


Icona documento 1) Download Document HTML


Icona documento Open access Icona documento Restricted Icona documento Private

 


Per ulteriori informazioni, contattare: Librarian http://puma.isti.cnr.it

Valid HTML 4.0 Transitional