Effect of visual representation and acoustic scene complexity on auditory perception in reverberant audio-visual environments
For hearing research and evaluation of hearing systems, realistic, complex acoustic environments (CAE) are ecologically relevant in which auditory perception is accompanied by visual cues. Loudspeaker reproduction can be used to create virtual acoustic environments in combination with head mounted displays (HMDs) to create a stereoscopic visual representation of desired environments. However, acoustic cues as represented by the head-related transfer function can be altered when wearing an HMD. Here we investigate how the psychoacoustic measures sound source localization, distance perception, loudness, speech intelligibility and listening effort are affected by a visual representation of the scene using a HMD. The scene consisted of a ring of eight (virtual) loudspeakers which simultaneously played a target speech stimulus and different numbers of nonsense speech interferers in several spatial conditions. The acoustic environment was either anechoic or had a reverberation time (T60) of about 1.5 s. The complexity of the scene was additionally varied by assessing the psychoacoustic measures in isolated consecutive measurements or simultaneously after a single presentation of the target.Results show that the HMD did not have a significant effect on data. Loudness and distance perception show significantly different results when they were measured simultaneously instead of consecutively in isolation.