Project Details
Auditory-object-based perception modelling for complex scenes (A02)
Subject Area
Acoustics
Term
since 2018
Project identifier
Deutsche Forschungsgemeinschaft (DFG) - Project number 352015383
In complex realistic auditory scenes it can be expected that listeners are only able to sub-optimally extract and compare information of multiple sound sources depending on factors like duration, spectral content, spatial position, and reverberation. This project aims at developing a generalized model for auditory per-ception and quality in complex scenes taking into account a perceptual decomposition of the scene into auditory objects. The model will expand the successful concepts of energetic and amplitude modulation masking by informational masking helping to i) better understand auditory perception in realistic conditions, and ii) providing a tool for instrumental prediction of audio quality and auditory perception abilities.
DFG Programme
Collaborative Research Centres
Subproject of
SFB 1330:
Hearing acoustics: Perceptual principles, Algorithms and Applications (HAPPAA)
Applicant Institution
Carl von Ossietzky Universität Oldenburg
Project Head
Dr. Stephan Ewert