An oscillatory correlation model of auditory streaming |
| |
Authors: | DeLiang Wang Peter Chang |
| |
Affiliation: | (1) Department of Computer Science and Engineering, Center for Cognitive Science, The Ohio State University, Columbus, OH 43210, USA |
| |
Abstract: | We present a neurocomputational model for auditory streaming, which is a prominent phenomenon of auditory scene analysis. The proposed model represents auditory scene analysis by oscillatory correlation, where a perceptual stream corresponds to a synchronized assembly of neural oscillators and different streams correspond to desynchronized oscillator assemblies. The underlying neural architecture is a two-dimensional network of relaxation oscillators with lateral excitation and global inhibition, where one dimension represents time and another dimension frequency. By employing dynamic connections along the frequency dimension and a random element in global inhibition, the proposed model produces a temporal coherence boundary and a fissure boundary that closely match those from the psychophysical data of auditory streaming. Several issues are discussed, including how to represent physical time and how to relate shifting synchronization to auditory attention. |
| |
Keywords: | Auditory streaming Oscillatory correlation Relaxation oscillator Shifting synchronization LEGION |
本文献已被 SpringerLink 等数据库收录! |
|