Acoustic perception and acoustic memory of letters

1968 ◽  
Vol 28 ◽  
pp. 161-170 ◽  
Author(s):  
Teodor Künnapas
Author(s):  
Mike Chemistruck ◽  
Andrew Allen ◽  
John Snyder ◽  
Nikunj Raghuvanshi

We model acoustic perception in AI agents efficiently within complex scenes with many sound events. The key idea is to employ perceptual parameters that capture how each sound event propagates through the scene to the agent's location. This naturally conforms virtual perception to human. We propose a simplified auditory masking model that limits localization capability in the presence of distracting sounds. We show that anisotropic reflections as well as the initial sound serve as useful localization cues. Our system is simple, fast, and modular and obtains natural results in our tests, letting agents navigate through passageways and portals by sound alone, and anticipate or track occluded but audible targets. Source code is provided.


2019 ◽  
Vol 13 ◽  
Author(s):  
Shaowei Jin ◽  
Huaping Liu ◽  
Bowen Wang ◽  
Fuchun Sun

Author(s):  
Longchuan Yan ◽  
Jun Du ◽  
Qingming Huang ◽  
Shuqiang Jiang
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document