Toward a Novel Human Interface for Conceptualizing Spatial Information in Non-Speech Audio
We developed a concept of interfaces using nonspeech audio for building wearable devices to support visually impaired persons. The main purpose is to enable visually impaired persons to freely conceptualize spatial information by nonspeech audio without requiring conventional means, such as artificial pattern recognition and voice synthesizer systems. Subjects participated in experiments to evaluate their ability to localize pattern-associated sounds. During the experiments, the subjects navigated through various virtual 3-D acoustic environments. The experimental results showed that sound effects, such as reverberation and reflection and variable z-coordinate movement, enhance the ability to localize pattern-associated sounds. The subjects were also evaluated on their ability to conceptualize spatial information based on cues in “artificial” and “natural” sounds. The evaluation results revealed that “natural” sounds are essential for improving everyday listening skills and the ability to conceptualize spatial information.