A support system for visually impaired persons to understand three-dimensional visual information using acoustic interface

Author(s):  
Y. Kawai ◽  
F. Tomita
2000 ◽  
Vol 120 (5) ◽  
pp. 648-655 ◽  
Author(s):  
Yoshihiro Kawai ◽  
Makoto Kobayashi ◽  
Hiroki Minagawa ◽  
Masayuki Miyakawa ◽  
Fumiaki Tomita

2021 ◽  
Vol 21 (1) ◽  
pp. 3-10
Author(s):  
Ingmar BEŠIĆ ◽  
◽  
Zikrija AVDAGIĆ AVDAGIĆ ◽  
Kerim HODŽIĆ

Visual impairments often pose serious restrictions on a visually impaired person and there is a considerable number of persons, especially among aging population, which depend on assistive technology to sustain their quality of life. Development and testing of assistive technology for visually impaired requires gathering information and conducting studies on both healthy and visually impaired individuals in a controlled environment. We propose test setup for visually impaired persons by creating RFID based assistive environment – Visual Impairment Friendly RFID Room. The test setup can be used to evaluate RFID object localization and its use by visually impaired persons. To certain extent every impairment has individual characteristics as different individuals may better respond to different subsets of visual information. We use virtual reality prototype to both simulate visual impairment and map full visual information to the subset that visually impaired person can perceive. Time-domain color mapping real-time image processing is used to evaluate the virtual reality prototype targeting color vision deficiency.


2014 ◽  
Vol 26 (6) ◽  
pp. 735-742
Author(s):  
Noriyuki Kawarazaki ◽  
◽  
Yuhei Kaneishi ◽  
Nobuyuki Saito ◽  
Takashi Asakawa ◽  
...  

<div class=""abs_img""><img src=""[disp_template_path]/JRM/abst-image/00260006/06.jpg"" width=""250"" />Supporting system of choral singing</div> Visually impaired persons may find it difficult to take part in a chorus or other singing group because they cannot see the beat indicated by conductor’s hand movements. This paper provides a chorus support system for visually impaired persons using depth image sensor. This consists of an electric music baton with an acceleration sensor, a radio module, haptic interface devices with vibration motors, a depth image sensor, and a PC. The electric music baton transmits the signal indicating the conductor’s motion to visually impaired players based on sensor acceleration. Since the conductor must give individual instruction to player, we use a depth image sensor to indicate the direction in which the conductor’s baton points. This direction is estimated based on the conductor’s posture. We also attempted to develop a chorus support system without using the electric music baton. The beat is obtained by the maximum velocity position of the conductor’s hand motion using a depth image sensor. The effectiveness of our system is clarified by several experimental results. </span>


2006 ◽  
Vol 18 (Supplement) ◽  
pp. 56-56 ◽  
Author(s):  
Yuuri Miki ◽  
Chiharu Sasagawa ◽  
Akihiko Hanafusa ◽  
Teruhiko Fuwa

2010 ◽  
Vol 22 (4) ◽  
pp. 152-159 ◽  
Author(s):  
Syuri Terada ◽  
Akihiko Hanafusa ◽  
Tomozumi Ikeda ◽  
Teruhiko Fuwa

Author(s):  
Takafumi Matsumaru ◽  
◽  
Masashi Narita

This paper presents a newly developed calligraphy-stroke learning support system. The system incorporates the following functions: a) Displaying brushwork, trajectory, and handwriting; b) recording and playback of an expert’s calligraphy-stroke; and c) teaching a learner a calligraphy-stroke. The following features of the system demonstrate the contributions of our study. (1) The system, which consists of a sensor and projector, is simple and compact, so as to be easily introduced to the usual educational fields and practical leaning situations. (2) Three-dimensional calligraphy-stroke is instructed by presenting two-dimensional visual information. (3) A trajectory region is generated in the form of continuous squares, calculated using a brush model based on the brush position information measured by a sensor. (4) Handwriting is expressed by mapping a handwriting texture image according to ink concentration and the brush handling state. The results of the trial experiment suggest the effectiveness of the learning support function in terms of letter form and calligraphy-stroke.


Sensors ◽  
2021 ◽  
Vol 21 (21) ◽  
pp. 7351
Author(s):  
Dominik Osiński ◽  
Marta Łukowska ◽  
Dag Roar Hjelme ◽  
Michał Wierzchoń

The successful development of a system realizing color sonification would enable auditory representation of the visual environment. The primary beneficiary of such a system would be people that cannot directly access visual information—the visually impaired community. Despite the plethora of sensory substitution devices, developing systems that provide intuitive color sonification remains a challenge. This paper presents design considerations, development, and the usability audit of a sensory substitution device that converts spatial color information into soundscapes. The implemented wearable system uses a dedicated color space and continuously generates natural, spatialized sounds based on the information acquired from a camera. We developed two head-mounted prototype devices and two graphical user interface (GUI) versions. The first GUI is dedicated to researchers, and the second has been designed to be easily accessible for visually impaired persons. Finally, we ran fundamental usability tests to evaluate the new spatial color sonification algorithm and to compare the two prototypes. Furthermore, we propose recommendations for the development of the next iteration of the system.


2008 ◽  
Vol 20 (Supplement) ◽  
pp. 41-41
Author(s):  
Shuri Terada ◽  
Akihiko Hanafusa ◽  
Teruhiko Fuwa ◽  
Tomozumi Ikeda

Sign in / Sign up

Export Citation Format

Share Document