A Smart-Phone Based System to Detect Warning Sound for Hearing Impaired People

Author(s):  
Koichiro Takeuchi ◽  
Tetsuya Matsumoto ◽  
Yoshinori Takeuchi ◽  
Hiroaki Kudo ◽  
Noboru Ohnishi
Author(s):  
Kavita Pandey ◽  
Dhiraj Pandey ◽  
Vatsalya Yadav ◽  
Shriya Vikhram

Background: According to the WHO report, around 4.07% of the world's population is visually impaired. About 90% of the visually impaired users live in the lower economic strata. In the fast moving technology, most of the invention misses the need of these people. Mainly the technologies were designed for mainstream people; visually impaired people always find an inability to access it. This inability arises primarily for reasons such as cost, for example, Perkins Brailler costs 80-248 dollars for the simple purpose of Braille input. Another major reason is the hassle of carrying the big equipment. Objective: Keeping all this in mind and making technology as their best friends, MAGIC-1 has been designed. The goal is to provide a solution in terms of an application, which helps the visually impaired user in their daily life activities. Method: The proposed solution assists visually impaired users through smart phone technology. If visually impaired users ever wished to have a touched guide into a smart phone, MAGIC-1 has the solution that consolidates all the important features in their daily activities. Results: The performance of the solution as a whole and its individual features in terms of usability, utility and other metrics, etc. has been tested with sample visually impaired users. Moreover, their performances in term of Errors per Word and Words per Minute have been observed. Conclusion: MAGIC-I, the proposed solution works as an assistant of visually impaired users to overcome their daily struggles and stay more connected to the world. A visually impaired user can communicate via their mobile devices with features like eyes free texting using braille, voice calling etc. They can easily take help in an emergency situation with the options of SOS emergency calling and video assistance.


2021 ◽  
Vol 11 (8) ◽  
pp. 3439
Author(s):  
Debashis Das Chakladar ◽  
Pradeep Kumar ◽  
Shubham Mandal ◽  
Partha Pratim Roy ◽  
Masakazu Iwamura ◽  
...  

Sign language is a visual language for communication used by hearing-impaired people with the help of hand and finger movements. Indian Sign Language (ISL) is a well-developed and standard way of communication for hearing-impaired people living in India. However, other people who use spoken language always face difficulty while communicating with a hearing-impaired person due to lack of sign language knowledge. In this study, we have developed a 3D avatar-based sign language learning system that converts the input speech/text into corresponding sign movements for ISL. The system consists of three modules. Initially, the input speech is converted into an English sentence. Then, that English sentence is converted into the corresponding ISL sentence using the Natural Language Processing (NLP) technique. Finally, the motion of the 3D avatar is defined based on the ISL sentence. The translation module achieves a 10.50 SER (Sign Error Rate) score.


Author(s):  
Ganesh Bhutkar ◽  
Yohannes Kurniawan ◽  
Johan Johan ◽  
Dhananjay Bhole ◽  
Shrikant Salve ◽  
...  

2011 ◽  
Vol 36 (4) ◽  
pp. 683-694 ◽  
Author(s):  
Marek Niewiarowicz ◽  
Tomasz Kaczmarek

Abstract This article presents results of investigations of the angle of directional hearing acuity (ADHA) as a measure of the spatial hearing ability with a special emphasis on people with hearing impairments. A modified method proposed by Zakrzewski has been used - ADHA values have been determined for 8 azimuths in the horizontal plane at the height of the listeners' head. The two-alternative-forced-choice method (2AFC), based on a new system of listeners' responses (left - right instead of no difference - difference in location of sound sources) was the procedure used in the experiment. Investigations were carried out for two groups of subjects: normal hearing people (9 persons) and hearing impaired people (sensorineural hearing loss and tinnitus - 9 persons). In the experiment different acoustic signals were used: sinusoidal signals (pure tones), 1/3 octave noise, amplitude modulated 1/3 octave noise, CCITT speech and traffic noises and signals corresponding to personal character of tinnitus for individual subjects. The results obtained in the investigations showed, in general, a better localization of the sound source for noise type signals than those for tonal signals. Inessential differences exist in ADHA values for particular signals between the two groups of subjects. On the other hand, significant differences for tinnitus signals and traffic noise signals were stated. A new system of listeners' responses was used and appeared efficient (less dispersion of results compared to the standard system).


2012 ◽  
Vol 19 (3) ◽  
pp. 601-618
Author(s):  
Lee, Jin-Sook ◽  
Jang Won Moon ◽  
최은영

Sign in / Sign up

Export Citation Format

Share Document