Hardcopy Text Recognition and Vocalization for Visually Impaired and Illiterates in Bilingual Language

Author(s):  
K. Shanmugam ◽  
B. Vanathi
Author(s):  
Tudor Dumitras ◽  
Matthew Lee ◽  
Pablo Quinones ◽  
Asim Smailagic ◽  
Dan Siewiorek ◽  
...  

Author(s):  
Melchiezhedhieck J. Bongao ◽  
◽  
Arvin F. Almadin ◽  
Christian L. Falla ◽  
Juan Carlo F. Greganda ◽  
...  

This Raspberry Single-Board Computer-Based Object and Text Real-time Recognition Wearable Device using Convolutional Neural Network through TensorFlow Deep Learning, Python and C++ programming languages, and SQLite database application, which detect stationary objects, road signs and Philippine (PHP) money bills, and recognized texts through camera and translate it to audible outputs such as English and Filipino languages. Moreover, the system has a battery notification status using an Arduino microcontroller unit. It also has a switch for object detection mode, text recognition mode, and battery status report mode. This could fulfill the incapability of visually impaired in identifying of objects and the lack of reading ability as well as reducing the assistance that visually impaired needs. Descriptive quantitative research, Waterfall System Development Life Cycle and Evolutionary Prototyping Models were used as the methodologies of this study. Visually impaired persons and the Persons with Disability Affairs Office of the City Government of Biñan, Laguna, Philippines served as the main respondents of the survey conducted. Obtained results stipulated that the object detection, text recognition, and its attributes were accurate and reliable, which gives a significant distinction from the current system to detect objects and recognize printed texts for the visually impaired people.


Author(s):  
Juliana Damasio Oliveira ◽  
Olimar Teixeira Borges ◽  
Vanessa Stangherlin Machado Paixão-Cortes ◽  
Marcia de Borba Campos ◽  
Rafael Mendes Damasceno

2021 ◽  
Vol 6 (1) ◽  
Author(s):  
Binay Kumar Pandey ◽  
Digvijay Pandey ◽  
Subodh Wariya ◽  
Gaurav Aggarwal ◽  
Rahul Rastogi

Author(s):  
Mrs. Ritika Dhabliya

Autonomous travel is a notable test for visually impaired people and furthermore the expanding accessibility of cost proficiency, superior and versatile advanced imaging gadgets has made a gigantic open door for enhancing conventional checking for record picture securing. We propose a camera based visual help system utilizing raspberry pi for content perusing, movement of items and the sentiments of outwardly hindered people face various challenges to play out their everyday errand. They are absolutely or halfway subject to somebody for help. Their issues have made them to lose their would like to live in this contending society. They look for help from others to control them entire day. This paper expects to make the outwardly debilitated individual completely autonomous in all perspectives. The proposed framework depends on a virtual eye, which conveys to the outside encompassing through a camera. The camera goes about as a consistent wellspring of data to the framework. The information is gotten through the camera. They got signals from the information gadgets are dissected utilizing picture handling in LABVIEW and it reacts to the outwardly debilitated individual through discourse preparing units. The handled data about environmental factors will be educated through the speaker (yield unit) by which outwardly weakened individuals can move and make their work effectively all alone. Also the outwardly weakened individual can naturally control a portion of the home apparatuses like fan utilizing remote correspondence framework.


Sign in / Sign up

Export Citation Format

Share Document