VisualPal: A mobile app for object recognition for the visually impaired

Author(s):  
Shagufta Md. Rafique Bagwan ◽  
L. J. Sankpal
2021 ◽  
Vol 5 (4) ◽  
pp. 409
Author(s):  
Lee Ruo Yee ◽  
Hazalila Kamaludin ◽  
Noor Zuraidin Mohd Safar ◽  
Norfaradilla Wahid ◽  
Noryusliza Abdullah ◽  
...  

Intelligence Eye is an Android based mobile application developed to help blind and visually impaired users to detect light and objects. Intelligence Eye used Region-based Convolutional Neural Networks (R-CNN) to recognize objects in the object recognition module and a vibration feedback is provided according to the light value in the light detection module. A voice guidance is provided in the application to guide the users and announce the result of the object recognition. TensorFlow Lite is used to train the neural network model for object recognition in conjunction with extensible markup language (XML) and Java in Android Studio for the programming language. For future works, improvements can be made to enhance the functionality of the Intelligence Eye application by increasing the object detection capacity in the object recognition module, add menu settings for vibration intensity in light detection module and support multiple languages for the voice guidance.


2016 ◽  
Vol 29 (4-5) ◽  
pp. 337-363 ◽  
Author(s):  
Giles Hamilton-Fletcher ◽  
Thomas D. Wright ◽  
Jamie Ward

Visual sensory substitution devices (SSDs) can represent visual characteristics through distinct patterns of sound, allowing a visually impaired user access to visual information. Previous SSDs have avoided colour and when they do encode colour, have assigned sounds to colour in a largely unprincipled way. This study introduces a new tablet-based SSD termed the ‘Creole’ (so called because it combines tactile scanning with image sonification) and a new algorithm for converting colour to sound that is based on established cross-modal correspondences (intuitive mappings between different sensory dimensions). To test the utility of correspondences, we examined the colour–sound associative memory and object recognition abilities of sighted users who had their device either coded in line with or opposite to sound–colour correspondences. Improved colour memory and reduced colour-errors were made by users who had the correspondence-based mappings. Interestingly, the colour–sound mappings that provided the highest improvements during the associative memory task also saw the greatest gains for recognising realistic objects that also featured these colours, indicating a transfer of abilities from memory to recognition. These users were also marginally better at matching sounds to images varying in luminance, even though luminance was coded identically across the different versions of the device. These findings are discussed with relevance for both colour and correspondences for sensory substitution use.


2020 ◽  
Vol 7 (1) ◽  
pp. 1823158
Author(s):  
Mohammed Noman ◽  
Vladimir Stankovic ◽  
Ayman Tawfik

Author(s):  
Cheptoo K. Priscah ◽  
Khamadi I. D. Shem ◽  
Maina Jane

Information is power, very important and valuable commodity in everyday human activity as evidenced with the current digital divide that equal access to information is essential to the development of information society which also applies to people with disability. The aim of the study was to establish the information seeking behavior of visually impaired (VI) students at University of Nairobi library services. The total population of the study comprised of two units of analysis which were 32 visually impaired students and 6 librarians in charge of the visually impaired students. The study carried out a survey by means of questionnaires. The study employed Wilson’s (1999) theory of information behavior model which provided a framework in mapping the student’s information patterns. Statistical Package for the Social Sciences (SPSS) and Microsoft excel 2016 were used to analyze the data. The response rate was 78.95% from all the units of analysis. The findings indicated that majority of the respondents 72% were female. The findings further revealed that most Visually impaired students were getting assistance from a sighted person or by use of computers, speech synthesizers, screen readers, brail prints, and audio books. A smaller number of the respondents (2.44%) used mobile app called tap tap. The study indicated that majority 68.3% of the Visually impaired students relied on aiding tools.


Sign in / Sign up

Export Citation Format

Share Document