Communication System for Vocationally and Visually Disabled People Using Embedded System

2018 ◽  
Vol 15 (6) ◽  
pp. 2082-2088
Author(s):  
Shivin Sinha ◽  
Arpit Singhal ◽  
R Mohanasundaram ◽  
H Abdulgaffar ◽  
Navin Kumar
2020 ◽  
Vol 2020 ◽  
pp. 1-10
Author(s):  
Mohammad J. M. Zedan ◽  
Ali I. Abduljabbar ◽  
Fahad Layth Malallah ◽  
Mustafa Ghanem Saeed

Nowadays, much research attention is focused on human–computer interaction (HCI), specifically in terms of biosignal, which has been recently used for the remote controlling to offer benefits especially for disabled people or protecting against contagions, such as coronavirus. In this paper, a biosignal type, namely, facial emotional signal, is proposed to control electronic devices remotely via emotional vision recognition. The objective is converting only two facial emotions: a smiling or nonsmiling vision signal captured by the camera into a remote control signal. The methodology is achieved by combining machine learning (for smiling recognition) and embedded systems (for remote control IoT) fields. In terms of the smiling recognition, GENKl-4K database is exploited to train a model, which is built in the following sequenced steps: real-time video, snapshot image, preprocessing, face detection, feature extraction using HOG, and then finally SVM for the classification. The achieved recognition rate is up to 89% for the training and testing with 10-fold validation of SVM. In terms of IoT, the Arduino and MCU (Tx and Rx) nodes are exploited for transferring the resulting biosignal remotely as a server and client via the HTTP protocol. Promising experimental results are achieved by conducting experiments on 40 individuals who participated in controlling their emotional biosignals on several devices such as closing and opening a door and also turning the alarm on or off through Wi-Fi. The system implementing this research is developed in Matlab. It connects a webcam to Arduino and a MCU node as an embedded system.


Author(s):  
Ricardo Vergaz Benito ◽  
César Vega-Colado ◽  
María Begoña Coco ◽  
Rubén Cuadrado ◽  
Juan Carlos Torres-Zafra ◽  
...  

The aim of the chapter is to review the most recent advances in electro-optical technologies applied to visually disabled people. The World Health Organization (WHO) estimates that the number of people in the world with some kind of visual impairment is 285 million, with 246 million of these persons in a partially sighted or Low Vision (LV) condition. The top three causes of visual impairment are uncorrected refractive errors, cataracts and glaucoma, followed by age-related macular degeneration. On the other hand, Head Mounted Displays or electro-optical materials used in liquid crystal or electrochromic devices can be used in technical aids for LV. In this chapter, the authors review how disabled people receive real world information using these new technologies, how the recently developed electro-optical technical aids can improve visual perception, and how these LV aids do work, from a technological point of view.


Author(s):  
John W. Mullennix ◽  
Steven E. Stern

A frequently overlooked form of CMC is computer synthesized speech (CSS). Although the first CSS systems were rather crude and unintelligible, newer systems are fairly intelligible and are widely used for a number of applications, most importantly as aids for the speaking or visually disabled. In this chapter, we briefly review the development of CSS technology and discuss the work on perception and comprehension of CSS. Then, we examine how CSS use influences interactions between disabled people and nondisabled people. We conclude by emphasizing that the development of CSS systems should take into account various social psychological factors rooted in prejudice and stigma of the disabled.


2018 ◽  
Vol 7 (2) ◽  
pp. 453 ◽  
Author(s):  
Siti Nur Suhaila Mirin ◽  
Khalil Azha Mohd Annuar ◽  
Chai Pui Yook

This paper describes the development of a smart wheelchair system with voice recognition and touch controlled using an embedded system. An android application is developed and installed on the android smartphone. The system is divided into two main modes: voice recognition mode and touch mode. For the voice recognition mode, elderlies or physically disabled people (users) can provide the voice input, for example, “go”, “reverse”, “turn to the left”, “turn to the right” and “stop”. The wheelchair will move according to the command given. For the touch mode, the user can select the specified direction displayed within the four quadrants on the screen of the android smartphone to control the wheelchair. An Arduino Uno is used to execute all commands. The MD30C motor driver and HC05 Bluetooth module are used in this system. This system is designed to save time and energy of the user.


2012 ◽  
Vol 532-533 ◽  
pp. 667-671
Author(s):  
Yan Xia Li ◽  
Mo Li Zhang ◽  
Dan Mei Niu ◽  
Xiao Ling Zhang

It is the requirements of limited resources and realtime in embedded system. Design of network communication system based on embedded system is hard. In this paper, we propose the architecture of embedded network using Modbus TCP/IP protocol, analyze the composition of Modbus TCP/IP protocol. It is that Modbus protocol information can be made in the TCP/IP protocol of the Internet transmission, and thus extend the application of Modbus protocol. The network communication system is designed and implemented on the ARM platform using non-blanking socket in Linux. The system is available, stable and worthy of popularization.


2019 ◽  
Vol 8 (4) ◽  
pp. 1739-1742

Braille is a material arrangement system used by an outwardly debilitated and the apparently upset. It is a customary formed with constrained paper. They can make Braille with the initial slate and stylus or type it on a Braille essayist. In this undertaking we are working up another methodology using a Braille framework to scrutinize for understudies. Electronic Braille Readers are becoming popular worldwide day by day among the visually disabled people. Our project Electronic Braille Alphabet Reader for learners is developed to support Multilanguage. With this project we have chosen to give them the learning unit and undeniable gadget which can diminish their work and implants excitation to learn essential letters in Braille to completely fledged use for outwardly impeded.


Sign in / Sign up

Export Citation Format

Share Document