scholarly journals The Analysis of the Possibility of Using Viola-Jones Algorithm to Recognise Hand Gestures in Human-Machine Interaction

2017 ◽  
Vol 40 (1) ◽  
pp. 109-144
Author(s):  
Piotr Golański ◽  
Marek Szczekala

AbstractThe article concerns the issue of applying computer-aided systems of the maintenance of technical objects in difficult conditions. Difficult conditions shall be understood as these in which the maintenance takes place in a specific location making it hard or even preventing from using a computer. In these cases computers integrated with workwear should be used, the so-called wearable computers, with which the communication is possible by using hand gestures. The results of the analysis of the usefulness of one of methods of image recognition based on Viola-Jones algorithm were described. This algorithm enables to obtain the model of recognised image which might be used as a pattern in the application programme detecting a certain image.

2018 ◽  
Vol 14 (1) ◽  
pp. 41-50
Author(s):  
Mohammed Tawfeeq ◽  
Ayam Abbass

The evolution of wireless communication technology increases human machine interaction capabilities especially in controlling robotic systems. This paper introduces an effective wireless system in controlling the directions of a wheeled robot based on online hand gestures. The hand gesture images are captured and processed to be recognized and classified using neural network (NN). The NN is trained using extracted features to distinguish five different gestures; accordingly it produces five different signals. These signals are transmitted to control the directions of the cited robot. The main contribution of this paper is, the technique used to recognize hand gestures is required only two features, these features can be extracted in very short time using quite easy methodology, and this makes the proposed technique so suitable for online interaction. In this methodology, the preprocessed image is partitioned column-wise into two half segments; from each half one feature is extracted. This feature represents the ratio of white to black pixels of the segment histogram. The NN showed very high accuracy in recognizing all of the proposed gesture classes. The NN output signals are transmitted to the robot microcontroller wirelessly using Bluetooth. Accordingly the microcontroller guides the robot to the desired direction. The overall system showed high performance in controlling the robot movement directions.


This paper focuses on a review of recent work on facial expression and hand gesture recognitions. Facial expressions and hand gestures are used to express emotions without oral communication. The human brain has the ability to identify the emotions of persons using expressions or hand gestures within a fraction of a second. Research has been conducted on human–machine interactions (HMIs), and the expectation is that systems based on such HMI algorithms should respond similarly. Furthermore, when a person intends to express emotions orally, he or she automatically uses complementary facial expressions and hand gestures. Extant systems are designed to express these emotions through HMIs without oral communication. Other systems have added various combinations of hand gestures and facial expressions as videos or images. The meaning or emotions conveyed by particular hand gestures and expressions are predefined in these cases. Accordingly, the systems were trained and tested. Further, certain extant systems have separately defined the meanings of such hand gestures and facial expressions


2016 ◽  
Vol 16 (16) ◽  
pp. 6425-6432 ◽  
Author(s):  
Hari Prabhat Gupta ◽  
Haresh S. Chudgar ◽  
Siddhartha Mukherjee ◽  
Tanima Dutta ◽  
Kulwant Sharma

2021 ◽  
pp. 1-9
Author(s):  
Harshadkumar B. Prajapati ◽  
Ankit S. Vyas ◽  
Vipul K. Dabhi

Face expression recognition (FER) has gained very much attraction to researchers in the field of computer vision because of its major usefulness in security, robotics, and HMI (Human-Machine Interaction) systems. We propose a CNN (Convolutional Neural Network) architecture to address FER. To show the effectiveness of the proposed model, we evaluate the performance of the model on JAFFE dataset. We derive a concise CNN architecture to address the issue of expression classification. Objective of various experiments is to achieve convincing performance by reducing computational overhead. The proposed CNN model is very compact as compared to other state-of-the-art models. We could achieve highest accuracy of 97.10% and average accuracy of 90.43% for top 10 best runs without any pre-processing methods applied, which justifies the effectiveness of our model. Furthermore, we have also included visualization of CNN layers to observe the learning of CNN.


Author(s):  
Xiaochen Zhang ◽  
Lanxin Hui ◽  
Linchao Wei ◽  
Fuchuan Song ◽  
Fei Hu

Electric power wheelchairs (EPWs) enhance the mobility capability of the elderly and the disabled, while the human-machine interaction (HMI) determines how well the human intention will be precisely delivered and how human-machine system cooperation will be efficiently conducted. A bibliometric quantitative analysis of 1154 publications related to this research field, published between 1998 and 2020, was conducted. We identified the development status, contributors, hot topics, and potential future research directions of this field. We believe that the combination of intelligence and humanization of an EPW HMI system based on human-machine collaboration is an emerging trend in EPW HMI methodology research. Particular attention should be paid to evaluating the applicability and benefits of the EPW HMI methodology for the users, as well as how much it contributes to society. This study offers researchers a comprehensive understanding of EPW HMI studies in the past 22 years and latest trends from the evolutionary footprints and forward-thinking insights regarding future research.


ATZ worldwide ◽  
2021 ◽  
Vol 123 (3) ◽  
pp. 46-49
Author(s):  
Tobias Hesse ◽  
Michael Oehl ◽  
Uwe Drewitz ◽  
Meike Jipp

Healthcare ◽  
2021 ◽  
Vol 9 (7) ◽  
pp. 834
Author(s):  
Magbool Alelyani ◽  
Sultan Alamri ◽  
Mohammed S. Alqahtani ◽  
Alamin Musa ◽  
Hajar Almater ◽  
...  

Artificial intelligence (AI) is a broad, umbrella term that encompasses the theory and development of computer systems able to perform tasks normally requiring human intelligence. The aim of this study is to assess the radiology community’s attitude in Saudi Arabia toward the applications of AI. Methods: Data for this study were collected using electronic questionnaires in 2019 and 2020. The study included a total of 714 participants. Data analysis was performed using SPSS Statistics (version 25). Results: The majority of the participants (61.2%) had read or heard about the role of AI in radiology. We also found that radiologists had statistically different responses and tended to read more about AI compared to all other specialists. In addition, 82% of the participants thought that AI must be included in the curriculum of medical and allied health colleges, and 86% of the participants agreed that AI would be essential in the future. Even though human–machine interaction was considered to be one of the most important skills in the future, 89% of the participants thought that it would never replace radiologists. Conclusion: Because AI plays a vital role in radiology, it is important to ensure that radiologists and radiographers have at least a minimum understanding of the technology. Our finding shows an acceptable level of knowledge regarding AI technology and that AI applications should be included in the curriculum of the medical and health sciences colleges.


Sign in / Sign up

Export Citation Format

Share Document