EIT-based Gesture Recognition Training with Augmented Reality

Author(s):  
Christian Giesser ◽  
Christian Gibas ◽  
Armin Gruenewald ◽  
Tanja Joan Eiler ◽  
Vanessa Schmuecker ◽  
...  
Author(s):  
Zeenat S. AlKassim ◽  
Nader Mohamed

In this chapter, the authors discuss a unique technology known as the Sixth Sense Technology, highlighting the future opportunities of such technology in integrating the digital world with the real world. Challenges in implementing such technologies are also discussed along with a review of the different possible implementation approaches. This review is performed by exploring the different inventions in areas similar to the Sixth Sense Technology, namely augmented reality (AR), computer vision, image processing, gesture recognition, and artificial intelligence and then categorizing and comparing between them. Lastly, recommendations are discussed for improving such a unique technology that has the potential to create a new trend in human-computer interaction (HCI) in the coming years.


2012 ◽  
Vol 7 (1) ◽  
pp. 468-472
Author(s):  
Yimin Chen ◽  
Qiming Li ◽  
Chen Huang ◽  
Congli Ye ◽  
Yun Li ◽  
...  

Author(s):  
Rafael Radkowski ◽  
Christian Stritzke

This paper presents a comparison between 2D and 3D interaction techniques for Augmented Reality (AR) applications. The interaction techniques are based on hand gestures and a computer vision-based hand gesture recognition system. We have compared 2D gestures and 3D gestures for interaction in AR application. The 3D recognition system is based on a video camera, which provides an additional depth image to each 2D color image. Thus, spatial interactions become possible. Our major question during this work was: Do depth images and 3D interaction techniques improve the interaction with AR applications, respectively with virtual 3D objects? Therefore, we have tested and compared the hand gesture recognition systems. The results show two things: First, they show that the depth images facilitate a more robust hand recognition and gesture identification. Second, the results are a strong indication that 3D hand gesture interactions techniques are more intuitive than 2D hand gesture interaction techniques. In summary the results emphasis, that depth images improve the hand gesture interaction for AR applications.


The Fingertip Detection acts a specific role in most of the vision based applications. The latest technologies like virtual reality and augmented reality actually follows this fingertip detection concept as its foundation. It is also helpful for Human Computer Interaction (HCI). So fingertip detection and tracking can be applied from games to robot control, from augmented reality to smart homes. The most important interesting field of fingertip detection is the gesture recognition related applications. In the context of interaction with the machines, gestures are the most simplest and efficient means of communication. This paper analyses the various works done in the areas of fingertip detection. A review on various real time fingertip methods is explained with different techniques and tools. Some challenges and research directions are also highlighted. Many researchers uses fingertip detection in HCI systems those have many applications in user identification, smart home etc. A comparison of results by different researchers is also included.


Author(s):  
Minghui Sun ◽  
Xinyu Wu ◽  
Zhihua Fan ◽  
Liyan Dong

Human-computer interaction (HCI) has developed rapidly in recent years, and more and more researchers are interested in applying HCI techniques into education. Compared with traditional approaches in the real world, gesture recognition is considered as a reasonable alternative since it is vivid and flexible. However, most of educational equipment nowadays achieves the function of augmented reality, without any interaction. This paper implemented a prototype, not only based on augmented reality system, but also especially we think about the interactive design. Accessibility is achieved by mobile devices and the dynamic switch of gesture recognition. By this interactive method, children are able to interact with the virtual objects easily and naturally. Consequently, children can have a profound and deep understanding of what they learn, and the quality of education will be improved.


2021 ◽  
Vol 11 (21) ◽  
pp. 9789
Author(s):  
Jiaqi Dong ◽  
Zeyang Xia ◽  
Qunfei Zhao

Augmented reality assisted assembly training (ARAAT) is an effective and affordable technique for labor training in the automobile and electronic industry. In general, most tasks of ARAAT are conducted by real-time hand operations. In this paper, we propose an algorithm of dynamic gesture recognition and prediction that aims to evaluate the standard and achievement of the hand operations for a given task in ARAAT. We consider that the given task can be decomposed into a series of hand operations and furthermore each hand operation into several continuous actions. Then, each action is related with a standard gesture based on the practical assembly task such that the standard and achievement of the actions included in the operations can be identified and predicted by the sequences of gestures instead of the performance throughout the whole task. Based on the practical industrial assembly, we specified five typical tasks, three typical operations, and six standard actions. We used Zernike moments combined histogram of oriented gradient and linear interpolation motion trajectories to represent 2D static and 3D dynamic features of standard gestures, respectively, and chose the directional pulse-coupled neural network as the classifier to recognize the gestures. In addition, we defined an action unit to reduce the dimensions of features and computational cost. During gesture recognition, we optimized the gesture boundaries iteratively by calculating the score probability density distribution to reduce interferences of invalid gestures and improve precision. The proposed algorithm was evaluated on four datasets and proved to increase recognition accuracy and reduce the computational cost from the experimental results.


Sign in / Sign up

Export Citation Format

Share Document