Touch Gesture Recognition System based on 1D Convolutional Neural Network with Two Touch Sensor Orientation Settings

Author(s):  
Joo-Hye Park ◽  
Ju-Hwan Seo ◽  
Young-Hoon Nho ◽  
Dong-Soo Kwon
Sensors ◽  
2021 ◽  
Vol 21 (7) ◽  
pp. 2540
Author(s):  
Zhipeng Yu ◽  
Jianghai Zhao ◽  
Yucheng Wang ◽  
Linglong He ◽  
Shaonan Wang

In recent years, surface electromyography (sEMG)-based human–computer interaction has been developed to improve the quality of life for people. Gesture recognition based on the instantaneous values of sEMG has the advantages of accurate prediction and low latency. However, the low generalization ability of the hand gesture recognition method limits its application to new subjects and new hand gestures, and brings a heavy training burden. For this reason, based on a convolutional neural network, a transfer learning (TL) strategy for instantaneous gesture recognition is proposed to improve the generalization performance of the target network. CapgMyo and NinaPro DB1 are used to evaluate the validity of our proposed strategy. Compared with the non-transfer learning (non-TL) strategy, our proposed strategy improves the average accuracy of new subject and new gesture recognition by 18.7% and 8.74%, respectively, when up to three repeated gestures are employed. The TL strategy reduces the training time by a factor of three. Experiments verify the transferability of spatial features and the validity of the proposed strategy in improving the recognition accuracy of new subjects and new gestures, and reducing the training burden. The proposed TL strategy provides an effective way of improving the generalization ability of the gesture recognition system.


2018 ◽  
Vol 2018 ◽  
pp. 1-10 ◽  
Author(s):  
Saad Albawi ◽  
Oguz Bayat ◽  
Saad Al-Azawi ◽  
Osman N. Ucan

Recently, social touch gesture recognition has been considered an important topic for touch modality, which can lead to highly efficient and realistic human-robot interaction. In this paper, a deep convolutional neural network is selected to implement a social touch recognition system for raw input samples (sensor data) only. The touch gesture recognition is performed using a dataset previously measured with numerous subjects that perform varying social gestures. This dataset is dubbed as the corpus of social touch, where touch was performed on a mannequin arm. A leave-one-subject-out cross-validation method is used to evaluate system performance. The proposed method can recognize gestures in nearly real time after acquiring a minimum number of frames (the average range of frame length was from 0.2% to 4.19% from the original frame lengths) with a classification accuracy of 63.7%. The achieved classification accuracy is competitive in terms of the performance of existing algorithms. Furthermore, the proposed system outperforms other classification algorithms in terms of classification ratio and touch recognition time without data preprocessing for the same dataset.


2020 ◽  
Vol 9 (3) ◽  
pp. 996-1004 ◽  
Author(s):  
Muhammad Biyan Priatama ◽  
Ledya Novamizanti ◽  
Suci Aulia ◽  
Erizka Banuwati Candrasari

Public services are available to all communities including people with disabilities. One obstacle that impedes persons with disabilities from participating in various community activities and enjoying the various public services available to the community is information and communication barriers. One way to communicate with people with disabilities is with hand gestures. Therefore, the hand gesture technology is needed, in order to facilitate the public to interact with the disability. This study proposes a reliable hand gesture recognition system using the convolutional neural network method. The first step, carried out pre-processing, to separate the foreground and background. Then the foreground is transformed using the discrete wavelet transform (DWT) to take the most significant subband. The last step is image classification with convolutional neural network. The amount of training and test data used are 400 and 100 images repectively, containing five classes namely class A, B, C, # 5, and pointing. This study engendered a hand gesture recognition system that had an accuracy of 100% for dataset A and 90% for dataset B.


2021 ◽  
Author(s):  
Mark Benedict D. Jarabese ◽  
Charlie S. Marzan ◽  
Jenelyn Q. Boado ◽  
Rushaine Rica Mae F. Lopez ◽  
Lady Grace B. Ofiana ◽  
...  

Author(s):  
Muhammad Muaaz ◽  
Ali Chelli ◽  
Martin Wulf Gerdes ◽  
Matthias Pätzold

AbstractA human activity recognition (HAR) system acts as the backbone of many human-centric applications, such as active assisted living and in-home monitoring for elderly and physically impaired people. Although existing Wi-Fi-based human activity recognition methods report good results, their performance is affected by the changes in the ambient environment. In this work, we present Wi-Sense—a human activity recognition system that uses a convolutional neural network (CNN) to recognize human activities based on the environment-independent fingerprints extracted from the Wi-Fi channel state information (CSI). First, Wi-Sense captures the CSI by using a standard Wi-Fi network interface card. Wi-Sense applies the CSI ratio method to reduce the noise and the impact of the phase offset. In addition, it applies the principal component analysis to remove redundant information. This step not only reduces the data dimension but also removes the environmental impact. Thereafter, we compute the processed data spectrogram which reveals environment-independent time-variant micro-Doppler fingerprints of the performed activity. We use these spectrogram images to train a CNN. We evaluate our approach by using a human activity data set collected from nine volunteers in an indoor environment. Our results show that Wi-Sense can recognize these activities with an overall accuracy of 97.78%. To stress on the applicability of the proposed Wi-Sense system, we provide an overview of the standards involved in the health information systems and systematically describe how Wi-Sense HAR system can be integrated into the eHealth infrastructure.


Author(s):  
Preeti Arora ◽  
Vinod M Kapse ◽  
Sapna Sinha ◽  
Saksham Gera

Sign in / Sign up

Export Citation Format

Share Document