motion recognition
Recently Published Documents


TOTAL DOCUMENTS

724
(FIVE YEARS 236)

H-INDEX

31
(FIVE YEARS 5)

Sensors ◽  
2022 ◽  
Vol 22 (1) ◽  
pp. 402
Author(s):  
Zhanjun Hao ◽  
Juan Niu ◽  
Xiaochao Dang ◽  
Zhiqiang Qiao

Motion recognition has a wide range of applications at present. Recently, motion recognition by analyzing the channel state information (CSI) in Wi-Fi packets has been favored by more and more scholars. Because CSI collected in the wireless signal environment of human activity usually carries a large amount of human-related information, the motion-recognition model trained for a specific person usually does not work well in predicting another person’s motion. To deal with the difference, we propose a personnel-independent action-recognition model called WiPg, which is built by convolutional neural network (CNN) and generative adversarial network (GAN). According to CSI data of 14 yoga movements of 10 experimenters with different body types, model training and testing were carried out, and the recognition results, independent of bod type, were obtained. The experimental results show that the average correct rate of WiPg can reach 92.7% for recognition of the 14 yoga poses, and WiPg realizes “cross-personnel” movement recognition with excellent recognition performance.


Life ◽  
2022 ◽  
Vol 12 (1) ◽  
pp. 64
Author(s):  
Dongdong Bu ◽  
Shuxiang Guo ◽  
He Li

The surface electromyography (sEMG) signal is widely used as a control source of the upper limb exoskeleton rehabilitation robot. However, the traditional way of controlling the exoskeleton robot by the sEMG signal requires one to specially extract and calculate for complex sEMG features. Moreover, due to the huge amount of calculation and individualized difference, the real-time control of the exoskeleton robot cannot be realized. Therefore, this paper proposes a novel method using an improved detection algorithm to recognize limb joint motion and detect joint angle based on sEMG images, aiming to obtain a high-security and fast-processing action recognition strategy. In this paper, MobileNetV2 combined the Ghost module as the feature extraction network to obtain the pretraining model. Then, the target detection network Yolo-V4 was used to estimate the six movement categories of the upper limb joints and to predict the joint movement angles. The experimental results showed that the proposed motion recognition methods were available. Every 100 pictures can be accurately identified in approximately 78 pictures, and the processing speed of every single picture on the PC side was 17.97 ms. For the train data, the [email protected] could reach 82.3%, and [email protected]–0.95 could reach 0.42; for the verification data, the average recognition accuracy could reach 80.7%.


Author(s):  
Qiming Li ◽  
Lu Xu ◽  
Xiaoyan Yang

Pose estimation is the basis and key of human motion recognition. In the two-dimensional human pose estimation based on image, in order to reduce the adverse effects of mutual occlusion among multiple people and improve the accuracy of motion recognition, a structurally symmetrical two-dimensional multi-person pose estimation model combined with face detection is proposed in this paper. First, transfer learning is used to initialize each sub-branch network model. Then, MTCNN is used for face detection to predict the number of people in the image. According to the number of people, the image is input into the improved two-branch OpenPose network. What is more, the double judgment algorithm is proposed to correct the false detection of MTCNN. The experimental results show that compared with TensorPose, which is the latest improved method based on OpenPose, the Average Precision (AP) (Intersection over Union [Formula: see text]) on the validation set is 8.8 higher. Furthermore, compared with OpenPose, the mean AP ([Formula: see text]) is 1.7 higher on the validation set and is 1.3 higher on the Test-dev test set.


2021 ◽  
Vol 27 (11) ◽  
pp. 835-844
Author(s):  
Jae Hoon Son ◽  
Dong Hwi Kang ◽  
Dong-Hwan Hwang

Sign in / Sign up

Export Citation Format

Share Document