CNN-LSTM Network Based Prediction of Human Joint Angles Using Multi-Band SEMG and Historical Angles

Author(s):  
Yuze Jiao ◽  
Weiqun Wang ◽  
Zeng-Guang Hou ◽  
Shixin Ren ◽  
Jiaxing Wang ◽  
...  
Author(s):  
D. Panariello ◽  
S. Grazioso ◽  
T. Caporaso ◽  
A. Palomba ◽  
G. Di Gironimo ◽  
...  
Keyword(s):  

2016 ◽  
Vol 20 (2) ◽  
pp. 498-507 ◽  
Author(s):  
Dai Meng ◽  
Todd Shoepe ◽  
Gustavo Vejarano

2020 ◽  
Vol 32 (5) ◽  
pp. 863-875
Author(s):  
Seigo Kimura ◽  
Ryuji Suzuki ◽  
Katsuki Machida ◽  
Rie Nishihama ◽  
Manabu Okui ◽  
...  

In recent years, the burden per worker has increased due to a decrease in the working population. Wearable assist suits have been developed as one of the methods for solving the problem. To extend the assist suit to practical situations, it is necessary to provide a motion judgment interface for judging the motion of a wearer. Therefore, in our study, a motion judgment algorithm is proposed for assist suits, based on variable viscoelasticity. The proposed algorithm judges sitting, standing-up, stance, sitting-down, and gait using only the joint angle information of the suit and verification is performed using human joint angles obtained by motion capture. Thus, the motion judgment rate is 90% or more for sitting, standing-up, stance, and sitting-down, and 80% or more for gait, confirming the usefulness of motion judgment. Additionally, based on these results, further verification is performed on an actual machine. As a result, in a series of motions starting from the sitting to the standing-up, the stance, and the gait, the motion judgement is successful five times from the sitting to the standing-up, the stance, and once in gait. In a series of motions from sitting to standing-up, the stance, and sitting-down, the motion judgment is successful five times during sitting; five times during sitting, stance, and sitting-down; and three times during standing-up. In this way, it is confirmed that the proposed method can judge the motion only by angle information, although there is a problem in a success rate depending on the motion.


Sensors ◽  
2020 ◽  
Vol 20 (4) ◽  
pp. 1185 ◽  
Author(s):  
Baoping Xiong ◽  
Nianyin Zeng ◽  
Yurong Li ◽  
Min Du ◽  
Meilan Huang ◽  
...  

Introduction: Human joint moment is a critical parameter to rehabilitation assessment and human-robot interaction, which can be predicted using an artificial neural network (ANN) model. However, challenge remains as lack of an effective approach to determining the input variables for the ANN model in joint moment prediction, which determines the number of input sensors and the complexity of prediction. Methods: To address this research gap, this study develops a mathematical model based on the Hill muscle model to determining the online input variables of the ANN for the prediction of joint moments. In this method, the muscle activation, muscle-tendon moment velocity and length in the Hill muscle model and muscle-tendon moment arm are translated to the online measurable variables, i.e., muscle electromyography (EMG), joint angles and angular velocities of the muscle span. To test the predictive ability of these input variables, an ANN model is designed and trained to predict joint moments. The ANN model with the online measurable input variables is tested on the experimental data collected from ten healthy subjects running with the speeds of 2, 3, 4 and 5 m/s on a treadmill. The variance accounted for (VAF) between the predicted and inverse dynamics moment is used to evaluate the prediction accuracy. Results: The results suggested that the method can predict joint moments with a higher accuracy (mean VAF = 89.67±5.56 %) than those obtained by using other joint angles and angular velocities as inputs (mean VAF = 86.27±6.6%) evaluated by jack-knife cross-validation. Conclusions: The proposed method provides us with a powerful tool to predict joint moment based on online measurable variables, which establishes the theoretical basis for optimizing the input sensors and detection complexity of the prediction system. It may facilitate the research on exoskeleton robot control and real-time gait analysis in motor rehabilitation.


2020 ◽  
Vol 17 (03) ◽  
pp. 1950039
Author(s):  
Xinwei Li ◽  
Su Liu ◽  
Ying Chang ◽  
Sujiao Li ◽  
Yuanjie Fan ◽  
...  

Exoskeleton for motion assistance has obtained more and more attention due to its advantages in rehabilitation and assistance for daily life. This research designed an estimation method of human joint torque by the kinetic human–machine interaction between the operator’s elbow joint torque and the output of exoskeleton. The human elbow joint torque estimation was obtained by back propagation (BP) neural network with physiological and physical input elements including shoulder posture, elbow joint-related muscles activation, elbow joint position, and angular velocity. An elbow-powered exoskeleton was developed to verify the validity of the human elbow joint torque estimation. The average correlation coefficients of estimated and measured three shoulder joint angles are 97.9%, 96.2%, and 98.1%, which show that estimated joint angles are consistent with the measured joint angle. The average root-mean-square error between estimated elbow joint torque and measured values is about 0.143[Formula: see text]N[Formula: see text]m. The experiment results proved that the proposed strategy had good performance in human joint torque estimation.


1995 ◽  
Vol 08 (01) ◽  
pp. 58-60 ◽  
Author(s):  
T. M. Caporn

SummaryThe feline temporomandibular joint (TMJ) is inherently more stable than the canine or human joint through the close congruity of the feline mandibular fossa and condyle. Rostral luxation of the feline TMJ is resisted by a relatively large bony eminence. Traumatic luxations of the feline TMJ are therefore often associated with fractures of the mandibular fossa and/or condyle (1).The anatomy of the temporomandibular joint shows variations between species. These are highlighted by comparing the human, canine and feline temporomandibular articulations.


Sign in / Sign up

Export Citation Format

Share Document