Development of an Open Source Software for Real Time Optical Motion Capture

Author(s):  
David Lunardi Flam ◽  
Joao Victor Boechat Gomide ◽  
Arnaldo De Albuquerque Araujo
Author(s):  
David Lunardi Flam ◽  
Daniel Pacheco de Queiroz ◽  
Thatyene Louise Alves de Souza Ramos ◽  
Arnaldo de Albuquerque Araujo ◽  
Joao Victor Boechat Gomide

2021 ◽  
Author(s):  
Patrick Slade ◽  
Ayman Habib ◽  
Jennifer L. Hicks ◽  
Scott L. Delp

AbstractAnalyzing human motion is essential for diagnosing movement disorders and guiding rehabilitation interventions for conditions such as osteoarthritis, stroke, and Parkinson’s disease. Optical motion capture systems are the current standard for estimating kinematics but require expensive equipment located in a predefined space. While wearable sensor systems can estimate kinematics in any environment, existing systems are generally less accurate than optical motion capture. Further, many wearable sensor systems require a computer in close proximity and rely on proprietary software, making it difficult for researchers to reproduce experimental findings. Here, we present OpenSenseRT, an open-source and wearable system that estimates upper and lower extremity kinematics in real time by using inertial measurement units and a portable microcontroller. We compared the OpenSenseRT system to optical motion capture and found an average RMSE of 4.4 degrees across 5 lower-limb joint angles during three minutes of walking (n = 5) and an average RMSE of 5.6 degrees across 8 upper extremity joint angles during a Fugl-Meyer task (n = 5). The open-source software and hardware are scalable, tracking between 1 and 14 body segments, with one sensor per segment. Kinematics are estimated in real-time using a musculoskeletal model and inverse kinematics solver. The computation frequency, depends on the number of tracked segments, but is sufficient for real-time measurement for many tasks of interest; for example, the system can track up to 7 segments at 30 Hz in real-time. The system uses off-the-shelf parts costing approximately $100 USD plus $20 for each tracked segment. The OpenSenseRT system is accurate, low-cost, and simple to replicate, enabling movement analysis in labs, clinics, homes, and free-living settings.


2017 ◽  
Vol 33 (6-8) ◽  
pp. 993-1003 ◽  
Author(s):  
Shihong Xia ◽  
Le Su ◽  
Xinyu Fei ◽  
Han Wang

2021 ◽  
Author(s):  
Mazen Al Borno ◽  
Johanna O'Day ◽  
Vanessa Ibarra ◽  
James Dunne ◽  
Ajay Seth ◽  
...  

Background: The ability to measure joint kinematics in natural environments over long durations using inertial measurement units (IMUs) could enable at-home monitoring and personalized treatment of neurological and musculoskeletal disorders. However, drift, or the accumulation of error over time, inhibits the accurate measurement of movement over long durations. We sought to develop an open-source workflow to estimate lower extremity joint kinematics from IMU data that was accurate, and capable of assessing and mitigating drift. Methods: We computed IMU-based estimates of kinematics using sensor fusion and an inverse kinematics approach with a constrained biomechanical model. We measured kinematics for 11 subjects as they performed two 10-minute trials: walking and a repeated sequence of varied lower-extremity movements. To validate the approach, we compared the joint angles computed with IMU orientations to the joint angles computed from optical motion capture using root mean square (RMS) difference and Pearson correlations, and estimated drift using a linear regression on each subject's RMS differences over time. Results: IMU-based kinematic estimates agreed with optical motion capture; median RMS differences over all subjects and all minutes were between 3-6 degrees for all joint angles except hip rotation and correlation coefficients were moderate to strong (r = 0.60 to 0.87). We observed minimal drift in the RMS differences over ten minutes; the average slopes of the linear fits to these data were near zero (-0.14 to 0.17 deg/min). Conclusions: Our workflow produced joint kinematics consistent with those estimated by optical motion capture, and could mitigate kinematic drift even in the trials of continuous walking without rest, obviating the need for explicit sensor recalibration (e.g. sitting or standing still for a few seconds or zero-velocity updates) used in current drift-mitigation approaches. This could enable long-duration measurements, bringing the field one step closer to estimating kinematics in natural environments.


2007 ◽  
Vol 6 (4) ◽  
pp. 11-20 ◽  
Author(s):  
Frank Hülsken ◽  
Christian Eckes ◽  
Roland Kuck ◽  
Jörg Unterberg ◽  
Sophie J�rg

We report on the workflow for the creation of realistic virtual anthropomorphic characters. 3D-models of human heads have been reconstructed from real people by following a structured light approach to 3D-reconstruction. We describe how these high-resolution models have been simplified and articulated with blend shape and mesh skinning techniques to ensure real-time animation. The full-body models have been created manually based on photographs. We present a system for capturing whole body motions, including the fingers, based on an optical motion capture system with 6 DOF rigid bodies and cybergloves. The motion capture data was processed in one system, mapped to a virtual character and visualized in real-time. We developed tools and methods for quick post processing. To demonstrate the viability of our system, we captured a library consisting of more than 90 gestures.


Sign in / Sign up

Export Citation Format

Share Document