Identify Finger Rotation Angles With ArUco Markers and Action Cameras

2021 ◽  
Author(s):  
Tianyun Yuan ◽  
Yu (Wolf) Song ◽  
Gerald A. Kraan ◽  
Richard H. M. Goossens

Abstract Measuring the motion of human hand joints is a challenging task due to the high number of DOFs. In this study, we proposed a low-cost hand tracking system built on action cameras and ArUco markers to measure finger joint rotation angles. The lens distortion of each camera was corrected first via intra-calibration and the videos of different cameras were aligned to the reference camera using a dynamic time warping based method. Two methods were proposed and implemented for extracting the rotation angles of finger joints: one is based on the 3D positions of the markers via inter-calibration between cameras, named pos-based method; the other one is based on the relative marker orientation information from individual cameras, named rot-based method. An experiment was conducted to evaluate the effectiveness of the proposed system. The right hand of a volunteer was included in this practical study, where the movement of the fingers was recorded and the finger rotation angles were calculated with the two proposed methods, respectively. The results indicated that although using the rot-based method may collect less data than using the pos-based method, it was more stable and reliable. Therefore, the rot-based method is recommended for measuring finger joint rotation in practical setups.

Author(s):  
Tianyun Yuan ◽  
Yu Song ◽  
Gerald A. Kraan ◽  
Richard HM Goossens

Abstract Measuring the motions of human hand joints is often a challenge due to the high number of degrees of freedom. In this study, we proposed a hand tracking system utilizing action cameras and ArUco markers to continuously measure the rotation angles of hand joints. Three methods were developed to estimate the joint rotation angles. The pos-based method transforms marker positions to a reference coordinate system (RCS) and extracts a hand skeleton to identify the rotation angles. Similarly, the orient-x-based method calculates the rotation angles from the transformed x-orientations of the detected markers in the RCS. In contrast, the orient-mat-based method first identifies the rotation angles in each camera coordinate system using the detected orientations, and then, synthesizes the results regarding each joint. Experiment results indicated that the repeatability errors with one camera regarding different marker sizes were around 2.64 to 27.56 degrees and 0.60 to 2.36 degrees using the marker positions and orientations respectively. When multiple cameras were employed to measure the joint rotation angles, the angles measured by using the three methods were comparable with that measured by a goniometer. Despite larger deviations occurred when using the pos-based method. Further analysis indicated that the results of using the orient-mat-based method can describe more types of joint rotations, and the effectiveness of this method was verified by capturing hand movements of several participants. Thus it is recommended for measuring joint rotation angles in practical setups.


2012 ◽  
Vol 235 ◽  
pp. 68-73
Author(s):  
Hai Bo Pang ◽  
You Dong Ding

Hand gesture provides an attractive alternative to cumbersome interface devices for human computer interface. Many hand gesture recognition methods using visual analysis have been proposed. In our research, we exploit multiple cues including divergence features, vorticity features and hand motion direction vector. Divergence and vorticity are derived from the optical flow for hand gesture recognition in videos. Then these features are computed by principal component analysis method. The hand tracking algorithm finds the hand centroids for every frame, computes hand motion direction vector. At last, we introduced dynamic time warping method to verify the robustness of our features. Those experimental results demonstrate that the proposed approach yields a satisfactory recognition rate.


2021 ◽  
Vol 38 (2) ◽  
pp. 369-377
Author(s):  
Güneş Ekim ◽  
Ayten Atasoy ◽  
Nuri İkizler

Motor neuron patients such as paralysis, locking syndrome, and amyotrophic lateral sclerosis can see and hear what is happening in their environment, but cannot communicate with their environment. It is very important for these patients, who do not have any physical function other than eye movements, to be able to express their needs, feelings and thoughts. Therefore, to express the thoughts, needs and feelings of these patients, a system that converts eye-blink signals to speech was developed in this study. The main purpose of the designed system is high accuracy, low cost, high speed and independence from environmental factors. Undoubtedly, it is also very important that it causes as little discomfort to the patient as possible. Morse-coded signals generated by voluntary eye-blinks and the single-channel wireless NeuroSky MindWave Mobile device eliminates the need for cost-increasing equipment such as a camera or eye tracker and environmental factors such as light. With the use of Dynamic Time Warping (DTW), an algorithm which works at high speed and high accuracy at the time domain and does not require any training process has been implemented. In this way, the recorded speech was performed with a quite impressive accuracy.


Technologies ◽  
2019 ◽  
Vol 7 (3) ◽  
pp. 56 ◽  
Author(s):  
Ioannis Rallis ◽  
Eftychios Protopapadakis ◽  
Athanasios Voulodimos ◽  
Nikolaos Doulamis ◽  
Anastasios Doulamis ◽  
...  

The convention for the safeguarding of Intangible Cultural Heritage (ICH) by UNESCO highlights the equal importance of intangible elements of cultural heritage to tangible ones. One of the most important domains of ICH is folkloric dances. A dance choreography is a time-varying 3D process (4D modelling), which includes dynamic co-interactions among different actors, emotional and style attributes, and supplementary elements, such as music tempo and costumes. Presently, research focuses on the use of depth acquisition sensors, to handle kinesiology issues. The extraction of skeleton data, in real time, contains a significant amount of information (data and metadata), allowing for various choreography-based analytics. In this paper, a trajectory interpretation method for Greek folkloric dances is presented. We focus on matching trajectories’ patterns, existing in a choreographic database, to new ones originating from different sensor types such as VICON and Kinect II. Then, a Dynamic Time Warping (DTW) algorithm is proposed to find out similarities/dissimilarities among the choreographic trajectories. The goal is to evaluate the performance of the low-cost Kinect II sensor for dance choreography compared to the accurate but of high-cost VICON-based choreographies. Experimental results on real-life dances are carried out to show the effectiveness of the proposed DTW methodology and the ability of Kinect II to localize dances in 3D space.


Sign in / Sign up

Export Citation Format

Share Document