Experimental verification of turbidity tolerance of stereo-vision-based 3D pose estimation system

2018 ◽  
Vol 24 (3) ◽  
pp. 756-779
Author(s):  
Myo Myint ◽  
Khin Nwe Lwin ◽  
Naoki Mukada ◽  
Daiki Yamada ◽  
Takayuki Matsuno ◽  
...  
2021 ◽  
Vol 2021 ◽  
pp. 1-17
Author(s):  
Cui Li ◽  
Derong Chen ◽  
Jiulu Gong ◽  
Yangyu Wu

Many objects in the real world have circular feature. In general, circular feature’s pose is represented by 5-DoF (degree of freedom) vector ξ = X , Y , Z , α , β T . It is a difficult task to measure the accuracy of circular feature’s pose in each direction and the correlation between each direction. This paper proposes a closed-form solution for estimating the accuracy of pose transformation of circular feature. The covariance matrix of ξ is used to measure the accuracy of the pose. The relationship between the pose of the circular feature of 3D object and the 2D points is analyzed to yield an implicit function, and then Gauss–Newton theorem is employed to compute the partial derivatives of the function with respect to such point, and after that the covariance matrix is computed from both the 2D points and the extraction error. In addition, the method utilizes the covariance matrix of 5-DoF circular feature’s pose variables to optimize the pose estimator. Based on pose covariance, minimize the mean square error (Min-MSE) metric is introduced to guide good 2D imaging point selection, and the total amount of noise introduced into the pose estimator can be reduced. This work provides an accuracy method for object 2D-3D pose estimation using circular feature. At last, the effectiveness of the method for estimating the accuracy is validated based on both random data sets and synthetic images. Various synthetic image sequences are illustrated to show the performance and advantages of the proposed pose optimization method for estimating circular feature’s pose.


2016 ◽  
Vol 22 (1) ◽  
pp. 8-16
Author(s):  
Wijenayake Udaya ◽  
Sung-In Choi ◽  
Soon-Yong Park

2017 ◽  
Vol 56 (24) ◽  
pp. 6822 ◽  
Author(s):  
Zhifeng Luo ◽  
Ke Zhang ◽  
Zhigang Wang ◽  
Jian Zheng ◽  
Yixin Chen

Author(s):  
Jun Liu ◽  
Henghui Ding ◽  
Amir Shahroudy ◽  
Ling-Yu Duan ◽  
Xudong Jiang ◽  
...  

2019 ◽  
Vol 5 (1) ◽  
pp. 9-12
Author(s):  
Jyothsna Kondragunta ◽  
Christian Wiede ◽  
Gangolf Hirtz

AbstractBetter handling of neurological or neurodegenerative disorders such as Parkinson’s Disease (PD) is only possible with an early identification of relevant symptoms. Although the entire disease can’t be treated but the effects of the disease can be delayed with proper care and treatment. Due to this fact, early identification of symptoms for the PD plays a key role. Recent studies state that gait abnormalities are clearly evident while performing dual cognitive tasks by people suffering with PD. Researches also proved that the early identification of the abnormal gaits leads to the identification of PD in advance. Novel technologies provide many options for the identification and analysis of human gait. These technologies can be broadly classified as wearable and non-wearable technologies. As PD is more prominent in elderly people, wearable sensors may hinder the natural persons movement and is considered out of scope of this paper. Non-wearable technologies especially Image Processing (IP) approaches captures data of the person’s gait through optic sensors Existing IP approaches which perform gait analysis is restricted with the parameters such as angle of view, background and occlusions due to objects or due to own body movements. Till date there exists no researcher in terms of analyzing gait through 3D pose estimation. As deep leaning has proven efficient in 2D pose estimation, we propose an 3D pose estimation along with proper dataset. This paper outlines the advantages and disadvantages of the state-of-the-art methods in application of gait analysis for early PD identification. Furthermore, the importance of extracting the gait parameters from 3D pose estimation using deep learning is outlined.


Sign in / Sign up

Export Citation Format

Share Document