Recovery Of Motion Parameters Using Optical Flow

Author(s):  
Phil Greenway
Author(s):  
Toru Tamaki ◽  

We propose a method for extracting human limb regions by the combination of optical flow-based motion segmentation and nonlinear optimization-based image registration. First, rotating limb regions with rough boundaries are extracted and motion parameters are estimated for an approximated model. Then the extracted region and estimated parameters are used as initial values for nonlinear optimization that minimizes residuals of two successive frames and estimates motion parameters. Combining the two steps reduces computational cost and avoids the initial state problem of optimization. According to estimated parameters, the limb region is extracted by a Bayesian classifier to obtain accurate region boundaries. Experimental results on real images are shown.


2015 ◽  
Vol 2015 ◽  
pp. 1-14
Author(s):  
Ming Yang ◽  
Xiaolin Gu ◽  
Hao Lu ◽  
Chunxiang Wang ◽  
Lei Ye

Precise navigation map is crucial in many fields. This paper proposes a panorama based method to detect and recognize lane markings and traffic signs on the road surface. Firstly, to deal with the limited field of view and the occlusion problem, this paper designs a vision-based sensing system which consists of a surround view system and a panoramic system. Secondly, in order to detect and identify traffic signs on the road surface, sliding window based detection method is proposed. Template matching method and SVM (Support Vector Machine) are used to recognize the traffic signs. Thirdly, to avoid the occlusion problem, this paper utilities vision based ego-motion estimation to detect and remove other vehicles. As surround view images contain less dynamic information and gray scales, improved ICP (Iterative Closest Point) algorithm is introduced to ensure that the ego-motion parameters are consequently obtained. For panoramic images, optical flow algorithm is used. The results from the surround view system help to filter the optical flow and optimize the ego-motion parameters; other vehicles are detected by the optical flow feature. Experimental results show that it can handle different kinds of lane markings and traffic signs well.


2004 ◽  
Vol 37 (4) ◽  
pp. 767-779 ◽  
Author(s):  
Sang-Cheol Park ◽  
Hyoung-Suk Lee ◽  
Seong-Whan Lee

2014 ◽  
Vol 2014 ◽  
pp. 1-7 ◽  
Author(s):  
B. R. Wang ◽  
Y. L. Jin ◽  
D. L. Shao ◽  
Y. Xu

Image jitters occur in the video of the autonomous robot moving on bricks road, which will reduce robot operation precision based on vision. In order to compensate the image jitters, the affine transformation kinematics were established for obtaining the six image motion parameters. The feature point pair detecting method was designed based on Eigen-value of the feature windows gradient matrix, and the motion parameters equation was solved using the least square method and the matching point pairs got based on the optical flow. The condition number of coefficient matrix was proposed to quantificationally analyse the effect of matching errors on parameters solving errors. Kalman filter was adopted to smooth image motion parameters. Computing cases show that more point pairs are beneficial for getting more precise motion parameters. The integrated jitters compensation software was developed with feature points detecting in subwindow. And practical experiments were conducted on two mobile robots. Results show that the compensation costing time is less than frame sample time and Kalman filter is valid for robot vision jitters compensation.


Sign in / Sign up

Export Citation Format

Share Document