Dynamic Kernel-Based Progressive Particle Filter for 3D Human Motion Tracking

Author(s):  
Shih-Yao Lin ◽  
I-Cheng Chang
2010 ◽  
Vol 43 (10) ◽  
pp. 3621-3635 ◽  
Author(s):  
I-Cheng Chang ◽  
Shih-Yao Lin

2015 ◽  
Vol 110 ◽  
pp. 164-177 ◽  
Author(s):  
Jigang Liu ◽  
Dongquan Liu ◽  
Justin Dauwels ◽  
Hock Soon Seah

Author(s):  
Tony Tung ◽  
Takashi Matsuyama

This chapter presents a new formulation for the problem of human motion tracking in video. Tracking is still a challenging problem when strong appearance changes occur as in videos of humans in motion. Most trackers rely on a predefined template or on a training dataset to achieve detection and tracking. Therefore they are not efficient to track objects whose appearance is not known in advance. A solution is to use an online method that updates iteratively a subspace of reference target models. In addition, we propose to integrate color and motion cues in a particle filter framework to track human body parts. The algorithm process consists of two modes, switching between detection and tracking. The detection steps involve trained classifiers to update estimated positions of the tracking windows, whereas tracking steps rely on an adaptive color-based particle filter coupled with optical flow estimations. The Earth Mover distance is used to compare color models in a global fashion, and constraints on flow features avoid drifting effects. The proposed method has revealed its efficiency to track body parts in motion and can cope with full appearance changes. Experiments were performed on challenging real world videos with poorly textured models and non-linear motions.


Sign in / Sign up

Export Citation Format

Share Document