Learning Structured Hough Voting for Joint Object Detection and Occlusion Reasoning

Author(s):  
Tao Wang ◽  
Xuming He ◽  
Nick Barnes
2013 ◽  
Vol 117 (9) ◽  
pp. 1190-1202 ◽  
Author(s):  
Min Sun ◽  
Shyam Sunder Kumar ◽  
Gary Bradski ◽  
Silvio Savarese

Author(s):  
Xiaoqin Kuang ◽  
Nong Sang ◽  
Feifei Chen ◽  
Runmin Wang ◽  
Changxin Gao

2014 ◽  
Vol 2014 ◽  
pp. 1-20 ◽  
Author(s):  
Yimin Lin ◽  
Naiguang Lu ◽  
Xiaoping Lou ◽  
Fang Zou ◽  
Yanbin Yao ◽  
...  

This paper introduces an invariant Hough random ferns (IHRF) incorporating rotation and scale invariance into the local feature description, random ferns classifier training, and Hough voting stages. It is especially suited for object detection under changes in object appearance and scale, partial occlusions, and pose variations. The efficacy of this approach is validated through experiments on a large set of challenging benchmark datasets, and the results demonstrate that the proposed method outperforms state-of-the-art conventional methods such as bounding-box-based and part-based methods. Additionally, we also propose an efficient clustering scheme based on the local patches’ appearance and their geometric relations that can provide pixel-accurate, top-down segmentations from IHRF back-projections. This refined segmentation can be used to improve the quality of online object tracking because it avoids the drifting problem. Thus, an online tracking framework based on IHRF, which is trained and updated in each frame to distinguish and segment the object from the background, is established. Finally, the experimental results on both object segmentation and long-term object tracking show that this method yields accurate and robust tracking performance in a variety of complex scenarios, especially in cases of severe occlusions and nonrigid deformations.


Author(s):  
S. C. Radopoulou ◽  
M. Sun ◽  
F. Dai ◽  
I. Brilakis ◽  
S. Savarese

Sign in / Sign up

Export Citation Format

Share Document