Real time Object Tracking based on Local Texture Feature with Correlation Filter

Author(s):  
Meina Qiao ◽  
Tian Wang ◽  
Yuan Dong ◽  
Jingang Shi ◽  
Jing Teng ◽  
...  
2019 ◽  
Vol 358 ◽  
pp. 33-43 ◽  
Author(s):  
Gengzheng Pan ◽  
Guochun Chen ◽  
Wenxiong Kang ◽  
Junhui Hou

Sensors ◽  
2019 ◽  
Vol 19 (10) ◽  
pp. 2362 ◽  
Author(s):  
Yijin Yang ◽  
Yihong Zhang ◽  
Demin Li ◽  
Zhijie Wang

Correlation filter-based methods have recently performed remarkably well in terms of accuracy and speed in the visual object tracking research field. However, most existing correlation filter-based methods are not robust to significant appearance changes in the target, especially when the target undergoes deformation, illumination variation, and rotation. In this paper, a novel parallel correlation filters (PCF) framework is proposed for real-time visual object tracking. Firstly, the proposed method constructs two parallel correlation filters, one for tracking the appearance changes in the target, and the other for tracking the translation of the target. Secondly, through weighted merging the response maps of these two parallel correlation filters, the proposed method accurately locates the center position of the target. Finally, in the training stage, a new reasonable distribution of the correlation output is proposed to replace the original Gaussian distribution to train more accurate correlation filters, which can prevent the model from drifting to achieve excellent tracking performance. The extensive qualitative and quantitative experiments on the common object tracking benchmarks OTB-2013 and OTB-2015 have demonstrated that the proposed PCF tracker outperforms most of the state-of-the-art trackers and achieves a high real-time tracking performance.


2018 ◽  
Vol 15 (3) ◽  
pp. 583-596 ◽  
Author(s):  
Ce Li ◽  
Xingchao Liu ◽  
Xiangbo Su ◽  
Baochang Zhang

2020 ◽  
Vol 10 (2) ◽  
pp. 713 ◽  
Author(s):  
Jungsup Shin ◽  
Heegwang Kim ◽  
Dohun Kim ◽  
Joonki Paik

Object tracking has long been an active research topic in image processing and computer vision fields with various application areas. For practical applications, the object tracking technique should be not only accurate but also fast in a real-time streaming condition. Recently, deep feature-based trackers have been proposed to achieve a higher accuracy, but those are not suitable for real-time tracking because of an extremely slow processing speed. The slow speed is a major factor to degrade tracking accuracy under a real-time streaming condition since the processing delay forces skipping frames. To increase the tracking accuracy with preserving the processing speed, this paper presents an improved kernelized correlation filter (KCF)-based tracking method that integrates three functional modules: (i) tracking failure detection, (ii) re-tracking using multiple search windows, and (iii) motion vector analysis to decide a preferred search window. Under a real-time streaming condition, the proposed method yields better results than the original KCF in the sense of tracking accuracy, and when a target has a very large movement, the proposed method outperforms a deep learning-based tracker, such as multi-domain convolutional neural network (MDNet).


2020 ◽  
Vol 10 (9) ◽  
pp. 3021
Author(s):  
Wangpeng He ◽  
Heyi Li ◽  
Wei Liu ◽  
Cheng Li ◽  
Baolong Guo

Object tracking is a challenging research task because of drastic appearance changes of the target and a lack of training samples. Most online learning trackers are hampered by complications, e.g., drifting problem under occlusion, being out of view, or fast motion. In this paper, a real-time object tracking algorithm termed “robust sum of template and pixel-wise learners” (rStaple) is proposed to address those problems. It combines multi-feature correlation filters with a color histogram. Firstly, we extract a combination of specific features from the searching area around the target and then merge feature channels to train a translation correlation filter online. Secondly, the target state is determined by a discriminating mechanism, wherein the model update procedure stops when the target is occluded or out of view, and re-activated when the target re-appears. In addition, by calculating the color histogram score in the searching area, a significant enhancement is adopted for the score map. The target position can be estimated by combining the enhanced color histogram score with the correlation filter response map. Finally, a scale filter is trained for multi-scale detection to obtain the final tracking result. Extensive experimental results on a large benchmark dataset demonstrates that the proposed rStaple is superior to several state-of-the-art algorithms in terms of accuracy and efficiency.


Sign in / Sign up

Export Citation Format

Share Document