Online Long-Term Object Tracking Based on Compressed Haar-Like Features via Sparse Representation
Online long-term tracking is a challenging problem as data streams change over time. In this paper, sparse representation has been applied to visual tracking by finding the most correct sample with minimal reconstruction error using compressed Haar-like features. However, most sparse representation tracking algorithm introduce l1 regularization into the PCA reconstruction using samples directly, which leads to complexity computation and can not adapt to occlusion, rotation and change in size. Our model updating not only uses the samples from the training set, but also generates the warped versions (include scale variation, rotation, occlusion and illumination changes) for the previous tracking result. Also, we do not use the samples in models for sparse representation directly, but the Haar-like features instead which are compressed in a very low-dimensional space. In addition, we use a robust and fast algorithm which exploits the spatio-temporal context for predicting the target location in the next frame. This step will lead to the reduction of the searching range by the detector. We demonstrate the proposed method is able to track objects well under pose and scale variation, rotation, occlusion and illumination with great real-time performance on challenging image sequences.