instance search
Recently Published Documents


TOTAL DOCUMENTS

54
(FIVE YEARS 13)

H-INDEX

10
(FIVE YEARS 1)

2021 ◽  
Vol 2107 (1) ◽  
pp. 012007
Author(s):  
Mohd Fauzi Abu Hassan ◽  
Azurahisham Sah Pri ◽  
Zakiah Ahmad ◽  
Tengku Mohd Azahar Tuan Dir

Abstract This paper investigated single target tracking of arbitrary objects. Tracking is a difficult problem due to a variety of challenges such as scale variation, motion, background clutter, illumination etc. To achieve better tracking performance under these severe conditions, this paper proposed covariance descriptor based on multi-layer instance search region. Our results show that the proposed approach significantly improves the performance in term of centre location error (in pixels) compared to covariance descriptor with using a fixed bounding box. From this work, it is believed that we have constructed a great solution in choosing best layer for this descriptor. This will be addressed in the next future work such as consider target motion during tracking.


2021 ◽  
Author(s):  
Yi-Geng Hong ◽  
Hui-Chu Xiao ◽  
Wan-Lei Zhao

2020 ◽  
Vol 34 (07) ◽  
pp. 11037-11044
Author(s):  
Lianghua Huang ◽  
Xin Zhao ◽  
Kaiqi Huang

A key capability of a long-term tracker is to search for targets in very large areas (typically the entire image) to handle possible target absences or tracking failures. However, currently there is a lack of such a strong baseline for global instance search. In this work, we aim to bridge this gap. Specifically, we propose GlobalTrack, a pure global instance search based tracker that makes no assumption on the temporal consistency of the target's positions and scales. GlobalTrack is developed based on two-stage object detectors, and it is able to perform full-image and multi-scale search of arbitrary instances with only a single query as the guide. We further propose a cross-query loss to improve the robustness of our approach against distractors. With no online learning, no punishment on position or scale changes, no scale smoothing and no trajectory refinement, our pure global instance search based tracker achieves comparable, sometimes much better performance on four large-scale tracking benchmarks (i.e., 52.1% AUC on LaSOT, 63.8% success rate on TLP, 60.3% MaxGM on OxUvA and 75.4% normalized precision on TrackingNet), compared to state-of-the-art approaches that typically require complex post-processing. More importantly, our tracker runs without cumulative errors, i.e., any type of temporary tracking failures will not affect its performance on future frames, making it ideal for long-term tracking. We hope this work will be a strong baseline for long-term tracking and will stimulate future works in this area.


Author(s):  
Dong Feng ◽  
Man-Gui Liang ◽  
Feng Gao ◽  
Yi-Cheng Huang ◽  
Xin-Feng Zhang ◽  
...  
Keyword(s):  

Author(s):  
Daqun Li ◽  
Yi Yu ◽  
Xiaolin Chen

AbstractTo improve the deficient tracking ability of fully-convolutional Siamese networks (SiamFC) in complex scenes, an object tracking framework with Siamese network and re-detection mechanism (Siam-RM) is proposed. The mechanism adopts the Siamese instance search tracker (SINT) as the re-detection network. When multiple peaks appear on the response map of SiamFC, a more accurate re-detection network can re-determine the location of the object. Meanwhile, for the sake of adapting to various changes in appearance of the object, this paper employs a generative model to construct the templates of SiamFC. Furthermore, a method of template updating with high confidence is also used to prevent the template from being contaminated. Objective evaluation on the popular online tracking benchmark (OTB) shows that the tracking accuracy and the success rate of the proposed framework can reach 79.8% and 63.8%, respectively. Compared to SiamFC, the results of several representative video sequences demonstrate that our framework has higher accuracy and robustness in scenes with fast motion, occlusion, background clutter, and illumination variations.


Sign in / Sign up

Export Citation Format

Share Document