Enforcing Template Representability and Temporal Consistency for Adaptive Sparse Tracking
Xue Yang, Fei Han, Hua Wang, Hao Zhang
IJCAI - 2016
Sparse representation has been widely studied in visual tracking, which has shown promising tracking performance. Despite a lot of progress, the visual tracking problem is still a challenging task due to appearance variations over time. In this paper, we propose a novel sparse tracking algorithm that well addresses temporal appearance changes, by enforcing template representability and temporal consistency (TRAC). By modeling temporal consistency, our algorithm addresses the issue of drifting away from a tracking target. By exploring the templates’ long-term-short-term representability, the proposed method adaptively updates the dictionary using the most descriptive templates, which significantly improves the robustness to target appearance changes. We compare our TRAC algorithm against the state-of-the-art approaches on 12 challenging benchmark image sequences. Both qualitative and quantitative results demonstrate that our algorithm significantly outperforms previous state-of-the-art trackers.
Links
- View publications from Fei Han
- View publications from Hua Wang
- View publications presented in IJCAI
- View publications in the project, Mining Materials Genome Data for Prediction and Guidance of Nanoparticle Synthesis
- View publications researching Sparsity / Sparse Coding
- View publications applied to Computer Vision
Cite this paper
MLA
Yang, Xue, et al. "Enforcing template representability and temporal consistency for adaptive sparse tracking." arXiv preprint arXiv:1605.00170 (2016).
BibTeX
@article{yang2016enforcing, title={Enforcing template representability and temporal consistency for adaptive sparse tracking}, author={Yang, Xue and Han, Fei and Wang, Hua and Zhang, Hao}, journal={arXiv preprint arXiv:1605.00170}, year={2016} }