World Scientific
  • Search
  •   
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×
Our website is made possible by displaying certain online content using javascript.
In order to view the full content, please disable your ad blocker or whitelist our website www.worldscientific.com.

System Upgrade on Tue, Oct 25th, 2022 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at [email protected] for any enquiries.

Correlation Filters Based on Strong Spatio-Temporal for Robust RGB-T Tracking

    https://doi.org/10.1142/S0218126622500414Cited by:1 (Source: Crossref)

    In this paper, we propose a strong spatio-temporal mechanism with correlation filters to solve multi-modality tracking tasks. First, we use the features of the previous four frames as spatio-temporal features, then aggregate the spatio-temporal features into the filters learning and positioning of the adjacent frame. Second, we enhance the temporal and spatial characteristics of the current frame filter by learning the previous four frame filters and spatial penalty. From the experimental results on the GTOT, VOT-TIR2019 and RGBT234 datasets, our strong spatio-temporal correlation filters has achieved excellent performance.

    This paper was recommended by Regional Editor Tongquan Wei.

    References

    • 1. W. Liu, X. Tang and C. Zhao, Distractor-aware tracking with multi-task and dynamic feature learning, J. Circuits Syst. Comput. 30(2) (2021) 2150031. Link, ISIGoogle Scholar
    • 2. B. Sui, M. Xu and D. Li, Object tracking in satellite videos based on a lightweight network, J. Circuits Syst. Comput. 30(6) (2020) 2150099. Link, ISIGoogle Scholar
    • 3. J. Ning, L. Zhang, D. Zhang and C. Wu, Robust object tracking using joint color-texture histogram, Int. J. Pattern Recognit. Artif. Intell. 23 (2009) 1245–1263. Link, ISIGoogle Scholar
    • 4. L. Maddalena, A. Petrosino and A. Ferone, Object motion detection and tracking by an artificial intelligence approach, Int. J. Pattern Recognit. Artif. Intell. 22 (2008) 915–928. Link, ISIGoogle Scholar
    • 5. Y. Ma, Y. Liu, S. Liu and Z. Zhang, Multiple object detection and tracking in complex background, Int. J. Pattern Recognit. Artif. Intell. 31 (2017) 1755003. Link, ISIGoogle Scholar
    • 6. C. Li, C. Zhu, Y. Huang, J. Tang and L. Wang, Cross-modal ranking with soft consistency and noisy labels for robust RGB-T tracking, Europeon Conf. Computer Vision (Springer, Cham, 2018), pp. 831–847. CrossrefGoogle Scholar
    • 7. Y. Zhu, C. Li, J. Tang and B. Luo, Quality-aware feature aggregation network for robust RGBT tracking, IEEE Trans. Intell. Veh. 6(1) (2020) 121–130. CrossrefGoogle Scholar
    • 8. C. Li, N. Zhao, Y. Lu, C. Zhu and J. Tang, Weighted sparse representation regularized graph learning for RGB-T object tracking, Proc. 25th ACM Int. Conf. Multimedia, 2017, pp. 1856–1864. CrossrefGoogle Scholar
    • 9. X. Lan, M. Ye, R. Shao, B. Zhong, P. C. Yuen and H. Zhou, Learning modality-consistency feature templates: A robust RGB-infrared tracking system, IEEE Trans. Ind. Electron. 66 (2019) 9887–9897. Crossref, ISIGoogle Scholar
    • 10. C. Li, A. Lu, A. Zheng, Z. Tu and J. Tang, Multi-adapter RGBT tracking, Proc. IEEE Int. Conf. Computer Vision Workshops, 2019, pp. 2262–2270. CrossrefGoogle Scholar
    • 11. C. Wang, C. Xu, Z. Cui, L. Zhou, T. Zhang, X. Zhang and J. Yang, Cross-modal pattern-propagation for RGB-T tracking, Proc. IEEE/CVF Conf. Computer Vision and Pattern Recognition, 2020, pp. 7064–7073. CrossrefGoogle Scholar
    • 12. S. Zhai, P. Shao, X. Liang and X. Wang, Fast RGB-T tracking via cross-modal correlation filters, Neurocomputing 334 (2019) 172–181. Crossref, ISIGoogle Scholar
    • 13. Y. Wang, C. Li and J. Tang, Learning soft-consistent correlation filters for RGB-T object tracking, Chinese Conf. Pattern Recognition and Computer Vision (PRCV) (Springer, 2018), pp. 295–306. CrossrefGoogle Scholar
    • 14. F. Li, C. Tian, W. Zuo, L. Zhang and M.-H. Yang, Learning spatial-temporal regularized correlation filters for visual tracking, Proc. IEEE Conf. Computer Vision and Pattern Recognition, 2018, pp. 4904–4913. CrossrefGoogle Scholar
    • 15. H. Kiani Galoogahi, A. Fagg and S. Lucey, Learning background-aware correlation filters for visual tracking, Proc. IEEE Int. Conf. Computer Vision, 2017, pp. 1135–1143. CrossrefGoogle Scholar
    • 16. D. S. Bolme, J. R. Beveridge, B. A. Draper and Y. M. Lui, Visual object tracking using adaptive correlation filters, 2010 IEEE Computer Society Conf. Computer Vision and Pattern Recognition (IEEE, 2010), pp. 2544–2550. CrossrefGoogle Scholar
    • 17. J. F. Henriques, R. Caseiro, P. Martins and J. Batista, High-speed tracking with kernelized correlation filters, IEEE Trans. Pattern Anal. Mach. Intell. 37 (2014) 583–596. Crossref, ISIGoogle Scholar
    • 18. J. F. Henriques, R. Caseiro, P. Martins and J. Batista, Exploiting the circulant structure of tracking-by-detection with kernels, European Conf. Computer Vision (Springer, 2012), pp. 702–715. CrossrefGoogle Scholar
    • 19. N. Wang, W. Zhou, Q. Tian, R. Hong, M. Wang and H. Li, Multi-cue correlation filters for robust visual tracking, Proc. IEEE Conf. Computer Vision and Pattern Recognition, 2018, pp. 4844–4853. CrossrefGoogle Scholar
    • 20. M. Danelljan, G. Häger, F. Khan and M. Felsberg, Accurate scale estimation for robust visual tracking, in British Machine Vision Conf., Nottingham, September 1–5, BMVA Press, 2014. CrossrefGoogle Scholar
    • 21. Y. Li and J. Zhu, A scale adaptive kernel correlation filter tracker with feature integration, European Conf. Computer Vision (Springer, 2014), pp. 254–265. Google Scholar
    • 22. J. Tan, X. Ma and Y. Wang, Target tracking based on correlation filter for scale offset, IOP Conf. Series: Materials Science and Engineering, Vol. 790 (IOP Publishing, 2020), p. 012038. CrossrefGoogle Scholar
    • 23. M. Danelljan, G. Hager, F. Shahbaz Khan and M. Felsberg, Learning spatially regularized correlation filters for visual tracking, IEEE Int. Conf. Computer Vision, 2015, pp. 4310–4318. CrossrefGoogle Scholar
    • 24. J. Choi, H. Jin Chang, S. Yun, T. Fischer, Y. Demiris and J. Young Choi, Attentional correlation filter network for adaptive visual tracking, Proc. IEEE Conf. Computer Vision and Pattern Recognition, 2017, pp. 4807–4816. CrossrefGoogle Scholar
    • 25. M. Tang, L. Zheng, B. Yu and J. Wang, Fast kernelized correlation filter without boundary effect, Proc. IEEE/CVF Winter Conf. Applications of Computer Vision, 2021, pp. 2999–3008. CrossrefGoogle Scholar
    • 26. M. Danelljan, A. Robinson, F. S. Khan and M. Felsberg, Beyond correlation filters: Learning continuous convolution operators for visual tracking, European Conf. Computer Vision (Springer, 2016), pp. 472–488. CrossrefGoogle Scholar
    • 27. M. Danelljan, G. Bhat, F. Shahbaz Khan and M. Felsberg, Eco: Efficient convolution operators for tracking, Proc. IEEE Conf. Computer Vision and Pattern Recognition, 2017, pp. 6638–6646. CrossrefGoogle Scholar
    • 28. B. Huang, T. Xu, Z. Shen, S. Jiang and J. Li, Bscf: Learning background suppressed correlation filter tracker for wireless multimedia sensor networks, Ad Hoc Netw. 111 (2021) 102340. Crossref, ISIGoogle Scholar
    • 29. Y. Yang, W. Xing, S. Zhang, L. Gao, Q. Yu, X. Che and W. Lu, Visual tracking with long-short term based correlation filter, IEEE Access 8 (2020) 20257–20269. Crossref, ISIGoogle Scholar
    • 30. M. Ruhani, H. Farsi and S. Mohamadzadeh, Object tracking in video with correlation filter and using histogram of gradient feature, J. Soft Comput. Inf. Technol. 9 (2020) 43–55. Google Scholar
    • 31. J. Lyu, N. He, X. Sun and J. Xue, Background spatial correlation filter in multi-channel object tracking, J. Phys. Conf. Ser. 1453 (2020) 012110. CrossrefGoogle Scholar
    • 32. X. Lan, M. Ye, S. Zhang and P. Yuen, Robust collaborative discriminative learning for RGB-infrared tracking, Proc. AAAI Conf. Artificial Intelligence, Vol. 32, 2018, pp. 7008–7015. CrossrefGoogle Scholar
    • 33. C. Li, X. Sun, X. Wang, L. Zhang and J. Tang, Grayscale-thermal object tracking via multitask laplacian sparse representation, IEEE Trans. Syst. Man Cybern. Syst. 47 (2017) 673–681. Crossref, ISIGoogle Scholar
    • 34. Y. Wu, E. Blasch, G. Chen, L. Bai and H. Ling, Multiple source data fusion via sparse representation for robust visual tracking, 14th Int. Conf. Information Fusion (IEEE, 2011), pp. 1–8. Google Scholar
    • 35. H. Liu and F. Sun, Fusion tracking in color and infrared images using joint sparse representation, Sci. China Inf. Sci. 55 (2012) 590–599. CrossrefGoogle Scholar
    • 36. C. Li, H. Cheng, S. Hu, X. Liu, J. Tang and L. Lin, Learning collaborative sparse representation for grayscale-thermal tracking, IEEE Trans. Image Process. 25 (2016) 5743–5756. Crossref, ISIGoogle Scholar
    • 37. Y. Gao, C. Li, Y. Zhu, J. Tang, T. He and F. Wang, Deep adaptive fusion network for high performance RGBT tracking, Proc. IEEE/CVF Int. Conf. Computer Vision Workshops ( IEEE, 2019). CrossrefGoogle Scholar
    • 38. Y. Zhu, C. Li, B. Luo, J. Tang and X. Wang, Dense feature aggregation and pruning for RGBT tracking, Proc. 27th ACM Int. Conf. Multimedia, 2019, pp. 465–472. CrossrefGoogle Scholar
    • 39. C. Li, X. Wu, N. Zhao, X. Cao and J. Tang, Fusing two-stream convolutional neural networks for RGB-T object tracking, Neurocomputing 281 (2018) 78–85. Crossref, ISIGoogle Scholar
    • 40. X. Zhang, P. Ye, S. Peng, J. Liu and G. Xiao, Dsiammft: An RGB-T fusion tracking method via dynamic siamese networks using multi-layer feature fusion, Signal Process. Image Commun. 84 (2020) 115756. Crossref, ISIGoogle Scholar
    • 41. A. Lu, C. Li, Y. Yan, J. Tang and B. Luo, RGBT tracking via multi-adapter network with hierarchical divergence loss, arXiv:2011.07189 (2020). Google Scholar
    • 42. C. Li, L. Liu, A. Lu, Q. Ji and J. Tang, Challenge-aware RGBT tracking, European Conf. Computer Vision (Springer, 2020), pp. 222–237. CrossrefGoogle Scholar
    • 43. M. Feng, K. Song, Y. Wang, J. Liu and Y. Yan, Learning discriminative update adaptive spatial-temporal regularized correlation filter for RGB-T tracking, J. Vis. Commun. Image Represent. 72 (2020) 102881. Crossref, ISIGoogle Scholar
    • 44. C. Ó. Conaire, N. E. O’Connor and A. Smeaton, Thermo-visual feature fusion for object tracking using multiple spatiogram trackers, Mach. Vis. Appl. 19 (2008) 483–494. Crossref, ISIGoogle Scholar
    • 45. C. O. Conaire, N. E. O’Connor, E. Cooke and A. F. Smeaton, Comparison of fusion methods for thermo-visual surveillance tracking, 2006 9th Int. Conf. Information Fusion (IEEE, 2006), pp. 1–7. CrossrefGoogle Scholar
    • 46. C. Li, C. Zhu, S. Zheng, B. Luo and J. Tang, Two-stage modality-graphs regularized manifold ranking for RGB-T tracking, Signal Process. Image Commun. 68 (2018) 207–217. Crossref, ISIGoogle Scholar
    • 47. K. Dai, D. Wang, H. Lu, C. Sun and J. Li, Visual tracking via adaptive spatially-regularized correlation filters, Proc. IEEE Conf. Computer Vision and Pattern Recognition, 2019, pp. 4670–4679. CrossrefGoogle Scholar
    • 48. C. Li, X. Liang, Y. Lu, N. Zhao and J. Tang, RGB-T object tracking: Benchmark and baseline, Pattern Recognit. 96 (2019) 106977. Crossref, ISIGoogle Scholar
    • 49. M. Kristan et al., The seventh visual object tracking vot2019 challenge results, Proc. IEEE/CVF Int. Conf. Computer Vision Workshops, 2019. CrossrefGoogle Scholar
    • 50. A. Lukezic, T. Vojir, L. Čehovin Zajc, J. Matas and M. Kristan, Discriminative correlation filter with channel and spatial reliability, Proc. IEEE Conf. Computer Vision and Pattern Recognition, 2017, pp. 6309–6318. CrossrefGoogle Scholar
    • 51. H.-U. Kim, D.-Y. Lee, J.-Y. Sim and C.-S. Kim, Sowp: Spatially ordered and weighted patch descriptor for visual tracking, IEEE Int. Conf. Computer Vision, 2015, pp. 3011–3019. CrossrefGoogle Scholar