1.中国科学院 长春光学精密机械与物理研究所, 吉林 长春 130033
2.中国科学院大学, 北京 100049
3.中国人民解放军第32035部队, 陕西 西安 710000
[ "刘刚阳(1996—),男,甘肃天水人,硕士研究生,2019年于兰州交通大学获得学士学位,主要从事图像处理与视觉目标跟踪方面的研究。E-mail:liugangyang20@mails.ucas.ac.cn" ]
扫 描 看 全 文
刘刚阳, 胡博, 王宇庆. 基于改进ECO-HC的目标跟踪方法[J]. 液晶与显示, 2023,38(8):1118-1127.
LIU Gang-yang, HU Bo, WANG Yu-qing. Target tracking method based on improved ECO-HC[J]. Chinese Journal of Liquid Crystals and Displays, 2023,38(8):1118-1127.
刘刚阳, 胡博, 王宇庆. 基于改进ECO-HC的目标跟踪方法[J]. 液晶与显示, 2023,38(8):1118-1127. DOI: 10.37188/CJLCD.2022-0290.
LIU Gang-yang, HU Bo, WANG Yu-qing. Target tracking method based on improved ECO-HC[J]. Chinese Journal of Liquid Crystals and Displays, 2023,38(8):1118-1127. DOI: 10.37188/CJLCD.2022-0290.
基于相关滤波的跟踪方法帧率高但容易发生跟踪漂移导致目标丢失。针对此问题,本文以相关滤波方法ECO-HC为基线,提出了一个用于描述图像场景变化幅度的参数,D,,根据实时计算该参数的值并以不同的间隔数更新模型,实现自适应更新模型在使算法以较高帧率运行的同时兼具良好的准确率,此效果在图像光照变化复杂、目标变形或被遮挡的情况下更加明显。实验结果表明,ECO-HC嵌入本文方法后,在光照变化的情况下成功率提高了1.6%,准确率提高了2.2%;在目标离开视场的情况下成功率提高了3.7%,准确率提高了3.4%。在本文实验环境中平均帧率达到60 FPS。
The tracking method based on correlation filtering has fast frame rate but is prone to tracking drift, resulting in target loss. Aimming to this problem, taking the correlation filtering method ECO-HC as the baseline, this paper proposes parameter ,,https://html.publish.founderss.cn/rc-pub/api/common/picture?pictureId=46525673&type=,https://html.publish.founderss.cn/rc-pub/api/common/picture?pictureId=46525672&type=,2.96333337,2.79399991, to describe the change amplitude of the image scene. According to real-time calculation of the value for this parameter and updating the model with different intervals, the adaptive updating model is realized, so that the algorithm runs at a higher frame rate and has a good accuracy. This effect is more obvious in the case of complex image illumination changing, target deformation or occlusion. Experimental results show that success rate and accuracy rate of ECO-HC embedded in this method are improved by 1.6% and 2.2% under the condition of changing illumination. When the target leaves the field of view, the success rate is improved by 3.7%, and the accuracy is improved by 3.4%.In this experiment environment, the average frame rate reaches 60 FPS.
目标跟踪相关滤波相关度模型更新策略
target trackingcorrelation filteringcorrelationmodel update strategy
崔艺涵,陈涛,陈宝刚.基于DPM和KCF的十字靶标检测与跟踪[J].液晶与显示,2018,33(12):1026-1032. doi: 10.3788/yjyxs20183312.1026http://dx.doi.org/10.3788/yjyxs20183312.1026
CUI Y H, CHEN T, CHEN B G. Detection and tracking method of cross target based on DPM detector and KCF tracker [J]. Chinese Journal of Liquid Crystals and Displays, 2018, 33(12): 1026-1032. (in Chinese). doi: 10.3788/yjyxs20183312.1026http://dx.doi.org/10.3788/yjyxs20183312.1026
BOLME D S, BEVERIDGE J R, DRAPER B A, et al. Visual object tracking using adaptive correlation filters [C]. 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. San Francisco: IEEE, 2010: 2544-2550. doi: 10.1109/cvpr.2010.5539960http://dx.doi.org/10.1109/cvpr.2010.5539960
DANELLJAN M, HÄGER G, SHAHBAZ KHAN F, et al. Convolutional features for correlation filter based visual tracking [C]//Proceedings of the IEEE International Conference on Computer Vision Workshops. Santiago: IEEE, 2015: 621-629. doi: 10.1109/iccvw.2015.84http://dx.doi.org/10.1109/iccvw.2015.84
WANG F, WANG C L, CHEN M L, et al. Far-field super-resolution ghost imaging with a deep neural network constraint [J]. Light: Science & Applications, 2022, 11(1): 1. doi: 10.1038/s41377-021-00680-whttp://dx.doi.org/10.1038/s41377-021-00680-w
梁慧慧,何秋生,贾伟振,等.一种多特征融合的目标跟踪算法[J].液晶与显示,2020,35(6):583-594.
LIANG H H, HE Q S, JIA W Z, et al. Multi-feature fusion target tracking algorithm [J]. Chinese Journal of Liquid Crystals and Displays, 2020, 35(6): 583-594. (in Chinese)
WANG J J, YANG H R, XU N, et al. Long-term target tracking combined with re-detection [J]. EURASIP Journal on Advances in Signal Processing, 2021, 2021(1): 2. doi: 10.1186/s13634-020-00713-3http://dx.doi.org/10.1186/s13634-020-00713-3
DANELLJAN M, HÄGER G, SHAHBAZ KHAN F, et al. Adaptive decontamination of the training set: a unified formulation for discriminative visual tracking [C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Las Vegas: IEEE, 2016: 1430-1438. doi: 10.1109/cvpr.2016.159http://dx.doi.org/10.1109/cvpr.2016.159
KOCH G, ZEMEL R, SALAKHUTDINOV R. Siamese neural networks for one-shot image recognition [C]//Proceedings of the 32nd International Conference on Machine Learning. Lille: JMLR, 2015.
ZHANG J M, MA S G, SCLAROFF S. MEEM: robust tracking via multiple experts using entropy minimization [C]. 13th European Conference on Computer Vision. Zurich: Springer, 2014: 188-203. doi: 10.1007/978-3-319-10599-4_13http://dx.doi.org/10.1007/978-3-319-10599-4_13
NAM H, HAN B. Learning multi-domain convolutional neural networks for visual tracking [C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Las Vegas: IEEE, 2016: 4293-4302. doi: 10.1109/cvpr.2016.465http://dx.doi.org/10.1109/cvpr.2016.465
DANELLJAN M, HÄGER G, SHAHBAZ KHAN F, et al. Learning spatially regularized correlation filters for visual tracking [C]//Proceedings of the IEEE International Conference on Computer Vision. Santiago: IEEE, 2015: 4310-4318. doi: 10.1109/iccv.2015.490http://dx.doi.org/10.1109/iccv.2015.490
DANELLJAN M, BHAT G, SHAHBAZ KHAN F, et al. ECO: efficient convolution operators for tracking [C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Honolulu: IEEE, 2017: 6931-6939. doi: 10.1109/cvpr.2017.733http://dx.doi.org/10.1109/cvpr.2017.733
DANELLJAN M, ROBINSON A, SHAHBAZ KHAN F, et al. Beyond correlation filters: learning continuous convolution operators for visual tracking [C]. 14th European Conference on Computer Vision. Amsterdam: Springer, 2016: 472-488. doi: 10.1007/978-3-319-46454-1_29http://dx.doi.org/10.1007/978-3-319-46454-1_29
KRISTAN M, LEONARDIS A, MATAS J, et al. The sixth visual object tracking VOT2018 challenge results [C]//Proceedings of the European Conference on Computer Vision (ECCV) Workshops. Munich: Springer, 2019: 3-53.
WU Y, LIM J, YANG M H. Object tracking benchmark [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2015, 37(9): 1834-1848. doi: 10.1109/tpami.2014.2388226http://dx.doi.org/10.1109/tpami.2014.2388226
WU Y, LIM J, YANG M H. Online object tracking: a benchmark [C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Portland: IEEE, 2013: 2411-2418. doi: 10.1109/cvpr.2013.312http://dx.doi.org/10.1109/cvpr.2013.312
JIA X, LU H C, YANG M H. Visual tracking via adaptive structural local sparse appearance model [C]. 2012 IEEE Conference on Computer Vision and Pattern Recognition. Providence: IEEE, 2012: 1822-1829. doi: 10.1109/cvpr.2012.6247880http://dx.doi.org/10.1109/cvpr.2012.6247880
SEVILLA-LARA L, LEARNED-MILLER E. Distribution fields for tracking [C]. 2012 IEEE Conference on Computer Vision and Pattern Recognition. Providence: IEEE, 2012: 1910-1917. doi: 10.1109/cvpr.2012.6247891http://dx.doi.org/10.1109/cvpr.2012.6247891
ZHONG W, LU H C, YANG M H. Robust object tracking via sparsity-based collaborative model [C]. 2012 IEEE Conference on Computer Vision and Pattern Recognition. Providence: IEEE, 2012: 1838-1845. doi: 10.1109/cvpr.2012.6247882http://dx.doi.org/10.1109/cvpr.2012.6247882
BAO C L, WU Y, LING H B, et al. Real time robust L1 tracker using accelerated proximal gradient approach [C]. 2012 IEEE Conference on Computer Vision and Pattern Recognition. Providence: IEEE, 2012: 1830-1837. doi: 10.1109/cvpr.2012.6247881http://dx.doi.org/10.1109/cvpr.2012.6247881
ROSS D A, LIM J, LIN R S, et al. Incremental learning for robust visual tracking [J]. International Journal of Computer Vision, 2008, 77(1/3): 125-141. doi: 10.1007/s11263-007-0075-7http://dx.doi.org/10.1007/s11263-007-0075-7
ZHANG T Z, GHANEM B, LIU S, et al. Robust visual tracking via structured multi-task sparse learning [J]. International Journal of Computer Vision, 2013, 101(2): 367-383. doi: 10.1007/s11263-012-0582-zhttp://dx.doi.org/10.1007/s11263-012-0582-z
ZHANG K H, ZHANG L, YANG M H. Real-time compressive tracking [C]. 12th European Conference on Computer Vision. Florence: Springer, 2012: 864-877. doi: 10.1007/978-3-642-33712-3_62http://dx.doi.org/10.1007/978-3-642-33712-3_62
WU Y, SHEN B, LING H B. Online robust image alignment via iterative convex optimization [C]. 2012 IEEE Conference on Computer Vision and Pattern Recognition. Providence: IEEE, 2012: 1808-1814. doi: 10.1109/cvpr.2012.6247878http://dx.doi.org/10.1109/cvpr.2012.6247878
ZHU G, PORIKLI F, LI H D. Beyond local search: tracking objects everywhere with instance-specific proposals [C]// Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Las Vegas: IEEE, 2016: 943-951. doi: 10.1109/cvpr.2016.108http://dx.doi.org/10.1109/cvpr.2016.108
LEE H, KIM D. Salient region-based online object tracking [C]. 2018 IEEE Winter Conference on Applications of Computer Vision (WACV). Lake Tahoe: IEEE, 2018: 1170-1177. doi: 10.1109/wacv.2018.00133http://dx.doi.org/10.1109/wacv.2018.00133
0
浏览量
73
下载量
0
CSCD
关联资源
相关文章
相关作者
相关机构