RpTrack: Robust Pig Tracking with Irregular Movement Processing and Behavioral Statistics
Abstract
:1. Introduction
2. Materials and Methods
2.1. Materials
2.2. Methods
2.2.1. Improved Kalman Filter
2.2.2. Improved Trajectory Management
Algorithm 1: Improved trajectory update |
Input: : represents the storage information for trajectory , represents the KF state recording the position information of trajectory over the last frames, with as the corresponding covariance matrix, indicates the trajectory state (lost or tracked), and represents the behavior category of trajectory . : indicates the position of detection matched with trajectory , where denotes the behavior category of the detected target. : indicates the maximum length of the trajectory information storage list. Output: : represents the updated storage information for trajectory , represents the updated Kalman filter state, represents the corresponding updated covariance matrix for , denotes the behavior category of the trajectory, and reflects the trajectory state. 1 /*update historical KF state vectors and KF covariance matrixes*/ 2 Step 1: Initialize an empty set 3 ← ∅ 4 Step 2: Initialize KF for detection d 5 ← KF initialization 6 ←∪ /*store current motion information*/ 7 Step 3:update motion information 8 if s == Tracked then 9 for i ← 1 to do 10 if i < then /*ensure that maximum K frames of position information are recorded*/ 10 ← KF update /*conduct KF update*/ 11 ←∪ 12 ← 13 ← /*update trajectory behavior category*/ 14 Return |
2.2.3. BIoU (Buffered IoU)
3. Experiments
3.1. Experimental Platform and Parameter Settings
3.2. Evaluation Metrics for Multi-Objective Tracking
3.3. Tracking Results
3.4. Comparison of Different MOT Algorithms
3.5. Behavioral Statistics
3.6. Ablation Experiments and Analysis
3.6.1. Effect of Different K Values in Improved Trajectory Management
3.6.2. Effect of Each Module in the RpTrack
4. Conclusions and Limitation Discussion
- Experiment results on both public and private datasets demonstrate that RpTrack achieves a competitive execution speed like the SORT method. In the public dataset, RpTrack achieves a HOTA of 73.2%, MOTA of 95.5, IDF1 of 85.6%, and IDSW of 148. In the private datasets, RpTrack’s performance achieves a HOTA of 80.8%, MOTA of 97.8%, IDF1 of 98.4, and IDSW of 6, surpassing other leading tracking methods in all performance metrics.
- Visualization results comparisons confirm that the improved KF and the improved trajectory management effectively address the issue of ID switches caused by irregular pig movements. Furthermore, It is also demonstrated that BIoU can alleviate the problem of ID switches resulting from missed detections due to uneven lighting. These improvements enhance tracking stability.
- Pig behaviors are categorized into “stand”, “lie”, “eat,” and “other”. Based on RpTrack’s more precise tracking results, it achieves more accurate pig behavior statistics.
- The improved trajectory management is sensitive to the hyperparameter . Different values of are used in different tracking environments to achieve better tracking results, and as is larger, more computation is required (Section 3.6.1).
- BIoU still has mismatch problems due to the bounding box extension.
- The work in this paper focuses on improvements to the tracker, and the problems that would be faced by behavioral statistics are not studied in depth. Therefore, RpTrack fails to deal with the effect of ID switches on the results of behavioral statistics, but this is the direction of our future work.
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Conflicts of Interest
References
- Tzanidakis, C.; Simitzis, P.; Arvanitis, K.; Panagakis, P. An overview of the current trends in precision pig farming technologies. Livest. Sci. 2021, 249, 104530. [Google Scholar] [CrossRef]
- Yin, M.; Ma, R.; Luo, H.; Li, J.; Zhao, Q.; Zhang, M. Non-contact sensing technology enables precision livestock farming in smart farms. Comput. Electron. Agric. 2023, 212, 108171. [Google Scholar] [CrossRef]
- Matthews, S.G.; Miller, A.L.; Plötz, T.; Kyriazakis, I. Automated tracking to measure behavioural changes in pigs for health and welfare monitoring. Sci. Rep. 2017, 7, 17582. [Google Scholar] [CrossRef] [PubMed]
- Zhang, L.; Gray, H.; Ye, X.; Collins, L.; Allinson, N. Automatic Individual Pig Detection and Tracking in Pig Farms. Sensors 2019, 19, 1188. [Google Scholar] [CrossRef] [PubMed]
- Cowton, J.; Kyriazakis, I.; Bacardit, J. Automated Individual Pig Localisation, Tracking and Behaviour Metric Extraction Using Deep Learning. IEEE Access 2019, 7, 108049–108060. [Google Scholar] [CrossRef]
- Girshick, R. Fast r-cnn. In Proceedings of the IEEE International Conference on Computer Vision, Santiago, Chile, 7–13 December 2015; pp. 1440–1448. [Google Scholar]
- Wojke, N.; Bewley, A.; Paulus, D. Simple online and realtime tracking with a deep association metric. In Proceedings of the: 2017 IEEE International Conference on Image Processing (ICIP), Beijing, China, 17–20 September 2017. [Google Scholar] [CrossRef]
- Guo, Q.; Sun, Y.; Orsini, C.; Bolhuis, J.E.; de Vlieg, J.; Bijma, P.; de With, P.H. Enhanced camera- based individual pig detection and tracking for smart pig farms. Comput. Electron. Agric. 2023, 211, 108009. [Google Scholar] [CrossRef]
- Wang, Z.; Zheng, L.; Liu, Y.; Li, Y.; Wang, S. Towards real-time multi-object tracking. In European Conference on Computer Vision; Springer: Berlin/Heidelberg, Germany, 2020; pp. 107–122. [Google Scholar]
- Zhang, Y.; Wang, C.; Wang, X.; Zeng, W.; Liu, W. Fairmot: On the fairness of detection and re–identification in multiple object tracking. Int. J. Comput. Vis. 2021, 129, 3069–3087. [Google Scholar] [CrossRef]
- Tu, S.; Liu, X.; Liang, Y.; Zhang, Y.; Huang, L.; Tang, Y. Behavior Recognition and Tracking Method of Group housed Pigs Based on Improved DeepSORT Algorithm. Nongye Jixie Xuebao/Trans. Chin. Soc. Agric. Mach. 2022, 53, 345–352. [Google Scholar] [CrossRef]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You only look once: Unified, real-time object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar]
- Jocher, G.; Chaurasia, A.; Stoken, A.; Borovec, J.; Kwon, Y.; Fang, J.; Michael, K.; Montes, D.; Nadar, J.; Skalski, P. Ultralytics/yolov5: v6.1–TensorRT, TensorFlow edge TPU and OpenVINO export and inference. Zenodo 2022. [Google Scholar] [CrossRef]
- Kim, J.; Suh, Y.; Lee, J.; Chae, H.; Ahn, H.; Chung, Y.; Park, D. EmbeddedPigCount: Pig Counting with Video Object Detection and Tracking on an Embedded Board. Sensors 2022, 22, 2689. [Google Scholar] [CrossRef]
- Odo, A.; Muns, R.; Boyle, L.; Kyriazakis, I. Video Analysis Using Deep Learning for Automated Quantification of Ear Biting in Pigs. IEEE Access 2023, 11, 59744–59757. [Google Scholar] [CrossRef]
- Wang, C.Y.; Bochkovskiy, A.; Liao, H.Y.M. Scaled-YOLOv4: Scaling cross stage partial network. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Nashville, TN, USA, 20–25 June 2021; pp. 13024–13033. [Google Scholar]
- Wang, C.Y.; Bochkovskiy, A.; Liao, H.Y.M. YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors. arXiv 2022, arXiv:2207. 02696. [Google Scholar]
- Han, S.; Fuentes, A.; Yoon, S.; Jeong, Y.; Kim, H.; Sun Park, D. Deep learning-based multi–cattle tracking in crowded livestock farming using video. Electron. Agric. 2023, 212, 108044. [Google Scholar] [CrossRef]
- Yigui, H.; Deqin, X.; Junbin, L.; Zhujie, T.; Kejian, L.; Miaobin, C. An Improved Pig Counting Algorithm Based on YOLOv5 and DeepSORT Model. Sensors 2023, 23, 6309. [Google Scholar] [CrossRef] [PubMed]
- Zheng, Z.; Li, J.; Qin, L. YOLO-BYTE: An efficient multi-object tracking algorithm for automatic monitoring of dairy cows. Comput. Electron. Agric. 2023, 209, 107857. [Google Scholar] [CrossRef]
- Van der Zande, L.E.; Guzhva, O.; Rodenburg, T.B. Individual detection and tracking of group housed pigs in their home pen using computer vision. Front. Animal Sci. 2021, 2, 669312. [Google Scholar] [CrossRef]
- Redmon, J.; Farhadi, A. Yolov3: An incremental improvement. arXiv 2018, arXiv:1804.02767. [Google Scholar]
- Bewley, A.; Ge, Z.; Ott, L.; Ramos, F.; Upcroft, B. Simple online and realtime tracking. In Proceedings of the 2016 IEEE International Conference on Image Processing (ICIP), Phoenix, AZ, USA, 25–28 September 2016. [Google Scholar] [CrossRef]
- Ge, Z.; Liu, S.; Wang, F.; Li, Z.; Sun, J. Yolox: Exceeding yolo series in 2021. arXiv 2021, arXiv:2107.08430. [Google Scholar]
- Psota, T.; Schmidt, E.; Mote, T.B.; Pérez, C.L. Long-Term Tracking of Group-Housed Livestock Using Keypoint Detection and MAP Estimation for Individual Animal Identification. Sensors 2020, 20, 3670. [Google Scholar] [CrossRef]
- Kalman, R.E. A New Approach to Linear Filtering and Prediction Problems. J. Basic Eng. 1960, 82, 35–45. [Google Scholar] [CrossRef]
- Aharon, N.; Orfaig, R.; Bobrovsky, B.-Z. BoT-SORT: Robust Associations Multi-Pedestrian Tracking. arXiv 2022, arXiv:2206.14651. [Google Scholar] [CrossRef]
- Yang, F.; Odashima, S.; Masui, S.; Jiang, S. Hard to Track. In Objects with Irregular Motions and Similar Appearances? Make It Easier by Buffering the Matching Space. In Proceedings of the 2023 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), Waikoloa, HI, USA, 3–7 January 2023. [Google Scholar] [CrossRef]
- Luiten, J.; Osep, A.; Dendorfer, P.; Torr, P.; Geiger, A.; Leal-Taixé, L.; Leibe, B. Hota: A Higher Order Metric for Evaluating Multi-object Tracking. Int. J. Comput. Vis. 2020, 129, 548–578. [Google Scholar] [CrossRef] [PubMed]
- Bernardin, K.; Stiefelhagen, R. Evaluating Multiple Object Tracking Performance: The CLEAR MOT Metrics. J. Image Video Proc. 2008, 2008, 246309. [Google Scholar] [CrossRef]
- Zhang, Y.; Sun, P.; Jiang, Y.; Yu, D.; Weng, F.; Yuan, Z.; Luo, P.; Liu, W.; Wang, X. ByteTrack: Multi-object Tracking by Associating Every Detection Box. In Computer Vision–ECCV; Springer Nature: Cham, Switzerland, 2022. [Google Scholar] [CrossRef]
- Cao, J.; Pang, J.; Weng, X.; Khirodkar, R.; Kitani, K. Observation-Centric SORT: Rethinking SORT for Robust Multi-Object Tracking. In Proceedings of the 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Vancouver, BC, Canada, 17–24 June 2023. [Google Scholar]
Dataset | No. | Sparse | Dense | Day | Night | Light |
---|---|---|---|---|---|---|
Public | 0102 | √ | — | √ | — | uniform |
0402 | — | √ | √ | — | uniform | |
0502 | √ | — | — | √ | uniform | |
0602 | — | √ | √ | — | uniform | |
0702 | √ | — | √ | — | uniform | |
0802 | — | √ | — | √ | uniform | |
0902 | — | √ | √ | — | uniform | |
1002 | — | √ | — | √ | uneven | |
1102 | √ | — | √ | — | uniform | |
1202 | √ | — | √ | — | uniform | |
1502 | — | √ | — | √ | uniform | |
01 | √ | — | √ | — | uniform | |
05 | √ | — | — | √ | uniform | |
11 | — | √ | √ | — | uniform | |
15 | — | √ | — | √ | uniform | |
Private | 0010 | — | √ | √ | — | uniform |
0011 | — | √ | √ | — | uniform | |
0012 | — | √ | — | √ | uneven | |
0013 | — | √ | √ | — | uniform | |
0014 | √ | — | √ | — | uniform | |
0015 | √ | — | √ | — | uniform | |
0016 | √ | — | √ | — | uniform | |
0017 | √ | — | — | √ | uniform | |
0018 | √ | — | — | √ | uneven |
Dataset | Video | HOTA/% | MOTA/% | IDF1/% | IDSW/% | FPS/(f/s) |
---|---|---|---|---|---|---|
Public dataset | 0102 | 90.6 | 99.9 | 100.0 | 0 | 73.8 |
0402 | 90.1 | 99.8 | 100.0 | 0 | 70.1 | |
0502 | 84.8 | 99.8 | 100.0 | 0 | 73.6 | |
0602 | 80.0 | 98.6 | 99.3 | 0 | 69.0 | |
0702 | 87.5 | 99.9 | 100.0 | 0 | 71.7 | |
0802 | 92.1 | 100.0 | 100.0 | 0 | 71.3 | |
0902 | 88.8 | 96.6 | 98.3 | 0 | 70.2 | |
1002 | 73.8 | 97.1 | 98.6 | 0 | 70.3 | |
1102 | 88.8 | 97.1 | 98.6 | 0 | 69.1 | |
1202 | 86.0 | 97.1 | 98.5 | 2 | 69.6 | |
1502 | 77.6 | 98.4 | 99.2 | 0 | 68.5 | |
01 | 77.0 | 96.6 | 93.9 | 8 | 75.4 | |
05 | 79.0 | 98.8 | 94.4 | 12 | 73.6 | |
11 | 61.8 | 94.2 | 71.8 | 54 | 68.7 | |
15 | 66.4 | 91.8 | 78.9 | 69 | 69.2 | |
Private dataset | 0010 | 80.0 | 99.0 | 97.8 | 2 | 65.7 |
0011 | 81.4 | 97.1 | 98.5 | 0 | 69.7 | |
0012 | 79.0 | 98.2 | 99.1 | 0 | 69.6 | |
0013 | 81.9 | 97.6 | 98.8 | 0 | 69.6 | |
0014 | 84.1 | 99.9 | 100.0 | 0 | 74.7 | |
0015 | 85.5 | 99.6 | 99.8 | 0 | 74.9 | |
0016 | 86.1 | 99.8 | 99.9 | 0 | 74.3 | |
0017 | 81.2 | 99.9 | 100.0 | 0 | 73.6 | |
0018 | 66.2 | 88.2 | 90.7 | 4 | 74.8 |
Dataset | Method | HOTA/% | MOTA/% | IDF1/% | IDSW | FPS/(f/s) |
---|---|---|---|---|---|---|
Public | SORT | 65.2 | 95.0 | 72.8 | 242 | 75.3 |
ByteTrack | 61.6 | 92.8 | 72.6 | 229 | 73.8 | |
C-BIoU | 70.1 | 95.2 | 79.7 | 369 | 79.1 | |
OC-SORT | 70.2 | 95.2 | 81.0 | 161 | 73.1 | |
Bot-SORT | 69.1 | 95.1 | 78.8 | 317 | 19.2 | |
RpTrack | 73.2 | 95.5 | 85.6 | 146 | 70.9 | |
Private | SORT | 77.7 | 97.4 | 93.0 | 29 | 78.6 |
ByteTrack | 73.3 | 93.2 | 90.1 | 41 | 79.3 | |
C-BIoU | 76.8 | 95.4 | 91.7 | 45 | 80.7 | |
OC-SORT | 78.6 | 97.4 | 94.3 | 18 | 78.1 | |
Bot-SORT | 78.8 | 97.0 | 93.4 | 35 | 37.4 | |
RpTrack | 80.8 | 97.8 | 98.4 | 6 | 72.9 |
Public | Private | |||||||
---|---|---|---|---|---|---|---|---|
K Values | HOTA/% | MOTA/% | IDF1/% | IDSW | HOTA/% | MOTA/% | IDF1/% | IDSW |
- | 71.0 | 95.5 | 82.6 | 152 | 79.4 | 97.8 | 96.7 | 11 |
1 | 72.1 | 95.4 | 84.2 | 158 | 80.1 | 97.5 | 97.9 | 7 |
2 | 73.2 | 95.5 | 85.6 | 148 | 79.4 | 97.6 | 96.6 | 10 |
3 | 72.1 | 95.5 | 84.6 | 151 | 79.9 | 97.6 | 97.3 | 8 |
4 | 72.2 | 95.5 | 84.8 | 154 | 80.7 | 97.7 | 98.3 | 6 |
5 | 72.3 | 95.5 | 85.1 | 148 | 79.5 | 97.6 | 96.6 | 12 |
Public | Private | |||||||||
---|---|---|---|---|---|---|---|---|---|---|
IKF | ITM | BIoU | HOTA/% | MOTA/% | IDF1/% | IDSW | HOTA/% | MOTA/% | IDF1/% | IDSW |
65.2 | 95.0 | 72.8 | 242 | 77.7 | 97.4 | 93.0 | 29 | |||
✔ | 70.5 | 95.5 | 81.8 | 146 | 79.7 | 97.7 | 96.6 | 11 | ||
✔ | ✔ | 70.7 | 95.5 | 82.0 | 142 | 80.2 | 97.8 | 97.2 | 8 | |
✔ | ✔ | ✔ | 73.2 | 95.5 | 85.6 | 148 | 80.8 | 97.8 | 98.4 | 6 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Tu, S.; Lei, H.; Liang, Y.; Lyu, E.; Liu, H. RpTrack: Robust Pig Tracking with Irregular Movement Processing and Behavioral Statistics. Agriculture 2024, 14, 1158. https://doi.org/10.3390/agriculture14071158
Tu S, Lei H, Liang Y, Lyu E, Liu H. RpTrack: Robust Pig Tracking with Irregular Movement Processing and Behavioral Statistics. Agriculture. 2024; 14(7):1158. https://doi.org/10.3390/agriculture14071158
Chicago/Turabian StyleTu, Shuqin, Hua Lei, Yun Liang, Enli Lyu, and Hongxing Liu. 2024. "RpTrack: Robust Pig Tracking with Irregular Movement Processing and Behavioral Statistics" Agriculture 14, no. 7: 1158. https://doi.org/10.3390/agriculture14071158
APA StyleTu, S., Lei, H., Liang, Y., Lyu, E., & Liu, H. (2024). RpTrack: Robust Pig Tracking with Irregular Movement Processing and Behavioral Statistics. Agriculture, 14(7), 1158. https://doi.org/10.3390/agriculture14071158