Real-Time Detection and Localization of Weeds in Dictamnus dasycarpus Fields for Laser-Based Weeding Control
Abstract
:1. Introduction
1.1. Main Objectives
- Create a dataset of the most widely distributed and numerous weeds in real Dictamnus dasycarpus fields, covering a number of important real disturbances.
- Propose a detection network that can trade off accuracy and light weight to enable weed localization in complex scenarios.
- Provide joint tracking and detection algorithms and achieve stable performance as well as fast matching of tracking trajectories.
- Perform goal-oriented training and improvement of the model for real-world noise as well as various disturbances.
1.2. Research Contribution
2. Materials and Methods
2.1. Dataset Acquisition
2.2. Data Preprocessing and Augmentation
2.3. Weed-Detection Model
2.3.1. YOLOv7-Riny Weed-Detection Model
2.3.2. Constructing a Lightweight Backbone Network Using Partial Convolution
2.3.3. Lightweight Upsampling Operator: CARAFE
2.3.4. Constructing the Reparameterized Convolution RepBlock
2.4. Field Weed Tracking Model for Dictamnus dasycarpus
2.5. Experimental Evaluation Indicators
2.5.1. Performance Metrics Evaluation Metrics for YOLO-Riny Model Performance
2.5.2. Evaluation Metrics for Tracking Algorithm Performance
3. Results and Discussion
3.1. Experimental Environment
3.2. Results and Analysis of the Weed-Detection Experiment
3.2.1. Ablation Research Analysis
3.2.2. Comparative Analysis of Multi-Model Performance
3.2.3. Analysis of Weed-Detection Visualization Experiment Results
3.2.4. Analysis of Weed-Detection Model Robustness
3.2.5. Results and Analysis of Weed Tracking Experiments
3.2.6. Embedded Systems Experiment
4. Conclusions
- This research utilizes a self-constructed weed dataset specific to the Dictamnus dasycarpus field environment, facilitating the model to learn key weed features and enhance its robustness against the complex background of these fields. This approach mitigates the interference of crop features on weed detection.
- By enhancing the backbone structure of the model, introducing a lightweight upsampling operator, and applying structural adjustments such as detector head decoupling to YOLOv7-tiny through structural reparameterization, detection accuracy for Digitaria sanguinalis and Acalypha australis ultimately amounted to 5.4% and 10%, respectively. Overall precision and recall increased has increased and ultimately amounted to 2.9% and 2.1%, respectively, with average precision ultimately amounting to 1.9%. These improvements demonstrate that the proposed method significantly enhances the model’s feature perception for various species of weeds, boosts overall model generalization, and reduces issues of omission and misdetection in the tracking algorithm.
- The YOLO-Riny detection model reduces the original network size by 2 MB and the number of floating-point operations by 2GFLOPs, leading to a 10 ms improvement in inference time for an entire batch of images. For complex images, inference time is notably reduced, with a 2–3 ms faster performance on multi-weed category images in contrast to the original network. These improvements indicate that the proposed algorithm requires fewer computational resources, making it more suitable for deployment on resource-constrained devices. Consequently, it facilitates timely updates of target positions and bounding box information, while minimizing ID transformations in complex scenes.
- To minimize the cost associated with repeatedly removing the same weed during laser weeding, this paper combines the improved YOLO-Riny detection algorithm with the parameterized ByteTrack algorithm to track and mark seven species of weeds in Dictamnus dasycarpus fields. The YOLO-Riny-ByteTrack model achieved a 3-percentage-point increase in MOTA and a 3.4-percentage-point increase in MOTP, while reducing the number of ID switches by a factor of 10 compared to the original model. This combination enhances target consistency across consecutive frames, improves tracking accuracy and stability, and consequently reduces resource and time waste in laser weeding.
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Hu, X.; Zhang, W.; Zhu, Q. Zhonghua Bencao; Shanghai Science and Technology Publications: Shanghai, China, 1998; pp. 225–226. [Google Scholar]
- Liang, J.R.; Shi, J.J.; Yuan, W.H.; Zhang, Y.; Ding, C.H. First Report of Chinese Medicinal Plant Dictamnus dasycarpus Leaf Spot Disease Caused by Fusarium scirpi in China. Plant Dis. 2024, 108, 2232. [Google Scholar] [CrossRef]
- Wang, Z.; Xu, F.; An, S. Chemical constituents from the root bark of Dictamnus dasycarpus Turcz. China J. Chin. Mater. Medica 1992, 17, 551–576. [Google Scholar]
- Deng, Z.; Wang, T.; Zheng, Y.; Zhang, W.; Yun, Y.-H. Deep learning in food authenticity: Recent advances and future trends. Trends Food Sci. Technol. 2024, 144, 104344. [Google Scholar] [CrossRef]
- Mesnage, R.; Székács, A.; Zaller, J.G. Herbicides: Brief History, Agricultural Use, and Potential Alternatives for Weed Control. In Herbicides; Mesnage, R., Zaller, J.G., Eds.; Elsevier: Amsterdam, The Netherlands, 2021; pp. 1–20. [Google Scholar]
- Jin, X.; Liu, T.; Yang, Z.; Xie, J.; Bagavathiannan, M.; Hong, X.; Xu, Z.; Chen, X.; Yu, J.; Chen, Y. Precision weed control using a smart sprayer in dormant bermudagrass turf. Crop Prot. 2023, 172, 106302. [Google Scholar] [CrossRef]
- Sharma, V.; Tripathi, A.K.; Mittal, H. Technological revolutions in smart farming: Current trends, challenges & future directions. Comput. Electron. Agric. 2022, 201, 107217. [Google Scholar] [CrossRef]
- Mwitta, C.; Rains, G.C.; Prostko, E. Evaluation of Diode Laser Treatments to Manage Weeds in Row Crops. Agronomy 2022, 12, 2681. [Google Scholar] [CrossRef]
- Yu, K.; Ren, J.; Zhao, Y. Principles, developments and applications of laser-induced breakdown spectroscopy in agriculture: A review. Artif. Intell. Agric. 2020, 4, 127–139. [Google Scholar] [CrossRef]
- Upadhyay, A.; Zhang, Y.; Koparan, C.; Rai, N.; Howatt, K.; Bajwa, S.; Sun, X. Advances in ground robotic technologies for site-specific weed management in precision agriculture: A review. Comput. Electron. Agric. 2024, 225, 109363. [Google Scholar] [CrossRef]
- Quan, L.; Jiang, W.; Li, H.; Li, H.; Wang, Q.; Chen, L. Intelligent intra-row robotic weeding system combining deep learning technology with a targeted weeding mode. Biosyst. Eng. 2022, 216, 13–31. [Google Scholar] [CrossRef]
- Zhu, H.; Zhang, Y.; Mu, D.; Bai, L.; Wu, X.; Zhuang, H.; Li, H. Research on improved YOLOx weed detection based on lightweight attention module. Crop Prot. 2024, 177, 106563. [Google Scholar] [CrossRef]
- Guo, W.; Qiao, S.; Zhao, C.; Zhang, T. Defect detection for industrial neutron radiographic images based on modified YOLO network. Nucl. Instrum. Methods Phys. Res. Sect. A Accel. Spectrometers Detect. Assoc. Equip. 2023, 1056, 168694. [Google Scholar] [CrossRef]
- Zhu, H.B.; Zhang, Y.Y.; Mu, D.L.; Bai, L.Z.; Zhuang, H.; Li, H. YOLOX-based blue laser weeding robot in corn field. Front. Plant Sci. 2022, 13, 1017803. [Google Scholar] [CrossRef] [PubMed]
- Mwitta, C.J. Development of the Autonomous Diode Laser Weeding Robot. Ph.D. Thesis, University of Georgia, Athens, GA, USA, 2023. [Google Scholar]
- Liu, S.; Jin, Y.; Ruan, Z.; Ma, Z.; Gao, R.; Su, Z. Real-Time Detection of Seedling Maize Weeds in Sustainable Agriculture. Sustainability 2022, 14, 15088. [Google Scholar] [CrossRef]
- Shao, Y.; Guan, X.; Xuan, G.; Gao, F.; Feng, W.; Gao, G.; Wang, Q.; Huang, X.; Li, J. GTCBS-YOLOv5s: A lightweight model for weed species identification in paddy fields. Comput. Electron. Agric. 2023, 215, 108461. [Google Scholar] [CrossRef]
- Peng, H.; Li, Z.; Zhou, Z.; Shao, Y. Weed detection in paddy field using an improved RetinaNet network. Comput. Electron. Agric. 2022, 199, 107179. [Google Scholar] [CrossRef]
- Tolias, A.; Papanicolaou, G.C.; Alexandropoulos, D. Fabrication of glass to PLA joints with an intermediate aluminum layer by using low-cost industrial nanosecond IR fiber lasers. Opt. Laser Technol. 2024, 175, 110811. [Google Scholar] [CrossRef]
- Kuantama, E.; Zhang, Y.; Rahman, F.; Han, R.; Dawes, J.; Mildren, R.; Abir, T.A.; Nguyen, P. Laser-based drone vision disruption with a real-time tracking system for privacy preservation. Expert Syst. Appl. 2024, 255, 124626. [Google Scholar] [CrossRef]
- Liu, Y.; Li, Y. Positioning accuracy improvement for target point tracking of robots based on Extended Kalman Filter with an optical tracking system. Robot. Auton. Syst. 2024, 179, 104751. [Google Scholar] [CrossRef]
- Chai, J.; He, S.; Shin, H.-S.; Tsourdos, A. Domain-knowledge-aided airborne ground moving targets tracking. Aerosp. Sci. Technol. 2024, 144, 108807. [Google Scholar] [CrossRef]
- Su, Y.; Cheng, T.; He, Z. Collaborative trajectory planning and transmit resource scheduling for multiple target tracking in distributed radar network system with GTAR. Signal Process. 2024, 223, 109550. [Google Scholar] [CrossRef]
- Chen, G.; Xu, Y.; Yang, X.; Hu, H.; Cheng, H.; Zhu, L.; Zhang, J.; Shi, J.; Chai, X. Target tracking control of a bionic mantis shrimp robot with closed-loop central pattern generators. Ocean. Eng. 2024, 297, 116963. [Google Scholar] [CrossRef]
- Wang, Z.; Walsh, K.; Koirala, A. Mango Fruit Load Estimation Using a Video Based MangoYOLO—Kalman Filter—Hungarian Algorithm Method. Sensors 2019, 19, 2742. [Google Scholar] [CrossRef] [PubMed]
- Li, X.; Wang, X.; Ong, P.; Yi, Z.; Ding, L.; Han, C. Fast Recognition and Counting Method of Dragon Fruit Flowers and Fruits Based on Video Stream. Sensors 2023, 23, 8444. [Google Scholar] [CrossRef]
- Özlüoymak, Ö. Design and development of a servo-controlled target-oriented robotic micro-dose spraying system in precision weed control. Semin.-Cienc. Agrar. 2021, 42, 635–656. [Google Scholar] [CrossRef]
- Zhang, Y.; Sun, P.; Jiang, Y.; Yu, D.; Yuan, Z.; Luo, P.; Liu, W.; Wang, X. ByteTrack: Multi-Object Tracking by Associating Every Detection Box. In Proceedings of the European Conference on Computer Vision, Montreal, QC, Canada, 11 October 2021. [Google Scholar]
- Li, X.; Orchard, M.T. New edge-directed interpolation. IEEE Trans. Image Process. 2001, 10, 1521–1527. [Google Scholar] [CrossRef]
- Wang, C.-Y.; Bochkovskiy, A.; Liao, H.-Y.M. YOLOv7: Trainable Bag-of-Freebies Sets New State-of-the-Art for Real-Time Object Detectors. In Proceedings of the 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Vancouver, BC, Canada, 17–24 June 2023; pp. 7464–7475. [Google Scholar]
- Wang, J.; Chen, K.; Xu, R.; Liu, Z.; Loy, C.C.; Lin, D. CARAFE: Content-Aware ReAssembly of FEatures. In Proceedings of the 2019 IEEE/CVF International Conference on Computer Vision (ICCV), Seoul, Republic of Korea, 27 October–2 November 2019; pp. 3007–3016. [Google Scholar]
- Ding, X.; Zhang, X.; Ma, N.; Han, J.; Ding, G.; Sun, J. RepVGG: Making VGG-Style ConvNets Great Again. In Proceedings of the 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Nashville, TN, USA, 20–25 June 2021; pp. 13728–13737. [Google Scholar]
- Chen, J.; Kao, S.-h.; He, H.; Zhuo, W.; Wen, S.; Lee, C.-H.; Chan, S.-H.G. Run, Don’t Walk: Chasing Higher FLOPS for Faster Neural Networks. In Proceedings of the 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Vancouver, BC, Canada, 17–24 June 2023; pp. 12021–12031. [Google Scholar]
- Chollet, F. Xception: Deep Learning with Depthwise Separable Convolutions. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 1800–1807. [Google Scholar]
- Wang, Y.; Jia, Y.; Tian, Y.; Xiao, J. Deep reinforcement learning with the confusion-matrix-based dynamic reward function for customer credit scoring. Expert Syst. Appl. 2022, 200, 117013. [Google Scholar] [CrossRef]
Model | mAP /% | Accuracy/% | Precision /% | Recall /% | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Faster-Net | CARAFE | Rep-Block | Chenopodium Album | Digitaria Sanguinalis | Poa Annua | Acalypha Australis | Commelina Communis | Bidens Pilosa | Capsella Bursa-Pastoris | |||
× | × | × | 89.8 | 98.0 | 81.2 | 94.4 | 81.4 | 81.1 | 94.2 | 98.3 | 83.8 | 88.6 |
√ | × | × | 88.7 | 90.7 | 86.6 | 87.5 | 85.9 | 83.6 | 92.0 | 94.6 | 85.6 | 89.1 |
× | √ | × | 89.4 | 92.9 | 84.1 | 90.5 | 86.1 | 85.4 | 91.5 | 93.7 | 84.4 | 87.1 |
× | × | √ | 89.4 | 97.5 | 80.1 | 92.2 | 80.6 | 80.8 | 94.0 | 98.0 | 83.2 | 88.4 |
√ | √ | × | 89.7 | 91.9 | 87.4 | 91.5 | 87.3 | 84.4 | 92.1 | 93.4 | 86.1 | 88.1 |
√ | × | √ | 89.8 | 94.7 | 82.0 | 87.9 | 90.1 | 83.3 | 94.5 | 95.9 | 84.7 | 89.2 |
× | √ | √ | 90.4 | 94.3 | 85.4 | 92.1 | 86.3 | 87.2 | 93.0 | 94.2 | 87.7 | 85.6 |
√ | √ | √ | 91.7 | 98.3 | 86.6 | 92.7 | 91.4 | 82.0 | 93.2 | 98.0 | 86.7 | 90.7 |
Model | Floating Points of Operations/G | Memory Usage /M | GPU Speed /ms | ||
---|---|---|---|---|---|
FasterNet | CARAFE | RepBlock | |||
× | × | × | 13.1 | 12.9 | 46 |
√ | × | × | 11.9 | 11.4 | 40 |
√ | √ | × | 11.6 | 11.6 | 39 |
√ | √ | √ | 10.1 | 11.2 | 36 |
Model | mAP /% | Memory Usage /M | Floating Points of Operations/G | GPU Speed /ms |
---|---|---|---|---|
FasterRCNN | 76.3 | 129.0 | 56.3 | 78 |
ShuffleNetv2 | 74.2 | 12.3 | 13.6 | 41 |
MobileNetV3 | 81.4 | 24.3 | 26.0 | 42 |
GhostNetV2 | 82.9 | 15.0 | 14.1 | 41 |
YOLOX | 85.3 | 33.3 | 27.9 | 44 |
YOLOv5 | 87.3 | 43.2 | 22.4 | 47 |
YOLOv7 | 92.6 | 77.4 | 24.2 | 49 |
YOLOv8l | 92.9 | 48.6 | 160.0 | 42 |
YOLOv8s | 89.1 | 12.1 | 29.4 | 38 |
YOLO-Riny | 91.7 | 11.2 | 10.1 | 36 |
Proposeed Algorithm | Multiple Object Tracking Accuracy% | ID Switch Rate /Times | Multiple Object Tracking Precision% |
---|---|---|---|
YOLOv7-tiny ByteTrack | 81.4 | 24 | 74.2 |
YOLO-Riny-ByteTrack | 84.4 | 10 | 77.6 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Xu, Y.; Liu, Z.; Li, J.; Huang, D.; Chen, Y.; Zhou, Y. Real-Time Detection and Localization of Weeds in Dictamnus dasycarpus Fields for Laser-Based Weeding Control. Agronomy 2024, 14, 2363. https://doi.org/10.3390/agronomy14102363
Xu Y, Liu Z, Li J, Huang D, Chen Y, Zhou Y. Real-Time Detection and Localization of Weeds in Dictamnus dasycarpus Fields for Laser-Based Weeding Control. Agronomy. 2024; 14(10):2363. https://doi.org/10.3390/agronomy14102363
Chicago/Turabian StyleXu, Yanlei, Zehao Liu, Jian Li, Dongyan Huang, Yibing Chen, and Yang Zhou. 2024. "Real-Time Detection and Localization of Weeds in Dictamnus dasycarpus Fields for Laser-Based Weeding Control" Agronomy 14, no. 10: 2363. https://doi.org/10.3390/agronomy14102363
APA StyleXu, Y., Liu, Z., Li, J., Huang, D., Chen, Y., & Zhou, Y. (2024). Real-Time Detection and Localization of Weeds in Dictamnus dasycarpus Fields for Laser-Based Weeding Control. Agronomy, 14(10), 2363. https://doi.org/10.3390/agronomy14102363