FCB-YOLOv8s-Seg: A Malignant Weed Instance Segmentation Model for Targeted Spraying in Soybean Fields
Abstract
:1. Introduction
- (1)
- A lightweight weed instance segmentation detection model named FCB-YOLOv8s-Seg was developed by incorporating improved FasterNet, C2fSE, and BiFPN modules.
- (2)
- The FCB-YOLOv8s-Seg model was trained on self-collected soybean field weed images, demonstrating a high ability to discern subtle difference between weeds.
- (3)
- The comprehensive performance analysis and comparative experiments of the FCB-YOLOv8s-Seg model show that it outperforms existing baseline models in target detection and instance segmentation accuracy, while exhibiting better generalization ability and stability.
- (4)
- The FCB-YOLOv8s-Seg model was deployed on ground-based targeted spraying pesticide vehicles and tested in real soybean field environments, validating the method’s effective segmentation performance and stable operational capability.
2. Materials and Methods
2.1. Weed Image Acquisition and Dataset Preprocessing
2.1.1. Weed Image Acquisition
2.1.2. Production of the Base Dataset
2.1.3. Data Division and Augmentation
2.2. YOLOv8s-Seg Network Model
2.3. Improvement Scheme of the YOLOv8s-Seg Model
2.3.1. Improvement of the Backbone Network Based on FasterNet
2.3.2. C2fSE Attention Module
2.3.3. Introduction and Optimization of the BiFPN Module
2.4. Model Training and Evaluation Indicators
2.4.1. Model Training Environment
2.4.2. Model Training and Testing Parameter Settings
2.4.3. Model Evaluation Indicators
3. Results and Discussion
3.1. Ablation Experiments
3.2. Segmentation Effect of the FCB-YOLOv8s-Seg Model in Different Scenes
3.3. Comparison of Performance Parameters with Other Models
3.4. Field Test and Result Analysis
4. Conclusions
- (1)
- Ablation experiments show that the FCB-YOLOv8s-Seg model is significantly better than the original model in terms of mAP evaluation indicators, achieving 95.18% and 96.63% mAP in bounding box detection and segmentation, respectively, which were 5.08% and 7.43% higher than the original model, respectively. At the same time, the inference FPS increased by 7.7%, and the model size decreased by 1.8%. The model’s target detection and segmentation accuracy performance surpass some existing models while maintaining a balanced model size.
- (2)
- Image detection results in different scenes show that the FCB-YOLOv8s-Seg model performs well in complex scenes and fine-grained feature segmentation.
- (3)
- Comparative analysis with classic segmentation algorithms, such as Mask-RCNN, YOLACT, and YOLOv5s-Seg, demonstrates the feasibility and performance advantages of the FCB-YOLOv8s-Seg model in Cirsium setosum segmentation tasks.
- (4)
- In the field targeted spraying operation test, the average segmentation rate of the FCB-YOLOv8s-Seg model reached 91.3%, which was 6.38% better than the original model, further proving its practical reliability in smart agriculture applications.
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Conflicts of Interest
References
- Darbyshire, M.; Coutts, S.; Bosilj, P.; Sklar, E.; Parsons, S. Review of weed recognition: A global agriculture perspective. Comput. Electron. Agric. 2024, 227, 109499. [Google Scholar] [CrossRef]
- García-Navarrete, O.L.; Correa-Guimaraes, A.; Navas-Gracia, L.M. Application of Convolutional Neural Networks in Weed Detection and Identification: A Systematic Review. Agriculture 2024, 14, 568. [Google Scholar] [CrossRef]
- Grün, E.; Alves, A.F.; da Silva, A.L.; Zanon, A.J.; Corrêa, A.R.; Leichtweis, E.M.; Neto, R.C.A.; da Rosa Ulguim, A. How Do Off-Season Cover Crops Affect Soybean Weed Communities? Agriculture 2024, 14, 1509. [Google Scholar] [CrossRef]
- Sharma, G.; Shrestha, S.; Kunwar, S.; Tseng, T.-M. Crop diversification for improved weed management: A review. Agriculture 2021, 11, 461. [Google Scholar] [CrossRef]
- Gaskin, J. Recent contributions of molecular population genetic and phylogenetic studies to classic biological control of weeds. BioControl 2024, 69, 353–360. [Google Scholar] [CrossRef]
- Gamble, A.V.; Price, A.J. The intersection of integrated pest management and soil quality in the resistant weed era. Ital. J. Agron. 2021, 16, 1875. [Google Scholar] [CrossRef]
- Raj, M.; Gupta, S.; Chamola, V.; Elhence, A.; Garg, T.; Atiquzzaman, M.; Niyato, D. A survey on the role of Internet of Things for adopting and promoting Agriculture 4.0. Netw. Comput. Appl. 2021, 187, 103107. [Google Scholar] [CrossRef]
- Andreasen, C.; Scholle, K.; Saberi, M. Laser weeding with small autonomous vehicles: Friends or foes? Front. Agron. 2022, 4, 841086. [Google Scholar] [CrossRef]
- Jin, X.; Che, J.; Chen, Y. Weed identification using deep learning and image processing in vegetable plantation. IEEE Access 2021, 9, 10940–10950. [Google Scholar] [CrossRef]
- Khan, A.; Ilyas, T.; Umraiz, M.; Mannan, Z.I.; Kim, H. Ced-net: Crops and weeds segmentation for smart farming using a small cascaded encoder-decoder architecture. Electronics 2020, 9, 1602. [Google Scholar] [CrossRef]
- Tufail, M.; Iqbal, J.; Tiwana, M.I.; Alam, M.S.; Khan, Z.A.; Khan, M.T. Identification of tobacco crop based on machine learning for a precision agricultural sprayer. IEEE Access 2021, 9, 23814–23825. [Google Scholar] [CrossRef]
- Agarwal, D. A machine learning framework for the identification of crops and weeds based on shape curvature and texture properties. Int. J. Inf. Technol. 2024, 16, 1261–1274. [Google Scholar] [CrossRef]
- Zhang, J.; Gong, J.; Zhang, Y.; Mostafa, K.; Yuan, G. Weed identification in maize fields based on improved Swin-Unet. Agronomy 2023, 13, 1846. [Google Scholar] [CrossRef]
- Fan, X.; Zhou, J.; Xu, Y.; Li, K.; Wen, D. Identification and Localization of Weeds Based on Optimized Faster R-CNN in Cotton Seedling Stage. Trans. Chin. Soc. Agric. Mach. 2021, 52, 26–34. [Google Scholar] [CrossRef]
- Zou, K.; Chen, X.; Wang, Y.; Zhang, C.; Zhang, F. A modified U-Net with a specific data argumentation method for semantic segmentation of weed images in the field. Comput. Electron. Agric. 2021, 187, 106242. [Google Scholar] [CrossRef]
- Yang, L.; Xu, S.; Yu, X.; Long, H.; Zhang, H.; Zhu, Y. A new model based on improved VGG16 for corn weed identification. Front. Plant Sci. 2023, 14, 1205151. [Google Scholar] [CrossRef]
- Sun, Z.; Zhang, C.; Ge, L.; Zhang, M.; Li, W.; Tan, Y. Image Detection Method for Broccoli Seedlings in Field Based on Faster R-CNN. Trans. Chin. Soc. Agric. Mach. 2019, 50, 216–221. [Google Scholar] [CrossRef]
- Wang, H.; Song, H.; Wu, H.; Zhang, Z.; Deng, S.; Feng, X.; Chen, Y. Multilayer feature fusion and attention-based network for crops and weeds segmentation. J. Plant Dis. Prot. 2022, 129, 1475–1489. [Google Scholar] [CrossRef]
- Sodjinou, S.G.; Mohammadi, V.; Mahama, A.T.S.; Gouton, P. A deep semantic segmentation-based algorithm to segment crops and weeds in agronomic color images. Inf. Process. Agric. 2022, 9, 355–364. [Google Scholar] [CrossRef]
- Kim, Y.H.; Park, K.R. MTS-CNN: Multi-task semantic segmentation-convolutional neural network for detecting crops and weeds. Comput. Electron. Agric. 2022, 199, 107146. [Google Scholar] [CrossRef]
- Xu, B.; Fan, J.; Chao, J.; Arsenijevic, N.; Werle, R.; Zhang, Z. Instance segmentation method for weed detection using UAV imagery in soybean fields. Comput. Electron. Agric. 2023, 211, 107994. [Google Scholar] [CrossRef]
- Chen, C.; Zheng, Z.; Xu, T.; Guo, S.; Feng, S.; Yao, W.; Lan, Y. Yolo-based uav technology: A review of the research and its applications. Drones 2023, 7, 190. [Google Scholar] [CrossRef]
- Jiang, P.; Qi, A.; Zhong, J.; Luo, Y.; Hu, W.; Shi, Y.; Liu, T. Field cabbage detection and positioning system based on improved YOLOv8n. Plant Methods 2024, 20, 96. [Google Scholar] [CrossRef]
- Jing, X.; Wang, Y.; Li, D.; Pan, W. Melon ripeness detection by an improved object detection algorithm for resource constrained environments. Plant Methods 2024, 20, 127. [Google Scholar] [CrossRef]
- Wang, C.-Y.; Bochkovskiy, A.; Liao, H.-Y.M. YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Vancouver, BC, Canada, 18–22 June 2023; pp. 7464–7475. [Google Scholar]
- Ahmad, A.; Saraswat, D.; Aggarwal, V.; Etienne, A.; Hancock, B. Performance of deep learning models for classifying and detecting common weeds in corn and soybean production systems. Comput. Electron. Agric. 2021, 184, 106081. [Google Scholar] [CrossRef]
- Zhu, H.; Zhang, Y.; Mu, D.; Bai, L.; Wu, X.; Zhuang, H.; Li, H. Research on improved YOLOx weed detection based on lightweight attention module. Crop Prot. 2024, 177, 106563. [Google Scholar] [CrossRef]
- Fan, X.; Sun, T.; Chai, X.; Zhou, J. YOLO-WDNet: A lightweight and accurate model for weeds detection in cotton field. Comput. Electron. Agric. 2024, 225, 109317. [Google Scholar] [CrossRef]
- Wang, X.; Wang, Q.; Qiao, Y.; Zhang, X.; Lu, C.; Wang, C. Precision Weed Management for Straw-Mulched Maize Field: Advanced Weed Detection and Targeted Spraying Based on Enhanced YOLO v5s. Agriculture 2024, 14, 2134. [Google Scholar] [CrossRef]
- Rai, N.; Zhang, Y.; Villamil, M.; Howatt, K.; Ostlie, M.; Sun, X. Agricultural weed identification in images and videos by integrating optimized deep learning architecture on an edge computing technology. Comput. Electron. Agric. 2024, 216, 108442. [Google Scholar] [CrossRef]
- Li, H.; Guo, C.; Yang, Z.; Chai, J.; Shi, Y.; Liu, J.; Zhang, K.; Liu, D.; Xu, Y. Design of field real-time target spraying system based on improved yolov5. Front. Plant Sci. 2022, 13, 1072631. [Google Scholar] [CrossRef]
- Yang, Z.; Liu, J.; Wang, L.; Shi, Y.; Cui, G.; Ding, L.; Li, H. Fast and Precise Detection of Dense Soybean Seedlings Images Based on Airborne Edge Device. Agriculture 2024, 14, 208. [Google Scholar] [CrossRef]
- Wang, L.; Zhao, Y.; Xiong, Z.; Wang, S.; Li, Y.; Lan, Y. Fast and precise detection of litchi fruits for yield estimation based on the improved YOLOv5 model. Front. Plant Sci. 2022, 13, 965425. [Google Scholar] [CrossRef] [PubMed]
- Terven, J.; Córdova-Esparza, D.; Romero-González, J. A comprehensive review of yolo architectures in computer vision: From yolov1 to yolov8 and yolo-nas. Mach. Learn. Knowl. Extr. 2023, 5, 1680–1716. [Google Scholar] [CrossRef]
- Jocher, G.; Chaurasia, A.; Qiu, J. YOLO by Ultralytics. Available online: https://github.com/ultralytics/ultralytics (accessed on 1 January 2024).
- Fang, M.; Liang, X.; Fu, F.; Song, Y.; Shao, Z. Attention mechanism based semi-supervised multi-gain image fusion. Symmetry 2020, 12, 451. [Google Scholar] [CrossRef]
- Tan, M.; Pang, R.; Le, Q. Efficientdet: Scalable and efficient object detection. In Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 19–20 June 2020; pp. 10778–10787. [Google Scholar]
- Ge, Z.; Liu, S.; Wang, F.; Li, Z.; Sun, J. Yolox: Exceeding yolo series in 2021. arXiv 2021, arXiv:2107.08430. [Google Scholar] [CrossRef]
Running Environment | Version |
---|---|
CPU | Intel Core i5-12600KF |
GPU | NVIDIA GeForce RTX 3060 12GB |
Server environment | Ubuntu 20.04.3 LTS |
Deep learning framework | Pytorch 3.8.18 |
CUDA | 11.0 |
Python version | 3.11 |
Models | P_Obj (%) | R_Obj (%) | mAP_obj (%) | P_Mask (%) | R_Mask (%) | mAP_Mask (%) | FPS | Size (MB) |
---|---|---|---|---|---|---|---|---|
YOLOv8s-Seg | 82.2 | 88.0 | 90.1 | 81.7 | 87.5 | 89.2 | 75.47 | 22.7 |
+FasterNet | 89.3+7.1 | 80.2−7.8 | 90.9+0.8 | 89.0+7.3 | 80.4−7.1 | 90.9+1.7 | 86.2+10.74 | 17.9−4.8 |
+FasterNet+C2fSE | 90.9+1.6 | 85.9+5.7 | 92.9+2 | 89.9+0.9 | 85.9+5.5 | 91.8+0.9 | 81.3−4.91 | 18.1+0.2 |
+FasterNet+ C2fSE+BiFPN | 91.95+1.05 | 86.4+0.5 | 95.18+2.28 | 91.3+1.4 | 89.95+4.05 | 96.63+4.83 | 81.3+0 | 22.3+4.2 |
Models | mAP_Obj (%) | mAP_Mask (%) | Size (MB) |
---|---|---|---|
YOLOv5s-Seg | 88.80 | 87.65 | 14.5 |
MaskRCNN (ResNet 50) | 72.19 | 60.85 | 335 |
YOLACT | 82.30 | 84.50 | 117 |
YOLOv8s-Seg | 90.10 | 89.20 | 22.7 |
FCB-YOLOv8s-Seg | 95.18 | 96.63 | 22.3 |
Area Number | Speed (km/s) | Number of Cirsium arvense | YOLOv8s-Seg | FCB-YOLOv8s-Seg | ||
---|---|---|---|---|---|---|
Segmentation Count | Average Segmentation Rate (%) | Segmentation Count | Average Segmentation Rate (%) | |||
1 | 2 | 97 | 87 | 86.60 | 95 | 94.16 |
3 | 85 | 91 | ||||
4 | 80 | 88 | ||||
2 | 2 | 142 | 130 | 83.80 | 136 | 89.67 |
3 | 119 | 127 | ||||
4 | 108 | 119 | ||||
3 | 2 | 106 | 96 | 84.91 | 102 | 90.88 |
3 | 91 | 97 | ||||
4 | 83 | 90 | ||||
Total | 345 | 293.00 | 84.93 | 315 | 91.30 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yang, Z.; Wang, L.; Li, C.; Li, H. FCB-YOLOv8s-Seg: A Malignant Weed Instance Segmentation Model for Targeted Spraying in Soybean Fields. Agriculture 2024, 14, 2357. https://doi.org/10.3390/agriculture14122357
Yang Z, Wang L, Li C, Li H. FCB-YOLOv8s-Seg: A Malignant Weed Instance Segmentation Model for Targeted Spraying in Soybean Fields. Agriculture. 2024; 14(12):2357. https://doi.org/10.3390/agriculture14122357
Chicago/Turabian StyleYang, Zishang, Lele Wang, Chenxu Li, and He Li. 2024. "FCB-YOLOv8s-Seg: A Malignant Weed Instance Segmentation Model for Targeted Spraying in Soybean Fields" Agriculture 14, no. 12: 2357. https://doi.org/10.3390/agriculture14122357
APA StyleYang, Z., Wang, L., Li, C., & Li, H. (2024). FCB-YOLOv8s-Seg: A Malignant Weed Instance Segmentation Model for Targeted Spraying in Soybean Fields. Agriculture, 14(12), 2357. https://doi.org/10.3390/agriculture14122357