A Lightweight Cotton Field Weed Detection Model Enhanced with EfficientNet and Attention Mechanisms
Abstract
:1. Introduction
2. Materials and Methods
2.1. Materials
2.2. Improved Efficient Convolutional Lightweight Weed Detection Model in Cotton Field
2.2.1. Overall Structure of the Model
2.2.2. Improved Efficient Convolutional Lightweight Backbone Network
2.2.3. Head Network of Efficient Multi-Scale Attention Progressive Feature Pyramid Combination
2.3. Experimental Environment
2.4. Model Evaluation Metrics
3. Results
3.1. Comparative Experiment Analysis of Lightweight Backbone Networks
3.2. Comparative Experiment Analysis of Backbone Networks with Attention Mechanisms
3.3. Comparative Experiment Analysis of the Improved Head Network C2f Module Based on Attention Mechanisms
3.4. Ablation Study Analysis
3.5. Experimental Analysis of Improved Models
3.6. Comparative Experiment Analysis of Detection Networks
3.7. Model Acceleration Test Based on Tensor RT
4. Discussion
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Hu, K.; Wang, Z.; Coleman, G.; Bender, A.; Yao, T.; Zeng, S.; Song, D.; Schumann, A.; Walsh, M. Deep learning techniques for in-crop weed recognition in large-scale grain production systems: A review. Precis. Agric. 2024, 25, 1–29. [Google Scholar] [CrossRef]
- Rani, S.V.J.; Kumar, P.S.; Priyadharsini, R.; Srividya, S.J.; Harshana, S. Automated weed detection system in smart farming for developing sustainable agriculture. Int. J. Environ. Sci. Technol. 2022, 19, 9083–9094. [Google Scholar] [CrossRef]
- Lauwers, M.; De Cauwer, B.; Nuyttens, D.; Cool, S.R.; Pieters, J.G. Hyperspectral classification of Cyperus esculentus clones and morphologically similar weeds. Sensors 2020, 20, 2504. [Google Scholar] [CrossRef]
- Xu, K.; Yuen, P.; Xie, Q.; Zhu, Y.; Cao, W.; Ni, J. WeedsNet: A dual attention network with RGB-D image for weed detection in natural wheat field. Precis. Agric. 2024, 25, 460–485. [Google Scholar] [CrossRef]
- Li, J.; Chen, D.; Yin, X.; Li, Z. Performance evaluation of semi-supervised learning frameworks for multi-class weed detection. Front. Plant Sci. 2024, 15, 1396568. [Google Scholar] [CrossRef] [PubMed]
- MacRae, A.W.; Webster, T.M.; Sosnoskie, L.M.; Culpepper, A.S.; Kichler, J.M. Cotton yield loss potential in response to length of Palmer amaranth (Amaranthus palmeri) interference. J. Cotton Sci. 2013, 17, 227–232. [Google Scholar]
- Mendoza-Bernal, J.; González-Vidal, A.; Skarmeta, A.F. A Convolutional Neural Network approach for image-based anomaly detection in smart agriculture. Expert Syst. Appl. 2024, 247, 123210. [Google Scholar] [CrossRef]
- Dang, F.; Chen, D.; Lu, Y.; Li, Z. YOLOWeeds: A novel benchmark of YOLO object detectors for multi-class weed detection in cotton production systems. Comput. Electron. Agric. 2023, 205, 107655. [Google Scholar] [CrossRef]
- Ahmad, A.; Saraswat, D.; Aggarwal, V.; Etienne, A.; Hancock, B. Performance of deep learning models for classifying and detecting common weeds in corn and soybean production systems. Comput. Electron. Agric. 2021, 184, 1–30. [Google Scholar] [CrossRef]
- Peteinatos, G.G.; Weis, M.; Andújar, D.; Rueda Ayala, V.; Gerhards, R. Potential use of ground-based sensor technologies for weed detection. Pest Manag. Sci. 2014, 70, 190–199. [Google Scholar] [CrossRef]
- Farooq, A.; Jia, X.; Hu, J.; Zhou, J. Multi-resolution weed classification via Convolutional Neural Network and superpixel based local binary pattern using remote sensing images. Remote Sens. 2019, 11, 1692. [Google Scholar] [CrossRef]
- Shorewala, S.; Ashfaque, A.; Sidharth, R.; Verma, U. Weed density and distribution estimation for precision agriculture using semi-supervised learning. IEEE Access 2021, 9, 27971–27986. [Google Scholar] [CrossRef]
- Dos Santos Ferreira, A.; Freitas, D.M.; Da Silva, G.G.; Pistori, H.; Folhes, M.T. Unsupervised deep learning and semi-automatic data labeling in weed discrimination. Comput. Electron. Agric. 2019, 165, 104963. [Google Scholar] [CrossRef]
- Mu, Y.; Feng, R.; Ni, R.; Li, J.; Luo, T.; Liu, T.; Li, X.; Gong, H.; Guo, Y.; Sun, Y.; et al. A faster R-CNN-based model for the identification of weed seedling. Agronomy 2022, 12, 2867. [Google Scholar] [CrossRef]
- Ilyas, T.; Lee, J.; Won, O.; Jeong, Y.; Kim, H. Overcoming field variability: Unsupervised domain adaptation for enhanced crop-weed recognition in diverse farmlands. Front. Plant Sci. 2023, 14, 1234616. [Google Scholar] [CrossRef]
- Mu, Y.; Ni, R.; Fu, L.; Luo, T.; Feng, R.; Li, J.; Pan, H.; Wang, Y.; Sun, Y.; Gong, H.; et al. DenseNet weed recognition model combining local variance preprocessing and attention mechanism. Front. Plant Sci. 2023, 13, 1041510. [Google Scholar] [CrossRef]
- Wang, Q.; Cheng, M.; Huang, S.; Cai, Z.; Zhang, J.; Yuan, H. A deep learning approach incorporating YOLO v5 and attention mechanisms for field real-time detection of the invasive weed Solanum rostratum Dunal seedlings. Comput. Electron. Agric. 2022, 199, 107194. [Google Scholar] [CrossRef]
- Fan, X.; Chai, X.; Zhou, J.; Sun, T. Deep learning based weed detection and target spraying robot system at seedling stage of cotton field. Comput. Electron. Agric. 2023, 214, 108317. [Google Scholar] [CrossRef]
- Chen, P.; Xia, T.; Yang, G. A new strategy for weed detection in maize fields. Eur. J. Agron. 2024, 159, 127289. [Google Scholar] [CrossRef]
- Jin, X.; Sun, Y.; Che, J.; Bagavathiannan, M.; Yu, J.; Chen, Y. A novel deep learning-based method for detection of weeds in vegetables. Pest Manag. Sci. 2022, 78, 1861–1869. [Google Scholar] [CrossRef]
- Singh, V.; Singh, D.; Kumar, H. Efficient application of deep neural networks for identifying small and multiple weed patches using drone images. IEEE Access 2024, 12, 71982–71996. [Google Scholar] [CrossRef]
- Tan, M.; Le, Q. Efficientnet: Rethinking model scaling for convolutional neural networks. In Proceedings of the International Conference on Machine Learning, Long Beach, CA, USA, 9–15 June 2019; pp. 6105–6114. [Google Scholar]
- Lin, H.; Cheng, X.; Wu, X.; Shen, D. Cat: Cross attention in vision transformer. In Proceedings of the 2022 IEEE international Conference on Multimedia and Expo (ICME), Taipei, Taiwan, 18–22 July 2022; pp. 1–6. [Google Scholar]
- Hu, J.; Shen, L.; Sun, G. Squeeze-and-excitation networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 13–23 June 2018; IEEE: New York, NY, USA; pp. 7132–7141. [Google Scholar]
- Yang, G.; Lei, J.; Zhu, Z.; Cheng, S.; Feng, Z.; Liang, R. AFPN: Asymptotic Feature Pyramid Network for Object Detection. arXiv 2023, arXiv:2306.15988. [Google Scholar]
- Ouyang, D.; He, S.; Zhang, G.; Luo, M.; Guo, H.; Zhan, J.; Huang, Z. Efficient Multi-Scale Attention Module with Cross-Spatial Learning. In Proceedings of the ICASSP 2023-2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Rhodes Island, Greece, 4–10 June 2023; pp. 1–5. [Google Scholar]
- Javanmardi, S.; Ashtiani, S.H.M.; Verbeek, F.J.; Martynenko, A. Computer-vision classification of corn seed varieties using deep convolutional neural network. J. Stored Prod. Res. 2021, 92, 101800. [Google Scholar] [CrossRef]
- Zhou, D.; Hou, Q.; Chen, Y.; Feng, J.; Yan, S. Rethinking bottleneck structure for efficient mobile network design. In Proceedings of the Computer Vision-ECCV 2020: 16th European Conference, Glasgow, UK, 23–28 August 2020; Springer International Publishing: Berlin/Heidelberg, Germany, 2020; pp. 680–697. [Google Scholar]
- Cui, C.; Gao, T.; Wei, S.; Du, Y.; Guo, R.; Dong, S.; Lu, B.; Zhou, Y.; Lv, X.; Liu, Q.; et al. PP-LCNet: A Lightweight CPU Convolutional Neural Network. arXiv 2021, arXiv:2109.15099. [Google Scholar]
- Ma, N.; Zhang, X.; Zheng, H.T.; Sun, J. Shufflenet v2: Practical guidelines for efficient cnn architecture design. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 116–131. [Google Scholar]
- Woo, S.; Park, J.; Lee, J.Y.; Kweon, I.S. Cbam: Convolutional block attention module. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 3–19. [Google Scholar]
- Qin, X.; Li, N.; Weng, C.; Su, D.; Li, M. Simple attention module based speaker verification with iterative noisy label detection. In Proceedings of the ICASSP 2022-2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Singapore, 22–27 May 2022; pp. 6722–6726. [Google Scholar]
- Li, X.; Wang, W.; Hu, X.; Yang, J. Selective kernel networks. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 15–20 June 2019; pp. 510–519. [Google Scholar]
- Li, Y.; Yao, T.; Pan, Y.; Mei, T. Contextual transformer networks for visual recognition. IEEE Trans. Pattern Anal. Mach. Intell. 2022, 45, 1489–1500. [Google Scholar] [CrossRef]
- Misra, D.; Nalamada, T.; Arasanipalai, A.U.; Hou, Q. Rotate to attend: Convolutional triplet attention module. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, Virtual, 5–9 January 2021; pp. 3139–3148. [Google Scholar]
- Wang, Q.; Wu, B.; Zhu, P.; Li, P.; Zuo, W.; Hu, Q. ECA-Net: Efficient channel attention for deep convolutional neural networks. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 14–19 June 2020; pp. 11534–11542. [Google Scholar]
- Bolouri, F.; Kocoglu, Y.; Pabuayon, I.L.B.; Ritchie, G.L.; Sari-Sarraf, H. CottonSense: A high-throughput field phenotyping system for cotton fruit segmentation and enumeration on edge devices. Comput. Electron. Agric. 2024, 216, 108531. [Google Scholar] [CrossRef]
- Muruganantham, P.; Wibowo, S.; Grandhi, S.; Samrat, N.H.; Islam, N. A systematic literature review on crop yield prediction with deep learning and remote sensing. Remote Sens. 2022, 14, 1990. [Google Scholar] [CrossRef]
- Zhu, H.; Lin, C.; Liu, G.; Wang, D.; Qin, S.; Li, A.; Xu, J.-L.; He, Y. Intelligent agriculture: Deep learning in UAV-based remote sensing imagery for crop diseases and pests detection. Front. Plant Sci. 2024, 15, 1435016. [Google Scholar] [CrossRef]
- Kashyap, P.K.; Kumar, S.; Jaiswal, A.; Prasad, M.; Gandomi, A.H. Towards precision agriculture: IoT-enabled intelligent irrigation systems using deep learning neural network. IEEE Sens. J. 2021, 21, 17479–17491. [Google Scholar] [CrossRef]
- Kussul, N.; Lavreniuk, M.; Skakun, S.; Shelestov, A. Deep learning classification of land cover and crop types using remote sensing data. IEEE Geosci. Remote Sens. Lett. 2017, 14, 778–782. [Google Scholar] [CrossRef]
- Maimaitijiang, M.; Sagan, V.; Sidike, P.; Daloye, A.M.; Erkbol, H.; Fritschi, F.B. Crop monitoring using satellite/UAV data fusion and machine learning. Remote Sens. 2020, 12, 1357. [Google Scholar] [CrossRef]
- Navaneethan, S.; Sampath, J.L.; Kiran, S.S. Development of a Multi-Sensor Fusion Framework for Early Detection and Monitoring of Corn Plant Diseases. In Proceedings of the 2023 2nd International Conference on Automation, Computing and Renewable Systems (ICACRS), Pudukkottai, India, 11–13 December 2023; pp. 856–861. [Google Scholar]
English | Latin | Quantity |
---|---|---|
Waterhemp | Debregeasia orientalis | 1967 |
Morningglory | Ipomoea nil | 1372 |
Purslane | Portulaca oleracea | 996 |
Spotted Spurge | Euphorbia maculata | 1010 |
Carpetweed | Trigastrotheca stricta | 995 |
Ragweed | Ambrosia artemisiifolia | 917 |
Eclipta | Eclipta prostrata | 877 |
Prickly Sida | Eryngium foetidum | 509 |
Palmer Amaranth | Amaranthus palmeri | 350 |
Sicklepod | Stag beetle | 249 |
Goosegrass | Eleusine indica | 215 |
Cutleaf Groundcherry | Physalis angulata | 144 |
Environmental Parameter | Value |
---|---|
Operating system | CentOS |
CPU | Intel(R)Xeon(R)CPU E5-2630 v4(2.20 GHZ 10 cores) |
Memory | 22 GB × 3 |
GPU | NVIDIA Tesla p40 |
Programming language | Python3.8 |
Experimental framework | Pytorch2.0.1 |
GPU acceleration library | CUDA11.3 |
Dataset | CottonWeedDet12 |
Models | mAP (%) | Detection Time Per Image (ms) | FPS | Model Size (MB) | Params (MB) |
---|---|---|---|---|---|
YOLOv8 | 91.98 | 12.7 | 78.70 | 5.9 | 11.45 |
YOLO-MobileNeXt | 91.30 | 13.6 | 75.53 | 4.8 | 8.34 |
YOLO-PP-LCNet | 89.90 | 2.3 | 434.78 | 4.1 | 7.24 |
YOLO-ShuffleNetV2 | 79.30 | 3.1 | 322.58 | 3.3 | 22.5 |
YOLO-EfficientNet | 91.80 | 1.7 | 588.24 | 4.5 | 7.97 |
Models | mAP (%) | Inference Time (ms) | FPS | Model Size (MB) | Params (MB) |
---|---|---|---|---|---|
YOLOv8 | 91.98 | 12.7 | 78.70 | 5.9 | 11.45 |
EfficientB0-EMA | 92.30 | 3.5 | 385.71 | 4.6 | 8.36 |
EfficientB0-CBAM | 91.50 | 1.9 | 526.32 | 4.6 | 7.98 |
EfficientB0-SE | 91.90 | 2.0 | 500.00 | 4.5 | 8.36 |
EfficientB0-SimAM | 91.80 | 2.5 | 400.00 | 4.5 | 8.35 |
EfficientB0-SK | 90.70 | 2.0 | 500.00 | 5.0 | 9.16 |
EfficientB0-CoT | 92.40 | 2.7 | 370.37 | 4.6 | 8.44 |
EfficientB0-CA | 92.10 | 2.1 | 476.19 | 4.6 | 8.36 |
Models | mAP (%) | Inference Time (ms) | FPS | Model Size (MB) | Params (MB) |
---|---|---|---|---|---|
YOLOv8 | 91.98 | 12.7 | 78.70 | 5.9 | 11.45 |
C2f_SE | 91.10 | 3.0 | 333.33 | 4.6 | 7.98 |
C2f_CA | 91.50 | 1.9 | 526.32 | 4.6 | 7.98 |
C2f_ECA | 91.50 | 19.2 | 52.08 | 4.6 | 7.99 |
C2f_TripleT | 91.90 | 2.4 | 416.67 | 4.6 | 7.96 |
C2f_EMA | 91.40 | 1.9 | 526.32 | 5.0 | 8.74 |
YOLO-WL | 92.30 | 1.9 | 526.32 | 4.6 | 7.98 |
Models | EfficientNet | CA | C2f_EMA | mAP (%) | Inference Time (ms) | Model Size (MB) |
---|---|---|---|---|---|---|
YOLOv8 | X | X | X | 91.98 | 12.70 | 5.9 |
YOLO-EfficientNet | √ | X | X | 91.80 | 1.70 | 4.5 |
YOLO-CA | X | √ | X | 92.10 | 2.70 | 4.6 |
YOLO-C2f_EMA-AFPN | √ | X | √ | 91.40 | 1.90 | 5.0 |
YOLO-WL | √ | √ | √ | 92.30 | 1.90 | 4.6 |
Weeds | mAp (%) | Precision (%) | Recall (%) |
---|---|---|---|
Waterhemp | 81.5 | 92.4 | 55.7 |
Morningglory | 90.0 | 93.0 | 86.4 |
Purslane | 96.8 | 94.4 | 92.0 |
Spotted Spurge | 89.8 | 84.8 | 81.2 |
Carpetweed | 94.9 | 96.1 | 85.8 |
Ragweed | 85.7 | 96.3 | 76.1 |
Eclipta | 92.9 | 97.1 | 87.9 |
Prickly Sida | 99.5 | 96.2 | 100.0 |
Palmer Amaranth | 93.6 | 93.5 | 90.4 |
Sicklepod | 93.5 | 90.9 | 94.4 |
Goosegrass | 92.5 | 92.6 | 90.9 |
Cutleaf Groundcherry | 96.6 | 97.9 | 89.8 |
Models | mAP (%) | Inference Time (ms) | FPS | Model Size (MB) | Params (MB) |
---|---|---|---|---|---|
YOLOv3 | 93.10 | 54.2 | 18.45 | 791.0 | 395.13 |
YOLOv5 | 91.80 | 12.7 | 78.74 | 5.0 | 9.56 |
YOLOv7-tiny | 92.87 | 25.5 | 39.22 | 11.8 | 23.04 |
YOLOv7 | 92.65 | 120.4 | 8.31 | 285.0 | 139.34 |
YOLOv8 | 91.98 | 12.7 | 78.70 | 5.9 | 11.45 |
YOLO-WL | 92.30 | 1.9 | 526.32 | 4.6 | 7.98 |
Models | mAP (%) | Inference Time (ms) |
---|---|---|
YOLOv8 | 91.98 | 30.851 |
YOLO-WL | 92.3 | 23.134 |
YOLO-WL-RT | 92.1 | 2.443 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zheng, L.; Long, L.; Zhu, C.; Jia, M.; Chen, P.; Tie, J. A Lightweight Cotton Field Weed Detection Model Enhanced with EfficientNet and Attention Mechanisms. Agronomy 2024, 14, 2649. https://doi.org/10.3390/agronomy14112649
Zheng L, Long L, Zhu C, Jia M, Chen P, Tie J. A Lightweight Cotton Field Weed Detection Model Enhanced with EfficientNet and Attention Mechanisms. Agronomy. 2024; 14(11):2649. https://doi.org/10.3390/agronomy14112649
Chicago/Turabian StyleZheng, Lu, Lyujia Long, Chengao Zhu, Mengmeng Jia, Pingting Chen, and Jun Tie. 2024. "A Lightweight Cotton Field Weed Detection Model Enhanced with EfficientNet and Attention Mechanisms" Agronomy 14, no. 11: 2649. https://doi.org/10.3390/agronomy14112649
APA StyleZheng, L., Long, L., Zhu, C., Jia, M., Chen, P., & Tie, J. (2024). A Lightweight Cotton Field Weed Detection Model Enhanced with EfficientNet and Attention Mechanisms. Agronomy, 14(11), 2649. https://doi.org/10.3390/agronomy14112649