Weed Detection in Peanut Fields Based on Machine Vision
Abstract
:1. Introduction
2. Materials and Methods
2.1. Materials
2.1.1. Data Acquisition
2.1.2. Data Enhancement and Annotation
2.2. Methods
2.2.1. EM-YOLOv4-Tiny Network
2.2.2. ECA Attention Mechanisms
2.2.3. Use of Complete Intersection over Union Loss
2.2.4. Soft-NMS Algorithm for Filtering Prediction Boxes
2.2.5. Model Performance Evaluation Indices
2.2.6. Model Training
3. Results
3.1. Performance Evaluation of EM-YOLOv4-Tiny
3.2. Performance Comparison of Improved Methods
3.3. Performance Comparison of Different Attention Mechanisms
3.4. Comparison of Performance with Different Network Models
3.5. Comparison of Performances under Different Scenarios
4. Discussion
4.1. Deep Learning for Weed Detection
4.2. Challenge of Small Target Detection
4.3. Limitations and Shortcomings
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A
Appendix B
References
- Renton, M.; Chauhan, B.S. Modelling crop-weed competition: Why, what, how and what lies ahead? Crop Prot. 2017, 95, 101–108. [Google Scholar] [CrossRef]
- Zhuang, J.; Li, X.; Bagavathiannan, M.; Jin, X.; Yang, J.; Meng, W.; Li, T.; Li, L.; Wang, Y.; Chen, Y.; et al. Evaluation of different deep convolutional neural networks for detection of broadleaf weed seedlings in wheat. Pest Manag. Sci. 2022, 78, 521–529. [Google Scholar] [CrossRef] [PubMed]
- Kanagasingham, S.; Ekpanyapong, M.; Chaihan, R. Integrating machine vision-based row guidance with GPS and compass-based routing to achieve autonomous navigation for a rice field weeding robot. Precis. Agric. 2020, 21, 831–855. [Google Scholar] [CrossRef]
- Wang, A.; Zhang, W.; Wei, X. A review on weed detection using ground-based machine vision and image processing techniques. Comput. Electron. Agric. 2019, 158, 226–240. [Google Scholar] [CrossRef]
- Reedha, R.; Dericquebourg, E.; Canals, R.; Hafiane, A. Transformer Neural Network for Weed and Crop Classification of High Resolution UAV Images. Remote Sens. 2022, 14, 592. [Google Scholar] [CrossRef]
- Peteinatos, G.G.; Weis, M.; Andújar, D.; Ayala, V.R.; Gerhards, R. Potential use of ground-based sensor technologies for weed detection. Pest Manag. Sci. 2014, 70, 190–199. [Google Scholar] [CrossRef]
- García-Santillán, I.D.; Pajares, G. On-line crop/weed discrimination through the Mahalanobis distance from images in maize fields. Biosyst. Eng. 2018, 166, 28–43. [Google Scholar] [CrossRef]
- Bakhshipour, A.; Jafari, A. Evaluation of support vector machine and artificial neural networks in weed detection using shape features. Comput. Electron. Agric. 2018, 145, 153–160. [Google Scholar] [CrossRef]
- Pulido, C.; Solaque, L.; Velasco, N. Weed recognition by SVM texture feature classification in outdoor vegetable crop images. Ing. E Investig. 2017, 37, 68–74. [Google Scholar] [CrossRef]
- Kamilaris, A.; Prenafeta-Boldú, F.X. Deep learning in agriculture: A survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef] [Green Version]
- Gai, R.; Chen, N.; Yuan, H. A detection algorithm for cherry fruits based on the improved YOLO-v4 model. Neural Comput. Appl. 2021, 1–12. [Google Scholar] [CrossRef]
- Khan, S.; Tufail, M.; Khan, M.T.; Khan, Z.A.; Anwar, S. Deep learning-based identification system of weeds and crops in strawberry and pea fields for a precision agriculture sprayer. Precis. Agric. 2021, 22, 1711–1727. [Google Scholar] [CrossRef]
- Jin, X.; Sun, Y.; Che, J.; Bagavathiannan, M.; Yu, J.; Chen, Y. A novel deep learning-based method for detection of weeds in vegetables. Pest Manag. Sci. 2022, 78, 1861–1869. [Google Scholar] [CrossRef]
- Ying, B.; Xu, Y.; Zhang, S.; Shi, Y.; Liu, L. Weed detection in images of carrot fields based on improved YOLO v4. Traitement Du Signal 2021, 38, 341–348. [Google Scholar] [CrossRef]
- Li, X.; Pan, J.; Xie, F.; Zeng, J.; Li, Q.; Huang, X.; Liu, D.; Wang, X. Fast and accurate green pepper detection in complex backgrounds via an improved YOLOv4-tiny model. Comput. Electron. Agric. 2021, 191, 106503. [Google Scholar] [CrossRef]
- Li, H.; Li, C.; Li, G.; Chen, L. A real-time table grape detection method based on improved YOLOv4-tiny network in complex background. Biosyst. Eng. 2021, 212, 347–359. [Google Scholar] [CrossRef]
- Lin, T.Y.; Dollár, P.; Girshick, R.; He, K.; Hariharan, B.; Belongie, S. Feature pyramid networks for object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 2117–2125. [Google Scholar]
- Gao, C.; Cai, Q.; Ming, S. YOLOv4 object detection algorithm with efficient channel attention mechanism. In Proceedings of the 2020 5th International Conference on Mechanical, Control and Computer Engineering (ICMCCE), Harbin, China, 25–27 December 2020; IEEE: Piscataway Township, NJ, USA, 2020; pp. 1764–1770. [Google Scholar]
- Bodla, N.; Singh, B.; Chellappa, R.; Davis, L.S. Soft-NMS--improving object detection with one line of code. In Proceedings of the IEEE international conference on computer vision, Venice, Italy, 22–29 October 2017; pp. 5561–5569. [Google Scholar]
- Neubeck, A.; Van Gool, L. Efficient non-maximum suppression. In Proceedings of the 18th International Conference on Pattern Recognition (ICPR’06), Hong Kong, China, 20–24 August 2006; IEEE: Piscataway Township, NJ, USA, 2006; Volume 3, pp. 850–855. [Google Scholar]
- Wu, D.; Lv, S.; Jiang, M.; Song, H. Using channel pruning-based YOLO v4 deep learning algorithm for the real-time and accurate detection of apple flowers in natural environments. Comput. Electron. Agric. 2020, 178, 105742. [Google Scholar] [CrossRef]
- Wang, L.; Qin, M.; Lei, J.; Wang, X.; Tan, K. Blueberry maturity recognition method based on improved YOLOv4-Tiny. Nongye Gongcheng Xuebao/Trans. Chin. Soc. Agric. Eng. 2021, 37, 170–178. [Google Scholar]
- Xu, J.; Li, Z.; Du, B.; Zhang, M.; Liu, J. Reluplex made more practical: Leaky ReLU. In Proceedings of the 2020 IEEE Symposium on Computers and communications (ISCC), Rennes, France, 7–10 July 2020; IEEE: Piscataway Township, NJ, USA, 2020; pp. 1–7. [Google Scholar]
- Chen, Z.; Tian, S.; Yu, L.; Zhang, L.; Zhang, X. An object detection network based on YOLOv4 and improved spatial attention mechanism. J. Intell. Fuzzy Syst. 2022, 42, 2359–2368. [Google Scholar] [CrossRef]
- Choi, E.; Bahadori, M.T.; Sun, J.; Kulas, J.; Schuetz, A.; Stewart, W. Retain: An interpretable predictive model for healthcare using reverse time attention mechanism. arXiv 2016, arXiv:1608.05745. [Google Scholar]
- Schmidt-Hieber, J. Nonparametric regression using deep neural networks with ReLU activation function. Ann. Stat. 2020, 48, 1875–1897. [Google Scholar]
- Zheng, Z.; Wang, P.; Liu, W.; Li, J.; Ye, R.; Ren, D. Distance-IoU loss: Faster and better learning for bounding box regression. In Proceedings of the AAAI conference on artificial intelligence, New York, NY, USA, 7–12 February 2020; Volume 34, pp. 12993–13000. [Google Scholar]
- Zhou, T.; Fu, H.; Gong, C.; Shen, J.; Shao, L.; Porikli, F. Multi-mutual consistency induced transfer subspace learning for human motion segmentation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 13–19 June 2020; pp. 10277–10286. [Google Scholar]
- Zhong, S.; Chen, D.; Xu, Q.; Chen, T. Optimizing the Gaussian kernel function with the formulated kernel target alignment criterion for two-class pattern classification. Pattern Recognit. 2013, 46, 2045–2054. [Google Scholar] [CrossRef]
- Ismkhan, H. Ik-means−+: An iterative clustering algorithm based on an enhanced version of the k-means. Pattern Recognit. 2018, 79, 402–413. [Google Scholar] [CrossRef]
- Eide, A.; Koparan, C.; Zhang, Y.; Ostlie, M.; Howatt, K.; Sun, X. UAV-Assisted Thermal Infrared and Multispectral Imaging of Weed Canopies for Glyphosate Resistance Detection. Remote Sens. 2021, 13, 4606. [Google Scholar] [CrossRef]
- De Castro, A.I.; Torres-Sánchez, J.; Peña, J.M.; Jiménez-Brenes, F.M.; Csillik, O.; López-Granados, F. An automatic random forest-OBIA algorithm for early weed mapping between and within crop rows using UAV imagery. Remote Sens. 2018, 10, 285. [Google Scholar] [CrossRef]
- Hussain, N.; Farooque, A.A.; Schumann, A.W.; McKenzie-Gopsill, A.; Esau, T.; Abbas, F.; Acharya, B.; Zaman, Q. Design and development of a smart variable rate sprayer using deep learning. Remote Sens. 2020, 12, 4091. [Google Scholar] [CrossRef]
- Wei, H.; Zhang, Q.; Qian, Y.; Xu, Z.; Han, J. MTSDet: Multi-scale traffic sign detection with attention and path aggregation. Appl. Intell. 2022, 1–13. [Google Scholar] [CrossRef]
- Zhang, M.; Xu, S.; Song, W.; He, Q.; Wei, Q. Lightweight underwater object detection based on yolo v4 and multi-scale attentional feature fusion. Remote Sens. 2021, 13, 4706. [Google Scholar] [CrossRef]
- Kitaev, N.; Kaiser, Ł.; Levskaya, A. Reformer: The efficient transformer. arXiv 2020, arXiv:2001.04451. [Google Scholar]
- Goodfellow, I.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Bengio, Y. Generative adversarial nets. Commun. ACM 2020, 63, 139–144. [Google Scholar] [CrossRef]
Dataset | Train | Test | Total |
---|---|---|---|
Original Images | 700 | 155 | 855 |
Flip Horizontally | 500 | 0 | 500 |
Flip Vertically | 500 | 0 | 500 |
Brightness Increase | 500 | 0 | 500 |
Brightness Decrease | 500 | 0 | 500 |
Gauss Noise | 500 | 0 | 500 |
Total Number | 3200 | 155 | 3355 |
Configuration | Parameter |
---|---|
Operating System | Ubuntu 18.04.1 LTS |
CPU | Intel(R) Xeon(R) Silver 4114 CPU @ 2.20GHz |
GPU | NVIDIA Tesla T4 |
Accelerate Environment | CUDA10.2 CuDNN7.6.5 |
Pytorch | 1.2 |
Python | 3.6.2 |
Models | mAP/% | Volume/MB | Time/ms | |
---|---|---|---|---|
Small Targets | All Targets | |||
YOLOv4-Tiny | 79.53 | 87.71 | 22.4 | 6 |
EM-YOLOv4-Tiny | 89.65 | 94.54 | 28.7 | 10.4 |
Method | Precision/% | Recall/% | mAP/% | F1/% | Time/ms |
---|---|---|---|---|---|
YOLOv4-Tiny | 87.60 | 75.60 | 87.71 | 0.80 | 6.0 |
YOLOv4-Tiny + K-Means | 91.80 | 74.80 | 88.90 | 0.82 | 6.0 |
YOLOv4-Tiny + K-Means+ Soft-NMS | 88.16 | 84.91 | 90.37 | 0.86 | 6.0 |
YOLOv4-Tiny + K-Means+ Soft-NMS + scale3 | 95.40 | 82.90 | 93.72 | 0.89 | 9.0 |
YOLOv4-Tiny + K-Means+ Soft-NMS + scale3 + ECA(EM-YOLOv4-Tiny) | 96.7 | 85.90 | 94.54 | 0.90 | 10.4 |
Method | Precision/% | Recall/% | mAP/% | F1/% | Time/ms |
---|---|---|---|---|---|
Base-SE | 96.3 | 79.6 | 92.32 | 0.87 | 11 |
Base-CBAM | 97.5 | 80.8 | 93.15 | 0.88 | 12 |
Base-ECA(EM-YOLOv4-Tiny) | 96.7 | 85.9 | 94.54 | 0.90 | 10.4 |
Model | mAP/% | F1/% | Time/ms | Volume/MB | Parameter/×106 |
---|---|---|---|---|---|
Faster-RCNN | 84.90 | 0.78 | 121 | 111.4 | 28.3 |
YOLOv4 | 89.76 | 0.80 | 25.2 | 234 | 64.0 |
YOLOv5s | 87.78 | 0.86 | 15 | 27.1 | 7.1 |
Swin-Transformer | 89.70 | 0.89 | 20.4 | 117.8 | 30.8 |
DETR | 95.3 | 0.92 | 32.7 | 158.9 | 41 |
EM-YOLOv4-Tiny | 94.54 | 0.90 | 10.4 | 27.8 | 6.8 |
Scenarios | Precision/% | Recall/% | mAP/% | F1/% |
---|---|---|---|---|
Single Weed | 94.67 | 96.03 | 98.48 | 0.95 |
Sparsely Distributed | 95.97 | 93.21 | 98.16 | 0.94 |
Vigorous Growth | 90.24 | 89.52 | 94.30 | 0.90 |
Mean | 93.62 | 93.01 | 96.98 | 0.93 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang, H.; Wang, Z.; Guo, Y.; Ma, Y.; Cao, W.; Chen, D.; Yang, S.; Gao, R. Weed Detection in Peanut Fields Based on Machine Vision. Agriculture 2022, 12, 1541. https://doi.org/10.3390/agriculture12101541
Zhang H, Wang Z, Guo Y, Ma Y, Cao W, Chen D, Yang S, Gao R. Weed Detection in Peanut Fields Based on Machine Vision. Agriculture. 2022; 12(10):1541. https://doi.org/10.3390/agriculture12101541
Chicago/Turabian StyleZhang, Hui, Zhi Wang, Yufeng Guo, Ye Ma, Wenkai Cao, Dexin Chen, Shangbin Yang, and Rui Gao. 2022. "Weed Detection in Peanut Fields Based on Machine Vision" Agriculture 12, no. 10: 1541. https://doi.org/10.3390/agriculture12101541
APA StyleZhang, H., Wang, Z., Guo, Y., Ma, Y., Cao, W., Chen, D., Yang, S., & Gao, R. (2022). Weed Detection in Peanut Fields Based on Machine Vision. Agriculture, 12(10), 1541. https://doi.org/10.3390/agriculture12101541