Review of Weed Detection Methods Based on Computer Vision
Abstract
:1. Introduction
2. Public Image Datasets
3. Traditional Machine Learning Weed Detection Methods
3.1. Traditional Features and Their Advantages and Disadvantages for Common Weed Detection
3.1.1. Texture Features
3.1.2. Shape Features
3.1.3. Spectral Features
3.1.4. Color Features
3.2. Multi-Feature Fusion
3.3. Classifier
4. Weed Detection and Identification Methods Based on Deep Learning
4.1. Weed Detection and Identification Methods Based on CNNs
4.2. Weed Detection and Identification Methods Based on FCNs
4.3. Weed Detection and Identification Methods Based on Semi- and Unsupervised Feature Learning
4.4. Other Deep Learning Methods
5. Weeding Machinery
6. Discussion
6.1. Various Weed Detection Tasks
- (1)
- Different crops and weed species and diverse problems for the detection of various species. When a crop is similar to associated weeds, the detection is difficult. Relevant research has only classified and identified the leaves of specific plants rather than actual field images in a complex background, as shown in Figure 2. When applied to weed detection in the field, the accuracy is low and the stability is poor.
- (2)
- Different datasets and evaluation indicators. At present, few public datasets are available. Consequently, many studies have been conducted on the basis of self-built datasets. Even if the main body of some datasets is the same crop, the portability of the algorithm is poor under different growth periods, illumination, and actual field backgrounds. Relevant evaluation indicators are not comparable due to the different basis of the dataset developed by the algorithm. The actual performance is difficult to determine.
6.2. Multiple Complex Factors Affect Weed Detection
- (1)
- The influence of different growth stages. Most plants change their leaf morphology, texture, and spectral characteristics in different seasons or growth and development stages.
- (2)
- The influence of changing light conditions. When light conditions are different, the shade of the plant canopy and the angle of the sun will affect the color of the vegetation. Some scholars have used the ultra-green index and the Otsu algorithm to solve the problems caused by ambient light. In particular, Atrand et al. [156] solved the problems by using camera filters and different types of cameras. HIS color model was also applied, and grayscale images with H component were generated to reduce the impact of uneven lighting on color images [157].
- (3)
- Influence of overlapping leaves and occlusion. The accurate segmentation of plants is a challenging task. In complex actual field images, overlapping leaves, occlusions, leaf shadows, dead leaves, and damaged leaves will make it impossible to segment the leaves effectively when processing the images.
- (4)
- Bottleneck of weed detection. Factors, such as hardware, algorithm complexity, and plant density, limit the actual detection speed or accuracy. Hence, fast image processing and accurate weed identification remains extremely important challenges.
7. Summary and Outlook
- (1)
- Further research on semi- or unsupervised feature learning will be a hotspot of weed detection in the future. Researchers have obtained good results in diverse specific background, but they still lack generality and robustness. Deep learning-based methods show an encouraging promise, but the large number of labeled samples increases the manual requirements. The verification and comparison of new development algorithms also require sufficient sample size and corresponding ground truth datasets. Compared with various weeds, field crop images are relatively easy to obtain. In view of the above reasons, weed detection methods based on semi- or unsupervised feature learning will continue to be a popular research topic in the future.
- (2)
- With the use of the technology of weed detection and accumulation to develop an automatic crop guidance system, agricultural operations can be carried out in various aspects, such as harvest, weeding, spraying, and transportation. Automatically guided agricultural vehicles do not fatigue and reduce the labor intensity of the operator, thus improving efficiency and safety. However, at present, few methods and devices meet the high requirements of practical applications. Considerable work should be done to develop equipment with high performance and cost efficiency.
- (3)
- Traditional and deep learning methods have their own advantages. In the future, the advantages of the two methods should be fully utilized for further research. To improve the level of weed detection and weeding, solutions have been proposed to solve the difficult practical problems, such as plant illumination, overlapping leaves, occlusion, and classifier or network structure optimization.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Tian, H.; Wang, T.; Liu, Y.; Qiao, X.; Li, Y. Computer vision technology in agricultural automation—A review. Inf. Process. Agric. 2020, 7, 1–19. [Google Scholar] [CrossRef]
- Mavridou, E.; Vrochidou, E.; Papakostas, G.; Pachidis, T.; Kaburlasos, V. Machine Vision Systems in Precision Agriculture for Crop Farming. J. Imaging 2019, 5, 89. [Google Scholar] [CrossRef] [Green Version]
- Zhang, S.; Huang, W.; Wang, Z. Combing modified Grabcut, K-means clustering and sparse representation classification for weed recognition in wheat field. Neurocomputing 2021. [Google Scholar] [CrossRef]
- Bàrberi, P. Weed management in organic agriculture: Are we addressing the right issues? Weed Res. 2002, 42, 177–193. [Google Scholar] [CrossRef]
- Sabzi, S.; Abbaspour-Gilandeh, Y.; Arribas, J. An automatic visible-range video weed detection, segmentation and classification prototype in potato field. Heliyon 2020, 6, e03685. [Google Scholar] [CrossRef]
- Liakos, K.; Busato, P.; Moshou, D.; Pearson, S.; Bochtis, D. Machine Learning in Agriculture: A Review. Sensors 2018, 18, 2674. [Google Scholar] [CrossRef] [Green Version]
- Kamilaris, A.; Prenafeta-Boldu, F. Deep learning in agriculture: A survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef] [Green Version]
- Weng, Y.; Zeng, R.; Wu, C.; Wang, M.; Wang, M.; Liu, Y. A survey on deep-learning-based plant phenotype research in agriculture. Scientia Sinica Vitae 2019, 49, 698–716. [Google Scholar] [CrossRef] [Green Version]
- Su, W. Advanced Machine Learning in Point Spectroscopy, RGB- and Hyperspectral-Imaging for Automatic Discriminations of Crops and Weeds: A Review. Smart Cities 2020, 3, 767–792. [Google Scholar] [CrossRef]
- Koirala, A.; Walsh, K.B.; Wang, Z.; McCarthy, C. Deep learning Method overview and review of use for fruit detection and yield estimation. Comput. Electron. Agric. 2019, 162, 219–234. [Google Scholar] [CrossRef]
- Yuan, H.; Zhao, N.; Cheng, M. Review of Weeds Recognition Based on Image Processing. Trans. Chin. Soc. Agric. Mach. 2020, 51, 323–334. [Google Scholar] [CrossRef]
- Hasan, A.; Sohel, F.; Diepeveen, D.; Laga, H.; Jones, M. A survey of deep learning techniques for weed detection from images. Comput. Electron. Agric. 2021, 184, 106067. [Google Scholar] [CrossRef]
- Deng, J.; Dong, W.; Socher, R.; Li, L.; Li, K.; Li, F. ImageNet: A Large-Scale Hierarchical Image Database. In Proceedings of the 2009 IEEE Conference on Computer Vision and Pattern Recognition, Miami, FL, USA, 20–25 June 2009; pp. 248–255. [Google Scholar]
- Lin, T.; Maire, M.; Belongie, S.; Hays, J.; Perona, P.; Ramanan, D.; Dollár, P.; Zitnick, L. Microsoft COCO: Common Objects in Context; Springer: Zurich. Switzerland, 2014; pp. 740–755. [Google Scholar]
- Everingham, M.; Van Gool, L.; Williams, C.; Winn, J.; Zisserman, A. The PASCAL visual object classes (VOC) challenge. Int. J. Comput. Vis. 2010, 88, 303–338. [Google Scholar] [CrossRef] [Green Version]
- Kuznetsova, A.; Rom, H.; Alldrin, N.; Uijlings, J.; Krasin, I.; PontTuset, J.; Kamali, S.; Popov, S.; Malloci, M.; Kolesnikov, A.; et al. The Open Images Dataset V4. Int. J. Comput. Vis. 2020, 128, 1956–1981. [Google Scholar] [CrossRef] [Green Version]
- Lu, Y.; Young, S. A survey of public datasets for computer vision tasks in precision agriculture. Comput. Electron. Agric. 2020, 178, 105760. [Google Scholar] [CrossRef]
- Yu, J.; Schumann, A.; Cao, Z.; Sharpe, S.; Boyd, N. Weed detection in perennial ryegrass with deep learning convolutional neural network. Front. Plant Sci. 2019, 10, 1422. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ferreira, A.; Freitas, D.; Silva, G.; Pistori, H.; Folhes, M. Weed detection in soybean crops using ConvNets. Comput. Electron. Agric. 2017, 143, 314–324. [Google Scholar] [CrossRef]
- Giselsson, T.; Jrgensen, R.; Jensen, P.; Dyrmann, M.; Midtiby, H. A Public Image Database for Benchmark of Plant Seedling Classification Algorithms. arXiv 2017, arXiv:1711.05458. [Google Scholar]
- Olsen, A.; Konovalov, D.; Philippa, B.; Ridd, P.; Wood, J.; Johns, J.; Banks, W.; Girgenti, B.; Kenny, O.; Whinney, J.; et al. Deepweeds: A multiclass weed species image dataset for deep learning. Sci. Rep. 2019, 9, 2058. [Google Scholar] [CrossRef]
- Madsen, S.; Mathiassen, S.; Dyrmann, M.; Laursen, M.; Paz, L.; Jørgensen, R. Open Plant Phenotype Database of Common Weeds in Denmark. Remote Sens. 2020, 12, 1246. [Google Scholar] [CrossRef] [Green Version]
- Sa, I.; Chen, Z.; Popović, M.; Khanna, R.; Liebisch, F.; Nieto, J.; Siegwart, R. Weednet: Dense semantic weed classification using multispectral images and mav for smart farming. IEEE Robot. Automat. Lett. 2017, 588–595. [Google Scholar] [CrossRef] [Green Version]
- Chebrolu, N.; Lottes, P.; Schaefer, A.; Winterhalter, W.; Burgard, B.; Stachniss, C. Agricultural robot dataset for plant classification, localization and mapping on sugar beet fields. Int. J. Robot. Res. 2017, 36, 1045–1052. [Google Scholar] [CrossRef] [Green Version]
- Ma, X.; Deng, X.; Qi, L.; Jiang, Y.; Li, H.; Wang, Y.; Xing, X. Fully convolutional network for rice seedling and weed image segmentation at the seedling stage in paddy fields. PLoS ONE 2019, 14, e0215676. [Google Scholar] [CrossRef] [PubMed]
- Sudars, K.; Jasko, J.; Namatevs, I.; Ozola, L.; Badaukis, N. Dataset of annotated food crops and weed images for robotic computer vision control. Data Brief 2020, 31, 105833. [Google Scholar] [CrossRef]
- Champ, J.; Mora-Fallas, A.; Goëau, H.; Mata-Montero, E.; Bonnet, P.; Joly, A. Instance segmentation for the fine detection of crop and weed plants by precision agricultural robots. Appl Plant Sci. 2020, 8, e11373. [Google Scholar] [CrossRef] [PubMed]
- Wu, S.; Bao, F.; Xu, E.; Wang, Y.; Chang, Y.; Xiang, Q. A leaf recognition algorithm for plant classification using probabilistic neural network. In Proceedings of the IEEE 7th International Symposium on Signal Processing and Information Technology, Giza, Egypt, 15–18 December 2007; pp. 11–16. [Google Scholar]
- Zheng, Y.; Kong, J.; Jin, X.; Wang, X.; Su, T.; Zuo, M. CropDeep: The Crop Vision Dataset for Deep-Learning-Based Classification and Detection in Precision Agriculture. Sensors 2019, 19, 1058. [Google Scholar] [CrossRef] [Green Version]
- Chavan, T.R.; Nandedkar, A.V. AgroAVNET for crops and weeds classification: A step forward in automatic farming. Comput. Electron. Agric. 2018, 154, 361–372. [Google Scholar] [CrossRef]
- Trong, V.H.; Hyun, Y.G.; Young, K.J.; Bao, P.T. Yielding Multi-Fold Training Strategy for Image Classification of Imbalanced Weeds. Appl. Sci. 2021, 11, 3331. [Google Scholar] [CrossRef]
- Xu, Y.; Zhai, Y.; Zhao, B.; Jiao, Y.; Kong, S. Weed recognition for depthwise separable network based on transfer learning. Intell. Autom. Soft Comput. 2021, 27, 669–682. [Google Scholar] [CrossRef]
- Ferreira, A.; Freitas, D.; Silva, G.; Pistori, H.; Folhes, M. Unsupervised deep learning and semi-automatic data labeling in weed discrimination. Comput. Electron. Agric. 2019, 165, 104963. [Google Scholar] [CrossRef]
- Hu, K.; Coleman, G.; Zeng, S.; Wang, Z.; Walsh, M. Graph weeds net: A graph-based deep learning method for weed recognition. Comput. Electron. Agric. 2020, 174, 105520. [Google Scholar] [CrossRef]
- Naresh, Y.; Nagendraswamy, H. Classification of medicinal plants: An approach using modified LBP with symbolic representation. Neurocomputing 2016, 173, 1789–1797. [Google Scholar] [CrossRef]
- Mahajan, S.; Raina, A.; Gao, X.-Z.; Kant Pandit, A. Plant Recognition Using Morphological Feature Extraction and Transfer Learning over SVM and AdaBoost. Symmetry 2021, 13, 356. [Google Scholar] [CrossRef]
- Yang, C. Plant leaf recognition by integrating shape and texture features. Pattern Recognit. 2021, 112, 107809. [Google Scholar] [CrossRef]
- Le, V.; Apopei, B.; Alameh, K. Effective plant discrimination based on the combination of local binary pattern operators and multiclass support vector machine methods. Inf. Process. Agric. 2019, 6, 116–131. [Google Scholar] [CrossRef]
- Chen, Y.; Zhao, B.; Li, S.; Liu, L.; Yuan, Y.; Zhang, Y. Weed Reverse Positioning Method and Experiment Based on Multi-feature. Trans. Chin. Soc. Agric. Mach. 2015, 46, 257–262. [Google Scholar] [CrossRef]
- Zhu, W.; Zhu, X. The Application of Support Vector Machine in Weed Classification. In Proceedings of the 2009 IEEE International Conference on Intelligent Computing and Intelligent Systems, Shanghai, China, 20–22 November 2009; pp. 532–536. [Google Scholar] [CrossRef]
- Zhang, X.; Xie, Z.; Zhang, N.; Cao, C. Weed recognition from pea seedling images and variable spraying control system. Trans. Chin. Soc. Agric. Mach. 2012, 43, 220–225+73. [Google Scholar]
- Wang, C.; Li, Z. Weed recognition using SVM model with fusion height and monocular image features. Trans. CSAE 2016, 32, 165–174. [Google Scholar] [CrossRef]
- Midtiby, H.; Astrand, B.; Jørgensen, O.; Jørgensen, R. Upper limit for context–based crop classification in robotic weeding applications. Biosyst. Eng. 2016, 146, 183–192. [Google Scholar] [CrossRef]
- Tang, J.; Chen, X.; Miao, R.; Wang, D. Weed detection using image processing under different illumination for site-specific areas spraying. Comput. Electron. Agric. 2016, 122, 103–111. [Google Scholar] [CrossRef]
- Huang, S.; Wu, S.; Sun, C.; Ma, X.; Jiang, Y.; Qi, L. Deep localization model for intra-row crop detection in paddy field. Comput. Electron. Agric. 2020, 169, 105203. [Google Scholar] [CrossRef]
- Wang, A.; Zhang, W.; Wei, X. A review on weed detection using ground-based machine vision and image processing techniques. Comput. Electron. Agric. 2019, 158, 226–240. [Google Scholar] [CrossRef]
- He, D.; Qiao, Y.; Li, P.; Gao, Z.; Li, H.; Tang, J. Weed Recognition Based on SVM-DS Multi-feature Fusion. Trans. Chin. Soc. Agric. Mach. 2013, 44, 182–187. [Google Scholar] [CrossRef]
- Deng, X.; Qi, L.; Ma, X.; Jiang, Y.; Chen, X.; Liu, H.; Chen, W. Recognition of weeds at seedling stage in paddy fields using multi-feature fusion and deep belief networks. Trans. CSAE 2018, 34, 165–172. [Google Scholar] [CrossRef]
- Torres-Sánchez, J.; López-Granados, F.; Peña, J.M. An automatic object-based method for optimal thresholding in UAV images: Application for vegetation detection in herbaceous crops. Comput. Electron. Agric. 2015, 114, 43–52. [Google Scholar] [CrossRef]
- Ma, Y.; Feng, Q.; Yang, M.; Li, M. Wine grape leaf detection based on HOG. Comput. Eng. Appl. 2016, 52, 158–161. [Google Scholar]
- Bakhshipour, A.; Jafari, A. Evaluation of support vector machine and artificial neural networks in weed detection using shape features. Comput. Electron. Agric. 2018, 145, 153–160. [Google Scholar] [CrossRef]
- Ishak, A.; Hussain, A.; Mustafa, M. Weed image classification using Gabor wavelet and gradient field distribution. Comput. Electron. Agric. 2009, 66, 53–61. [Google Scholar] [CrossRef]
- Chaki, J.; Parekh, R.; Bhattacharya, S. Plant leaf recognition using texture and shape features with neural classifiers. Pattern Recognit. Lett. 2015, 58, 61–68. [Google Scholar] [CrossRef]
- Zheng, Y.; Zhong, G.; Wang, Q.; Zhao, Y.; Zhao, Y. Method of Leaf Identification Based on Multi-feature Dimension Reduction. Trans. Chin. Soc. Agric. Mach. 2017, 48, 30–37. [Google Scholar] [CrossRef]
- Tang, Z.; Su, Y.; Er, M.; Qi, F.; Zhang, L.; Zhou, J. A local binary pattern based texture descriptors for classification of tea leaves. Neurocomputing. 2015, 168, 1011–1023. [Google Scholar] [CrossRef]
- Zhai, Y.; Thomasson, J.; Boggess, J.; Sui, R. Soil texture classification with artificial neural networks operating on remote sensing data. Comput. Electron. Agric. 2006, 54, 53–68. [Google Scholar] [CrossRef]
- Wooten, J.; Filip-To, S.; Igathinathane, C.; Pordesimo, L. Discrimination of bark from wood chips through texture analysis by image processing. Comput. Electron. Agric. 2011, 79, 13–19. [Google Scholar] [CrossRef]
- Zhang, Y.; Wang, S.; Ji, G.; Phillips, P. Fruit classification using computer vision and feedforward neural network. J. Food Eng. 2014, 143, 167–177. [Google Scholar] [CrossRef]
- Bharati, M.; Liu, J.; MacGregor, J. Image texture analysis: Methods and comparisons. Chemom. Intell. Lab. Syst. 2004, 72, 57–71. [Google Scholar] [CrossRef]
- Haralick, R.; Shanmugam, K.; Dinstein, I. Textural features for image classification. IEEE Trans. Syst. Man. Cybern. 1973, 6, 610–621. [Google Scholar] [CrossRef] [Green Version]
- Wu, L.; Liu, J.; Wen, Y. Image Identification of Corn and Weed Based on Fractal Dimension. Trans. Chin. Soc. Agric. Mach. 2009, 40, 176–179. [Google Scholar]
- Dryden, I.; Scarr, M.; Taylor, C. Bayesian texture segmentation of weed and crop images using reversible jump Markov chain Monte Carlo methods. Appl. Statist. 2003, 52, 31–50. [Google Scholar] [CrossRef]
- Bakhshipour, A.; Jafari, A.; Nassiri, S.; Zare, D. Weed segmentation using texture features extracted from wavelet sub-images. Biosyst. Eng. 2017, 157, 1–12. [Google Scholar] [CrossRef]
- Mustapha, A.; Mustafa, M. Development of a real-time site sprayer system for specific weeds using gabor wavelets and neural networks model. In Proceedings of the Malaysia Science and Technology Congress, Kuala Lumpur, Malaysia, 20 April 2005; pp. 406–413. [Google Scholar]
- Hu, M.K. Visual pattern recognition by moment invariants. Ieee Trans. Inf. Theory 1962, 8, 179–187. [Google Scholar]
- Deng, L.; Tang, J.; Ma, W. Feature extraction and recognition system of maize leaf based on image processing. J. Chin. Agric. Mech. 2014, 35, 72–75, 79. [Google Scholar] [CrossRef]
- Long, M.; He, D. Weed identification from corn seedling based on computer vision. Trans. CSAE 2007, 23, 139–144. [Google Scholar]
- Agrawal, K.; Singh, K.; Bora, G.; Lin, D. Weed recognition using image processing technique based on leaf parameters. J. Agric. Sci. Technol. 2012, 2, 899. [Google Scholar]
- Pereira, L.; Nakamura, R.; Souza, G.; Martins, D.; Papa, J. Aquatic weed automatic classification using machine learning techniques. Comput. Electron. Agric. 2012, 87, 56–63. [Google Scholar] [CrossRef]
- Tang, J.; Miao, R.; Zhang, Z.; Xin, J.; Wang, D. Distance-based separability criterion of ROI in classification of farmland hyper-spectral images. Int. J. Agric. Biol. Eng. 2017, 10, 177–185. [Google Scholar] [CrossRef]
- Slaughter, D.; Giles, D.; Downey, D. Autonomous robotic weed control systems: A review. Comput. Electron. Agric. 2008, 61, 63–78. [Google Scholar] [CrossRef]
- Shapira, U.; Herrmann, I.; Karnieli, A.; Bonfil, D. Field spectroscopy for weed detection in wheat and chickpea fields. Int. J. Remote Sens. 2013, 34, 6094–6108. [Google Scholar] [CrossRef] [Green Version]
- Zwiggelaar, R. A review of spectral properties of plants and their potential use for crop/weed discrimination in row-crops. Crop Prot. 1998, 17, 189–206. [Google Scholar] [CrossRef]
- Huang, Y.; Lee, M.; Thomson, S.; Reddy, K. Ground-based hyperspectral remote sensing for weed management in crop production. Int. J. Agric. Biol. Eng. 2016, 9, 98–109. [Google Scholar] [CrossRef]
- Longchamps, L.; Panneton, B.; Samson, G.; Leroux, G.; Thériault, R. Discrimination of corn, grasses and dicot weeds by their UV-induced fluorescence spectral signature. Precis. Agric. 2010, 11, 181–197. [Google Scholar] [CrossRef]
- Pignatti, S.; Casa, R.; Harfouche, A.; Huang, W.; Palombo, A.; Pascucci, S. Maize crop and weeds species detection by using Uav Vnir Hyperpectral data. In Proceedings of the IGARSS 2019—2019 IEEE International Geoscience and Remote Sensing Symposium, Yokohama, Japan, 28 July–2 August 2019; pp. 7235–7238. [Google Scholar] [CrossRef]
- Berger, K.; Atzberger, C.; Danner, M.; D’Urso, G.; Mauser, W.; Vuolo, F.; Hank, T. Evaluation of the PROSAIL Model Capabilities for Future Hyperspectral Model Environments: A Review Study. Remote Sens. 2018, 10, 85. [Google Scholar] [CrossRef] [Green Version]
- Che’Ya, N. Site-Specific Weed Management Using Remote Sensing. Ph. D. Thesis, University Putra Malaysia, Putrajaya, Malaysia, 2016. [Google Scholar]
- Dammer, K.; Intress, J.; Beuche, H.; Selbeck, J.; Dworak, V. Discrimination of Ambrosia artemisiifolia and Artemisia vulgaris by hyperspectral image analysis during the growing season. Weed Res. 2013, 53, 146–156. [Google Scholar] [CrossRef]
- Elstone, L.; How, K.; Brodie, S.; Ghazali, M.; Heath, W.; Grieve, B. High Speed Crop and Weed Identification in Lettuce Fields for Precision Weeding. Sensors 2020, 20, 455. [Google Scholar] [CrossRef] [Green Version]
- López-Granados, F. Weed detection for site-specific weed management: Mapping and real-time approaches. Weed Res. 2011, 51, 1–11. [Google Scholar] [CrossRef] [Green Version]
- Peteinatos, G.; Weis, M.; Andújar, D.; Ayala, V.; Gerhards, R. Potential use of ground-based sensor technologies for weed detection. Pest Manag. Sci. 2013, 70, 190–199. [Google Scholar] [CrossRef] [PubMed]
- Symonds, P.; Paap, A.; Alameh, K.; Rowe, J.; Miller, C. A real-time plant discrimination system utilising discrete reflectance spectroscopy. Comput. Electron. Agric. 2015, 117, 57–69. [Google Scholar] [CrossRef] [Green Version]
- Ahmed, F.; Al-Mamun, H.; Bari, A.; Hossain, E.; Kwan, P. Classification of crops and weeds from digital images: A support vector machine approach. Crop Prot. 2012, 40, 98–104. [Google Scholar] [CrossRef]
- Hamuda, E.; Glavin, M.; Jones, E. A survey of image processing techniques for plant extraction and segmentation in the field. Comput. Electron. Agric. 2016, 125, 184–199. [Google Scholar] [CrossRef]
- Tang, L.; Tian, L.; Steward, B. Color image segmentation with genetic algorithm for in field weed sensing. Trans. ASAE 2000, 43, 1019–1027. [Google Scholar] [CrossRef]
- Ghasab, M.; Khamis, S.; Mohammad, F.; Fariman, H. Feature decision-making ant colony optimization system for an automated recognition of plant species. Expert Syst. Appl. 2015, 42, 2361–2370. [Google Scholar] [CrossRef]
- Zhao, Z.; Ma, L.; Cheung, Y.; Wu, X.; Tang, Y.; Chen, C. ApLeaf: An efficient android-based plant leaf identification system. Neurocomputing 2015, 151, 1112–1119. [Google Scholar] [CrossRef]
- Rasmussen, J.; Nielsen, J.; Streibig, J.C.; Jensen, J.E.; Pedersen, K.S.; Olsen, S.I. Pre-harvest weed mapping of Cirsium arvense in wheat and barley with off-the-shelf UAVs. Precis. Agric. 2019, 20, 983–999. [Google Scholar] [CrossRef]
- Cheng, H.; Jiang, X.; Sun, Y.; Wang, J. Color image segmentation: Advances and prospects. Pattern Recognit. 2001, 34, 2259–2281. [Google Scholar] [CrossRef]
- Hamuda, E.; Ginley, B.; Glavin, M.; Jones, E. Automatic crop detection under field conditions using the HSV colour space and morphological operations. Comput. Electron. Agric. 2017, 133, 97–107. [Google Scholar] [CrossRef]
- Guo, W.; Rage, U.; Ninomiya, S. Illumination invariant segmentation of vegetation for time series wheat images based on decision tree model. Comput. Electron. Agric. 2013, 96, 58–66. [Google Scholar] [CrossRef]
- Knoll, F.; Czymmek, V.; Poczihoski, S.; Holtorf, T.; Hussmann, S. Improving efficiency of organic farming by using a deep learning classification approach. Comput. Electron. Agric. 2018, 153, 347–356. [Google Scholar] [CrossRef]
- Jin, F. Research of Feature Extraction and Recognition Method of Weed Image Based on Machine Vision. Master’s Thesis, Jiangsu University, Zhen Jiang, China, 2007. [Google Scholar]
- Ghazali, K.; Mustafa, M.; Hussain, A. Machine vision system for automatic weeding strategy using image processing technique. Am. Eurasian J. Agric. Environ. Sci. 2008, 3, 451–458. [Google Scholar]
- Li, Y.; Zhang, L.; Yan, W.; Huang, C.; Tong, Q. Weed identification using imaging spectrometer data. J. Remote Sens. 2013, 17, 855–871. [Google Scholar] [CrossRef]
- Chowdhury, S.; Verma, B.; Stockwell, D. A novel texture feature based multiple classifier technique for roadside vegetation classification. Expert Syst. Appl. 2015, 42, 5047–5055. [Google Scholar] [CrossRef]
- Tang, Q. Research on Plant Leaves Recognition Based on Color and Texture Features. Master’s Thesis, Zhejiang University, Hangzhou, China, 2015. [Google Scholar]
- Chen, Y.; Wu, Z.; Zhao, B.; Fan, C.; Shi, S. Weed and Corn Seedling Detection in Field Based on Multi Feature Fusion and Support Vector Machine. Sensors 2021, 21, 212. [Google Scholar] [CrossRef]
- Nursuriati, J.; Hussin, N.; Nordin, S.; Awang, K. Automatic Plant Identification: Is Shape the Key Feature? Procedia Comput. Sci. 2015, 76, 436–442. [Google Scholar] [CrossRef] [Green Version]
- Lin, F.; Zhang, D.; Huang, Y.; Wang, X.; Chen, X. Detection of Corn and Weed Species by the Combination of Spectral, Shape and Textural Features. Sustainability 2017, 9, 1335. [Google Scholar] [CrossRef] [Green Version]
- Behmann, J.; Mahlein, A.; Rumpf, T.; Römer, C.; Plümer, L. A review of advanced machine learning methods for the detection of biotic stress in precision crop protection. Precis. Agric. 2015, 16, 239–260. [Google Scholar] [CrossRef]
- Tellaeche, A.; Pajares, G.; Burgos-Artizzu, X.; Ribeiro, A. A computer vision approach for weeds identification through Support Vector Machines. Appl. Soft Comput. 2011, 11, 908–915. [Google Scholar] [CrossRef] [Green Version]
- Kazmi, W.; Garcia-Ruiz, F.; Nielsen, J.; Rasmussen, J.; Andersen, H. Exploiting affine invariant regions and leaf edge shapes for weed detection. Comput. Electron. Agric. 2015, 118, 290–299. [Google Scholar] [CrossRef]
- Hall, D.; McCool, C.; Dayoub, F.; Sunderhauf, N.; Upcroft, B. Evaluation of Features for Leaf Classification in Challenging Conditions. In Proceedings of the 2015 IEEE Winter Conference on Applications of Computer Vision (WACV), Waikoloa, HI, USA, 5–9 January 2015; pp. 797–804. [Google Scholar] [CrossRef] [Green Version]
- Lottes, P.; Hoeferlin, M.; Sander, S.; Muter, M.; Schulze, P.; Stachniss, L. An effective classification system for separating sugar beets and weeds for precision farming applications. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 5157–5163. [Google Scholar] [CrossRef]
- De Rainville, F.; Durand, A.; Fortin, F.; Tanguy, K.; Maldague, X.; Panneton, B.; Simard, M. Bayesian classification and unsupervised learning for isolating weeds in row crops. Pattern Anal. Appl. 2014, 17, 401–414. [Google Scholar] [CrossRef]
- Mursalin, M.; Mesbah-Ul-Awal, M. Towards Classification of Weeds through Digital Image. In Proceedings of the Fourth International Conference on Advanced Computing & Communication Technologies, Rohtak, India, 8–9 February 2014; pp. 1–4. [Google Scholar] [CrossRef]
- García-Santillán, I.; Pajares, G. On-line crop/weed discrimination through the Mahalanobis distance from images in maize fields. Biosyst. Eng. 2018, 166, 28–43. [Google Scholar] [CrossRef]
- Ahmad, J.; Muhammad, K.; Ahmad, I.; Ahmad, W.; Smith, M.; Smith, L.; Jain, D.; Wang, H.; Mehmood, I. Visual features based boosted classification of weeds for real-time selective herbicide sprayer systems. Comput. Ind. 2018, 98, 23–33. [Google Scholar] [CrossRef]
- Mathanker, S.; Weckler, P.; Taylor, R.; Fan, G. Adaboost and Support Vector Machine Classifiers for Automatic Weed Control: Canola and Wheat. In Proceedings of the 2010 ASABE Annual International Meeting, Pittsburgh, PA, USA, 20–23 June 2010; p. 1008834. [Google Scholar]
- Jeon, H.; Tian, L.; Zhu, H. Robust Crop and Weed Segmentation under Uncontrolled Outdoor Illumination. Sensors 2011, 11, 6270–6283. [Google Scholar] [CrossRef]
- Chen, Y.; Lin, P.; He, Y.; Xu, Z. Classification of broadleaf weed images using Gabor wavelets and Lie group structure of region covariance on Riemannian manifolds. Biosyst. Eng. 2011, 109, 220–227. [Google Scholar] [CrossRef]
- Rumpf, T.; Römer, C.; Weis, M.; Sökefeld, M.; Gerhards, R.; Plümer, L. Sequential support vector machine classification for small-grain weed species discrimination with special regard to Cirsium arvense and Galium aparine. Comput. Electron. Agric. 2012, 80, 89–96. [Google Scholar] [CrossRef]
- Miao, R.; Yang, H.; Wu, J.; Liu, H. Weed identification of overlapping spinach leaves based on image sub-block and reconstruction. Trans. Csae. 2020, 36, 178–184. [Google Scholar] [CrossRef]
- Ashraf, T.; Khan, Y. Weed density classification in rice crop using computer vision. Comput. Electron. Agric. 2020, 175, 105590. [Google Scholar] [CrossRef]
- Pantazi, X.; Moshou, D.; Bravo, C. Active learning system for weed species recognition based on hyperspectral sensing. Biosyst. Eng. 2016, 146, 193–202. [Google Scholar] [CrossRef]
- Fu, L.; Gao, F.; Wu, J.; Li, R.; Karkee, M.; Zhang, Q. Application of consumer RGB-D cameras for fruit detection and localization in field: A critical review. Comput. Electron. Agric. 2020, 177, 105687. [Google Scholar] [CrossRef]
- Aversano, L.; Bernardi, M.; Cimitile, M.; Iammarino, M.; Rondinella, S. Tomato diseases Classification Based on VGG and Transfer Learning. In Proceedings of the 2020 IEEE International Workshop on Metrology for Agriculture and Forestry (MetroAgriFor), Trento, Italy, 4–6 November 2020; pp. 129–133. [Google Scholar] [CrossRef]
- Edna, C.; Li, Y.; Sam, N.; Liu, Y. A comparative study of fine-tuning deep learning models for plant disease identification. Comput. Electron. Agric. 2019, 161, 272–279. [Google Scholar] [CrossRef]
- Tiwari, O.; Goyal, V.; Kumar, P.; Vij, S. An experimental set up for utilizing convolutional neural network in automated weed detection. In Proceedings of the 2019 4th International Conference on Internet of Things: Smart Innovation and Usages (IoT-SIU), Ghaziabad, India, 18–19 April 2019; pp. 1–6. [Google Scholar] [CrossRef]
- Gongal, A.; Amatya, S.; Karkee, M.; Zhang, Q.; Lewis, K. Sensors and systems for fruit detection and localization: A review. Comput. Electron. Agric. 2015, 116, 8–19. [Google Scholar] [CrossRef]
- Kounalakis, T.; Triantafyllidis, G.; Nalpantidis, L. Deep learning-based visual recognition of rumex for robotic precision farming. Comput. Electron. Agric. 2019, 165, 104973. [Google Scholar] [CrossRef]
- Dyrmann, M.; Karstoft, H.; Midtiby, H. Plant species classification using deep convolutional neural network. Biosyst. Eng. 2016, 151, 72–80. [Google Scholar] [CrossRef]
- Yu, J.; Sharpe, S.; Schumann, A.; Boyd, N. Deep learning for image-based weed detection in turfgrass. Eur. J. Agron. 2019, 104, 78–84. [Google Scholar] [CrossRef]
- Potena, C.; Nardi, D.; Pretto, A. Fast and accurate crop and weed identification with summarized train sets for precision agriculture. Intelligent Autonomous Systems 14. IAS 2016. Adv. Intell. Systems Comput. 2017, 531, 105–121. [Google Scholar] [CrossRef] [Green Version]
- Beeharry, Y.; Bassoo, V. Performance of ANN and AlexNet for weed detection using UAV-based images. In Proceedings of the 2020 3rd International Conference on Emerging Trends in Electrical, Electronic and Communications Engineering (ELECOM), Balaclava, Mauritius, 25–27 November 2020; pp. 163–167. [Google Scholar] [CrossRef]
- Ramirez, W.; Achanccaray, P.; Mendoza, L.; Pacheco, M. Deep Convolutional Neural Networks For Weed Detection in Agricultural Crops using Optical Aerial Images. In Proceedings of the 2020 IEEE Latin American GRSS & ISPRS Remote Sensing Conference (LAGIRS), Santiago, Chile, 22–26 March 2020; pp. 133–137. [Google Scholar] [CrossRef]
- Patidar, S.; Singh, U.; Sharma, S.; Himanshu. Weed Seedling Detection Using Mask Regional Convolutional Neural Network. In Proceedings of the 2020 International Conference on Electronics and Sustainable Communication Systems (ICESC), Coimbatore, India, 2–4 July 2020; pp. 311–316. [Google Scholar] [CrossRef]
- You, J.; Liu, W.; Lee, J. A DNN-based semantic segmentation for detecting weed and crop. Comput. Electron. Agric. 2020, 178, 105750. [Google Scholar] [CrossRef]
- Peteinatos, G.; Reichel, P.; Karouta, J.; Andújar, D.; Gerhards, R. Weed Identification in Maize, Sunflower, and Potatoes with the Aid of Convolutional Neural Networks. Remote Sens. 2020, 12, 4185. [Google Scholar] [CrossRef]
- Asad, M.; Bais, A. Weed detection in canola fields using maximum likelihood classification and deep convolutional neural network. Inf. Process. Agric. 2020, 7, 535–545. [Google Scholar] [CrossRef]
- Quan, L.; Feng, H.; Lv, Y.; Wang, Q.; Zhang, C.; Liu, J.; Yuan, Z. Maize seedling detection under different growth stages and complex field environments based on an improved Faster R–CNN. Biosyst. Eng. 2019, 184, 1–23. [Google Scholar] [CrossRef]
- Suh, H.; IJsselmuiden, J.; Hofstee, J.; Henten, E. Transfer learning for the classification of sugar beet and volunteer potato under field conditions. Biosyst. Eng. 2018, 174, 50–65. [Google Scholar] [CrossRef]
- Chechliński, Ł.; Siemiątkowska, B.; Majewski, M. A System for Weeds and Crops Identification—Reaching over 10 FPS on Raspberry Pi with the Usage of MobileNets, DenseNet and Custom Modifications. Sensors 2019, 19, 3787. [Google Scholar] [CrossRef] [Green Version]
- Huang, H.; Lan, Y.; Deng, J.; Yang, A.; Deng, X.; Zhang, L.; Wen, S. A semantic labeling approach for accurate weed mapping of high resolution UAV Imagery. Sensors 2018, 18, 2113. [Google Scholar] [CrossRef] [Green Version]
- Peng, C.; Li, Y.; Jiao, L.; Chen, Y.; Shang, R. Densely Based Multi-Scale and Multi-Modal Fully Convolutional Networks for High-Resolution Remote-Sensing Image Semantic Segmentation. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 2612–2626. [Google Scholar] [CrossRef]
- Chen, F.; Wang, C.; Gu, M.; Zhao, Y. Spruce Image Segmentation Algorithm Based on Fully Convolutional Networks. Trans. Chin. Soc. Agric. Mach. 2018, 49, 188–194+210. [Google Scholar] [CrossRef]
- Dyrmann, M.; Jørgensen, R.; Midtiby, H. RoboWeedSupport-Detection of weed locations in leaf occluded cereal crops using a fully convolutional neural network. Adv. Anim. Biosci. 2017, 8, 842–847. [Google Scholar] [CrossRef]
- Huang, H.; Deng, J.; Lan, Y.; Yang, A.; Deng, X.; Wen, S.; Zhang, H.; Zhang, Y. Accurate Weed Mapping and Prescription Map Generation Based on Fully Convolutional Networks Using UAV Imagery. Sensors 2018, 18, 3299. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Fu, X.; Qu, H. Research on Semantic Segmentation of High-resolution Remote Sensing Image Based on Full Convolutional Neural Network. In Proceedings of the 2018 12th International Symposium on Antennas, Propagation and EM Theory (ISAPE), Hangzhou, China, 3–6 December 2018; pp. 1–4. [Google Scholar] [CrossRef]
- Hung, C.; Xu, Z.; Sukkarieh, S. Feature Learning Based Approach for Weed Classification Using High Resolution Aerial Images from a Digital Camera Mounted on a UAV. Remote Sens. 2014, 6, 12037–12054. [Google Scholar] [CrossRef] [Green Version]
- He, L. Research on Weeds Identification Based on k-Means Feature Learning. Master’s Thesis, Northwest AF University, Yangling, China, 2016. [Google Scholar]
- Jiang, H.; Zhang, C.; Qiao, Y.; Zhang, Z.; Zhang, W.; Song, C. CNN feature based graph convolutional network for weed and crop recognition in smart farming. Comput. Electron. Agric. 2020, 174, 105450. [Google Scholar] [CrossRef]
- Tang, J.; Wang, D.; Zhang, Z.; He, L.; Xin, J.; Xu, Y. Weed identification based on K-means feature learning combined with convolutional neural network. Comput. Electron. Agric. 2017, 135, 63–70. [Google Scholar] [CrossRef]
- Bah, M.; Hafiane, A.; Canals, R. Deep Learning with Unsupervised Data Labeling for Weed Detection in Line Crops in UAV Images. Remote Sens. 2018, 10, 1690. [Google Scholar] [CrossRef] [Green Version]
- Sadgrove, E.; Falzon, G.; Miron, D.; Lamb, D. Fast object detection in pastoral landscapes using a Colour Feature Extreme Learning Machine. Comput. Electron. Agric. 2017, 139, 204–212. [Google Scholar] [CrossRef]
- Abdalla, A.; Cen, H.; Wan, L.; Rashid, R.; Weng, H.; Zhou, W.; He, Y. Fine-tuning convolutional neural network with transfer learning for semantic segmentation of ground-level oilseed rape images in a field with high weed pressure. Comput. Electron. Agric. 2019, 167, 105091. [Google Scholar] [CrossRef]
- Raja, R.; Nguyen, T.; Slaughter, D.; Fennimore, S. Real-time weed-crop classification and localisation technique for robotic weed control in lettuce. Biosyst. Eng. 2020, 192, 257–274. [Google Scholar] [CrossRef]
- Khan, A.; Ilyas, T.; Umraiz, M.; Mannan, Z.; Kim, H. CED-Net: Crops and Weeds Segmentation for Smart Farming Using a Small Cascaded Encoder-Decoder Architecture. Electronics 2020, 9, 1602. [Google Scholar] [CrossRef]
- Liang, W.; Yang, Y.; Chao, C. Low-Cost Weed Identification System Using Drones. In Proceedings of the 2019 Seventh International Symposium on Computing and Networking Workshops (CANDARW), Nagasaki, Japan, 26–29 November 2019; pp. 260–263. [Google Scholar]
- Osorio, K.; Puerto, A.; Pedraza, C.; Jamaica, D.; Rodríguez, L. A Deep Learning Approach for Weed Detection in Lettuce Crops Using Multispectral Images. AgriEngineering 2020, 2, 471–488. [Google Scholar] [CrossRef]
- Pérez-Ruíz, M.; Slaughter, D.; Fathallah, F.; Gliever, C.; Miller, B. Co-robotic intra-row weed control system. Biosyst. Eng. 2014, 126, 45–55. [Google Scholar] [CrossRef]
- Raja, R.; Nguyen, T.; Vuong, V.; Slaughter, D.; Fennimore, S. RTD-SEPs: Real-time detection of stem emerging points and classification of crop-weed for robotic weed control in producing tomato. Biosyst. Eng. 2020, 195, 152–171. [Google Scholar] [CrossRef]
- Raja, R.; Nguyen, T.; Slaughter, D.; Fennimore, S. Real-time robotic weed knife control system for tomato and lettuce based on geometric appearance of plant labels. Biosyst. Eng. 2020, 194, 152–164. [Google Scholar] [CrossRef]
- Åstrand, B.; Baerveldt, A. A vision based row-following system for agricultural field machinery. Mechatronics 2005, 15, 251–269. [Google Scholar] [CrossRef]
- Meng, Q.; Qiu, R.; He, J.; Zhang, M.; Ma, X.; Liu, G. Development of agricultural implement system based on machine vision and fuzzy control. Comput. Electron. Agric. 2015, 112, 128–138. [Google Scholar] [CrossRef]
Reference | Datasets | Purpose | Plant | Image Size | Features |
---|---|---|---|---|---|
[18] | Perennial ryegrass and weed | Weed detection and control | Dandelion, ground ivy, spotted spurge, and ryegrass | 1920 × 1080 33,086 | It includes 17,600 positive images (contain target weeds) and 15,486 negative images (contain perennial ryegrass with no target weeds). |
[19] | Grass-Broadleaf | Weed detection by using ConvNets | Soil, soybean, broadleaf, and grass weeds | 4000 × 3000 15,336 | Data are from a set of images captured using a UAV and the SLIC algorithm. These images are segmented, and the segments are annotated manually. The ratio of soil: soybeans: grass: broadleaf weeds is roughly 3:7:3:1 (Figure 1a). |
[20] | Plant seedlings dataset | Identifying plant species and weeding in the early growth stage | 12 weed and crop species of Danish arable land | 5184 × 3456 407 | Each image is provided with an ID and associated with a single species. The dataset contains a full image, automatically segmented plants, and single plants that are not segmented. |
[21] | DeepWeeds | Classification of multiple weed species based on deep learning | 8 nationally significant weed species native to 8 locations across northern Australia | 256 × 256 17,509 | Each class contains between 1009 and 1125 images of the corresponding species, with a total of over 8000 images of positive species classes. |
[22] | Open Plant Phenotype Database | Plant detection and classification algorithms | 47 species of common weeds in Denmark | 1000 × 1000 7590 | It includes 47 different species of monocotyledonous and dicotyledonous weeds in arable crops in Denmark. Several plant species were cultivated in a semifield setting to mimic natural growth conditions. |
[23] | WeedNet | Dense semantic classification, vegetation detection | Crops and weeds | / 465 | Three kinds of multispectral image datasets are included: one contains only 132 images of crops, the other has 243 images of weeds, and the third one contains 90 images of crop–weed. |
[24] | Sugar beet | Plant classification, localization, and mapping | Sugar beets and 9 different types of weed | 1296 × 966 >10,000 | Data were recorded 3 times per week until the field was no longer accessible to the machinery without damaging the crops. The robot carried a four-channel multispectral camera and an RGB-D sensor. |
[25] | Rice seedlings and weeds | Image segmentation of rice seedling and weeds | Rice seedlings and weed background | 912 × 1024 224 | The images were selected in the paddy fields, and all weeds were in early growth stages. The data sample included GT and RGB images (Figure 1c). |
[26] | Food crops and weed | Crop and weed identification | 6 food crops and 8 weed species | 720 × 1280 1118 | Datasets of 14 basic food crops and weeds in controlled environment and field conditions at different growth stages and manually annotated images are included (Figure 1d). |
[27] | Crop and weed | Instance segmentation for fine detection | Maize, the common bean, and a variety of weeds | 1200 × 2048 2489 | The crops include maize and the common bean. Weeds include cultivated and natural weeds. Each mask is annotated with the species name of the plant. |
[28] | Flavia | Plant leaf classification | Leaves of 32 plants | 1600 × 1200 1907 | Each plant has a minimum of 50 leaves and a maximum of 77. The background of the leaf image is white (Figure 1b). |
[29] | CropDeep | Crop classification and testing | 30 common vegetables and fruits | 1000 × 1000 31,147 | At least 1100 annotated samples per category and vegetables or fruits with different parts and periods of growth are included. A high degree of similarity exists among certain categories in the dataset. |
Reference | Dataset | Method | Evaluation Metrics |
---|---|---|---|
Chavan et al. (2018) [30] | Plant seedlings dataset [20] | AgroAVNET (A hybrid model of AlexNet and VGGNET) | Accuracy: 98.23% |
Trong et al. (2021) [31] | Yielding multi-fold training (YMufT) strategy and DNN; Min-class-max-bound procedure (MCMB); Resnet | Accuracy: 97.18% | |
Xu et al. (2021) [32] | Depthwise separable convolutional neural network, Xception | Accuracy: 99.63% | |
Olsen et al. (2019) [21] | Deepweeds [21] | Dataset was classified with the ResNet-50 and Inception-v3 CNN models to establish a baseline level of performance for comparison. | Accuracy: 95.1% (Inception-v3) Accuracy: 95.7% (ResNet-50) |
Ferreira et al. (2019) [33] | Joint Unsupervised Learning of Deep Representations and Image Clusters (JULE) and Deep Clustering for Unsupervised Learning of Visual Features (DeepCluster) | Precision: 95% | |
Hu et al. (2020) [34] | GWN (Graph Weeds Net) | Accuracy: 98.1% | |
Naresh et al. (2016) [35] | Flavia [28] | MLBP (Modified Local binary patterns) | Accuracy: 97.55% |
Mahajan et al. (2021) [36] | Support vector machine with adaptive boosting | Precision:95.85% | |
Yang C. Z. (2021) [37] | MTD (multiscale triangle descriptor) and LBP-HF (local binary pattern histogram Fourier) | Accuracy: 99.1% |
Reference | Year | Purpose | Accuracy | Problems |
---|---|---|---|---|
[50] | 2016 | Combining HOG feature with Support Vector Machine (SVM) to identify grape leaves | 83.50% | Single-feature detection has poor stability and low accuracy. |
[35] | 2016 | Identifying different plant leaves on the basis of improved LBP | 79.35% | |
[51] | 2018 | Using three shape features to compare the effect of SVM or Artificial Neural Network (ANN) on detecting sugar beets and weeds | 93.33% | Analysis on the selection of features is lacking. |
[52] | 2009 | Combining GW (Gabor wavelet) and GFD (gradient field distribution) to classify different weeds | 93.75% | |
[53] | 2015 | Combining Gabor and Grey-level Co-occurrence Matrix (GLCM) to classify 31 plant leaves | 91.60% | No actual field images are included, and the dataset is only composed of different plant leaves, without complex background, such as soil. |
[54] | 2017 | Extracting the shape and texture features of an image to classify and recognize plant leaves | 92.51% | |
[55] | 2015 | Using improved LBP and GLCM to categorize fresh tea in the production line | 94.80% | Nonwhole plants are detected and recognized, and only the same kind of leaves is classified. |
Features | Advantages | Disadvantages |
---|---|---|
Texture | Has high accuracy, strong adaptability, and robustness | Grey-level co-occurrence matrix (GLCM takes a long time and does not meet the real-time processing requirements. |
Shape | Independent of geometric translation, scaling, or rotation; robust to noise | Shapes are deformed by disease, insect eating, and man-made or mechanical damage and incomplete under overlap and occlusion. |
Color | Insensitive to the adjustment of proportion, size, and position | Crops and weeds with similar color will fail; leaf lesions and plant seasonality will change color. |
Spectral | Robust to partial occlusion | Spectral features vary in different growth stages of plants, are easily affected by the collection environment, and are unstable. |
Ref. | Crop | Types | Architecture | Strengths | Comparison Group | Highest Accuracy |
---|---|---|---|---|---|---|
[151] (2019) | Not specified | RGB | Convolutional neural network | Propose a low-cost Weed Identification System (WIS) using RGB images taken by drones as training data and applying CNN to build the identification model. | 1.CNN-WIS 2.LBP 3.HOG | 98.8% (CNN-WIS) |
[152] (2020) | Lettuce | Multis-pectral | Region proposal network | A false green image was generated, which is the union of the red, green, and near infrared bands, in order to highlight the vegetation. | 1.Mask R-CNN 2.HOG-SVM 3.YOLOv3 | (Precision) 98% (Mask R-CNN) |
[25] (2019) | Rice | RGB | Fully convolutional network | Proposed a SegNet semantic segmentation method based on FCN. Could effectively classify the pixels of rice seedlings, background, and weeds in rice field images. | 1.SegNet 2.FCN 3.U-Net | 92.7% (SegNet) |
[144] (2020) | Corn, lettuce, radish | RGB | Graph convolutional network | Used GCN combined with state-of-the-art pre-training network (AlexNet, VGG16 and ResNet-101) to conduct comparative analysis on four datasets. | 1.GCN-ResNet101 2.GCN-VGG16 3.GCN-AlexNet | 97.8% (GCN-ResNet101) |
[30] (2018) | Maise, common wheat, sugar beet | RGB | Hybrid Network | AgroAVNET is a hybrid model of AlexNet and VGGNET. The performance is compared with AlexNet, VGGNET and their variants and existing methods. | 1.Hybrid Network (AgroAVNET) 2.VGGNet 3.AlexNet | 98.23% (Hybrid Network) |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wu, Z.; Chen, Y.; Zhao, B.; Kang, X.; Ding, Y. Review of Weed Detection Methods Based on Computer Vision. Sensors 2021, 21, 3647. https://doi.org/10.3390/s21113647
Wu Z, Chen Y, Zhao B, Kang X, Ding Y. Review of Weed Detection Methods Based on Computer Vision. Sensors. 2021; 21(11):3647. https://doi.org/10.3390/s21113647
Chicago/Turabian StyleWu, Zhangnan, Yajun Chen, Bo Zhao, Xiaobing Kang, and Yuanyuan Ding. 2021. "Review of Weed Detection Methods Based on Computer Vision" Sensors 21, no. 11: 3647. https://doi.org/10.3390/s21113647
APA StyleWu, Z., Chen, Y., Zhao, B., Kang, X., & Ding, Y. (2021). Review of Weed Detection Methods Based on Computer Vision. Sensors, 21(11), 3647. https://doi.org/10.3390/s21113647