Deep Learning-Based Weed–Crop Recognition for Smart Agricultural Equipment: A Review
Abstract
:1. Introduction
2. Weed Detection Using Remote Sensing Technique
2.1. Image Data Collection
Dataset | Crop and Weed | Sensor | Number of Images | Reference |
---|---|---|---|---|
Dataset of annotated food crops and weed images | 6 crops, 8 weeds | RGB | 1176 | Sudars, K., et al. [48] |
DeepWeeds | 8 weeds | RGB and Gound-based weed control robot (AutoWeed) | 17,509 | Olsen, A., et al. [49] |
Weed-Corn/Lettuce/Radish | maize, lettuce, radish | RGB | 7200 | Jiang, H.H., et al. [50] |
Sugar Beet/Weed Dataset | sugar beet | Multispectral and Micro Aerial Vehicle (MAV) | 465 | Sa, I., et al. [51] |
Rumex and Urtica weed plants Dataset | Rumex and Urtica weed plants | RGB and Crawler robots | 10,000 | Binch, A. and C.W. Fox [52] |
Multispectral Lettuce Dataset | Lettuce and weeds | Multispectral bands and UAV | 100 | Osorio, K. et al. [53] |
Early crop weed | tomato, cotton, velvetleaf and black nightshade | RGB | 508 | Espejo-Garcia, B. et al. [54] |
AIWeeds | flax and 14 most common weeds | RGB | 10,000 | Du, Y., et al. [5] |
TobSet | Tobacco Crop and Weeds | RGB | 1700 | Alam, M.S., et al. [55] |
Crop and weed | Maize, the common bean, and a variety of weeds | RGB and Autonomous electrifier robot | 83 | Champ, J., et al. [56] |
Datasets for sugar beet crop/weed detection | Capsella bursa pastoris | RGB-NIR and BOSCH Bonirob farm robot | 8518 | Di Cicco, M., et al. [57] |
2.2. Preprocessing
2.2.1. Image Resizing
2.2.2. Image Enhancement and Denoising
2.2.3. Background Removal
2.3. Feature Extraction of Weeds
2.3.1. Visual Texture Feature
Feature Combination | Methods | Crop | Weed | Test Accuracy | Reference |
---|---|---|---|---|---|
LBP | ANN with 15 units in ensemble | carrot | Weed | 83.5% | Lease, B.A., et al. [7] |
GLCM | SVM | Rice | Grasses | 73% | Ashraf, T. and Y.N. Khan [66] |
LBP | SVM | spinach | Ashwagandha of the quinoa family, prickly pear of the aster family | 83.78% | Miao, R., et al. [69] |
LBP | k-FLBPCM | broadleaf canola | wild carrot | >96.75% | Vi Nguyen Thanh Le et al. [70] |
Gabor | LDA | blueberry | Goldenrod, lamb’s-quarters, sheep sorrel, goldenrod, poplar spreading dogbane, mouse-eared hawkweed and a few black bulrushes. | 81.4% | Ayalew, G., et al. [67] |
GLCM-M | DA-WDGN | Crop | Detect broadleaf weeds | 99.4% | Raja, G., et al. [71] |
Gabor | SVM | oil palm | broad weed, narrow weed | 95.0% | Zaman, M.H.M., et al. [72] |
Gabor | MLPNN | oil palm | broad weed, narrow weed | 94.5% | Zaman, M.H.M. et al. [72] |
LBP+GLCM | GA-SVM | lettuce | Chenopodium serotinum, Polygonum lapathifolium | 87.55% | Zhang, L., et al. [68] |
LBP+GLCM | SVM | lettuce | Chenopodium serotinum, Polygonum lapathifolium | 81.33% | Zhang, L., et al. [68] |
HOG+LBP+GLCM | GA-SVM | lettuce | Chenopodium serotinum, Polygonum lapathifolium | 86.02% | Zhang, L., et al. [68] |
HOG+GLCM | GA-SVM | lettuce | Chenopodium serotinum, Polygonum lapathifolium | 85.46% | Zhang, L., et al. [68] |
GGCM+RotLBP | SVM | Crop | Cirsium setosum (Willd.) MB, Poa annua L., Eleusine indica (L.) Gaertn., and Chenopodium album L. | 97.50% | Chen, Y., et al. [4] |
2.3.2. Spatial Context Feature
2.3.3. Spectral Feature
Feature Combination | Methods | Crop | Weed | Test Accuracy | Reference |
---|---|---|---|---|---|
LBP | ANN with 15 units in ensemble | carrot | Weed | 83.5% | Lease, B.A. et al. [7] |
430 hyperspectral | RF | grasslands, meadows, and forests | Blackberry, various species of goldenrod, wood small-reed grass | >78.4% | Sabat-Tomala, A. et al. [83] |
30 MNF | SVM | grasslands, meadows, and forests | Blackberry, various species of goldenrod, wood small-reed grass | >85.0% | Sabat-Tomala, A. et al. [83] |
Thermal | ML | soybean | kochia, waterhemp, redroot pigweed, and common ragweed | 82% | Eide, A. et al. [44] |
RGB+NIR | cGAN | sunflower crops, Sugar beet | Weed | 94% | Fawakherji, M., et al. [78] |
RGB | RF and SVM | chilli | unwanted weed and parasites within crops | 96%, 94% | Islam, N. et al. [77] |
terahertz spectral | Wheat-V2 | wheat | wheat husk, wheat straw, wheat leaf, wheat grain, weed, and ladybugs | >96.7% | Shen, Y. et al. [84] |
hyperspectral | lightweight-3D-CNN | Crop seedlings | Weed | >97.4% | Diao, Z. et al. [35] |
Multispectral | U-Net and FPN | sunflower | (Chenopodium album L., Convolvulus arviensis L.; and Cyperus rotundus L. | 90%, 89% | Lopez, L.O. et al. [41] |
Multispectral | SFS-Top3 | Triticum aestivum L. | Alopecurus myosuroides | 93.8% | Su, J. et al. [38] |
Multispectral | RF and XGB | pasture lands and forest meadows of New Zealand | Hawkweeds (Pilosella spp.) | 97%, 98% | Amarasingam, N. et al. [40] |
2.3.4. Biological Morphological Features
3. Applications for Weed/Crop Discrimination
3.1. Learning Algorithm
3.2. Recognition Applications
3.2.1. Spot Photographic Image Recognition
3.2.2. Satellite Photo Image Recognition
3.2.3. Application of Drone Weed Identification
3.2.4. Application of Agricultural Robotics for Weed Recognition
4. Discussion
5. Challenges for Weed Recognition in Smart Farming Equipment and Future Trends
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Murad, N.Y.; Mahmood, T.; Forkan, A.R.M.; Morshed, A.; Jayaraman, P.P.; Siddiqui, M.S. Weed Detection Using Deep Learning: A Systematic Literature Review. Sensors 2023, 23, 3670. [Google Scholar] [CrossRef] [PubMed]
- Hamuda, E.; Glavin, M.; Jones, E. A survey of image processing techniques for plant extraction and segmentation in the field. Comput. Electron. Agric. 2016, 125, 184–199. [Google Scholar] [CrossRef]
- Llewellyn, R.; Ronning, D.; Clarke, M.; Mayfield, A.; Walker, S.; Ouzman, J. Impact of Weeds in Australian Grain Production; Grains Research and Development Corporation: Canberra, Australia, 2016. [Google Scholar]
- Chen, Y.; Wu, Z.; Zhao, B.; Fan, C.; Shi, S. Weed and Corn Seedling Detection in Field Based on Multi Feature Fusion and Support Vector Machine. Sensors 2021, 21, 212. [Google Scholar] [CrossRef]
- Du, Y.; Zhang, G.; Tsang, D.; Jawed, M.K. Deep-CNN based Robotic Multi-Class Under-Canopy Weed Control in Precision Farming. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Philadelphia, PA, USA, 23–27 May 2022; pp. 2273–2279. [Google Scholar]
- Tufail, M.; Iqbal, J.; Tiwana, M.I.; Alam, M.S.; Khan, Z.A.; Khan, M.T. Identification of Tobacco Crop Based on Machine Learning for a Precision Agricultural Sprayer. IEEE Access 2021, 9, 23814–23825. [Google Scholar] [CrossRef]
- Lease, B.A.; Wong, W.K.; Gopal, L.; Chiong, W.R. Weed Pixel Level Classification Based on Evolving Feature Selection on Local Binary Pattern with Shallow Network Classifier. In Proceedings of the 2nd International Conference on Materials Technology and Energy (ICMTE), Curtin Univ Malaysia, Sarawak, Malaysia, 6–8 November 2020. [Google Scholar]
- Mogili, U.M.R.; Deepak, B.B.V.L. Review on Application of Drone Systems in Precision Agriculture. In Proceedings of the 1st International Conference on Robotics and Smart Manufacturing (RoSMa), Chennai, India, 19–21 July 2018; pp. 502–509. [Google Scholar]
- Tataridas, A.; Kanatas, P.; Chatzigeorgiou, A.; Zannopoulos, S.; Travlos, I. Sustainable Crop and Weed Management in the Era of the EU Green Deal: A Survival Guide. Agronomy 2022, 12, 589. [Google Scholar] [CrossRef]
- Jeanmart, S.; Edmunds, A.J.F.; Lamberth, C.; Pouliot, M. Synthetic approaches to the 2010-2014 new agrochemicals. Bioorganic Med. Chem. 2016, 24, 317–341. [Google Scholar] [CrossRef]
- Eyre, M.D.; Critchley, C.N.R.; Leifert, C.; Wilcockson, S.J. Crop sequence, crop protection and fertility management effects on weed cover in an organic/conventional farm management trial. Eur. J. Agron. 2011, 34, 153–162. [Google Scholar] [CrossRef]
- Ampatzidis, Y.; De Bellis, L.; Luvisi, A. iPathology: Robotic Applications and Management of Plants and Plant Diseases. Sustainability 2017, 9, 1010. [Google Scholar] [CrossRef]
- Aravind, K.R.; Raja, P.; Perez-Ruiz, M. Task-based agricultural mobile robots in arable farming: A review. Span. J. Agric. Res. 2017, 15, e02R01-01. [Google Scholar] [CrossRef]
- Su, W.-H. Advanced Machine Learning in Point Spectroscopy, RGB- and Hyperspectral-Imaging for Automatic Discriminations of Crops and Weeds: A Review. Smart Cities 2020, 3, 767–792. [Google Scholar] [CrossRef]
- Ringland, J.; Bohm, M.; Baek, S.-R. Characterization of food cultivation along roadside transects with Google Street View imagery and deep learning. Comput. Electron. Agric. 2019, 158, 36–50. [Google Scholar] [CrossRef]
- Zhu, H.B.; Zhang, Y.Y.; Mu, D.L.; Bai, L.Z.; Zhuang, H.; Li, H. YOLOX-based blue laser weeding robot in corn field. Front. Plant Sci. 2022, 13, 1017803. [Google Scholar] [CrossRef]
- Bah, M.D.; Hafiane, A.; Canals, R. Deep Learning with Unsupervised Data Labeling for Weed Detection in Line Crops in UAV Images. Remote Sens. 2018, 10, 1690. [Google Scholar] [CrossRef]
- Teimouri, N.; Dyrmann, M.; Nielsen, P.R.; Mathiassen, S.K.; Somerville, G.J.; Jorgensen, R.N. Weed Growth Stage Estimator Using Deep Convolutional Neural Networks. Sensors 2018, 18, 1580. [Google Scholar] [CrossRef]
- Oghaz, M.M.; Razaak, M.; Kerdegari, H.; Argyriou, V.; Remagnino, P. Scene and Environment Monitoring Using Aerial Imagery and Deep Learning. In Proceedings of the 15th Annual International Conference on Distributed Computing in Sensor Systems (DCOSS), Santorini, Greece, 29–31 May 2019; pp. 362–369. [Google Scholar]
- Zhu, S.; Deng, J.; Zhang, Y.; Yang, C.; Yan, Z.; Xie, Y. Study on distribution map of weeds in rice field based on UAV remote sensing. J. South China Agric. Univ. 2020, 41, 67–74. [Google Scholar] [CrossRef]
- Zualkernan, I.; Abuhani, D.A.; Hussain, M.H.; Khan, J.; ElMohandes, M. Machine Learning for Precision Agriculture Using Imagery from Unmanned Aerial Vehicles (UAVs): A Survey. Drones 2023, 7, 382. [Google Scholar] [CrossRef]
- Shi, J.Y.; Bai, Y.H.; Diao, Z.H.; Zhou, J.; Yao, X.B.; Zhang, B.H. Row Detection BASED Navigation and Guidance for Agricultural Robots and Autonomous Vehicles in Row-Crop Fields: Methods and Applications. Agronomy 2023, 13, 1780. [Google Scholar] [CrossRef]
- de Castro, A.I.; Shi, Y.; Maja, J.M.; Pena, J.M. UAVs for Vegetation Monitoring: Overview and Recent Scientific Contributions. Remote Sens. 2021, 13, 2139. [Google Scholar] [CrossRef]
- Khan, S.; Tufail, M.; Khan, M.T.; Khan, Z.A.; Anwar, S. Deep learning-based identification system of weeds and crops in strawberry and pea fields for a precision agriculture sprayer. Precis. Agric. 2021, 22, 1711–1727. [Google Scholar] [CrossRef]
- Kim, Y.H.; Park, K.R. MTS-CNN: Multi-task semantic segmentation-convolutional neural network for detecting crops and weeds. Comput. Electron. Agric. 2022, 199, 107146. [Google Scholar] [CrossRef]
- Deepa, S.N.; Rasi, D. FHGSO: Flower Henry gas solubility optimization integrated deep convolutional neural network for image classification. Appl. Intell. 2022, 53, 7278–7297. [Google Scholar] [CrossRef]
- Babu, V.S.; Ram, N.V. Deep Residual CNN with Contrast Limited Adaptive Histogram Equalization for Weed Detection in Soybean Crops. Trait. Du Signal 2022, 39, 717–722. [Google Scholar] [CrossRef]
- Reedha, R.; Dericquebourg, E.; Canals, R.; Hafiane, A. Transformer Neural Network for Weed and Crop Classification of High Resolution UAV Images. Remote Sens. 2022, 14, 592. [Google Scholar] [CrossRef]
- Yu, H.; Che, M.; Yu, H.; Zhang, J. Development of Weed Detection Method in Soybean Fields Utilizing Improved DeepLabv3+ Platform. Agronomy 2022, 12, 2889. [Google Scholar] [CrossRef]
- Sun, Y.; Chen, Y.; Jin, X.; Yu, J.; Chen, Y. AI differentiation of bok choy seedlings from weeds. Fujian J. Agric. Sci. 2021, 36, 1484–1490. [Google Scholar] [CrossRef]
- Wu, Z.N.; Chen, Y.J.; Zhao, B.; Kang, X.B.; Ding, Y.Y. Review of Weed Detection Methods Based on Computer Vision. Sensors 2021, 21, 3647. [Google Scholar] [CrossRef] [PubMed]
- Xu, X.; Wang, L.; Shu, M.; Liang, X.; Ghafoor, A.Z.; Liu, Y.; Ma, Y.; Zhu, J. Detection and Counting of Maize Leaves Based on Two-Stage Deep Learning with UAV-Based RGB Image. Remote Sens. 2022, 14, 5388. [Google Scholar] [CrossRef]
- Fan, K.-J.; Su, W.-H. Applications of Fluorescence Spectroscopy, RGB- and MultiSpectral Imaging for Quality Determinations of White Meat: A Review. Biosensors 2022, 12, 76. [Google Scholar] [CrossRef]
- Li, Y.; Al-Sarayreh, M.; Irie, K.; Hackell, D.; Bourdot, G.; Reis, M.M.; Ghamkhar, K. Identification of Weeds Based on Hyperspectral Imaging and Machine Learning. Front. Plant Sci. 2021, 11, 611622. [Google Scholar] [CrossRef]
- Diao, Z.; Yan, J.; He, Z.; Zhao, S.; Guo, P. Corn seedling recognition algorithm based on hyperspectral image and lightweight-3D-CNN. Comput. Electron. Agric. 2022, 201, 107343. [Google Scholar] [CrossRef]
- Dashti, H.; Glenn, N.F.; Ustin, S.; Mitchell, J.J.; Qi, Y.; Ilangakoon, N.T.; Flores, A.N.; Luis Silvan-Cardenas, J.; Zhao, K.; Spaete, L.P.; et al. Empirical Methods for Remote Sensing of Nitrogen in Drylands May Lead to Unreliable Interpretation of Ecosystem Function. IEEE Trans. Geosci. Remote Sens. 2019, 57, 3993–4004. [Google Scholar] [CrossRef]
- Lou, Z.; Quan, L.; Sun, D.; Li, H.; Xia, F. Hyperspectral remote sensing to assess weed competitiveness in maize farmland ecosystems. Sci. Total Environ. 2022, 844, 157071. [Google Scholar] [CrossRef] [PubMed]
- Su, J.; Yi, D.; Coombes, M.; Liu, C.; Zhai, X.; McDonald-Maier, K.; Chen, W.-H. Spectral analysis and mapping of blackgrass weed by leveraging machine learning and UAV multispectral imagery. Comput. Electron. Agric. 2022, 192, 106621. [Google Scholar] [CrossRef]
- Su, J.; Coombes, M.; Liu, C.; Zhu, Y.; Song, X.; Fang, S.; Guo, L.; Chen, W.H. Machine Learning-Based Crop Drought Mapping System by UAV Remote Sensing RGB Imagery. Unmanned Syst. 2020, 8, 71–83. [Google Scholar] [CrossRef]
- Amarasingam, N.; Hamilton, M.; Kelly, J.E.; Zheng, L.; Sandino, J.; Gonzalez, F.; Dehaan, R.L.; Cherry, H. Autonomous Detection of Mouse-Ear Hawkweed Using Drones, Multispectral Imagery and Supervised Machine Learning. Remote Sens. 2023, 15, 1633. [Google Scholar] [CrossRef]
- Lopez, L.O.; Ortega, G.; Aguera-Vega, F.; Carvajal-Ramirez, F.; Martinez-Carricondo, P.; Garzon, E.M. Multispectral Imaging for Weed Identification in Herbicides Testing. Informatica 2022, 33, 771–793. [Google Scholar] [CrossRef]
- Aguera-Vega, F.; Aguera-Puntas, M.; Aguera-Vega, J.; Martinez-Carricondo, P.; Carvajal-Ramirez, F. Multi-sensor imagery rectification and registration for herbicide testing. Measurement 2021, 175, 109049. [Google Scholar] [CrossRef]
- Allred, B.; Martinez, L.; Fessehazion, M.K.; Rouse, G.; Williamson, T.N.; Wishart, D.; Koganti, T.; Freeland, R.; Eash, N.; Batschelet, A.; et al. Overall results and key findings on the use of UAV visible-color, multispectral, and thermal infrared imagery to map agricultural drainage pipes. Agric. Water Manag. 2020, 232, 106036. [Google Scholar] [CrossRef]
- Eide, A.; Koparan, C.; Zhang, Y.; Ostlie, M.; Howatt, K.; Sun, X. UAV-Assisted Thermal Infrared and Multispectral Imaging of Weed Canopies for Glyphosate Resistance Detection. Remote Sens. 2021, 13, 4606. [Google Scholar] [CrossRef]
- Pineda, M.; Baron, M.; Perez-Bueno, M.L. Thermal Imaging for Plant Stress Detection and Phenotyping. Remote Sens. 2021, 13, 68. [Google Scholar] [CrossRef]
- Wang, X.; Pan, H.; Guo, K.; Yang, X.; Luo, S. The evolution of LiDAR and its application in high precision measurement. IOP Conf. Ser. Earth Environ. Sci. 2020, 502, 012008. [Google Scholar] [CrossRef]
- Moreno, H.; Valero, C.; Bengochea-Guevara, J.M.; Ribeiro, A.; Garrido-Izard, M.; Andujar, D. On-Ground Vineyard Reconstruction Using a LiDAR-Based Automated System. Sensors 2020, 20, 1102. [Google Scholar] [CrossRef] [PubMed]
- Sudars, K.; Jasko, J.; Namatevs, I.; Ozola, L.; Badaukis, N. Dataset of annotated food crops and weed images for robotic computer vision control. Data Brief 2020, 31, 105833. [Google Scholar] [CrossRef]
- Olsen, A.; Konovalov, D.A.; Philippa, B.; Ridd, P.; Wood, J.C.; Johns, J.; Banks, W.; Girgenti, B.; Kenny, O.; Whinney, J.; et al. DeepWeeds: A Multiclass Weed Species Image Dataset for Deep Learning. Sci. Rep. 2019, 9, 2058. [Google Scholar] [CrossRef]
- Jiang, H.H.; Zhang, C.Y.; Qiao, Y.L.; Zhang, Z.; Zhang, W.J.; Song, C.Q. CNN feature based graph convolutional network for weed and crop recognition in smart farming. Comput. Electron. Agric. 2020, 174, 105450. [Google Scholar] [CrossRef]
- Sa, I.; Chen, Z.T.; Popovic, M.; Khanna, R.; Liebisch, F.; Nieto, J.; Siegwart, R. weedNet: Dense Semantic Weed Classification Using Multispectral Images and MAV for Smart Farming. IEEE Robot. Autom. Lett. 2018, 3, 588–595. [Google Scholar] [CrossRef]
- Binch, A.; Fox, C.W. Controlled comparison of machine vision algorithms for Rumex and Urtica detection in grassland. Comput. Electron. Agric. 2017, 140, 123–138. [Google Scholar] [CrossRef]
- Osorio, K.; Puerto, A.; Pedraza, C.; Jamaica, D.; Rodriguez, L. A Deep Learning Approach for Weed Detection in Lettuce Crops Using Multispectral Images. Agriengineering 2020, 2, 471–488. [Google Scholar] [CrossRef]
- Espejo-Garcia, B.; Mylonas, N.; Athanasakos, L.; Fountas, S.; Vasilakoglou, I. Towards weeds identification assistance through transfer learning. Comput. Electron. Agric. 2020, 171, 105306. [Google Scholar] [CrossRef]
- Alam, M.S.; Alam, M.; Tufail, M.; Khan, M.U.; Guenes, A.; Salah, B.; Nasir, F.E.; Saleem, W.; Khan, M.T. TobSet: A New Tobacco Crop and Weeds Image Dataset and Its Utilization for Vision-Based Spraying by Agricultural Robots. Appl. Sci. 2022, 12, 1308. [Google Scholar] [CrossRef]
- Champ, J.; Mora-Fallas, A.; Goeau, H.; Mata-Montero, E.; Bonnet, P.; Joly, A. Instance segmentation for the fine detection of crop and weed plants by precision agricultural robots. Appl. Plant Sci. 2020, 8, e11373. [Google Scholar] [CrossRef]
- Di Cicco, M.; Potena, C.; Grisetti, G.; Pretto, A. Automatic Model Based Dataset Generation for Fast and Accurate Crop and Weeds Detection. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)/Workshop on Machine Learning Methods for High-Level Cognitive Capabilities in Robotics, Vancouver, BC, Canada, 24–28 September 2017; pp. 5188–5195. [Google Scholar]
- Hasan, A.; Sohel, F.; Diepeveen, D.; Laga, H.; Jones, M.G.K. A survey of deep learning techniques for weed detection from images. Comput. Electron. Agric. 2021, 184, 106067. [Google Scholar] [CrossRef]
- Wang, A.; Xu, Y.; Wei, X.; Cui, B. Semantic Segmentation of Crop and Weed using an Encoder-Decoder Network and Image Enhancement Method under Uncontrolled Outdoor Illumination. IEEE Access 2020, 8, 81724–81734. [Google Scholar] [CrossRef]
- Ramirez, W.; Achanccaray, P.; Mendoza, L.F.; Pacheco, M.A.C. Deep Convolutional Neural Networks For Weed Detection in Agricultural Crops Using Optical Aerial Images. In Proceedings of the IEEE Latin American GRSS and ISPRS Remote Sensing Conference (LAGIRS), Santiago, Chile, 21–26 March 2020; pp. 133–137. [Google Scholar]
- Vypirailenko, D.; Kiseleva, E.; Shadrin, D.; Pukalchik, M. Deep learning techniques for enhancement of weeds growth classification. In Proceedings of the IEEE International Instrumentation and Measurement Technology Conference (I2MTC), Glasgow, UK, 17–20 May 2021. [Google Scholar]
- Gee, C.; Denimal, E. RGB Image-Derived Indicators for Spatial Assessment of the Impact of Broadleaf Weeds on Wheat Biomass. Remote Sens. 2020, 12, 2982. [Google Scholar] [CrossRef]
- Slaughter, D.C. The Biological Engineer: Sensing the Difference Between Crops and Weeds. In Automation: The Future of Weed Control in Cropping Systems; Young, S.L., Pierce, F.J., Eds.; Springer: Dordrecht, The Netherlands, 2014; pp. 71–95. [Google Scholar] [CrossRef]
- Al-Badri, A.H.; Ismail, N.A.; Al-Dulaimi, K.; Salman, G.A.; Khan, A.R.; Al-Sabaawi, A.; Salam, M.S.H. Classification of weed using machine learning techniques: A review-challenges, current and future potential techniques. J. Plant Dis. Prot. 2022, 129, 745–768. [Google Scholar] [CrossRef]
- Cimpoi, M.; Maji, S.; Kokkinos, I.; Vedaldi, A. Deep Filter Banks for Texture Recognition, Description, and Segmentation. Int. J. Comput. Vis. 2016, 118, 65–94. [Google Scholar] [CrossRef] [PubMed]
- Ashraf, T.; Khan, Y.N. Weed density classification in rice crop using computer vision. Comput. Electron. Agric. 2020, 175, 105590. [Google Scholar] [CrossRef]
- Ayalew, G.; Zaman, Q.U.; Schumann, A.W.; Percival, D.C.; Chang, Y. An investigation into the potential of Gabor wavelet features for scene classification in wild blueberry fields. Artif. Intell. Agric. 2021, 5, 72–81. [Google Scholar] [CrossRef]
- Zhang, L.; Zhang, Z.; Wu, C.; Sun, L. Segmentation algorithm for overlap recognition of seedling lettuce and weeds based on SVM and image blocking. Comput. Electron. Agric. 2022, 201. [Google Scholar] [CrossRef]
- Miao, R.; Yang, H.; Wu, J.; Liu, H. Weed identification of overlapping spinach leaves based on image sub-block and reconstruction. Trans. Chin. Soc. Agric. Eng. 2020, 36, 178–184. [Google Scholar]
- Vi Nguyen Thanh, L.; Ahderom, S.; Alameh, K. Performances of the LBP Based Algorithm over CNN Models for Detecting Crops and Weeds with Similar Morphologies. Sensors 2020, 20, 2193. [Google Scholar] [CrossRef]
- Raja, G.; Dev, K.; Philips, N.D.; Suhaib, S.A.M.; Deepakraj, M.; Ramasamy, R.K. DA-WDGN: Drone-Assisted Weed Detection using GLCM-M features and NDIRT indices. In Proceedings of the IEEE Conference on Computer Communications Workshops (IEEE INFOCOM), Vancouver, BC, Canada, 9–12 May 2021. [Google Scholar]
- Zaman, M.H.M.; Mustaza, S.M.; Ibrahim, M.F.; Zulkifley, M.A.; Mustafa, M.M. Weed Classification Based on Statistical Features from Gabor Transform Magnitude. In Proceedings of the International Conference on Decision Aid Sciences and Application (DASA), Sakheer, Bahrain, 7–8 December 2021. [Google Scholar]
- Wang, A.; Zhang, W.; Wei, X. A review on weed detection using ground-based machine vision and image processing techniques. Comput. Electron. Agric. 2019, 158, 226–240. [Google Scholar] [CrossRef]
- Bailey, D.; Chang, Y.; Le Moan, S. Analysing Arbitrary Curves from the Line Hough Transform. J. Imaging 2020, 6, 26. [Google Scholar] [CrossRef] [PubMed]
- Teplyakov, L.; Kaymakov, K.; Shvets, E.; Nikolaev, D. Line detection via a lightweight CNN with a Hough Layer. In Proceedings of the 13th International Conference on Machine Vision, Rome, Italy, 2–6 November 2021. [Google Scholar]
- Qi, M.; Wang, Y.; Chen, Y.; Xin, H.; Xu, Y.; Meng, H.; Wang, A. Center detection algorithm for printed circuit board circular marks based on image space and parameter space. J. Electron. Imaging 2023, 32, 011002. [Google Scholar] [CrossRef]
- Islam, N.; Rashid, M.M.; Wibowo, S.; Xu, C.-Y.; Morshed, A.; Wasimi, S.A.; Moore, S.; Rahman, S.M. Early Weed Detection Using Image Processing and Machine Learning Techniques in an Australian Chilli Farm. Agriculture 2021, 11, 387. [Google Scholar] [CrossRef]
- Fawakherji, M.; Potena, C.; Pretto, A.; Bloisi, D.D.; Nardi, D. Multispectral Image Synthesis for Crop/Weed Segmentation in Precision Farming. Robot. Auton. Syst. 2021, 146, 103861. [Google Scholar] [CrossRef]
- Ustin, S.L.; Jacquemoud, S. How the Optical Properties of Leaves Modify the Absorption and Scattering of Energy and Enhance Leaf Functionality. Remote Sens. Plant Biodivers. 2020, 14, 349–384. [Google Scholar]
- Zhu, W.; Sun, Z.; Huang, Y.; Yang, T.; Li, J.; Zhu, K.; Zhang, J.; Yang, B.; Shao, C.; Peng, J.; et al. Optimization of multi-source UAV RS agro-monitoring schemes designed for field-scale crop phenotyping. Precis. Agric. 2021, 22, 1768–1802. [Google Scholar] [CrossRef]
- Calderon, R.; Montes-Borrego, M.; Landa, B.B.; Navas-Cortes, J.A.; Zarco-Tejada, P.J. Detection of downy mildew of opium poppy using high-resolution multispectral and thermal imagery acquired with an unmanned aerial vehicle. Precis. Agric. 2014, 15, 639–661. [Google Scholar] [CrossRef]
- Bellvert, J.; Zarco-Tejada, P.J.; Girona, J.; Fereres, E. Mapping crop water stress index in a ‘Pinot-noir’ vineyard: Comparing ground measurements with thermal remote sensing imagery from an unmanned aerial vehicle. Precis. Agric. 2014, 15, 361–376. [Google Scholar] [CrossRef]
- Sabat-Tomala, A.; Raczko, E.; Zagajewski, B. Comparison of Support Vector Machine and Random Forest Algorithms for Invasive and Expansive Species Classification Using Airborne Hyperspectral Data. Remote Sens. 2020, 12, 516. [Google Scholar] [CrossRef]
- Shen, Y.; Yin, Y.; Li, B.; Zhao, C.; Li, G. Detection of impurities in wheat using terahertz spectral imaging and convolutional neural networks. Comput. Electron. Agric. 2021, 181, 105931. [Google Scholar] [CrossRef]
- Guo, X.; Ge, Y.; Liu, F.; Yang, J. Identification of maize and wheat seedlings and weeds based on deep learning. Front. Earth Sci. 2023, 11, 1146558. [Google Scholar] [CrossRef]
- Wang, Y.; Zhang, X.; Ma, G.; Du, X.; Shaheen, N.; Mao, H. Recognition of weeds at asparagus fields using multi-feature fusion and backpropagation neural network. Int. J. Agric. Biol. Eng. 2021, 14, 190–198. [Google Scholar] [CrossRef]
- Tannouche, A.; Sbai, K.; Rahmoune, M.; Zoubir, A.; Agounoune, R.; Saadani, R.; Rahmani, A. A Fast and Efficient Shape Descriptor for an Advanced Weed Type Classification Approach. Int. J. Electr. Comput. Eng. 2016, 6, 1168–1175. [Google Scholar]
- Bakhshipour, A.; Jafari, A. Evaluation of support vector machine and artificial neural networks in weed detection using shape features. Comput. Electron. Agric. 2018, 145, 153–160. [Google Scholar] [CrossRef]
- Zhuang, J.; Jin, X.; Chen, Y.; Meng, W.; Wang, Y.; Yu, J.; Muthukumar, B. Drought stress impact on the performance of deep convolutional neural networks for weed detection in Bahiagrass. Grass Forage Sci. 2023, 78, 214–223. [Google Scholar] [CrossRef]
- Li, D.; Shi, G.; Li, J.; Chen, Y.; Zhang, S.; Xiang, S.; Jin, S. PlantNet: A dual-function point cloud segmentation network for multiple plant species. Isprs J. Photogramm. Remote Sens. 2022, 184, 243–263. [Google Scholar] [CrossRef]
- Schmidhuber, J. Deep learning in neural networks: An overview. Neural Netw 2015, 61, 85–117. [Google Scholar] [CrossRef]
- Zhu, Y.; Wang, M.; Yin, X.; Zhang, J.; Meijering, E.; Hu, J. Deep Learning in Diverse Intelligent Sensor Based Systems. Sensors 2023, 23, 62. [Google Scholar] [CrossRef]
- Garibaldi-Marquez, F.; Flores, G.; Mercado-Ravell, D.A.; Ramirez-Pedraza, A.; Valentin-Coronado, L.M. Weed Classification from Natural Corn Field-Multi-Plant Images Based on Shallow and Deep Learning. Sensors 2022, 22, 3021. [Google Scholar] [CrossRef]
- Alzubaidi, L.; Zhang, J.; Humaidi, A.J.; Al-Dujaili, A.Q.; Duan, Y.; Al-Shamma, O.; Santamaría, J.; Fadhel, M.A.; Al-Amidie, M.; Farhan, L. Review of deep learning: Concepts, CNN architectures, challenges, applications, future directions. J. Big Data 2021, 8, 53. [Google Scholar] [CrossRef]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, Ł.; Polosukhin, I. Attention is all you need. In Proceedings of the 31st International Conference on Neural Information Processing Systems, Long Beach, CA, USA, 4–9 December 2017; pp. 6000–6010. [Google Scholar]
- Dosovitskiy, A.; Beyer, L.; Kolesnikov, A.; Weissenborn, D.; Houlsby, N. An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale. arXiv 2020, arXiv:2010.11929. [Google Scholar]
- Jiang, K.; Afzaal, U.; Lee, J. Transformer-Based Weed Segmentation for Grass Management. Sensors 2023, 23, 65. [Google Scholar] [CrossRef]
- Khan, S.; Naseer, M.; Hayat, M.; Zamir, S.W.; Khan, F.S.; Shah, M. Transformers in Vision: A Survey. ACM Comput. Surv. 2022, 54, 200. [Google Scholar] [CrossRef]
- Tao, T.; Wei, X. A hybrid CNN-SVM classifier for weed recognition in winter rape field. Plant Methods 2022, 18, 29. [Google Scholar] [CrossRef]
- Zhang, H.; Wang, Z.; Guo, Y.; Ma, Y.; Cao, W.; Chen, D.; Yang, S.; Gao, R. Weed Detection in Peanut Fields Based on Machine Vision. Agriculture 2022, 12, 1541. [Google Scholar] [CrossRef]
- Jin, X.; Sun, Y.; Che, J.; Bagavathiannan, M.; Yu, J.; Chen, Y. A novel deep learning-based method for detection of weeds in vegetables. Pest Manag. Sci. 2022, 78, 1861–1869. [Google Scholar] [CrossRef]
- Abouzahir, S.; Sadik, M.; Sabir, E. Paper Bag-of-visual-words-augmented Histogram of Oriented Gradients for efficient weed detection. Biosyst. Eng. 2021, 202, 179–194. [Google Scholar] [CrossRef]
- Haq, M.A. CNN Based Automated Weed Detection System Using UAV Imagery. Comput. Syst. Sci. Eng. 2022, 42, 837–849. [Google Scholar] [CrossRef]
- Milioto, A.; Lottes, P.; Stachniss, C. Real-Time Blob-Wise Sugar Beets vs. Weeds Classification for Monitoring Fields Using Convolutional Neural Networks. In Proceedings of the International Conference on Unmanned Aerial Vehicles in Geomatics, Bonn, Germany, 4–7 September 2017; pp. 41–48. [Google Scholar]
- Ong, P.; Teo, K.S.; Sia, C.K. UAV-based weed detection in Chinese cabbage using deep learning. Smart Agric. Technol. 2023, 4, 100181. [Google Scholar] [CrossRef]
- Quan, L.; Feng, H.; Li, Y.; Wang, Q.; Zhang, C.; Liu, J.; Yuan, Z. Maize seedling detection under different growth stages and complex field environments based on an improved Faster R-CNN. Biosyst. Eng. 2019, 184, 1–23. [Google Scholar] [CrossRef]
- Sanchez, P.R.; Zhang, H. Evaluation of a CNN-Based Modular Precision Sprayer in Broadcast-Seeded Field. Sensors 2022, 22, 9723. [Google Scholar] [CrossRef]
- Zhang, W.H.; Hansen, M.F.; Volonakis, T.N.; Smith, M.; Smith, L.; Wilson, J.; Ralston, G.; Broadbent, L.; Wright, G. Broad-Leaf Weed Detection in Pasture. In Proceedings of the 3rd IEEE International Conference on Image, Vision and Computing (ICIVC), Chongqing, China, 27–29 June 2018; pp. 101–105. [Google Scholar]
- McCool, C.; Perez, T.; Upcroft, B. Mixtures of Lightweight Deep Convolutional Neural Networks: Applied to Agricultural Robotics. IEEE Robot. Autom. Lett. 2017, 2, 1344–1351. [Google Scholar] [CrossRef]
- Asseng, S.; Asche, F. Future farms without farmers. Sci. Robot. 2019, 4, eaaw1875. [Google Scholar] [CrossRef]
- Wang, D.S.; Cao, W.J.; Zhang, F.; Li, Z.L.; Xu, S.; Wu, X.Y. A Review of Deep Learning in Multiscale Agricultural Sensing. Remote Sens. 2022, 14, 559. [Google Scholar] [CrossRef]
- Zhang, H.D.; Wang, L.Q.; Tian, T.; Yin, J.H. A Review of Unmanned Aerial Vehicle Low-Altitude Remote Sensing (UAV-LARS) Use in Agricultural Monitoring in China. Remote Sens. 2021, 13, 1221. [Google Scholar] [CrossRef]
- Waldner, F.; Diakogiannis, F.I. Deep learning on edge: Extracting field boundaries from satellite images with a convolutional neural network. Remote Sens. Environ. 2020, 245, 1221. [Google Scholar] [CrossRef]
- Yuan, X.H.; Shi, J.F.; Gu, L.C. A review of deep learning methods for semantic segmentation of remote sensing imagery. Expert Syst. Appl. 2021, 169, 114417. [Google Scholar] [CrossRef]
- Liu, J.; Xiang, J.J.; Jin, Y.J.; Liu, R.H.; Yan, J.N.; Wang, L.Z. Boost Precision Agriculture with Unmanned Aerial Vehicle Remote Sensing and Edge Intelligence: A Survey. Remote Sens. 2021, 13, 4387. [Google Scholar] [CrossRef]
- Khan, S.; Tufail, M.; Khan, M.T.; Khan, Z.A.; Iqbal, J.; Wasim, A. Real-time recognition of spraying area for UAV sprayers using a deep learning approach. PLoS ONE 2021, 16, e0249436. [Google Scholar] [CrossRef] [PubMed]
- De Castro, A.I.; Ehsani, R.; Ploetz, R.; Crane, J.H.; Abdulridha, J. Optimum spectral and geometric parameters for early detection of laurel wilt disease in avocado. Remote Sens. Environ. 2015, 171, 33–44. [Google Scholar] [CrossRef]
- Xie, C.Q.; Yang, C. A review on plant high-throughput phenotyping traits using UAV-based sensors. Comput. Electron. Agric. 2020, 178, 105731. [Google Scholar] [CrossRef]
- Allred, B.; Eash, N.; Freeland, R.; Martinez, L.; Wishart, D. Effective and efficient agricultural drainage pipe mapping with UAS thermal infrared imagery: A case study. Agric. Water Manag. 2018, 197, 132–137. [Google Scholar] [CrossRef]
- Guo, A.T.; Huang, W.J.; Dong, Y.Y.; Ye, H.C.; Ma, H.Q.; Liu, B.; Wu, W.B.; Ren, Y.; Ruan, C.; Geng, Y. Wheat Yellow Rust Detection Using UAV-Based Hyperspectral Technology. Remote Sens. 2021, 13, 123. [Google Scholar] [CrossRef]
- Radoglou-Grammatikis, P.; Sarigiannidis, P.; Lagkas, T.; Moscholios, I. A compilation of UAV applications for precision agriculture. Comput. Netw. 2020, 172, 107148. [Google Scholar] [CrossRef]
- Sivakumar, A.N.; Modi, S.; Gasparino, M.V.; Ellis, C.; Velasquez, A.E.B.; Chowdhary, G.; Gupta, S. Learned Visual Navigation for Under-Canopy Agricultural Robots. In Proceedings of the Conference on Robotics—Science and Systems, Electr Network, Virtual, 12–16 July 2021. [Google Scholar]
- Subeesh, A.; Mehta, C.R. Automation and digitization of agriculture using artificial intelligence and internet of things. Artif. Intell. Agric. 2021, 5, 278–291. [Google Scholar] [CrossRef]
- Andreasen, C.; Scholle, K.; Saberi, M. Laser Weeding With Small Autonomous Vehicles: Friends or Foes? Front. Agron. 2022, 4, 841086. [Google Scholar] [CrossRef]
- Tran, D.; Schouteten, J.J.; Degieter, M.; Krupanek, J.; Jarosz, W.; Areta, A.; Emmi, L.; De Steur, H.; Gellynck, X. European stakeholders’ perspectives on implementation potential of precision weed control: The case of autonomous vehicles with laser treatment. Precis. Agric. 2023, 24, 2200–2222. [Google Scholar] [CrossRef]
- Hussain, A.; Fatima, H.S.; Zia, S.M.; Hasan, S.; Khurram, M.; Stricker, D.; Afzal, M.Z. Development of Cost-Effective and Easily Replicable Robust Weeding Machine-Premiering Precision Agriculture in Pakistan. Machines 2023, 11, 287. [Google Scholar] [CrossRef]
- Xu, S.Y.; Wu, J.J.; Zhu, L.; Li, W.H.; Wang, Y.T.; Wang, N. A novel monocular visual navigation method for cotton-picking robot based on horizontal spline segmentation. In Proceedings of the 9th International Symposium on Multispectral Image Processing and Pattern Recognition (MIPPR)—Automatic Target Recognition and Navigation, Enshi, China, 31 October–1 November 2015. [Google Scholar]
- Jia, W.K.; Zhang, Y.; Lian, J.; Zheng, Y.J.; Zhao, D.; Li, C.J. Apple harvesting robot under information technology: A review. Int. J. Adv. Robot. Syst. 2020, 17, 1729881420925310. [Google Scholar] [CrossRef]
- Jiang, W.; Quan, L.Z.; Wei, G.Y.; Chang, C.; Geng, T.Y. A conceptual evaluation of a weed control method with post-damage application of herbicides: A composite intelligent intra-row weeding robot. Soil Tillage Res. 2023, 234, 105837. [Google Scholar] [CrossRef]
- Mohamed, E.S.; Belal, A.; Abd-Elmabod, S.K.; El-Shirbeny, M.A.; Gad, A.; Zahran, M.B. Smart farming for improving agricultural management. Egypt. J. Remote Sens. Space Sci. 2021, 24, 971–981. [Google Scholar] [CrossRef]
- Darwin, B.; Dharmaraj, P.; Prince, S.; Popescu, D.E.; Hemanth, D.J. Recognition of Bloom/Yield in Crop Images Using Deep Learning Models for Smart Agriculture: A Review. Agronomy 2021, 11, 646. [Google Scholar] [CrossRef]
- Jin, S.; Dai, H.; Peng, J.; He, Y.; Zhu, M.; Yu, W.; Li, Q. An Improved Mask R-CNN Method for Weed Segmentation. In Proceedings of the 17th Conference on Industrial Electronics and Applications (ICIEA), Chengdu, China, 16–19 December 2022. [Google Scholar]
Crop | Methods | Enhancement | Input Representation | MIoU | Reference |
---|---|---|---|---|---|
Sugar beet | An encoder-decoder deep learning network | HE PS-AC DPE | RGB | 92.75% 94.29% 93.50% | AICHEN WANG et al. [59] |
Oilseed | An encoder-decoder deep learning network | HE PS-AC DPE | RGB | 94.80% 95.80% 96.12% | AICHEN WANG et al. [59] |
Soybean | DeepLabv3+ Swin+DeepLabv3+ Swin-DeepLab | Random rotation, Random flipping, Random cropping, Adding Gaussian noise, and Increasing contrast | RGB | 88.59% 91.10% 91.53% | Yu, H., et al. [29] |
Sunflower | An algorithm proposed by Lopez, L.O., et al [41]. U-Net FPN | Perspective Deformity Correction Program | Multispectral | 89% 90% 89% | Lopez, L.O., et al. [41] |
Methods | Crop | Weed | Sensor | Accuracy | Reference |
---|---|---|---|---|---|
Swin-DeepLab | Soybean | Graminoid weeds such as Digitaria sanguinalis (L.) Scop and Setaria viridis (L.) Beauv; broadleaf weeds such as Chenopodium glaucum L., Acalypha australis L., and Amaranthus retroflexus L. | RGB | 91.53% | Yu, H. et al. [29] |
lightweight-3D-CNN | Crop seedlings | Weed | Hyperspectral | >97.4% | Diao, Z. et al. [35] |
A combination of fine-tuned Densenet and Support Vector Machine | Tomato (Solanum lycopersicum L.) and Cotton (Gossypium hirsutum L.) | Black nightshade (Solanum nigrum L.) and Velvetleaf (Abutilon theophrasti Medik.) | RGB | 99.29% | Espejo-Garcia, B. et al. [54] |
VGG16, VGG19, Xception | Corn | NLW, BLW | RGB | 97.83%, 97.44%, 97.24% | Garibaldi-Marquez, F. et al. [93] |
GA-SVM | Lettuce | Chenopodium serotinum, Polygonum lapathifolium | RGB | 87.55% | Zhang, L. et al. [68] |
VIt | Maize and Wheat | Black-grass, Charlock, Cleaver, Common Chickweed, Common wheat, Fat Hen, Loose Silky-bent, Maize, Scentless Mayweed, Shepherds Purse, Small-flowered Cranesbill, and Sugar beet. | RGB | 98.1% | Guo, X. et al. [85] |
AlexNet, GoogleNet, VGG | Florida pusley | Bahiagrass | RGB | 95%, 96%, 95% | Zhuang, J. et al. [89] |
PlantNet | Tobacco, Tomato, and Sorghum | Monocotyledonous weed | High Precision 3D Laser | 95.04% 96.44% 98.03% | Li, D. et al. [90] |
VGG-SVM | Winter rape seedlings | Rape seedlings associated weeds | RGB | 92.1% | Tao, T. and X. Wei [99] |
YOLOv4-Tiny | peanuts | Portulaca oleracea, Eleusine indica, Chenopodium album, Amaranth blitum, Abutilon theophrasti, and Calystegia. | RGB | 96.7% | Zhang, H. et al. [100] |
U-Net | Sunflower | (Chenopodium album L., Convolvulus arviensis L.; and Cyperus rotundus L. | Multispectral | 90% | Lopez, L.O. et al. [41] |
YOLO-v3 CenterNet Faster R-CNN | Bok choy | Weeds | RGB | 98.4%, 98.3%, 97.5% | Xiaojun Jin et al. [101] |
SVM, YOLOV3, Mask R-CNN | Lettuce Crops | Weeds | Multispectral and UAV | 88%, 94%, 94% | Osorio, K. et al. [53] |
ML | Wheat | blackgrass weeds | Multispectral and UAV | 93.8% | Su, J. et al. [38] |
SVM, KNN, AdaBoost and CNN | Rice | Leptochloa Chinensis, Sedges | RGB and UAV | 89.75%, 85.58%, 90.25%, 92.41% | Zhu, S. et al. [20] |
BPNN | Soybean, Sugar Beet and Carrot | Broad-leaf, Grass, Pig-weed, Lambs-quarter, Hares-ear Mustard, Turnip Weed, Wild Carrot, Corsican Mint | RGB, UAV and BONIROB Robot | 96.6%, 97.7%, 93% | Abouzahir, S. et al. [102] |
RF, SVM and KNN | Chilli | Unwanted weed and Parasites within crops | RGB and UAV | 94%, 96%, 63% | Islam, N. et al. [77] |
Improved Faster R-CNN | Pea and Strawberry | Annual Goosegrass (Eleusine indica) weeds | RGB and UAV | average of 95.3% | Khan, S. et al. [24] |
RF | Bean and Spinach | Thistles and young Potato sprouts | RGB and DJI Phantom 3 Pro drone | 96.99 | Bah, M.D. et al. [17] |
CNNLVQ | Soybean | Grassy weeds and Broadleaf weeds | RGB and UAV | 99.44% | Haq, M.A. [103] |
CNN | Soybean | Weeds | RGB+NIR and UAV | 99.66% | Milioto, A. et al. [104] |
CNN | Chinese cabbage | Weeds | RGB and UAV | 92.41% | Ong, P. et al. [105] |
Vit | Beet, Parsley and Spinach | Weeds | RGB and UAV | >98.63% | Reedha, R. et al. [28] |
MobileNetV2 | Flax | 14 most common weeds | RGB and SAMBot | 90% | Du, Y. et al. [5] |
SVM | Tobacco | Weeds | RGB and A tractor-mounted boom sprayer | 96% | Tufail, M. et al. [6] |
YOLOX | Corn seedlings | Weeds | RGB | 92.45% | Zhu, H.B. et al. [16] |
Faster R-CNN and YOLOv5 | Tobacco | Bare soil and Weeds (that grow up in tobacco fields) | RGB and Pesticide spraying robot | 98.43% 94.45% | Alam·, M.S. et al. [55] |
Faster ReCNN with VGG1 | Maize seedling | Weeds | Industrial USB cameras and Field robot platform (FRP) | 98.2% | Quan, L. et al. [106] |
An encoder-decoder network with atrous separable convolution | Sugar Beet and Oilseed | Weed | RGB and BoniRob robot | 96.12% | Wang, A. et al. [60] |
ANN with 15 units in ensemble | Carrot | Weeds | Multispectral and Bonirob | 83.5% | Lease, B.A. et al. [7] |
cGAN | sunflower crops, Sugar beet | Weeds | Multispectral and BOSCH Bonirob farm robot | 94% | Fawakherji, M. et al. [78] |
CNN | Soybeans | Weeds | RGB and Precision Sprayer | In the experiment, spray volume was reduced by up to 48.89. | Sanchez, P.R. and H. Zhang [107] |
CNN | Grass | Broad-leaf weed | High resolution camera and Quadbike | 96.88% | Zhang, W.H. et al. [108] |
lightweight DCNN | Organic carrot | Weeds | RGB and AgBot II | 93.9% | McCool, C. et al. [109] |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Qu, H.-R.; Su, W.-H. Deep Learning-Based Weed–Crop Recognition for Smart Agricultural Equipment: A Review. Agronomy 2024, 14, 363. https://doi.org/10.3390/agronomy14020363
Qu H-R, Su W-H. Deep Learning-Based Weed–Crop Recognition for Smart Agricultural Equipment: A Review. Agronomy. 2024; 14(2):363. https://doi.org/10.3390/agronomy14020363
Chicago/Turabian StyleQu, Hao-Ran, and Wen-Hao Su. 2024. "Deep Learning-Based Weed–Crop Recognition for Smart Agricultural Equipment: A Review" Agronomy 14, no. 2: 363. https://doi.org/10.3390/agronomy14020363
APA StyleQu, H.-R., & Su, W.-H. (2024). Deep Learning-Based Weed–Crop Recognition for Smart Agricultural Equipment: A Review. Agronomy, 14(2), 363. https://doi.org/10.3390/agronomy14020363