Artificial Intelligence Applied to Support Agronomic Decisions for the Automatic Aerial Analysis Images Captured by UAV: A Systematic Review
Abstract
:1. Introduction
2. Materials and Methods
2.1. Planning
2.2. Selection
- (1)
- Precision indicates whether the deep learning models were able to correctly detect or classify the objects contained in the dataset.
- (2)
- Accuracy (Acc) calculates the proportion between correctly predicted observations and all observations in the set.
- (3)
- mAP calculates the area under the precision and recall curve.
- (4)
- Recall calculates the number of correctly detected objects divided by all ground truth objects.
- (5)
- F1-score is used in the weighted harmonic average of precision and recall.
3. Results and Discussion
3.1. Weed
3.2. Nutritional Deficiency
3.3. Water Stress
3.4. Plant Disease
3.5. Agricultural Pest
3.6. Yield Estimation
3.7. Percentage of SLR Research According to Techniques That Automate Agricultural Processes
3.8. Answers to Research Questions
3.9. Limitations
4. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Brazil na revolução 4.0. CEPEA-ESALQ/USP. 2019. Available online: https://www.cepea.esalq.usp.br/br/opiniao-cepea/o-brasil-na-revolucao-4-0.aspx (accessed on 3 April 2023).
- Rossetto, R.; Santiago, A.D. Cana: Plantas Daninhas; Embrapa: Parque Estação Biológica, PqEB: Brasília, Brazil, 2022. Available online: https://www.embrapa.br/agencia-de-informacao-tecnologica/cultivos/cana/producao/manejo/plantas-daninhas (accessed on 5 April 2023).
- Bah, M.D.; Hafiane, A.; Canals, R. CRowNet: Deep network for Crop row detection in UAV images. IEEE Access 2019, 8, 5189–5200. [Google Scholar] [CrossRef]
- Castro, A.I.; Torres-Sánchez, J.; Peña, J.M.; Jiménez-Brenes, F.M.; Csillik, O.; López-Granados, F. An automatic random for-est-OBIA algorithm for early weed mapping between and within crop rows using UAV imagery. Remote Sens. 2018, 10, 285. [Google Scholar] [CrossRef]
- Gill, S.S.; Tuli, S.; Xu, M.; Singh, I.; Singh, K.V.; Lindsay, D.; Tuli, S.; Smirnova, D.; Singh, M.; Jain, U.; et al. Transformative effects of IoT, Blockchain and Artificial Intelligence on cloud computing: Evolution, vision, trends and open challenges. Internet Things 2019, 8, 100118. [Google Scholar] [CrossRef]
- Lolito, V.; Zmbelli, T. Pattern detection in colloidal assembly: A mosaic of analysis techniques. Adv. Colloid Inter. Sci. 2020, 284, 102252. [Google Scholar] [CrossRef]
- Etienne, A.; Ahmad, A.; Aggarwal, V.; Saraswat, D. Deep learning-based object detection system for identifying weeds using UAS imagery. Remote Sens. 2021, 13, 5182. [Google Scholar] [CrossRef]
- Inteligência Artificial Torna Mais Preciso o Mapeamento da Intensificação Agrícola no Cerrado. Embrapa. 2023. Available online: https://www.embrapa.br/busca-de-noticias/-/noticia/83327528/inteligencia-artificial-torna-mais-preciso-o-mapeamento-da-intensificacao-agricola-no-cerrado (accessed on 6 April 2023).
- Shankar, H.R.; Veeraraghavan, A.K.; Uvais; Sivaraman, K.U.; Ramachandran, S.S. Application of UAV for Pest, Weeds and Disease Detection using Open Computer Vision. In Proceedings of the 2018 International Conference on Smart Systems and Inventive Technology (ICSSIT), Tirunelveli, India, 13–14 December 2018; pp. 287–292. [Google Scholar] [CrossRef]
- Hassler, S.C.; Baysal-Gurel, F. Unmanned aircraft system (UAS) technology and applications in agriculture. Agronomy 2019, 9, 618. [Google Scholar] [CrossRef]
- Hamylton, S.M.; Morris, R.H.; Carvalho, R.C.; Roder, N.; Barlow, P.; Mills, K.; Wang, L. Evaluating techniques for mapping island vegetation from unmanned aerial vehicle (UAV) images: Pixel classification, visual interpretation and machine learning approaches. Int. J. Appl. Earth Obs. Geoinf. 2020, 89, 102085. [Google Scholar] [CrossRef]
- Haq, M.A. CNN Based Automated Weed Detection System Using UAV Imagery. Comput. Syst. Sci. Eng. 2021, 42, 837–849. [Google Scholar] [CrossRef]
- Islam, N.; Rashid, M.M.; Wibowo, S.; Xu, C.Y.; Morshed, A.; Wasimi, S.A.; Moore, S.; Rahman, S.M. Early weed detection using image processing and machine learning techniques in an Australian chili farm. Agriculture 2021, 11, 387. [Google Scholar] [CrossRef]
- Belete, N.A.S.; Tetila, E.C.; Astolfi, G.; Pistori, H. Classification of weed in soybean crops using unmanned aerial vehicle images. In Proceedings of the XV Workshop de Visão Computacional, Sao Bernardo do Campo, Brazil, 9–11 September 2019; pp. 121–125. [Google Scholar] [CrossRef]
- Salazar, J.; Sánchez-De La Cruz, E.; Ochoa-Zezzatti, A.; Rivera, M.M. Diagnosis of Collateral Effects in Climate Change Through the Identification of Leaf Damage Using a Novel Heuristics and Machine Learning Framework. In Metaheuristics in Machine Learning: Theory and Applications; Springer: Cham, Switzerland, 2021; pp. 61–75. [Google Scholar] [CrossRef]
- Ferreira, C.M.; Barrigossi, J.A.F. Embrapa Rice and Beans: Tradition and Food Security. Technical Editors. 2021, 16. Available online: http://www.cnpaf.embrapa.br/languages/ricebeans.php (accessed on 6 April 2023).
- Sanders, J.T.; Jones, E.A.L.; Austin, R.; Roberson, G.T.; Richardson, R.J.; Everman, W.J. Remote sensing for palmer amaranth (Amaranthus palmeri s. wats.) detection in soybean (Glycine max (L.) Merr.). Agronomy 2021, 11, 1909. [Google Scholar] [CrossRef]
- Valente, J.; Doldersum, M.; Roers, C.; Kooistra, L. Detecting Rumex Obtusifolius weed plants in grasslands from UAV RGB imagery using deep learning. Remote Sens. Spat. Inf. Sci. 2019, 4, 179–185. [Google Scholar] [CrossRef]
- Peña, J.M.; Torres-Sánchez, J.; Castro, A.I.; Kelly, M.; López-Granados, F. Weed Mapping in Early-Season Maize Fields Using Object-Based Analysis of Unmanned Aerial Vehicle (UAV) Images. PLoS ONE 2018, 8, e77151. [Google Scholar] [CrossRef] [PubMed]
- Pham, F.; Raheja, A.; Bhandari, S. Machine learning models for predicting lettuce health using UAV imagery. In Proceedings of the SPIE 11008, Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV, Baltimore, MD, USA, 14–18 April 2019; p. 110080Q. [Google Scholar] [CrossRef]
- Preethi, C.; Brintha, N.C.; Yogesh, C.K. A comprehensive survey on applications of precision agriculture in the context of weed classification, leave disease detection, yield prediction and UAV Image analysis. Adv. Parallel Comput. 2021, 39, 296–306. [Google Scholar] [CrossRef]
- Sun, G.; Xie, H.; Sinnott, R.O. A Crop Water Stress Monitoring System Utilizing a Hybrid e-Infrastructure. In Proceedings of the 10th International Conference on Utility and Cloud Computing, Austin, TX, USA, 5–8 December 2017; pp. 161–170. [Google Scholar] [CrossRef]
- Siqueira, V.S.; Borges, M.M.; Furtado, R.G.; Dourado, C.N.; Costa, R.M. Artificial intelligence applied to support medical decisions for the automatic analysis of echocardiogram images: A systematic review. Artif. Intell. Med. 2021, 120, 102165. [Google Scholar] [CrossRef] [PubMed]
- Kitchenham, B. Procedures for Performing Systematic Reviews; Keele University: Keele, UK, 2004; pp. 1–26. [Google Scholar]
- Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ 2021, 372, 71. [Google Scholar] [CrossRef]
- ACM Digital Library. Available online: https://dl.acm.org/search/advanced (accessed on 3 April 2023).
- IEEE Xplore Digital Library. Available online: http://ieeexplore.ieee.org/Xplore/home.jsp (accessed on 3 April 2023).
- Science Direct—Elsevier. Available online: https://www.sciencedirect.com/search/advanced (accessed on 3 April 2023).
- MDPI—Publisher of Open Access Journals. Available online: https://www.mdpi.com/ (accessed on 5 April 2023).
- Web of Science. Available online: https://apps.webofknowledge.com (accessed on 5 April 2023).
- Jackulin, C.; Murugavalli, S. A comprehensive review on detection of plant disease using machine learning and deep learning approaches. Meas. Sens. 2022, 24, 100441. [Google Scholar] [CrossRef]
- Mohidem, N.A.; Che’ya, N.N.; Juraimi, A.S.; Ilahi, W.F.F.; Roslim, M.H.M.; Sulaiman, N.; Saberioon, M.; Noor, N.M. How can unmanned aerial vehicles be used for detecting weeds in agricultural fields? Agriculture 2021, 11, 1004. [Google Scholar] [CrossRef]
- Rai, N.; Zhang, Y.; Ram, B.G.; Schumacher, L.; Yellavajjala, R.K.; Bajwa, S.; Sun, X. Applications of deep learning in precision weed management: A review. Comput. Electron. Agric. 2023, 206, 107698. [Google Scholar] [CrossRef]
- Shahi, T.B.; Xu, C.-Y.; Neupane, A.; Guo, W. Recent Advances in Crop Disease Detection Using UAV and Deep Learning Techniques. Remote Sens. 2023, 15, 2450. [Google Scholar] [CrossRef]
- Kuswidiyanto, L.W.; Noh, H.H.; Han, X.Z. Plant disease diagnosis using deep learning based on aerial hyperspectral images: A review. Remote Sens. 2022, 14, 6031. [Google Scholar] [CrossRef]
- Varah, A.; Ahodo, K.; Coutts, S.R.; Hicks, H.L.; Comont, D.; Crook, L.; Hull, R.; Neve, P.; Childs, D.Z.; Freckleton, R.P. The costs of human-induced evolution in an agricultural system. Nat. Sustain. 2020, 3, 63–71. [Google Scholar] [CrossRef] [PubMed]
- Hoeser, T.; Bachofer, F.; Kuenzer, C. Object detection and image segmentation with deep learning on Earth observation data: A review—Part II: Applications. Remote Sens. 2020, 12, 3053. [Google Scholar] [CrossRef]
- Fraccaro, P.; Butt, J.; Edwards, B.; Freckleton, R.P.; Childs, D.Z.; Reusch, K.; Comont, D. A Deep Learning Application to Map Weed Spatial Extent from Unmanned Aerial Vehicles Imagery. Remote Sens. 2022, 14, 4197. [Google Scholar] [CrossRef]
- Bah, H.; Hafiane, A.; Canals, R. Deep Learning with Unsupervised Data Labeling for Weed Detection in Line Crops in UAV Images. Remote Sens. 2018, 10, 1690. [Google Scholar] [CrossRef]
- Huang, H.; Lan, Y.; Yang, A.; Zhang, Y.; Wen, S.; Deng, J. Deep learning versus Object-based Image Analysis (OBIA) in weed mapping of UAV imagery. Int. J. Remote Sens. 2020, 41, 3446–3479. [Google Scholar] [CrossRef]
- Beeharry, Y.; Bassoo, V. Performance of ANN and AlexNet for weed detection using UAV-based images. In Proceedings of the 2020 3rd International Conference on Emerging Trends in Electrical, Electronic and Communications Engineering (ELECOM), Balaclava, Mauritius, 25–27 November 2020; pp. 163–167. [Google Scholar] [CrossRef]
- Reedha, R.; Dericquebourg, E.; Canals, R.; Hafiane, A. Transformer Neural Network for Weed and Crop Classification of High-Resolution UAV Images. Remote Sens. 2022, 14, 592. [Google Scholar] [CrossRef]
- Genze, N.; Ajekwe, R.; Güreli, Z.; Haselbeck, F.; Grieb, M.; Grimm, D.G. Deep learning-based early weed segmentation using motion blurred UAV images of sorghum fields. Comput. Electron. Agric. 2022, 202, 168. [Google Scholar] [CrossRef]
- Gallo, I.; Rehman, A.U.; Dehkord, R.H.; Landro, N.; La Grassa, R.; Boschetti, M. Deep Object Detection of Crop Weeds: Performance of YOLOv7 on a Real Case Dataset from UAV Images. Remote Sens. 2023, 15, 539. [Google Scholar] [CrossRef]
- Ajayi, O.G.; Ashi, J.; Guda, B. Performance evaluation of YOLO v5 model for automatic crop and weed classification on UAV images. Smart Agric. Technol. 2023, 5, 100231. [Google Scholar] [CrossRef]
- Pei, H.; Sun, Y.; Huang, H.; Zhang, W.; Sheng, J.; Zhang, Z. Weed Detection in Maize Fields by UAV Images Based on Crop Row Preprocessing and Improved YOLOv4. Agriculture 2021, 12, 975. [Google Scholar] [CrossRef]
- Su, J.; Yi, D.; Coombes, M.; Liu, C.; Zhai, X.; McDonald-Maier, K.; Chen, W.-H. Spectral analysis and mapping of blackgrass weed by leveraging machine learning and UAV multispectral imagery. Comput. Electron. Agric. 2022, 192, 106621. [Google Scholar] [CrossRef]
- Barrero, O.; Perdomo, S.A. RGB and multispectral UAV image fusion for Gramineae weed detection in rice fields. Precis. Agric. 2018, 19, 809–822. [Google Scholar] [CrossRef]
- Bah, M.D.; Hafiane, A.; Canals, R.; Emile, B. Deep features and One-class classification with unsupervised data for weed detection in UAV images. In Proceedings of the 2019 9th International Conference on Image Processing Theory, Tools and Applications IPTA, Istanbul, Turkey, 6–9 November 2019; pp. 1–5. [Google Scholar] [CrossRef]
- Naveed, A.; Muhammad, W.; Irshad, M.J.; Aslam, M.J.; Manzoor, S.M.; Kauser, T.; Lu, Y. Saliency-Based Semantic Weeds Detection and Classification Using UAV Multispectral Imaging. IEEE Access 2023, 11, 11991–12003. [Google Scholar] [CrossRef]
- Chegini, H.; Beltran, F.; Mahanti, A. Designing and Developing a Weed Detection Model for California Thistle. ACM Trans. Internet Technol. 2023, 48, 29. [Google Scholar] [CrossRef]
- Xu, W.; Chen, P.; Zhan, Y.; Chen, S.; Zhang, L.; Lan, Y. Cotton yield estimation model based on machine learning using time series UAV remote sensing data. Int. J. Appl. Earth Obs. Geoinf. 2021, 104, 102511. [Google Scholar] [CrossRef]
- Nagothu, S.K.; Anitha, G.; Siranthini, B.; Anandi, V.; Prasad, P.S. Weed detection in agricultural crops using unmanned aerial vehicles and machine learning. Mater. Proc. 2023, in press. [Google Scholar] [CrossRef]
- Nasiri, A.; Omid, M.; Taheri-Garavand, A.; Jafari, A. Deep learning-based precision agriculture through weed recognition in sugar beet fields. Sustain. Comput. Inform. Syst. 2022, 35, 100759. [Google Scholar] [CrossRef]
- Ajayi, O.G.; Ashi, J. Effect of varying training epochs of a Faster Region-Based Convolutional Neural Network on the Accuracy of an Automatic Weed Classification Scheme. Smart Agric. Technol. 2022, 3, 100128. [Google Scholar] [CrossRef]
- Rahman, A.; Lu, Y.; Wang, H. Performance Evaluation of Deep Learning Object Detectors for Herbal Detection weeds for cotton. Smart Agric. Technol. 2022, 3, 100126. [Google Scholar] [CrossRef]
- Diao, Z.; Guo, P.; Zhang, B.; Yan, J.; He, Z.; Zhao, S.; Zhao, C.; Zhang, J. Navigation line extraction algorithm for corn spraying robot based on improved YOLOv8s network. Comput. Electron. Agric. 2023, 212, 108049. [Google Scholar] [CrossRef]
- Mekhalfa, F.; Yacef, F.; Belhocine, M. Pre-trained Deep Learning Models for UAV-based Weed Recognition. In Proceedings of the 2023 Signal Processing: Algorithms, Architectures, Arrangements, and Applications (SPA), Poznan, Poland, 20–22 September 2023; IEEE: Piscataway, NJ, USA, 2023. [Google Scholar] [CrossRef]
- Taha, M.F.; Abdalla, A.; ElMasry, G.; Gouda, M.; Zhou, L.; Zhao, N.; Liang, N.; Niu, Z.; Hassanein, A.; Al-Rejaie, S.; et al. Using Deep Convolutional Neural Network for Image-Based Diagnosis of Nutrient Deficiencies in Plants Grown in Aquaponics. Chemosensors 2022, 10, 45. [Google Scholar] [CrossRef]
- Fischer, H.; Romano, N.; Jones, J.; Howe, J.; Renukdas, N.; Sinha, A.K. Comparing water quality/bacterial composition and productivity of largemouth bass Micropterus salmoides juveniles in a recirculating aquaculture system versus aquaponics as well as plant growth/mineral composition with or without media. Aquaculture 2021, 538, 736554. [Google Scholar] [CrossRef]
- Barzin, R.; Lotfi, H.; Varco, J.J.; Bora, G.C. Machine Learning in Evaluating Multispectral Active Canopy Sensor for Prediction of Corn Leaf Nitrogen Concentration and Yield. Remote Sens. 2022, 14, 120. [Google Scholar] [CrossRef]
- Sathyavani, R.; JagnMohan, K.; Kalaavathi, B. Detection of plant leaf nutrients using convolutional neural network based Internet of Things data acquisition. Int. J. Nonlinear Anal. 2021, 2, 1175–1186. [Google Scholar] [CrossRef]
- Yang, T.; Kim, H.J. Characterizing Nutrient Composition and Concentration in Tomato-, Basil-, and Lettuce-Based Aquaponic and Hydroponic Systems. Water 2020, 12, 1259. [Google Scholar] [CrossRef]
- Sabzi, S.; Pourdarbani, R.; Rhoban, M.H.; Garccía-Mateos, G.; Arribas, J.I. Estimation of nitrogen content in cucumber plant (Cucumis sativus L.) leaves using hyperspectral imaging data with neural network and partial least squares regressions. Chemom. Intell. Lab. Syst. 2021, 217, 104404. [Google Scholar] [CrossRef]
- Zhang, L.; Niu, Y.; Han, W.; Liu, Z. Establishing Method of CropWater Stress Index Empirical Model of Field Maize. Nongye Jixie Xuebao/Trans. Chin. Soc. Agric. Mach. 2018, 49, 233–239. [Google Scholar] [CrossRef]
- Zhang, Z.; Bian, J.; Han, W.; Fu, Q.; Chen, S.; Cui, T. Diagnosis of Cotton Water Stress Using Unmanned Aerial Vehicle Thermal Infrared Remote Sensing after Removing Soil. Nongye Jixie Xuebao/Trans. Chin. Soc. Agric. Mach. 2018, 49, 250–260. [Google Scholar] [CrossRef]
- Li, Y.; Yan, H.; Cai, D.; Gu, T.; Sui, R.; Chen, D. Evaluating the water application uniformity of center pivot irrigation systems in Northern China. Int. Agric. Eng. J. 2018. [Google Scholar] [CrossRef]
- Bhandari, S.; Raheja, A.; Do, D.; Pham, F. Machine learning techniques for the assessment of citrus plant health using UAV-based digital images. In Proceedings of the Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping III, Orlando, FL, USA, 15–19 April 2018. [Google Scholar] [CrossRef]
- Sankararao, U.G.; Priyanka, G.; Rajalakshmi, P.; Choudhary, S. CNN Based Water Stress Detection in Chickpea Using UAV Based Hyperspectral Imaging. In Proceedings of the 2021 IEEE International India Geoscience and Remote Sensing Symposium (InGARSS), Ahmedabad, India, 6–10 December 2021; pp. 145–148. [Google Scholar] [CrossRef]
- Sankararao, U.G.; Rajalakshmi, P.; Kaliamoorthy, S.; Choudhary, S. Water Stress Detection in Pearl Millet Canopy with Selected Wavebands using UAV Based Hyperspectral Imaging and Machine Learning. In Proceedings of the IEEE Sensors Applications Symposium (SAS), Sundsvall, Sweden, 1–3 August 2022; pp. 1–6. [Google Scholar] [CrossRef]
- Tunca, E.; Köksal, E.S.; Taner, S.Ç. Calibrating UAV Thermal Sensors using Machine Learning Methods for Improved Accuracy in Agricultural Applications. Infrared Phys. Technol. 2023, 133, 104804. [Google Scholar] [CrossRef]
- Bertalan, L.; Holb, I.; Pataki, A.; Négyesi, G.; Szabó, G.; Szalóki, A.K.; Szabó, S. UAV-based multispectral and thermal cameras to predict soil water content—A machine learning approach. Comput. Electron. Agric. 2022, 200, 107262. [Google Scholar] [CrossRef]
- Niu, Y.; Han, W.; Zhang, H.; Zhang, L.; Chen, H. Estimating fractional vegetation cover of maize under water stress from UAV multispectral imagery using machine learning algorithms. Comput. Electron. Agric. 2021, 189, 106414. [Google Scholar] [CrossRef]
- Das, S.; Christopher, J.; Apan, A.; Choudhury, M.R.; Chapman, S.; Menzies, N.W.; Dang, Y.P. Evaluation of water status of wheat genotypes to aid prediction of yield on sodic soils using UAV-thermal imaging and machine learning. Agric. For. Meteorol. 2021, 307, 108477. [Google Scholar] [CrossRef]
- Wang, J.; Lou, Y.; Wang, W.; Liu, S.; Zhang, H.; Hui, X.; Wang, Y.; Yan, H.; Maes, W.H. A robust model for diagnosing water stress of winter wheat by combining UAV multispectral and thermal remote sensing. Agric. Water Manag. 2024, 291, 108616. [Google Scholar] [CrossRef]
- Sumesh, K.C.; Ninsawat, S.; Som-ard, J. Integration of RGB-based vegetation index, crop surface model and object-based image analysis approach for sugarcane yield estimation using unmanned aerial vehicle. Comput. Electron. Agric. 2021, 180, 105903. [Google Scholar] [CrossRef]
- Sanseechan, P.; Saengprachathanarug, K.; Posom, J.; Wongpichet, S.; Chea, C.; Wongphati, M. Use of vegetation indices in monitoring sugarcane white leaf disease symptoms in sugarcane field using multispectral UAV aerial imagery. IOP Conf. Ser. Earth Environ. Sci. 2019, 301, 12025. [Google Scholar] [CrossRef]
- Pan, Q.; Gao, M.; Wu, P.; Yan, J.; Li, S. A Deep-Learning-Based Approach for Wheat Yellow Rust Disease Recognition from Unmanned Aerial Vehicle Images. Sensors 2021, 21, 6540. [Google Scholar] [CrossRef]
- Wu, B.; Liang, A.; Zhang, H.; Zhu, T.; Zou, Z.; Yang, D.; Tang, W.; Li, J.; Su, J. Application of conventional UAV-based high-throughput object detection to the early diagnosis of pine wilt disease by deep learning. For. Ecol. Manag. 2021, 486, 15. [Google Scholar] [CrossRef]
- Selvaraj, M.G.; Vergara, A.; Montenegro, F.; Ruiz, H.A.; Safari, N.; Raymaekers, D.; Ocimati, W.; Ntamwira, J.; Tits, L.; Omondi, A.B.; et al. Detection of banana plants and their major diseases through aerial images and machine learning methods: A case study in DR Congo and Republic of Benin. ISPRS J. Photogramm. Remote Sens. 2020, 169, 110–124. [Google Scholar] [CrossRef]
- Amarasingam, N.; Gonzalez, F.; Salgadoe, A.S.A.; Sandino, J.; Powell, K. Detection of White Leaf Disease in Sugarcane Crops Using UAV-Derived RGB Imagery with Existing Deep Learning Models. Remote Sens. 2022, 14, 6137. [Google Scholar] [CrossRef]
- Yu, R.; Luo, Y.; Zhou, Q.; Zhang, X.; Wu, D.; Ren, L. Early detection of pine wilt disease using deep learning algorithms and UAV-based multispectral imagery. For. Ecol. Manag. 2021, 497, 119493. [Google Scholar] [CrossRef]
- Shi, Y.; Han, L.; Kleerekoper, A.; Chang, S.; Hu, T. Novel CropdocNet Model for Automated Potato Late Blight Disease Detection from Unmanned Aerial Vehicle-Based Hyperspectral Imagery. Remote Sens. 2022, 14, 396. [Google Scholar] [CrossRef]
- Kerkech, M.; Hafiane, A.; Canals, R. VddNet: Vine Disease Detection Network Based on Multispectral Images and Depth Map. Remote Sens. 2020, 12, 3305. [Google Scholar] [CrossRef]
- Delgado, C.; Benitez, H.; Cruz, M.; Selvaraj, M. Digital Disease Phenotyping. In Proceedings of the IGARSS 20–9—2019 IEEE International Geoscience and Remote Sensing Symposium, Yokohama, Japan, 28 July–2 August 2019; pp. 5702–5705. [Google Scholar] [CrossRef]
- Khan, F.S.; Khan, S.; Mohd, M.N.H.; Waseem, A.; Khan, M.N.A.; Ali, S.; Ahmed, R. Federated learning-based UAVs for the diagnosis of Plant Diseases. In Proceedings of the International Conference on Engineering and Emerging Technologies (ICEET), Kuala Lumpur, Malaysia, 27–28 October 2022; pp. 1–6. [Google Scholar] [CrossRef]
- Oide, A.H.; Nagasaka, Y.; Tanaka, K. Performance of machine learning algorithms for detecting pine wilt disease infection using visible color imagery by UAV remote sensing. Remote Sens. Appl. Soc. Environ. 2022, 28, 100869. [Google Scholar] [CrossRef]
- Deng, J.; Zhang, X.; Yang, Z.; Zhou, C.; Wang, R.; Zhang, K.; Lv, X.; Yang, L.; Wang, Z.; Li, P.; et al. Pixel-level regression for UAV hyperspectral images: Deep learning-based quantitative inverse of wheat stripe rust disease index. Comput. Electron. Agric. 2023, 215, 108434. [Google Scholar] [CrossRef]
- Casas, E.; Arbelo, M.; Moreno-Ruiz, J.A.; Hernández-Leal, P.A.; Reyes-Carlos, J.A. UAV-Based Disease Detection in Palm Groves of Phoenix canariensis Using Machine Learning and Multispectral Imagery. Remote Sens. 2023, 15, 3584. [Google Scholar] [CrossRef]
- Amorim, W.P.; Tetila, E.C.; Pistori, H.; Papa, J.P. Semi-supervised learning with convolutional neural networks for UAV images automatic recognition. Comput. Electron. Agric. 2019, 164, 104932. [Google Scholar] [CrossRef]
- Brodbeck, C.; Sikora, E.; Delaney, D.; Pate, G.; Johnson, J. Using Unmanned Aircraft Systems for Early Detection of Soybean Diseases. Precis. Agric. 2017, 8, 802–806. [Google Scholar] [CrossRef]
- da Silva, F.L.; Sella, M.L.G.; Francoy, T.M.; Costa, A.H.R. Evaluating classification and feature selection techniques for honeybee subspecies identification using wing images. Comput. Electron. Agric. 2015, 114, 68–77. [Google Scholar] [CrossRef]
- Duarte, A.; Borralho, N.; Caetano, M. A Machine Learning Approach to Detect Dead Trees Caused by Longhorned Borer in Eucalyptus Stands Using UAV Imagery. In Proceedings of the 2021 IEEE International Geoscience and Remote Sensing Symposium IGARSS, Brussels, Belgium, 11–16 July 2021; pp. 5818–5821. [Google Scholar] [CrossRef]
- Tetila, E.C.; Machado, B.B.; Astolfi, G.; Belete, N.A.S.; Amorim, W.P.; Roel, A.R.; Pistori, H. Detection and classification of soybean pests using deep learning with UAV images. Comput. Electron. Agric. 2020, 179, 105836. [Google Scholar] [CrossRef]
- Retallack, A.; Finlayson, G.; Ostendorf, B.; Lewis, M. Using deep learning to detect an indicator arid shrub in ul-tra-high-resolution UAV imagery. Ecol. Indic. 2022, 145, 109698. [Google Scholar] [CrossRef]
- Li, X.; Chen, J.; He, Y.; Yang, G.; Li, Z.; Tao, Y.; Li, Y.; Li, Y.; Huang, L.; Feng, X. High-through counting of Chinese cabbage trichomes based on deep learning and trinocular stereo microscope. Comput. Electron. Agric. 2023, 212, 108134. [Google Scholar] [CrossRef]
- Lin, Q.; Huang, H.; Wang, J.; Chen, L.; Du, H.; Zhou, G. Early detection of pine shoot beetle attack using the vertical profile of plant traits through UAV-based hyperspectral, thermal, and lidar data fusion. Int. J. Appl. Earth Obs. Geoinf. 2023, 125, 103549. [Google Scholar] [CrossRef]
- Clevers, J.G.P.W.; Kooistra, L.; van den Brande, M.M.M. Using Sentinel-2 Data for Retrieving LAI and Leaf and Canopy Chlorophyll Content of a Potato Crop. Remote Sens. 2017, 9, 405. [Google Scholar] [CrossRef]
- Towers, P.C.; Strever, A.; Poblete-Echeverría, C. Comparison of Vegetation Indices for Leaf Area Index Estimation in Vertical Shoot Positioned Vine Canopies with and without Grenbiule Hail-Protection Netting. Remote Sens. 2019, 11, 1073. [Google Scholar] [CrossRef]
- Vélez, S.; Barajas, E.; Rubio, J.A.; Vacas, R.; Poblete-Echeverría, C. Effect of Missing Vines on Total Leaf Area Determined by NDVI Calculated from Sentinel Satellite Data: Progressive Vine Removal Experiments. Appl. Sci. 2020, 10, 3612. [Google Scholar] [CrossRef]
- Guo, H.; Xiao, Y.; Li, M.; Hao, F.; Zhang, X.; Sun, H.; Beurs, K.; Fu, Y.H.; He, Y. Identifying crop phenology using maize height constructed from multi-sources images. Int. J. Appl. Earth Obs. Geoinf. 2022, 115, 103121. [Google Scholar] [CrossRef]
- Xu, B.; Fan, J.; Chao, J.; Arsenijevic, N.; Werle, R.; Zhang, Z. Instance segmentation method for weed detection using UAV imagery in soybean fields. Comput. Electron. Agric. 2023, 211, 107994. [Google Scholar] [CrossRef]
- Ilniyaz, O.; Du, Q.; Shen, H.; He, W.; Feng, L.; Azadi, H.; Kurban, A.; Chen, X. Leaf area index estimation of pergola-trained vineyards in arid regions using classical and deep learning methods based on UAV-based RGB images. Comput. Electron. Agric. 2023, 207, 107723. [Google Scholar] [CrossRef]
- Peng, M.; Han, W.; Li, C.; Yao, X.; Shao, G. Modeling the daytime net primary productivity of maize at the canopy scale based on UAV multispectral imagery and machine learning. J. Clean. Prod. 2022, 367, 133041. [Google Scholar] [CrossRef]
- Barbosa, B.D.S.; Ferraz, G.A.E.S.; Costa, L.; Ampatzidis, Y.; Vijayakumar, V.; Santos, L.M.D. UAV-based coffee yield prediction utilizing feature selection and deep learning. Smart Agric. Technol. 2021, 1, 100010. [Google Scholar] [CrossRef]
- Alabi, T.R.; Abebe, A.T.; Chigeza, G.; Fowobaje, K.R. Estimation of soybean grain yield from multispectral high-resolution UAV data with machine learning models in West Africa. Remote Sens. Appl. Soc. Environ. 2022, 27, 100782. [Google Scholar] [CrossRef]
- Teshome, F.T.; Bayabil, H.K.; Hoogenboom, G.; Schaffer, B.; Singh, A.; Ampatzidis, Y. Unmanned aerial vehicle (UAV) imaging and machine learning applications for plant phenotyping. Comput. Electron. Agric. 2023, 212, 108064. [Google Scholar] [CrossRef]
- Ariza-Sentís, M.; Valente, J.; Kooistra, L.; Kramer, H.; Mücher, S. Estimation of spinach (Spinacia oleracea) seed yield with 2D UAV data and deep learning. Smart Agric. Technol. 2022, 3, 100129. [Google Scholar] [CrossRef]
- Niu, B.; Feng, Q.; Chen, B.; Ou, C.; Liu, Y.; Yang, J. HSI-TransUNet: A Segmentation Model semantics based in transformer for crop mapping from UAV hyperspectral images. Comput. Electron. Agric. 2022, 201, 107297. [Google Scholar] [CrossRef]
- Pandey, A.; Jain, K. An intelligent system for crop identification and classification from UAV images using conjugated dense convolutional neural network. Comput. Electron. Agric. 2021, 192, 106543. [Google Scholar] [CrossRef]
- Vong, N.; Conway, L.S.; Feng, A.; Zhou, J.; Kitchen, N.R.; Sudduth, K.A. Estimating and Mapping Corn Emergence Uniformity using UAV imagery and deep learning. Comput. Electron. Agric. 2022, 198, 107008. [Google Scholar] [CrossRef]
- Chen, R.; Zhang, C.; Xu, B.; Zhu, Y.; Zhao, F.; Han, S.; Yang, G.; Yang, H. Predicting Individual Apple Yield using sensing data remote from multiple UAV sources and ensemble learning. Comput. Electron. Agric. 2022, 201, 107275. [Google Scholar] [CrossRef]
- Wang, H.; Feng, J.; Yin, H. Improved Method for Apple Fruit Target Detection Based on YOLOv5s. Agriculture 2023, 13, 2167. [Google Scholar] [CrossRef]
- Xu, X.; Wang, L.; Liang, X.; Zhou, L.; Chen, Y.; Feng, P.; Yu, H.; Ma, Y. Maize Seedling Leave Counting Based on Semi-Supervised Learning and UAV RGB Images. Sustainability 2023, 15, 9583. [Google Scholar] [CrossRef]
- Feng, Y.; Chen, W.; Ma, Y.; Zhang, Z.; Gao, P.; Lv, X. Cotton Seedling Detection and Counting Based on UAV Multispectral Images and Deep Learning Methods. Remote Sens. 2023, 15, 2680. [Google Scholar] [CrossRef]
- Tunca, E.; Köksal, E.S.; Özturk, E.; Akayc, H.; Taner, S.Ç. Accurate leaf area index estimation in sorghum using high-resolution UAV data and machine learning models. Phys. Chem. Earth Pt A/B/C 2024, 133, 103537. [Google Scholar] [CrossRef]
- Ma, J.; Liu, B.; Ji, L.; Zhu, Z.; Wu, Y.; Jiao, W. Field-scale yield prediction of winter wheat under different irrigation regimes based on the dynamic fusion of multimodal UAV imagery. Int. J. Appl. Earth Obs. Geoinf. 2023, 118, 103292. [Google Scholar] [CrossRef]
- Liu, S.; Jin, X.; Bai, Y.; Wu, W.; Cui, N.; Cheng, M.; Liu, Y.; Meng, L.; Jia, X.; Nie, C.; et al. UAV multispectral images for accurate estimation of the maize LAI considering the effect of soil background. Int. J. Appl. Earth Obs. Geoinf. 2023, 121, 103383. [Google Scholar] [CrossRef]
- Demir, S.; Dedeoğlu, M.; Başayiğit, L. Yield prediction models of organic oil rose farming with agricultural unmanned aerial vehicles (UAVs) images and machine learning algorithms. Remote Sens. Soc. Environ. 2023, 33, 101131. [Google Scholar] [CrossRef]
- Jamali, M.; Bakhshandeh, E.; Yeganeh, B.; Özdoğan, M. Development of machine learning models for estimating wheat bio-physical variables using satellite-based vegetation indices. Adv. Space Res. 2024, 73, 498–513. [Google Scholar] [CrossRef]
- Qu, H.; Zheng, C.; Ji, H.; Barai, K.; Zhang, Y. A fast and efficient approach to estimate wild blueberry yield using machine learning with drone photography: Flight altitude, sampling method and model effects. Comput. Electron. Agric. 2024, 216, 108543. [Google Scholar] [CrossRef]
- Sivakumar, A.N.V.; Li, J.; Scott, S.; Psota, E.; Jhala, A.J.; Luck, J.D.; Shi, Y. Comparison of object detection and patch-based classification deep learning models on mid-to late-season weed detection in UAV imagery. Remote Sens. 2020, 12, 1591. [Google Scholar] [CrossRef]
- Ghazali, W.N.W.B.; Zulkifli, C.N.B.; Ponrahono, Z. The Effect of Traffic Congestion on Quality of Community Life. ICRP 2019, 2, 759–766. [Google Scholar] [CrossRef]
- Jiber, M.; Mbarek, A.; Yahyaouy, A.; Sabri, M.A.; Boumhidi, J. Road Traffic Prediction Model Using Extreme Learning Machine: The Case Study of Tangier, Morocco. Information 2020, 11, 542. [Google Scholar] [CrossRef]
- Patro, K.K.; Allam, J.P.; Hammad, M.; Tadeusiewicz, R.; Pławiak, P. SCovNet: A skip connection-based feature union deep learning technique with statistical approach analysis for the detection of COVID-19. Biocybern. Biomed. Eng. 2023, 43, 352–368. [Google Scholar] [CrossRef] [PubMed]
- Pedada, K.R.; Rao, B.; Patro, K.K.; Allam, J.P.; Jamjoom, M.M.; Samee, N.A. A novel approach for brain tumour detection using deep learning based technique. Biomed. Signal Process. Control 2023, 82, 104549. [Google Scholar] [CrossRef]
- Shashirangana, J.; Padmasiri, H.; Meedeniya, D.; Perera, C.; Nayak, S.R.; Nayak, J.; Vimal, S.; Kadry, S. License plate recognition using neural architecture search for edge devices. Int. J. Intell. Syst. 2021, 37, 10211–10248. [Google Scholar] [CrossRef]
- Padmasiri, H.; Shashirangana, J.; Meedeniya, D.; Rana, O.; Perera, C. Automated License Plate Recognition for Resource-Constrained Environments. Sensors 2022, 22, 1434. [Google Scholar] [CrossRef] [PubMed]
- Mushtaq, M.; Akram, M.U.; Alghamdi, N.S.; Fatima, J.; Masood, R.F. Localization and Edge-Based Segmentation of Lumbar Spine Vertebrae to Identify the Deformities Using Deep Learning Models. Sensors 2022, 22, 1547. [Google Scholar] [CrossRef]
- Khatab, E.; Onsy, A.; Abouelfarag, A. Evaluation of 3D Vulnerable Objects’ Detection Using a Multi-Sensors System for Autonomous Vehicles. Sensors 2022, 22, 1663. [Google Scholar] [CrossRef]
- Fan, X.; Sun, T.; Chai, X.; Zhou, J. YOLO-WDNet: A lightweight and accurate model for weeds detection in cotton field. Comput. Electron. Agric. 2024, 225, 1093617. [Google Scholar] [CrossRef]
- de Oliveira, H.F.E.; de Castro, L.E.V.; Sousa, C.M.; Alves Júnior, L.R.; Mesquita, M.; Silva, J.A.O.S.; Faria, L.C.; da Silva, M.V.; Giongo, P.R.; de Oliveira Júnior, J.F.; et al. Geotechnologies in Biophysical Analysis through the Applicability of the UAV and Sentinel-2A/MSI in Irrigated Area of Common Beans: Accuracy and Spatial Dynamics. Remote Sens. 2024, 16, 1254. [Google Scholar] [CrossRef]
- de Melo, D.A.; Silva, P.C.; da Costa, A.R.; Delmond, J.G.; Ferreira, A.F.A.; de Souza, J.A.; de Oliveira-Júnior, J.F.; da Silva, J.L.B.; da Rosa Ferraz Jardim, A.M.; Giongo, P.R.; et al. Development and Automation of a Photovoltaic-Powered Soil Moisture Sensor for Water Management. Hydrology 2023, 10, 166. [Google Scholar] [CrossRef]
- Valverde-l, F.; Prados, J. Prevalence of Sarcopenia Determined by Computed Tomography in Pancreatic Cancer: A Systematic Review and Meta-Analysis of Observational Studies. Cancers 2024, 16, 3356. [Google Scholar] [CrossRef]
- Barsouk, A.; Elghawy, O.; Yang, A.; Sussman, J.H.; Mamtani, R.; Mei, L. Meta-Analysis of Age, Sex, and Race Disparities in the Era of Contemporary Urothelial Carcinoma Treatment. Cancers 2024, 16, 3338. [Google Scholar] [CrossRef] [PubMed]
- Pesch, M.H.; Mowers, J.; Huynh, A.; Schleiss, M.R. Intrauterine Fetal Demise, Spontaneous Abortion and Congenital Cytomegalovirus: A Systematic Review of the Incidence and Histopathologic Features. Viruses 2024, 16, 1552. [Google Scholar] [CrossRef] [PubMed]
- Benster, L.L.; Stapper, N.; Rodriguez, K.; Daniels, H.; Villodas, M.; Weissman, C.R.; Daskalakis, Z.J.; Appelbaum, L.G. Brain Sciences Developmental Predictors of Suicidality in Schizophrenia: A Systematic Review. Brain Sci. 2024, 14, 995. [Google Scholar] [CrossRef] [PubMed]
- Simione, L.; Frolli, A.; Scia, F.; Chiarella, S.G. Mindfulness-Based Interventions for People with Autism Spectrum Disorder: A Systematic Literature Review. Brain Sci. 2024, 14, 1001. [Google Scholar] [CrossRef]
Publication Year | Publication Number (n) | Bibliographic Database |
---|---|---|
2018 | 2 | IEEE 2 (n = 2) |
2019 | 3 | IEEE (n = 1); MDPI 3 (n = 1); Web of Science 4 (n = 1) |
2020 | 5 | IEEE (n = 1); Science Direct (n = 2); MDPI (n = 1); Web of Science (n = 1) |
2021 | 13 | IEEE (n = 2); Science Direct (n = 8); MDPI (n = 2); Web of Science (n = 1) |
2022 | 20 | ACM 1 (n = 1); IEEE (n = 2); Science Direct (n = 12); MDPI (n = 5) |
2023 | 22 | IEEE (n = 2); Science Direct (n = 15); MDPI (n = 5) |
2024 | 5 | Science Direct (n = 5) |
Total | 70 |
Id | Ref. | Crop | LT 4 | MTD/TNQ 3 | Sensor Type | Metrics | Precision |
---|---|---|---|---|---|---|---|
1 | Fraccaro et al. [38] | Wheat | SP 1 | UNET-ResNet | RGB 9 | r 5, Acc | 90.0% |
2 | Bah et al. [39] | - | SP | CNN | RGB | Acc | 93.58% |
3 | Huang et al. [40] | Rice | SP | Back Propagation, RF 10, AlexNet, VGGNet, GoogleNet and ResNet | RGB | Acc | 80.2% |
4 | Beehary and Bassoo [41] | Soybean | SP | ANN and AlexNet | RGB | Acc | 99.81% |
5 | Reedha et al. [42] | Spinach | SP | EfficientNet and ResNet | RGB | r | 98.63% |
6 | Bah et al. [49] | Spinach/Bean | NSP 2 | CNN | RGB | Acc | 1.50% and 6% |
7 | Genze et al. [43] | Sorghum | SP | UNET-ResNet | RGB | Houd-out test | 89% |
8 | Gallo et al. [44] | Sugar beet | SP | YOLOv7 | RGB | mAP | 74% |
9 | Ajayi et al. [45] | Spinach/Sugarcane | SP | CNN and YOLOv5 | RGB | Acc | 78%, recall of 65 |
10 | Pei et al. [46] | Corn | SP | YOLOv4 | RGB | mAP | 87% |
11 | Su et al. [47] | Wheat | SP | RF | Multispectral | Acc | 94% |
12 | Barrero and Perdomo [48] | Rice | SP | NN 8 | RGB and Multispectral | M/MGT(%) and MP(%) 7 | 80 to 108% |
13 | Naveed et al. [50] | - | SP | Neural modulation Network (PC/BC-DIM) | Multispectral | Acc | 94% |
14 | Chegini et al. [51] | Pasture | SP | Mask RCNN | Multispectral | mAP and Acc | 93% and 95% |
15 | Xu et al. [52] | Soybean | SP | ResNet101_v and DSASPP | RGB | Acc | 91% |
16 | Nagothu et al. [53] | Cotton | SP | SSD Mobilenet | RGB and Multispectral | Acc | 95% |
17 | Nasiri et al. [54] | Sugar beet | SP | U-Net and CNN | RGB | IoU 6 | The scores of 96.06% and 84.23%, respectively. |
18 | Ajayi and Ashi [55] | Sugarcane, banana, spinach and pepper | SP | RCNN | RGB | Acc | 97%, recall of 99%, and F1 score of 99% on 242.000 epochs. |
19 | Rahman et al. [56] | Cotton | SP | YOLOv5, RetinaNet, EfficientDet, Fast RCNN, and Faster RCNN | RGB | Acc 11 and mAP 12 | 79.98% and [email protected] |
20 | Diao et al. [57] | Corn | SP | YOLOv8s | RGB | mAP and F1 13 | 86,4% and 86% |
21 | Mekhalfa et al. [58] | Soybean | SP | AlexNet, VGG16, GoogLeNet, ResNet50, SqueezeNet and MobileNet | RGB | Acc | 98% |
Id | Ref. | Crop | LT | MTD/TNQ | Sensor Type | Metrics | Precision |
---|---|---|---|---|---|---|---|
1 | Barzin et al. [61] | Corn | SP | SVR 5, RF, GBM 6, XGBoost 4 | Multispectral | r and MAPE 2 | 75% and 4.4%, respectively |
2 | Sathyavani et al. [62] | Tomato/Pepper | SP | CNN 3 | RGB | Acc | 79.11% |
3 | Sabzi et al. [64] | Cucumber | SP | CNN | Hyperspectral | R 1 | 96.50% |
Id | Ref. | Crop | LT | MTD/TNQ | Sensor Type | Metrics | Precision |
---|---|---|---|---|---|---|---|
1 | Bhandari et al. [68] | Lettuce | SP | CNN | RGB and Multispectral | R | 62.30% |
2 | Sankararao et al. [69] | Millet | NSP | SVM | Hyperspectral | Acc | 81% |
3 | Sankararao et al. [70] | Chickpea | SP | CNN | Hyperspectral | Acc | 95.44% |
4 | Tunca et al. [71] | - | SP | RF, SVM 1, KNN 2 and XGBoost | RGB | r | r = 89% to 96% Micasense Altum and 87% to 94% (FDP-R). |
5 | Bertalan et al. [72] | Corn | SP | RF, ENR 3, GLM 4 and RLM 5 | Thermal and Multispectral | r and NRMSE | r = 97% vs. 71%, NRMSE = 10 vs. 25%, respectively, RF (r = 97%) and ENR (r = 88%). |
6 | Niu et al. [73] | Corn | SP | RF, ANN 6 and MLR 7 | RGB and Multispectral | r and NRMSE | FVC in corn (r of 89.2% and RMSE = 6.6%) |
7 | Das et al. [74] | Wheat | SP | CRT 8 | Thermal | r, RMSE 9 and NRMSE 10 | r = 86%; RMSE = 41.3 g/m2, r = 75%; RMSE = 47.7 g/m2, grain yield, where r = 78%; RMSE = 16.7 g/m2, r = 69%; RMSE = 23.2 g/m/2. |
8 | Wang et al. [75] | Wheat | SP | PLS 11, SVM and GBDT 12 | Multispectral and Thermal | r, RMSE and NRMSE | r = 88%, RMSE = 8%, NRMSE = 14.7%, and filling phase with r = 90%, RMSE = 5% and NRMSE = 15.9%. |
Id | Ref. | Crop | LT | MTD/TNQ | Sensor Type | Metrics | Precision |
---|---|---|---|---|---|---|---|
1 | Pan et al. [78] | Wheat | SP | PSPNet 2 | RGB | Acc | 98% |
2 | Wu et al. [79] | Pine | SP | Faster R-CNN 6 and YOLO 3 | RGB | Acc | 78% |
3 | Selvaraj et al. [80] | Banana | SP | RF and RetinaNet | RGB | Acc | Banana bunchy top disease (99.40%), Xanthomonas Wilt of Banana (92.80%), healthy banana cluster (93.30%), and individual banana plants (90.80%). |
4 | Amarasingam et al. [81] | Sugarcane | SP | YOLOv5, YOLOR 4, DETR 5 and Faster R-CNN | RGB | Acc | 95% |
5 | Yu et al. [82] | Pine | SP | Faster R-CNN and YOLOv4 | Multispectral | Acc | 66.70% (Faster R-CNN) and 63.55% (YOLOv4) |
6 | Shi et al. [83] | Potato | SP | CropdocNet | Hyperspectral | Acc | 98% |
7 | Kerkech et al. [84] | Grape | SP | VddNet 1 | Multispectral | Acc | 92% of vine-level detection and 87% of leaf level |
8 | Shankar et al. [9] | - | SP | ANN | RGB | max(R,G,B) + min(R,G,B)/2 | >25% in initial stage |
9 | Delgado et al. [85] | Rice | NSP | SVM and RF | Multispectral | r | 74% (SVM) versus 71% (RF) |
10 | Khan et al. [86] | - | SP | EfficientNet | RGB | Acc | 99.55% |
11 | Oide et al. [87] | Pine | SP | SVM, RF, ANN | RGB | Acc | 99.50% |
12 | Deng et al. [88] | Wheat | SP | UNET, HRNet_W48 | Hyperspectral | r, MSE | r = 87.5% and MSE = 1.29% |
13 | Casas et al. [89] | Palm tree | SP | SVM, ANN, and RF | Multispectral | Acc | 96% |
Id | Ref. | Crop | LT | MTD/TNQ | Sensor Type | Metrics | Precision |
---|---|---|---|---|---|---|---|
1 | Duarte et al. [93] | Eucalyptus | SP | SVM and RF | Multispectral | Acc | RF = 98.35%, SVM = 97.7% |
2 | Tetila et al. [94] | Soybean | SP | Inception-v3, Resnet-50, VGG-16 1, VGG-19, Xception | RGB | Acc | 93.82% |
3 | Retallack et al. [95] | Pasture | SP | CNN | RGB | Acc | 75% |
4 | Li et al. [96] | Cabbage | SP | YOLOv8 | RGB | AP50 | 94.40% |
5 | Lin et al. [97] | Pine | SP | RF | Hyperspectral and thermal | R2 and RMSE | R2 = 95% and RMSE = 1.15 µg/cm |
Id | Ref. | Crop | LT | MTD/TNQ | Sensor Type | Metrics | Precision |
---|---|---|---|---|---|---|---|
1 | Guo et al. [101] | Corn | SP | SLM 1 and HANTS | Multispectral and RGB | r | 93%. |
2 | Xu et al. [102] | Cotton | SP | U-Net, Back Propagation | Multispectral | r | 85.3% |
3 | Ilniyaz et al. [103] | Grape | SP | CNN, ResNet | Multispectral and RGB | r and RMSE | r = 89,8% and RMSE = 43.4% |
4 | Peng et al. [104] | Corn | SP | RF, SVR, GBR 3 | Multispectral | r | 89.9% |
5 | Barbosa et al. [105] | Coffee | SP | SVM, RF, GBR, PLSR 2 | RGB | MAPE | 31.75% |
6 | Alabi et al. [106] | Soybean | SP | Cubist, XGBoost, GBM, SVM, and RF | Multispectral | r | 89% |
7 | Teshome et al. [107] | Corn | SP | SVM, RF, KNN, GLMNET | Multispectral | r, d 5 and MAPE | d = 99%; r = 99%; MAPE = 5 cm |
8 | Ariza-Sentís et al. [108] | Spinach | SP | Mask R-CNN | RGB | r | 80% |
9 | Niu et al. [109] | - | SP | TransUNet | Hyperspectral | Acc | 86.05% |
10 | Pandey and Jain [110] | - | SP | CD-CNN 4 | RGB | Acc | 96.20% |
11 | Vong et al. [111] | Corn | SP | CNN ResNet18 | RGB | Acc | 97%, 73% and 95% |
12 | Chen et al. [112] | Apple | SP | KNN, SVR | Multispectral | Acc, r | Acc = 75.80% and r = 81.3% |
13 | Wang et al. [113] | Apple | SP | YOLOv5s | RGB | Acc, mAP, recall | Acc = 95.4%, mAP = 86.1% and recall = 91.8% |
14 | Xu et al. [114] | Corn | SSP | SOLOv2 and YOLOv5x | RGB | mAP | 93.6%, 89.6% and 57.4% |
15 | Feng et al. [115] | Cotton | SP | YOLOv7, YOLOv5 and CenterNet | Multispectral | r, RMSE, and RRMSE 13 | 0.94, 3.83 and 2.72%, respectively |
16 | Tunca et al. [116] | Sorghum | SP | K-NN, ETR 6, XGBoost, RF and SVR | Multispectral and thermal | r, RMSE, and MAE | 97%, 46% and 19.7%, respectively |
17 | Ma et al. [117] | Wheat | SP | MultimodalNet | Multispectral and thermal | r and MAE | r = 74.11% and MAE = 6.05%. |
18 | Liu et al. [118] | Corn | SP | RF and GBDT | Multispectral | r and RRMSE | r = 94% and rRMSE = 9.35% in the leaf stage V14 |
19 | Demir et al. [119] | Rose | SP | MLR 7, MARS 8, CHAID 9, ExCHAID 10, CART 11, RF and RNA | RGB | r | r = 90.7% (MARS), r = 88.8% (ExCHAID), r = 93.1% (CART 1) and r = 90.9% (RF1). |
20 | Jamali et al. [120] | Wheat | SP | DNN 12,ANN, SVM and MLR | Multispectral | r, RMSE, and MAE | Plant height: r = 82%, 66%, 71% and 53%; RMSE = 9.61 cm, 17.54 cm, 16.26 cm, and 17.70 cm; MAE = 7.13 cm, 14.91 cm, 14.37 cm, and 14.83 cm |
21 | Qu et al. [121] | Blueberry | SP | RF and XGBoost | RGB | r, RMSE, and MAE | r = 89%, RMSE = 542 g/m2 and MAE = 380 g/m2 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Silva, J.A.O.S.; Siqueira, V.S.d.; Mesquita, M.; Vale, L.S.R.; Silva, J.L.B.d.; Silva, M.V.d.; Lemos, J.P.B.; Lacerda, L.N.; Ferrarezi, R.S.; Oliveira, H.F.E.d. Artificial Intelligence Applied to Support Agronomic Decisions for the Automatic Aerial Analysis Images Captured by UAV: A Systematic Review. Agronomy 2024, 14, 2697. https://doi.org/10.3390/agronomy14112697
Silva JAOS, Siqueira VSd, Mesquita M, Vale LSR, Silva JLBd, Silva MVd, Lemos JPB, Lacerda LN, Ferrarezi RS, Oliveira HFEd. Artificial Intelligence Applied to Support Agronomic Decisions for the Automatic Aerial Analysis Images Captured by UAV: A Systematic Review. Agronomy. 2024; 14(11):2697. https://doi.org/10.3390/agronomy14112697
Chicago/Turabian StyleSilva, Josef Augusto Oberdan Souza, Vilson Soares de Siqueira, Marcio Mesquita, Luís Sérgio Rodrigues Vale, Jhon Lennon Bezerra da Silva, Marcos Vinícius da Silva, João Paulo Barcelos Lemos, Lorena Nunes Lacerda, Rhuanito Soranz Ferrarezi, and Henrique Fonseca Elias de Oliveira. 2024. "Artificial Intelligence Applied to Support Agronomic Decisions for the Automatic Aerial Analysis Images Captured by UAV: A Systematic Review" Agronomy 14, no. 11: 2697. https://doi.org/10.3390/agronomy14112697
APA StyleSilva, J. A. O. S., Siqueira, V. S. d., Mesquita, M., Vale, L. S. R., Silva, J. L. B. d., Silva, M. V. d., Lemos, J. P. B., Lacerda, L. N., Ferrarezi, R. S., & Oliveira, H. F. E. d. (2024). Artificial Intelligence Applied to Support Agronomic Decisions for the Automatic Aerial Analysis Images Captured by UAV: A Systematic Review. Agronomy, 14(11), 2697. https://doi.org/10.3390/agronomy14112697