Drone-Based Smart Weed Localization from Limited Training Data and Radiometric Calibration Parameters †
Abstract
:1. Introduction
2. Method
3. Results and Discussion
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Food and Agriculture Organization of the United Nations. Plant Production and Protection Division: Weeds. Available online: https://www.fao.org/agriculture/crops/thematic-sitemap/theme/biodiversity/weeds/en/ (accessed on 12 August 2023).
- Monteiro, A.; Santos, S. Sustainable Approach to Weed Management: The Role of Precision Weed Management. Agronomy 2022, 12, 118. [Google Scholar] [CrossRef]
- Soloneski, S.; Larramendy, M. Weed and Pest Control: Conventional and New Challenges; IntechOpen: London, UK, 2013; ISBN 978-953-51-0984-6. [Google Scholar]
- Khoshboresh-Masouleh, M.; Akhoondzadeh, M. Improving Weed Segmentation in Sugar Beet Fields Using Potentials of Multispectral Unmanned Aerial Vehicle Images and Lightweight Deep Learning. J. Appl. Remote Sens. 2021, 15, 034510. [Google Scholar] [CrossRef]
- Roslim, M.H.M.; Juraimi, A.S.; Che’Ya, N.N.; Sulaiman, N.; Manaf, M.N.H.A.; Ramli, Z.; Motmainna, M. Using Remote Sensing and an Unmanned Aerial System for Weed Management in Agricultural Crops: A Review. Agronomy 2021, 11, 1809. [Google Scholar] [CrossRef]
- Khoshboresh-Masouleh, M.; Shah-Hosseini, R. Uncertainty Estimation in Deep Meta-Learning for Crop and Weed Detection from Multispectral UAV Images. In Proceedings of the 2022 IEEE Mediterranean and Middle-East Geoscience and Remote Sensing Symposium (M2GARSS), Istanbul, Turkey, 7–9 March 2022; pp. 165–168. [Google Scholar]
- Khoshboresh-Masouleh, M.; Shah-Hosseini, R. Multimodal Few-Shot Target Detection Based on Uncertainty Analysis in Time-Series Images. Drones 2023, 7, 66. [Google Scholar] [CrossRef]
- Khoshboresh-Masouleh, M.; Shah-Hosseini, R. Real-Time Multiple Target Segmentation with Multimodal Few-Shot Learning. Front. Comput. Sci. 2022, 4, 1062792. [Google Scholar] [CrossRef]
- Badrinarayanan, V.; Kendall, A.; Cipolla, R. SegNet: A Deep Convolutional Encoder-Decoder Architecture for Image Segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 39, 2481–2495. [Google Scholar] [CrossRef]
- Guo, M.-H.; Xu, T.-X.; Liu, J.-J.; Liu, Z.-N.; Jiang, P.-T.; Mu, T.-J.; Zhang, S.-H.; Martin, R.R.; Cheng, M.-M.; Hu, S.-M. Attention Mechanisms in Computer Vision: A Survey. Comp. Vis. Media 2022, 8, 331–368. [Google Scholar] [CrossRef]
- Smith, G.M.; Milton, E.J. The Use of the Empirical Line Method to Calibrate Remotely Sensed Data to Reflectance. Int. J. Remote Sens. 1999, 20, 2653–2662. [Google Scholar] [CrossRef]
- Olsson, P.-O.; Vivekar, A.; Adler, K.; Garcia Millan, V.E.; Koc, A.; Alamrani, M.; Eklundh, L. Radiometric Correction of Multispectral UAS Images: Evaluating the Accuracy of the Parrot Sequoia Camera and Sunshine Sensor. Remote Sens. 2021, 13, 577. [Google Scholar] [CrossRef]
- Sa, I.; Popović, M.; Khanna, R.; Chen, Z.; Lottes, P.; Liebisch, F.; Nieto, J.; Stachniss, C.; Walter, A.; Siegwart, R. WeedMap: A Large-Scale Semantic Weed Mapping Framework Using Aerial Multispectral Imaging and Deep Neural Network for Precision Farming. Remote Sens. 2018, 10, 1423. [Google Scholar] [CrossRef]
- Use of Calibrated Reflectance Panels For MicaSense Data. Available online: https://support.micasense.com/hc/en-us/articles/115000765514-Use-of-Calibrated-Reflectance-Panels-For-MicaSense-Data (accessed on 12 August 2023).
- Khoshboresh-Masouleh, M.; Alidoost, F.; Arefi, H. Multiscale Building Segmentation Based on Deep Learning for Remote Sensing RGB Images from Different Sensors. J. Appl. Remote Sens. 2020, 14, 034503. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Khoshboresh-Masouleh, M.; Shah-Hosseini, R. Drone-Based Smart Weed Localization from Limited Training Data and Radiometric Calibration Parameters. Environ. Sci. Proc. 2024, 29, 39. https://doi.org/10.3390/ECRS2023-15854
Khoshboresh-Masouleh M, Shah-Hosseini R. Drone-Based Smart Weed Localization from Limited Training Data and Radiometric Calibration Parameters. Environmental Sciences Proceedings. 2024; 29(1):39. https://doi.org/10.3390/ECRS2023-15854
Chicago/Turabian StyleKhoshboresh-Masouleh, Mehdi, and Reza Shah-Hosseini. 2024. "Drone-Based Smart Weed Localization from Limited Training Data and Radiometric Calibration Parameters" Environmental Sciences Proceedings 29, no. 1: 39. https://doi.org/10.3390/ECRS2023-15854
APA StyleKhoshboresh-Masouleh, M., & Shah-Hosseini, R. (2024). Drone-Based Smart Weed Localization from Limited Training Data and Radiometric Calibration Parameters. Environmental Sciences Proceedings, 29(1), 39. https://doi.org/10.3390/ECRS2023-15854