Next Article in Journal
Estimation of Land Surface Temperature from the Joint Polar-Orbiting Satellite System Missions: JPSS-1/NOAA-20 and JPSS-2/NOAA-21
Previous Article in Journal
Change Detection from Landsat-8 Images Using a Multi-Scale Convolutional Neural Network (Case Study: Sahand City)
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Drone-Based Smart Weed Localization from Limited Training Data and Radiometric Calibration Parameters †

by
Mehdi Khoshboresh-Masouleh
* and
Reza Shah-Hosseini
*
School of Surveying and Geospatial Engineering, College of Engineering, University of Tehran, Tehran 11155-4563, Iran
*
Authors to whom correspondence should be addressed.
Presented at the 5th International Electronic Conference on Remote Sensing, 7–21 November 2023; Available online: https://ecrs2023.sciforum.net.
Environ. Sci. Proc. 2024, 29(1), 39; https://doi.org/10.3390/ECRS2023-15854
Published: 14 November 2023
(This article belongs to the Proceedings of ECRS 2023)

Abstract

:
The most efficient tool for practical uses, like weed monitoring in smart farming, is presently small object localization from drone images. While most object detection models indicate competency in localization when trained on large datasets, applying a few-shot learning technique can enhance scene comprehension, even when provided with limited training data. This investigation introduces a few-shot model for localizing weed grasses in multispectral drone images. The model encompasses a reflectance calibration factor, enabling it to perform well on tasks that it has yet to be specifically trained. An inductive transfer system enhances the model’s ability to generalize and accurately localize weeds. The research results demonstrate the potential of the suggested approach to detect weed grasses in drone-based multispectral images and calibration reflectance factor with a mIoU score of 71.45% and an accuracy of 84.3%, despite several difficulties in practical implementation.

1. Introduction

Increased world population growth will demand more high-quality food production, which can only be achieved by applying a sustainable method for increasing crop yields. The FAO pointed out that weed grasses increase environmental and economic costs of pesticide use by spreading them across farm boundaries, and their competition with agricultural crops reduces quantity and quality output [1]. Among pests, weed grasses are considered a crucial biotic constraint to food production [1,2]. In traditional pest control methods in agriculture, most farm fields are spatially variable in grass weed infestation to a certain degree, but general weed management methods for herbicide application are based on the assumption that grass weeds are distributed uniformly in agricultural fields [3]. However, a smart weed localization system for optimized herbicide dose in the agricultural filed is a crucial step for smart farming and is still an open problem in pest control methods [4].
A drone-based smart weed localization system is an effective method for real-time and precise grass weed control and optimized herbicide dosage selection [5]. Due to the similarity between weed grasses and crops, visual grass weed localization in drone-based multispectral images is an important task in precision farming [6]. In contrast, designing diverse approaches for various target types at different scales is ineffective due to the growth of spectral and spatial similarities. For practical applications, including weed monitoring in smart farming, small object localization from drone images has emerged as a distinctive technique [7]. A few-shot learning method could enhance scene comprehension with minimal or no training data. Yet, numerous object detection models comprehend localization with much training data. The diversity of geospatial data from agricultural regions might make it challenging to identify the most effective technique that meets their learning preferences for dataset generation and processing. Real-world applications of deep learning with few training images are often challenging; however, the resulting high-value output is valued commercially and technologically. Farmers and scholars can make a significant livelihood using a drone-based smartweed localization system with limited training data.

2. Method

This research utilizes reflectance calibration factor and weed grass localization employing drone-based multispectral images to pinpoint the weed on large-scale images. While several weed detector models claim to comprehend single-time tracking with extensive training data, weed grass localization utilizing few-shot learning for drone-based multispectral images could enhance multispectral scene knowledge with short training data. Few-shot learning, a transfer model whose major objective is to enhance the generalization capacity for numerous tasks, can perform unseen tasks following training on a small set of annotated datasets and considers various tasks to develop a predictive model. Although the trained model’s localization of weed grasses could be believed to be accurate, this issue is not the case for making decisions. For instance, timely weed grass localization in agricultural fields is essential for producing high-quality crops. The presence of weed grasses in agricultural fields significantly decreases available areas for growing crops. Despite recent advancements in deep learning and drone imaging, weed grass localization continues to be an issue for smart farming. The suggested approach uses three sets, each containing K multispectral images, consisting of a training set t r a i n = { l j , m j } j = 1 K , an input of multispectral data, a support set s u p p o r t = { l j , m j } j = 1 K s u p p o r t , and a test set t e s t = { l j } j = 1 k t e s t [8]. A limited training set of 80 image patches was utilized to train the proposed network (a patch size would be 480 × 360 pixels). Our network overcomes the limitations of convolution blocks and binary mask creation for the localization of weed grasses, since it is distinct from comparable models in the meta-feature extraction relying on a convolution neural network. The model being presented consists of a set of encoders and decoders [9]. The encoder consists of five attention modules [10], while the decoder comprises a single transpose convolution module and interpolation technique, which enables the learning of spatial–spectral representations. We implement five attention layers on the multi-scale outputs of the network to aggregate multistep representations, improving the boundaries of weed grasses. The proposed layer ( a o u t ) for meta-feature extraction of the input image is defined as follows (Figure 1):
a o u t = r e l u ( a j [ p o o l i n g I ] ( l j m j )
where r e l u is the rectified linear activation function, a j is an attention function with three convolutional layers, l j is the observed data, and m j is the predicted map.
The empirical line model [11] is a frequently used method for converting digital numbers into surface reflectance based on before/after calibration drone-based images. The technique implies a linear correlation between the digital numbers of each pixel in an image and the surface reflectance. On average, one or more reflectance calibration panels with established reflectance values are employed to estimate this correlation [12]. According to [13,14], the reflectance calibration factor for band j is as follows:
f j = ρ j a v e r a g e ( L j )
where ρ j is the calibrated reflectance value for the jth spectral channel and L j is the radiance value for the calibrated reflectance panel.

3. Results and Discussion

The indicated technique’s performance was determined by employing a single effective measure for pixel-based object localization. The researchers presented the ratio measurement between accurately categorized pixels as weed grasses and the total count of ground reference pixels, often referred to as the intersection over union (IoU) [15]. We apply the WeedMap [13] dataset for the suggested network’s training and evaluation. The RedEdge-M sensor-based drone-based multispectral images having a size of 480 × 360 pixels that are assigned for weed detection contain considerable crop and weed changes from a variety of settings that are located in Rheinbach, Germany. These drone-based images were captured on 18 September 2017. At the time of data collection, they were at an estimated one-month stage of development, with crops and weeds measuring 15 to 20 cm and 5 to 10 cm, correspondingly.
Figure 2 illustrates the test stage’s findings of the localization of weed grasses. For the tested images, the suggested model acquires a mean IoU and a corresponding kappa value for the localization of weed grasses of 69.7% and 76.4% for the 1-shot and 73.2% and 80.7% for 10-shot, respectively.

4. Conclusions

In this manuscript, a few-shot approach is designed for weed grass localization from a small training dataset in multispectral images. The outputs highlight the features of the proposed approach to weed grass localization from drone-based multispectral images and reflectance calibration factor with a mIoU score of 71.45 and correctness of 84.3, which include various challenges in real-world application. The goal of this paper is to investigate the features of one-shot learning and reflectance calibration factor estimation for precision farming.

Author Contributions

Conceptualization, M.K.-M. and R.S.-H.; methodology, M.K.-M.; software, M.K.-M.; validation, M.K.-M. and R.S.-H.; writing—review and editing, M.K.-M.; visualization, M.K.-M. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Remote Sensing 2018 Weed Map Dataset is available at [13].

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Food and Agriculture Organization of the United Nations. Plant Production and Protection Division: Weeds. Available online: https://www.fao.org/agriculture/crops/thematic-sitemap/theme/biodiversity/weeds/en/ (accessed on 12 August 2023).
  2. Monteiro, A.; Santos, S. Sustainable Approach to Weed Management: The Role of Precision Weed Management. Agronomy 2022, 12, 118. [Google Scholar] [CrossRef]
  3. Soloneski, S.; Larramendy, M. Weed and Pest Control: Conventional and New Challenges; IntechOpen: London, UK, 2013; ISBN 978-953-51-0984-6. [Google Scholar]
  4. Khoshboresh-Masouleh, M.; Akhoondzadeh, M. Improving Weed Segmentation in Sugar Beet Fields Using Potentials of Multispectral Unmanned Aerial Vehicle Images and Lightweight Deep Learning. J. Appl. Remote Sens. 2021, 15, 034510. [Google Scholar] [CrossRef]
  5. Roslim, M.H.M.; Juraimi, A.S.; Che’Ya, N.N.; Sulaiman, N.; Manaf, M.N.H.A.; Ramli, Z.; Motmainna, M. Using Remote Sensing and an Unmanned Aerial System for Weed Management in Agricultural Crops: A Review. Agronomy 2021, 11, 1809. [Google Scholar] [CrossRef]
  6. Khoshboresh-Masouleh, M.; Shah-Hosseini, R. Uncertainty Estimation in Deep Meta-Learning for Crop and Weed Detection from Multispectral UAV Images. In Proceedings of the 2022 IEEE Mediterranean and Middle-East Geoscience and Remote Sensing Symposium (M2GARSS), Istanbul, Turkey, 7–9 March 2022; pp. 165–168. [Google Scholar]
  7. Khoshboresh-Masouleh, M.; Shah-Hosseini, R. Multimodal Few-Shot Target Detection Based on Uncertainty Analysis in Time-Series Images. Drones 2023, 7, 66. [Google Scholar] [CrossRef]
  8. Khoshboresh-Masouleh, M.; Shah-Hosseini, R. Real-Time Multiple Target Segmentation with Multimodal Few-Shot Learning. Front. Comput. Sci. 2022, 4, 1062792. [Google Scholar] [CrossRef]
  9. Badrinarayanan, V.; Kendall, A.; Cipolla, R. SegNet: A Deep Convolutional Encoder-Decoder Architecture for Image Segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 39, 2481–2495. [Google Scholar] [CrossRef]
  10. Guo, M.-H.; Xu, T.-X.; Liu, J.-J.; Liu, Z.-N.; Jiang, P.-T.; Mu, T.-J.; Zhang, S.-H.; Martin, R.R.; Cheng, M.-M.; Hu, S.-M. Attention Mechanisms in Computer Vision: A Survey. Comp. Vis. Media 2022, 8, 331–368. [Google Scholar] [CrossRef]
  11. Smith, G.M.; Milton, E.J. The Use of the Empirical Line Method to Calibrate Remotely Sensed Data to Reflectance. Int. J. Remote Sens. 1999, 20, 2653–2662. [Google Scholar] [CrossRef]
  12. Olsson, P.-O.; Vivekar, A.; Adler, K.; Garcia Millan, V.E.; Koc, A.; Alamrani, M.; Eklundh, L. Radiometric Correction of Multispectral UAS Images: Evaluating the Accuracy of the Parrot Sequoia Camera and Sunshine Sensor. Remote Sens. 2021, 13, 577. [Google Scholar] [CrossRef]
  13. Sa, I.; Popović, M.; Khanna, R.; Chen, Z.; Lottes, P.; Liebisch, F.; Nieto, J.; Stachniss, C.; Walter, A.; Siegwart, R. WeedMap: A Large-Scale Semantic Weed Mapping Framework Using Aerial Multispectral Imaging and Deep Neural Network for Precision Farming. Remote Sens. 2018, 10, 1423. [Google Scholar] [CrossRef]
  14. Use of Calibrated Reflectance Panels For MicaSense Data. Available online: https://support.micasense.com/hc/en-us/articles/115000765514-Use-of-Calibrated-Reflectance-Panels-For-MicaSense-Data (accessed on 12 August 2023).
  15. Khoshboresh-Masouleh, M.; Alidoost, F.; Arefi, H. Multiscale Building Segmentation Based on Deep Learning for Remote Sensing RGB Images from Different Sensors. J. Appl. Remote Sens. 2020, 14, 034503. [Google Scholar] [CrossRef]
Figure 1. Meta-feature extraction module.
Figure 1. Meta-feature extraction module.
Environsciproc 29 00039 g001
Figure 2. The proposed model predictions on weed grass localization. The image samples 1 to 5 belong to the different patches in four fields.
Figure 2. The proposed model predictions on weed grass localization. The image samples 1 to 5 belong to the different patches in four fields.
Environsciproc 29 00039 g002
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Khoshboresh-Masouleh, M.; Shah-Hosseini, R. Drone-Based Smart Weed Localization from Limited Training Data and Radiometric Calibration Parameters. Environ. Sci. Proc. 2024, 29, 39. https://doi.org/10.3390/ECRS2023-15854

AMA Style

Khoshboresh-Masouleh M, Shah-Hosseini R. Drone-Based Smart Weed Localization from Limited Training Data and Radiometric Calibration Parameters. Environmental Sciences Proceedings. 2024; 29(1):39. https://doi.org/10.3390/ECRS2023-15854

Chicago/Turabian Style

Khoshboresh-Masouleh, Mehdi, and Reza Shah-Hosseini. 2024. "Drone-Based Smart Weed Localization from Limited Training Data and Radiometric Calibration Parameters" Environmental Sciences Proceedings 29, no. 1: 39. https://doi.org/10.3390/ECRS2023-15854

APA Style

Khoshboresh-Masouleh, M., & Shah-Hosseini, R. (2024). Drone-Based Smart Weed Localization from Limited Training Data and Radiometric Calibration Parameters. Environmental Sciences Proceedings, 29(1), 39. https://doi.org/10.3390/ECRS2023-15854

Article Metrics

Back to TopTop