Next Article in Journal
Sample Expansion and Classification Model of Maize Leaf Diseases Based on the Self-Attention CycleGAN
Previous Article in Journal
Interactive Cognitive Motor Training: A Promising Approach for Sustainable Improvement of Balance in Older Adults
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Weed Detection in Rainfed Maize Crops Using UAV and PlanetScope Imagery

by
Colette de Villiers
1,2,
Cilence Munghemezulu
2,
Zinhle Mashaba-Munghemezulu
2,*,
George J. Chirima
1,2 and
Solomon G. Tesfamichael
3
1
Department of Geography, Geoinformatics and Meteorology, University of Pretoria, Pretoria 0028, South Africa
2
Agricultural Research Council—Natural Resources and Engineering, Private Bag X79, Pretoria 0001, South Africa
3
Department of Geography, Environmental Management and Energy Studies, University of Johannesburg, Johannesburg 2006, South Africa
*
Author to whom correspondence should be addressed.
Sustainability 2023, 15(18), 13416; https://doi.org/10.3390/su151813416
Submission received: 31 July 2023 / Revised: 2 September 2023 / Accepted: 3 September 2023 / Published: 7 September 2023

Abstract

:
Weed invasion of crop fields, such as maize, is a major threat leading to yield reductions or crop right-offs for smallholder farming, especially in developing countries. A synoptic view and timeous detection of weed invasions can save the crop. The sustainable development goals (SDGs) have identified food security as a major focus point. The objectives of this study are to: (1) assess the precision of mapping maize-weed infestations using multi-temporal, unmanned aerial vehicle (UAV), and PlanetScope data by utilizing machine learning algorithms, and (2) determine the optimal timing during the maize growing season for effective weed detection. UAV and PlanetScope satellite imagery were used to map weeds using machine learning algorithms—random forest (RF) and support vector machine (SVM). The input features included spectral bands, color space channels, and various vegetation indices derived from the datasets. Furthermore, principal component analysis (PCA) was used to produce principal components (PCs) that served as inputs for the classification. In this study, eight experiments are conducted, four experiments each for UAV and PlanetScope datasets spanning four months. Experiment 1 utilized all bands with the RF classifier, experiment 2 used all bands with SVM, experiment 3 employed PCs with RF, and experiment 4 utilized PCs with SVM. The results reveal that PlanetScope achieves accuracies below 49% in all four experiments. The best overall performance was observed for experiment 1 using the UAV based on the highest mean accuracy score (>0.88), which included the overall accuracy, precision, recall, F1 score, and cross-validation scores. The findings highlight the critical role of spectral information, color spaces, and vegetation indices in accurately identifying weeds during the mid-to-late stages of maize crop growth, with the higher spatial resolution of UAV exhibiting a higher precision in the classification accuracy than the PlanetScope imagery. The most optimal stage for weed detection was found to be during the reproductive stage of the crop cycle based on the best F1 scores being indicated for the maize and weeds class. This study provides pivotal information about the spatial distribution of weeds in maize fields and this information is essential for sustainable weed management in agricultural activities.

1. Introduction

Weeds are often the cause of major crop loss and its management is therefore an integral part of crop production. In Sub-Saharan Africa, pathogens and pests (e.g., weeds) are responsible for an estimated 30% of maize crop yield losses or, in certain cases, total crop failure [1]. This shows the significant impact of weeds on food security and the livelihoods of rural communities that depend on smallholder farming. According to Nyambo, et al. [2], smallholder farmers are ill equipped for weed management due to a lack of knowledge, inadequate management practices, including delayed interventions, lack of mechanization, and climate change. Additionally, the limited access to herbicides and other weed control resources are the challenges faced by smallholder farmers [3,4,5].
The presence of weeds can be detrimental to crop health and growth. The interaction between weeds and the maize crop varies considerably from the early to late stages of maize growth during the season. Distinguishing between maize and weeds at different life cycles is essential for understanding weed-crop competition in farms. Competition for resources is a major issue that influences maize growth as valuable environmental factors, such as soil, water, light, and space, are under pressure at various levels of severity during the maize life cycle [6,7]. The detection of weeds in the early stages of maize growth can be more difficult when the plants have little foliage and the spectral signatures are difficult to detect [8]. Early weed detection is ideal for rapid and cost-effective measures to fight the invasion. To decrease the amount of herbicides sprayed, Nikolić, et al. [9] used a site- and time-specific early weed detection approach. This study identified specific areas and timings to reduce the spraying of herbicides, thereby ensuring a more cost-effective and precise weed management regime. Weed identification in early season maize crops is necessary to identify the distribution of weeds for specific species and treat affected areas to prevent crop yield losses and improve crop health [10]. Karimmojeni, et al. [11] showed that maize physiology can be severely affected by the presence of weeds in the early stages of growth. The presence of weeds remains a problem till the mid-to-late season in crops [12]. During the mid-to-late season of crop growth, the weeds are more likely to produce larger quantities of seeds. Additionally, weed infestations during the latter part of the growing season exhibit a high resistance to herbicide treatments [13], necessitating more costly weed management strategies [12,14]. In summary, the synthesis of these studies outlines the potential of unmanned aerial vehicles (UAV) technology in weed research and management within maize fields. The studies highlight the importance of selecting appropriate sensor technology and leveraging deep learning for weed mapping. Moreover, the studies emphasize the need for monitoring weeds at different crop growth stages.
Traditionally, farmers rely on manual ground methods for the identification and removal of weeds in agricultural fields; these techniques prove to be costly, time-consuming, and unreliable. Weed detection improved over time with the availability of remote sensing systems, such as satellites, drones, and ground-based vehicles. This enables farmers to implement weed management strategies over large areas of their farms. Accurate and timely weed-maize crop identification has become a vital part of crop management in modern precision agriculture. One method of achieving this is by using high-spatial resolution remotely sensed data, such as UAVs that capture centimeter-level details of crop fields. Such a level of detail allows for the mapping of individual plants, thereby contributing to precise estimations of weed-infested areas in the field [15,16]. UAV data acquisition is also more flexible than the images obtained from satellite platforms since UAVs can be deployed on demand during the growing season. Furthermore, UAVs can be equipped with sensors carrying enhanced spectral bands to reduce the spectral mixing of vegetation types during classification [17]. Such capabilities of UAVs was, for instance, exploited by Yang, et al. [18] in the classification of green vegetation, including weeds, maize, and peach trees.
UAV-derived vegetation indices (VIs) and color spaces have proved beneficial for crop-weed discrimination [15,19]. For instance, the visible atmospheric resistant index (VARI) has been directly linked to improvements in maize-weed detection, serving as a practical classification method to identify areas requiring targeted weed herbicide spray [9]. Furthermore, color-based VIs such as Excess Green (EGI/ExG) and the triangular greenness index (TGI), have demonstrated success in crop-weed discrimination studies [20,21]. A study by Yang, et al. [22] proposed a classification approach based on the hue, saturation, and value (HSV) color space to differentiate the greenness of maize and weeds across diverse environmental conditions. Additionally, Xu, et al. [23] found that HSV derived from UAV imagery assisted in distinguishing between maize crop, soil, and shadows. Another color space called Lab* includes the illumination values (L*), as well as the values from red to green (a*) and from blue to yellow (b*). Chen, et al. [24] suggested that the Lab* color space can effectively distinguish between weed and maize seedlings in the foreground and soil in the background. While these color spaces are often utilized for image segmentation to detect weeds in the early stages of maize growth, there is a potential for their application in improving mid-to-late-stage weed-maize classification.
Principal component analysis (PCA) is a technique widely used in remote sensing and data analyses for data dimensionality reduction. The process involves the transformation of correlated variables into uncorrelated variables, known as principal components (PCs), or excluding one of them from the model. PCs derived from remotely sensed data have proven successful in crop classification, as PCs tend to carry unique and diagnostic information about each plant while reducing the overall information across plants [25]. Jiang, et al. [26] used PCA for the reduction of 87 vegetation indices into PCs to identify the most influential variables, allowing for the identification of various factors that contribute to maize growth. The application of UAV-based PCA for maize-weed classification has been used in various studies [27,28,29]. The authors did not fully explore the application of PCA to various spectral bands, vegetation indices, and color spaces for weed detection. A notable study conducted by Xu, et al. [30] identified vital spectral information to assist in weed mapping. The authors found that textural, structural, and thermal signatures exhibited the best performance using SVM (96.4%) in weed mapping. The authors also applied PCA to the textural features; the PCs were used alone as input features for the SVM classification, leading to a notable accuracy of 88.6%. Generally, there is still a need to expand on the use of the PCA in the detection of weed in maize crops in smallholder farms.
Weed detection requires the exploration of using remote sensing techniques suitable to optimally and accurately identify weed distribution in crops. High spatial resolution platforms have the ability to improve weed detection [15,16,17,18]. The spectral data obtained from these systems provide valuable information for discrimination between weeds and crops. The enhancement of spectral data, such as the generation of VIs and color spaces, can greatly improve the input feature dataset to obtain high-accuracy classification models [9,15,19,22,23,24]. The implementation of these high spatial and spectral resolution features requires complex modeling techniques. The aim of this study is to assess the use of spectral–temporal characteristics on the classification of maize in a weed-infested field. Considering the complexity of the field in terms of the terrain and weed infestation, UAV imagery and PlanetScope data with spectral indices, color spaces, and high spatial-temporal resolutions are used in this study. The specific objectives of the study are: (i) to explore the accuracy of mapping maize-weed using multi-temporal data and vegetation indices from UAV-based and PlanetScope data and machine learning algorithms, and (ii) to determine the optimal time of the maize growing period for weed detection. The expected outcomes for these studies are as follows: firstly, the usage of UAV and PlanetScope imagery alongside machine learning algorithms is expected to yield a robust model for accurate weed detection. The fusion of these data sources and techniques would subsequently enhance the precision of the maize-weed classification. Secondly, the expected improved accuracy through remote sensing techniques would enable the identification of the optimal time window during the maize growing season for effective weed detection.

2. Materials and Methods

2.1. Site Description

The farm at Bronkhorstspruit is located next to the Vlakfontein village in the Gauteng Province, South Africa. The study area receives spring/summer rainfall with annual rainfall levels between 600 and 650 mm and mean temperatures between 18 and 27 °C [31]. The farm next to the Vlakfontein rural village is considered a medium-sized smallholder farm that is also contributing to employing members of the community. The maize was planted on the 15 November 2021. The maize field (Figure 1) chosen for this study was obstructed by various grasses and weed species; although it had been sprayed, the chemicals were ineffective. There were also areas of water logging observed in some parts of the field. The weeds in this field ultimately hindered the crop growth by most likely using many resources in the soil that were needed by the maize plants. The UAV imagery for this field was acquired between January and May 2022 during the field campaign by the Agricultural Research Council (ARC) of South Africa. Field data collection occurred within a few days (2–3 days) of the UAV image collection.

2.2. Methodology Overview

The overview of the methodology is illustrated using a flowchart (Figure 2), which summarizes the four experiments performed for this study. Data collection for both the image acquisition and ground truthing of weeds and crop was conducted from January to May 2022 during the mid-to-late growth period of the maize. The subsequent step involved the pre-processing of the UAV images to acquire ortho-mosaiced red–green–blue (RGB) imagery using Pix4D software [32] version 4.8.4. PlanetScope images were acquired from Planet, already pre-processed as reflectance images. The available spectral bands, color space images, and VIs were generated from the UAV and PlanetScope datasets. Lastly, four experiments were conducted using the inputs of both imagery feature datasets, feature-derived PCs, and two machine learning (RF and SVM) classification techniques. These experiments were designed to explore the success of different spectral and temporal information results for the maize-weed classification. Further details of these procedures are summarized in the following sections of this paper.

2.3. Data Collection, Pre-Processing, and Derived Variables

2.3.1. UAV-Based Image Acquisition in the Field

We used a MicaSense RedEdge-MX multispectral camera (MicaSense, Inc., Seattle, WA, USA) and mounted it on a DJI Matrice 600 Pro UAV, (DJI Technology Co., Ltd., Shenzhen, China). A professional drone system designed for aerial photography and mapping supplied by the ARC, South Africa. The 5-band multispectral camera (RedEdge-MX) was used for image acquisition based on the band characteristics of the camera (Table 1). A 2-step calibration method was used to prepare the data. First, a pre-flight image was captured on the reflectance calibration panel. Second, the upward-facing downwelling light sensor (DLS) was used with an integrated GPS sensor for the geotagging of the imagery, with the DLS kept motionless until the calibration was completed. All the flight missions for this study were conducted at an altitude of 120 m above ground level (AGL) with an 8 cm spatial resolution. All the UAV images were collected within 2 h of solar noon and the acquisitions were completed as shown in Table 2. The pre-processing of UAV images was performed on Pix4D version 4.8.4, a highly efficient photogrammetry software. The process included importing each image acquired during the data collection, image calibration, image alignments, and generating outputs, such as the point clouds, orthomosaics, surface reflectance, and digital surface/digital terrain models.

2.3.2. PlanetScope Satellite Image Acquisition

Table 3 presents the bands for the PlanetScope surface reflectance (SR) satellite product [19] that was obtained from the Planet Explorer platform accessed on 9 June 2022 at https://www.planet.com/explorer/. The data consisted of orthorectified images with surface reflectance values that were multiplied by 10,000 [34]. These bands consisted of blue, green, red, and NIR bands with a 3 m spatial resolution. The constellation of PlanetScope satellites provided almost daily coverage worldwide. The raw satellite images were pre-processed to account for atmospheric corrections using the 6S radiative transfer model [35]. This correction involved converting the top-of-atmosphere reflectance values to bottom-of-atmosphere reflectance values, thereby accounting for the effects of various atmospheric conditions, such as water vapor, ozone, and aerosols. The purpose of this correction was to enhance the stability and accuracy of the imagery by minimizing the influence of atmospheric interference. For each month (Table 3), all available images from the Planet archive and covering the study area were downloaded. Images with high cloud cover (>10%) were omitted, and the remaining images were used to create monthly mean composites for each month.

2.3.3. In-Situ Data Collection using Garmin GPS and UAV-Processed Data

In this study, in situ data points were collected for the maize, weeds, and soil using both a handheld Garmin GPS and UAV-processed false-color composites. The study employed a grid format to select plots to identify the maize; this resulted in the selection of samples that had weeds under the maize canopies or in-between maize rows throughout the field. The maize field identified for this study was heavily impacted and weeds grew freely throughout the field (Figure 3). Therefore, the UAV-processed data provided high-resolution images to identify and distinguish the difference between the maize, weeds, and soil. A total of 100 data points per class were identified for all four months from the GPS and UAV imagery.

2.3.4. Band Reflectance, Color Spaces, and Vegetation Indices for Classification

A total of 18 features were used for image processing for the UAV, and 16 features for PlanetScope (Table 4). Three groups of features are presented in this table. The first five features consist of spectral bands directly derived from the UAV and PlanetScope data. The second group presents the color spaces computed from the RGB spectral bands. The last group consists of seven vegetation indices calculated from each of the available spectral bands.

2.4. Image Classification Approach

Random forests is a popular machine learning approach that uses an ensemble of decision trees to produce robust and unbiased classification outputs [43,44]. As a typical family of supervised classification, it uses known classes, such as those collected through field surveys, as response variables (variables to predict) and explanatory variables, such as remotely sensed products, as predictors. The approach uses bagging to randomly split the samples into two-thirds known as “In-bag” data that are used for training the classification model and one-third called out-of-bag (OOB) data set aside to validate the trained model. The in-bag data are used to construct independent decision trees to determine the class of each sample by using the remotely sensed features as explanatory inputs. Equation (1) is used for the calculation of the decision trees [45]:
m n x ; Θ 1 , D n = i D n * Θ j 1 Χ i ϵ A n x ; Θ 1 , D n Y i N n x ; Θ 1 , D n ,
where D n * Θ j depicts the data samples selected prior to constructing the decision trees. The pixel cell that encompasses x is known as A n x ; Θ 1 , D n , and N n x ; Θ 1 , D n is all the preselected points that belong to A n x ; Θ 1 , D n . Once the specified number of decision trees are created to estimate the forest, a majority vote is then used to decide the class name of a sample. The accuracy of the classification created using the abovementioned procedure is evaluated using the OOB samples that are set aside.
The RF algorithm was identified as a suitable method for image classification, as it could effectively handle the high dimensionality of satellite data and identify the relationships between different features and class labels [46]. The variable of importance metric is a tool used in RF to identify the contribution of each predicting variable in the classification [47]. This is useful for improving the model performance and enhancing classification accuracy by selecting the most relevant features [48]. The scikit-learn package in Python was used for the implementation of the RF image classification [49]. The 5-fold cross-validation and grid search methods were used for hyperparameter tuning to identify the optimal parameters. These included the number of trees (n_estimator), the maximum number of features to consider when determining the best split of the trees (max_features), and the maximum depth of the trees (max_depth).
The support vector machine (SVM) is a well-known supervised learning algorithm ideal for classification [50]. The SVM algorithm identifies the optimal hyperplane for maximizing the distance or margin between various classes. Given a training dataset with input features X and corresponding class labels y ( 1   o r   1 ) , the goal is to find a hyperplane defined by a weight vector w and a bias term b such that:
w · x + b = 0
where x is a data point. The margin is the distance between the hyperplane and the closest data points from each class. In the linear SVM, the optimal hyperplane is where the margin is maximized:
y i w · x i + b 1
where y i is the class label of the sample x i ; constraint (3) ensures the correct classification of the samples to specific classes. A kernel function is needed to transform the training data into a high-dimensional feature space [51]. The equation for the polynomial kernel is:
K x , x = x · x + c d
where d is the degree of the polynomial and c is the constant term. The second non-linear kernel function is known as the radial basis function (RBF):
K x , x = e x p x x 2 2 σ 2
where σ is a parameter that decides the distribution of the Gaussian kernel.
The SVM model proved more efficient in timeliness with a small training sample size compared to the other models [52]. In this algorithm, the 5-fold cross-validation and grid search were performed for the hyperparameter tuning. These tuning parameters included the use of the regularization parameter (C) to decrease noise in the dataset. The type of kernel function was determined, namely, the linear, polynomial, and RBF. Lastly, the gamma hyperparameter was determined, which influenced the smoothness of the decision boundary by measuring the data points within the feature space. The modeling of the data was conducted using the support vector classification module in the Python library scikit-learn [49].

2.5. Experimental Design

To fulfill the objectives of the study, experiments were designed and conducted utilizing the data from both UAV and PlanetScope sources. Various spectral bands, color spaces, and vegetation indices were evaluated for their accuracy in detecting weeds in maize crops during the classification. The design of the four main experiments is described in Table 5. The first two experiments incorporated all the features outlined in Table 2, employing both RF and SVM classifiers for each dataset. The last two experiments aimed to reduce data dimensionality through PCA. By addressing the cumulative variance and explained variance ratio, the features that exhibited the highest variance in relation to the training dataset were determined. Subsequently, the PCs containing more than 98% were selected as inputs for the classification, employing both RF and SVM algorithms.

2.6. Accuracy Assessment

In this study, the RepeatedStratifiedKFold cross-validation method was employed using the scikit-learn Python library [49] to evaluate the performances of the various experimental models. Repeating the cross-validation multiple times increased the chance of a good representation of the variation in the dataset during the evaluation. The cross-validation was conducted using 10 splits and repeated 3 times, with reproducibility maintained using a fixed random state of 1. The accuracy scoring metric was calculated to determine the model’s performance, ensuring confidence in the reliability of the model for a successful classification.
For each classification performed for this investigation, the confusion matrix and accuracy metrics were generated using the 20% testing samples to assess the accuracy of the outcomes. The “classification report”, “accuracy score”, and ”confusion matrix” were metrics used to determine model’s performance [49]. The precision, recall, overall accuracy, and F1 score were calculated from the confusion matrix for the accuracy assessment. The classification performance of each class (maize, weed, and soil) was evaluated with precision and recall. These metrics were calculated using the following equations:
O v e r a l l   A c c u r a c y = T P N ,
P r e c i s i o n = T P T P + F N ,
R e c a l l = T P T P + F P ,
F 1   S c o r e = 2 × P r e c i s i o n × R e c a l l P r e c i s i o n + R e c a l l
These parameters are as follows: TP, true positive; FP, false positive; TN, true negative; and FN, false negative. N is the total number of samples in the study. Overall accuracy is the measurement of the proportion of correctly classified samples within the entire reference site. The percentage of plants that are correctly identified as belonging to a particular class is known as the recall (also known as user accuracy). Precision is a metric used to measure the accuracy of correctly predicted data (also known as producer accuracy). The F1 score, which is the harmonic average of precision and recall, assesses the relationships between the labels assigned to the data and those provided by the classifier. The categorization performance improves as the F1 score is closer to 1.

3. Results

3.1. Principal Components Analysis

The 18 UAV-derived and 16 PlanetScope-derived PCs and the variances are depicted in Figure 4. As presented in the figure, the cumulative variance depicts the total variance explained by the different PCs. It can be noted from Figure 4 that the cumulative variance levels off at around PC4. The red lines on the figures depict that 98% of the variance is explained around PC4 for all the observations. Consequently, incorporating any more than these four PCs would not significantly improve the capturing in variability of the dataset. This is needed to ensure the simplification of the model analysis and to mitigate model overfitting. Based on these variance-explained graphs, four PCs were identified as input features for each month in the third and fourth experiments.

3.2. Maize Crop Classification Accuracy Using UAV and PlanetScope Data

The high spatial resolution of UAV products was expected to considerably improve the weed detection in maize crops, especially in the mid-to-late season of maize growth. This was confirmed with the UAV yielding accuracy scores above 0.70 for most models (Figure 5a). The results for experiment 1 (all bands + RF) using UAV data in May 2022 depict the highest accuracy scores with a median above 0.90. The accuracies for experiment 2 of the UAV data were consistently below the other experiments. The cross-validation results of the PlanetScope models produced accuracy scores below 0.60 in most cases (Figure 5b).
In order to find the best-performing model, the accuracy metrics were obtained for each model. The overall accuracy of each model is plotted in Figure 6 to show the distribution of the accuracy of each model over time. The highest overall accuracy was produced by experiment 2 using UAVs in February 2022; although, this model produced UAV accuracies below 80% during the other months. The overall accuracy of experiment 1 indicated the most consistent accuracies above 86.7% for each month. This graph also depicts that experiments 1 to 4 using PlanetScope data produce overall accuracy values below 48% throughout the study period. The highest overall accuracies produced by experiment 3 (PCs + RF) using PlanetScope data were achieved for February, April, and May.
Figure 7 displays the mean composite metrics computed from the accuracy metrics (overall accuracy, precision, recall, F1, and mean cross-validation scores) between January and May 2022. The graph presents the models in descending order of mean accuracy using the UAV data. The best model is revealed to be experiment 1 (all bands + RF) with UAV data that achieve the highest mean accuracy (>0.88), followed by experiment 4 (PCs + SVM). It is worth noting that all the mean accuracies for the experiments using the PlanetScope data were below 0.4, as expected. Furthermore, the accuracies were generally comparable when using the PlanetScope data across the four experiments.
Figure 8 illustrates the variable importance of the two experiments using the random forest machine learning algorithm. The graphs show that the band importance varies with the experiment and the date of growth stage. In experiment 1, the NDVI has the highest importance for both April and May 2022, while the brightness value (V) has the highest importance in January 2022. The most important bands in experiment 1 also included b* in May 2022 with a score of 0.13, red in January 2022 (0.099), and February 2022 (0.23). High variable importance values were observed for the VIs SAVI, NDRE, and EVI in April and May. The abovementioned results indicate that spectral reflectance bands, color spaces, and VIs are significant inputs to classify weeds from maize crops and bare soils. The scores for the variable importance for experiment 3 include the scores for PCs 1 to 4 (mostly above 0.1). As expected, PC1 was the most important variable in January, February, and April, and the second highest in May 2022.

3.3. Optimal Growth Stages for Weed Detection in Maize

The detection of weeds in maize crop fields is needed for weed management and understanding weed-crop dynamics for future farming activities. The study investigated the effectiveness of mapping weeds at the mid-to-late maize growth stages. Maize was first observed in January 2022 during the booting stages of growth and the last month of observations were at the maturity growth stage in May 2022. The results for the best-performing model were used to investigate this optimal weed detection period. This model included all 18 UAV-derived data classified with the RF algorithm. Figure 9 shows the temporal evolution of the overall accuracy and F1 score for maize crops during the mid-to-late stages of the growth cycle. The overall accuracy metric shows that the overall accuracy remains almost constant during crop growth, with only a 3% variation (Figure 9a). The accuracy increased slightly from 0.87 in January to 0.90 in February, then decreased again to 0.87 in April before increasing again to 0.90 in May 2022. This consistency in the overall accuracy showed the good performance of the model to distinguish weed, maize crop, and bare soils throughout the mid-to-late growth season. Figure 9b illustrates the results of the F1 score of the classification. The F1 score of the maize reached a peak at 0.89 in February 2022, most likely due to the maximum greenness and plant height of the maize during the reproductive stage. After this date, there was a decline in the F1 score to 0.8 in April and a slight increase to 0.85 in May 2022. This contrasts with the weed class where there was a consistent increase in the F1 score during the mid-to-late growth season. Specifically, the F1 score for the weeds had a significant increase from 0.83 in January to 0.88 in February. The F1 score for weeds reached a peak at 0.95 in April and ended at 0.91 in May 2022.

3.4. Mapping Weeds in Maize Fields

The maps presented in Figure 10 and Figure 11 depict the spatial distributions of weeds within a maize crop using the data from PlanetScope and UAV sources. Figure 10a and Figure 11a display the true color composites for February 2022 alongside the classified maps for January, February, April, and May 2022. The classification maps using PlanetScope (Figure 10b–e) clearly show that the imagery is unable to produce highly detailed information about the weeds. Instead, the data offer a broader view of the spatial extent of the crop area, obscuring feature variations in the field. In contrast, the spatial distribution of the three classes in the two UAV classified images (Figure 11a–e) proved successful in showing the fine-scale identification of the maize and weeds. These maps can provide a more comprehensive detail of crop dynamics.

4. Discussion

The present study evaluated the effectiveness of UAV and PlanetScope data, and derived spectral information for detecting weeds on a rainfed maize farm in Bronkhorstspruit, South Africa. The input features consist of all the spectral bands available for both UAV (red, green, blue, NIR, and red-edge) and PlanetScope (red, green, blue, and NIR) data, color spaces (HSV, L*a*b*), and 7 VIs (EGI, TGI, VARI, NDVI, EVI, SAVI, and NDRE). Four experiments were conducted to assess the classification accuracy using two machine learning algorithms, RF and SVM, incorporated with different input scenarios. The first two experiments used all the spectral input features and the third and fourth experiments applied PCAto reduce the dimensionality of the spectral features in PCs. The RF and SVM classifiers were utilized in the experiments with accuracies compared using cross-validation scores and statistical tests. The best-performing experiments were then applied to determine the best time for mapping weeds in maize crop fields during the mid-to-late maize growing season. The study found that high spatial resolution data offered by UAVs, along with machine learning classification, are ideal for the detection of weeds in maize crop fields.
In recent years, the emergence of UAVs with their high spatial resolution has greatly improved the accuracy of the classification of plant species on agricultural farms. Several studies have successfully employed this innovative technology for monitoring weed infestations in maize areas [53,54]. Pei, et al. [55] differentiated between weeds and maize using high-resolution UAV images (0.685 cm resolution) to a mean accuracy of 86.89%. The study proposed an improved model of convolution neural networks (CNNs) and found the precision to increase to 95.5% and 93.98% for maize and weed detections, respectively. In this study, the best-performing model (experiment 1 using UAVs) produced comparable results, with precision values between 74% to 91% for maize and between 85% to 95% for weeds throughout the study period. It should be noted that this study used lower spatial resolution images (8 cm/pixel) and the growth of the maize occurred much later in the life cycle compared to the study by Pei, et al. [55]. The detection of mid-to-late season weeds has the potential to exhibit greater variations in the spectral characteristics of both crops and weeds [14], thereby enhancing the accuracy of the classification results. This study corroborated the findings of Xu, et al. [30], emphasizing the significance of VIs and spectral features in achieving high classification accuracy for detecting weeds during the mid-to-late growth stages in maize crops.
In this study, the main focus was on examining the benefits of multispectral information in crop monitoring with regard to weed management strategies. Jurišić, et al. [56] used a UAV system equipped with RGB, near-infrared, and red-edge cameras to collect high-resolution imagery, enabling the differentiation between maize plants and soil. By leveraging these spectral bands, RF algorithms were applied to identify the crops, achieving a high kappa value of 0.998. Notably, significant differences were observed due to the greater spectral signature differentiation between the soil and vegetation types [57]. The advantage of acquiring fine-bandwidth UAV images over coarse-bandwidth PlanetScope satellite images is to acquire more detailed information about crop variability [58]. However, when accounting for the presence of weeds, relying solely on these individual spectral bands becomes insufficient, particularly when distinguishing between vegetation types. In crop weed studies, the extraction of color spaces and VIs from spectral reflectance images has consistently yielded positive results [20,21,59,60]. This trend was evident in the current study, where the inclusion of these variables improved the classification accuracies. Variable importance analysis highlighted the significance of Vis, such as NDVI, EVI, SAVI, and NDRE. Notably, the great importance assigned to the NDRE and red-edge band held significance in some of the models for this study. Red-edge along with NIR was noted to be crucial in the prediction of maize chlorophyll content, which was an important indicator of crop health [61]. Additionally, red-edge band can serve as an indicator of weed infestation levels in maize crops [62]. Other features with high variable importance were the V (brightness values) and b* bands of the HSV and Lab* color spaces. These findings concur with the research conducted by Agarwal, et al. [63], who similarly reported that NDVI demonstrates high accuracy in measuring weeds in maize crops, while the TGI color VI does not improve the model’s performance.
The current study examined four experiments to assess the practicality of using various input features. The PCA data dimensionality reduction approach successfully identified four PCs from both UAV and PlanetScope data, which accounted for over 98% of the variance. This method was implemented to address the challenge of reduced classification accuracy resulting from the high dimensionality of several predicting variables. Interestingly, the findings contrast with the expected outcomes of the experiments in this study. Experiment 1, which utilized all the UAV-derived bands with RF, yielded the highest accuracies. Experiment 4 (PCs + SVM) using the UAV dataset was the second-highest performing model; although, the accuracies varied between 73% and 93%. In a related study, Gao, et al. [28] explored hyperspectral imagery in a laboratory setting and extracted 185 total features, including reflectance indices and VIs. Contrary to the results of this study, Gao, et al. [28] found that 30 of the most important features achieved higher accuracy than using all the features. The findings of this study also align with the conclusions drawn by Duke, et al. [27], who observed that UAV crop classification produced lower-accuracy results using RF when PCA was applied (0.799) to the data compared to using vegetation indices without PCA (0.845). Based on the findings of these studies, the performance difference can be attributed to the spectral bands of the input data used. The current study utilized multispectral data, which had broader wavelengths and a limited number of bands compared to hyperspectral data with more narrow and continuous bands. As a result, applying PCA to multispectral data led to the loss of important spectral information, resulting in lower accuracies in experiments 3 and 4. In Xu, et al.’s study [30], incorporating only PCs derived from spectral features, such as vegetation indices, resulted in lower classification accuracies compared to combining these input features with PCs derived from textural features, and fusing them with the canopy temperature and/or plant height.
A major advantage of this study was the availability of multi-temporal images acquired from four months of UAV sampling and monthly PlanetScope composites. The results indicate that PlanetScope produces a lower overall accuracy in the earlier stage of the maize growing season (January 2021). Weed detection in the early stages of maize growth is essential for effective eradication planning; unfortunately, during this time, weed foliage is undetectable. This is attributed to the lower spatial resolution of the sensor than the UAV data, and due to possible spectral similarities between weeds and maize crops during this early stage. The UAV images were able to contribute to the high overall accuracy classification, especially during the reproductive stages of maize crop growth; then, in later months, closer to the harvest period, the maize became more difficult to distinguish from the weeds. The ability to classify maize more readily in the reproductive stages (F1 score = 0.89) was due to the phenological changes it experienced during this period. The maize spectral signatures became more unique and distinguishable from the weeds during the reproductive stages. This is in agreement with Kumar, et al. [64], who found that Sentinel-2 NDVI values increased significantly in the peak vegetative stage of maize growth, before decreasing closer to maturity and harvesting. A better understanding of the temporal relationship between maize growth stages and spectral signatures is essential to weed detection. Ultimately, temporal information is mostly important to understand the most optimal time for weed detection in maize classification. This allows proper weed management strategies to be developed to ensure the promotion of cost-effective and sustainable agricultural practices. Farmers can maximize yield outputs by managing weeds better during the growth stages of the crop. Knowing where the weeds are can assist the farmer to in applying appropriate chemicals or physically removing the weeds in a targeted manner that saves costs [10]. The benefit of detecting weeds in crops using digital imagery, such as UAVs, is that important information about the spatial patterns, spectral features, phenology, and biology of weeds can be obtained. This information is needed for developing strategies that not only eradicate weeds, but also reduce or eliminate the use of herbicides [65].
The current research study is important in various aspects of sustainability. Notably, it aligns with several sustainable development goals (SDGs) [66], including SDG 2 (Zero Hunger) and SDG 12 (Responsible Consumption and Production). The investigation contributes to enhancing sustainable agriculture. The connection between this study and the SDGs is somewhat indirect. The study’s outcomes have the potential to positively impact local rural farmers by providing them with valuable insights into weed distribution by using mapping technologies involving UAVs and satellite data. These farmers play a crucial role in the supply of food to the rural community, thereby aligning with the objectives of both SDG 2 and SDG 12. By enhancing agricultural practices, the farmers can make informed decisions about weed management strategies, choosing cost-effective and environmentally sustainable options that maximize profitability and contribute to the SDGs. For instance, mapping weeds using UAVs helps identify areas with high weed pressure, enabling farmers to implement site-specific weed control measures [67]. This can include manual weeding, crop rotation, cover cropping, mulching, and integrated weed management practices [68,69]. The potential implications of refining the precision of agricultural production are substantial, particularly in the context of weed prevention for crops [68], a goal aligning closely with the objectives of SDG 12. The strategic mapping of weed distribution in maize crops can reduce the usage of herbicides and preserve valuable resources, such as water and soil. Additionally, the adoption of sustainable crop management strategies, such as crop rotation, can mitigate herbicide resistance while reducing weed density [69].
It is important to highlight the main limitations of this study on which future works can improve. First, the study only used machine learning classifiers, such as RF and SVM, following the evidence of good performances in the related studies. There is a definite need to examine the use of the more recently developed deep learning methods in a larger dataset and also explore model fusion techniques for improved classification. Second, the study used only a single image taken per month and no UAV imagery was available for March 2022, indicating the temporally isolated cases of information. Continuous multi-temporal data would have identified more precisely the optimal time for weed detection [70,71]. Overcoming the abovementioned limitations can certainly improve the results obtained in the present study.
The findings indicate that the combination of spectral information, vegetation indices, and color spaces produce high-accuracy maps for weed detection. The very high spatial resolutions of the UAV imagery were essential for model accuracy, especially in this field-level study compared to the 3 m spatial resolution of PlanetScope data. Weed detection over several months in the mid-to-late maize growth cycle showed that there was a definite optimal window in the reproductive stage when the highest accuracy of weed classification was observed. There is still further research that is needed to improve our understanding of crop yield, health, and vigor in mid-to-late season weed–maize interactions. The recommendation for future research is to focus on applying the models and workflows used in this study to intercropped smallholder farms. This is necessary due to the lack of South African studies using UAVs for weed detection in smallholder maize farming, which can be highly beneficial to decision-makers and resource allocations to rural farmers.

5. Conclusions

The present study aimed to demonstrate the ability of UAV-based and PlanetScope remote sensing approaches in detecting weeds in a maize field in Bronkhorstspruit during the mid-to-late crop growing season in 2022. The results show that PlanetScope satellite data do not produce sufficient detailed maps for identifying weeds in maize crops (overall accuracy below 47%). Utilizing high-resolution UAV-based data produced much better accuracies, ranging from 66% to 93%. The study produced results with both RF and SVM, with the former identified as the best-performing model with an F1 score above 0.8 and 0.83 for maize and weeds, respectively. There was a variety of spectral reflectance bands, color spaces, and VIs that were significant in discriminating between the weeds, maize, and soil. The results of the best-performing experiment (experiment 1 with UAV) identified the brightness values, the red spectral band, and NDVI as some of the features that best distinguished the spectral variability between classes. Experiment 1 with UAV data performed better than the other three experiments as all the spectral input features were included in the model with the RF classifier reaching overall accuracy values of 87% and 90%. Based on the mean accuracy scores, the findings reveal that the PCA data reduction in experiments 3 and 4 does not result in a significant improvement in accuracy; however, it is worth noting that experiment 4 emerges as the second-best-performing model. The study revealed that the availability of images ranged from the mid-to-late stages of the maize growing season. Therefore, the results indicate that an optimal weed and maize classification can be identified during the reproductive stage. This is the time of the season when maize plants have unique spectral signatures that distinguish them from weed species. The limitation of this study is that the full life cycles of maize and weeds were not studied continuously and that a better understanding of maize–weed interactions would be beneficial for weed detection from early, mid-to-late-season maize growth. Maps were produced in this study, demonstrating the spatial distribution of the maize crops and weeds, which was an essential tool for the development of weed management strategies. The findings identify UAVs to be useful sources of data in detecting weeds in maize to assist rural farmers and promote sustainable agricultural techniques. In future research, weed detection should focus on the development of models for intercropping using spectral bands, vegetation indices, and color spaces for intercropped fields in smallholder farms. This will be immensely challenging as farms are planting other crops between maize that will be difficult to distinguish from weeds.

Author Contributions

C.d.V. conceptualized and developed the original draft of the manuscript. C.M. developed the research methodology and data analysis, and reviewed, supervised, and provided financial assistance to the project. Z.M.-M. developed the research methodology and supervised and reviewed the manuscript. G.J.C. revised the manuscript and supervised the project. S.G.T. revised the manuscript and supervised the project. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Agricultural Research Council-Natural Resources and Engineering (ARC-NRE), Department of Science and Innovation, Council for Scientific and Industrial Research, grant number P07000198; the National Research Foundation (NRF-Thuthuka, grant number TTK200221506319); the Department of Agriculture, Land Reform and Rural Development (DALRRD); and the University of Pretoria.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The PlanetScope data were obtained from the Planet website for academic research.

Acknowledgments

The researchers express their gratitude to the postgraduate students who provided valuable assistance to the Agricultural Research Council in collecting the field data: Phathutshedzo E. Maluma, Shaun Muirhead, Lwandile Nduku, Vuwani Makuya, Annie Koalane, Tshimangadzo Rasifudi, Nombuso Parkies, Siyabonga SR. Gasa, and Siboniso Nkambule. Additionally, the researchers extend their appreciation to Eric Economon for piloting the UAV and facilitating the image collection process.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Savary, S.; Willocquet, L.; Pethybridge, S.J.; Esker, P.; McRoberts, N.; Nelson, A. The global burden of pathogens and pests on major food crops. Nat. Ecol. Evol. 2019, 3, 430–439. [Google Scholar] [CrossRef]
  2. Nyambo, P.; Nyambo, P.; Mavunganidze, Z.; Nyambo, V. Sub-Saharan Africa Smallholder Farmers Agricultural Productivity: Risks and Challenges. In Food Security for African Smallholder Farmers; Springer: Berlin/Heidelberg, Germany, 2022; pp. 47–58. [Google Scholar]
  3. Gugissa, D.A.; Abro, Z.; Tefera, T. Achieving a climate-change resilient farming system through push–pull technology: Evidence from maize farming systems in Ethiopia. Sustainability 2022, 14, 2648. [Google Scholar] [CrossRef]
  4. Laizer, H.C.; Chacha, M.N.; Ndakidemi, P.A. Farmers’ knowledge, perceptions and practices in managing weeds and insect pests of common bean in Northern Tanzania. Sustainability 2019, 11, 4076. [Google Scholar] [CrossRef]
  5. Mabuza, M.; Ndoro, J.T. Borich’s Needs Model Analysis of Smallholder Farmers’ Competence in Irrigation Water Management: Case Study of Nkomazi Local Municipality, Mpumalanga Province in South Africa. Sustainability 2023, 15, 4935. [Google Scholar] [CrossRef]
  6. Rajcan, I.; Swanton, C.J. Understanding maize–weed competition: Resource competition, light quality and the whole plant. Field Crops Res. 2001, 71, 139–150. [Google Scholar] [CrossRef]
  7. Lou, Z.; Quan, L.; Sun, D.; Li, H.; Xia, F. Hyperspectral remote sensing to assess weed competitiveness in maize farmland ecosystems. Sci. Total Environ. 2022, 844, 157071. [Google Scholar] [CrossRef] [PubMed]
  8. Gao, F.; Anderson, M.; Daughtry, C.; Karnieli, A.; Hively, D.; Kustas, W. A within-season approach for detecting early growth stages in corn and soybean using high temporal and spatial resolution imagery. Remote Sens. Environ. 2020, 242, 111752. [Google Scholar] [CrossRef]
  9. Nikolić, N.; Rizzo, D.; Marraccini, E.; Gotor, A.A.; Mattivi, P.; Saulet, P.; Persichetti, A.; Masin, R. Site and time-specific early weed control is able to reduce herbicide use in maize-a case study. Ital. J. Agron. 2021, 16, 1780. [Google Scholar] [CrossRef]
  10. Peteinatos, G.G.; Reichel, P.; Karouta, J.; Andújar, D.; Gerhards, R. Weed identification in maize, sunflower, and potatoes with the aid of convolutional neural networks. Remote Sens. 2020, 12, 4185. [Google Scholar] [CrossRef]
  11. Karimmojeni, H.; Rahimian, H.; Alizadeh, H.; Yousefi, A.R.; Gonzalez-Andujar, J.L.; Sweeney, E.M.; Mastinu, A. Competitive ability effects of Datura stramonium L. and Xanthium strumarium L. on the development of maize (Zea mays) seeds. Plants 2021, 10, 1922. [Google Scholar] [CrossRef]
  12. Veeranampalayam Sivakumar, A.N.; Li, J.; Scott, S.; Psota, E.J.; Jhala, A.; Luck, J.D.; Shi, Y. Comparison of object detection and patch-based classification deep learning models on mid-to late-season weed detection in UAV imagery. Remote Sens. 2020, 12, 2136. [Google Scholar] [CrossRef]
  13. Landau, C.A.; Hager, A.G.; Williams, M.M. Diminishing weed control exacerbates maize yield loss to adverse weather. Glob. Change Biol. 2021, 27, 6156–6165. [Google Scholar] [CrossRef] [PubMed]
  14. López-Granados, F. Weed detection for site-specific weed management: Mapping and real-time approaches. Weed Res. 2011, 51, 1–11. [Google Scholar] [CrossRef]
  15. Xia, F.; Quan, L.; Lou, Z.; Sun, D.; Li, H.; Lv, X. Identification and comprehensive evaluation of resistant weeds using unmanned aerial vehicle-based multispectral imagery. Front. Plant Sci. 2022, 13, 938604. [Google Scholar] [CrossRef] [PubMed]
  16. Gao, M.; Yang, F.; Wei, H.; Liu, X. Individual Maize Location and Height Estimation in Field from UAV-Borne LiDAR and RGB Images. Remote Sens. 2022, 14, 2292. [Google Scholar] [CrossRef]
  17. Louargant, M.; Villette, S.; Jones, G.; Vigneau, N.; Paoli, J.-N.; Gée, C. Weed detection by UAV: Simulation of the impact of spectral mixing in multispectral images. Precis. Agric. 2017, 18, 932–951. [Google Scholar] [CrossRef]
  18. Yang, D.; Zhao, J.; Lan, Y.; Wen, Y.; Pan, F.; Cao, D.; Hu, C.; Guo, J. Research on farmland crop classification based on UAV multispectral remote sensing images. Int. J. Precis. Agric. Aviat. 2021, 4, 29–35. [Google Scholar] [CrossRef]
  19. Sánchez-Sastre, L.F.; Casterad, M.A.; Guillén, M.; Ruiz-Potosme, N.M.; Veiga, N.M.A.d.; Navas-Gracia, L.M.; Martín-Ramos, P. UAV Detection of Sinapis arvensis Infestation in Alfalfa Plots Using Simple Vegetation Indices from Conventional Digital Cameras. AgriEngineering 2020, 2, 206–212. [Google Scholar] [CrossRef]
  20. Sapkota, B.; Singh, V.; Neely, C.; Rajan, N.; Bagavathiannan, M. Detection of Italian ryegrass in wheat and prediction of competitive interactions using remote-sensing and machine-learning techniques. Remote Sens. 2020, 12, 2977. [Google Scholar] [CrossRef]
  21. Kawamura, K.; Asai, H.; Yasuda, T.; Soisouvanh, P.; Phongchanmixay, S. Discriminating crops/weeds in an upland rice field from UAV images with the SLIC-RF algorithm. Plant Prod. Sci. 2021, 24, 198–215. [Google Scholar] [CrossRef]
  22. Yang, W.; Wang, S.; Zhao, X.; Zhang, J.; Feng, J. Greenness identification based on HSV decision tree. Inf. Process. Agric. 2015, 2, 149–160. [Google Scholar] [CrossRef]
  23. Xu, X.; Fan, L.; Li, Z.; Meng, Y.; Feng, H.; Yang, H.; Xu, B. Estimating leaf nitrogen content in corn based on information fusion of multiple-sensor imagery from UAV. Remote Sens. 2021, 13, 340. [Google Scholar] [CrossRef]
  24. Chen, Y.; Wu, Z.; Zhao, B.; Fan, C.; Shi, S. Weed and corn seedling detection in field based on multi feature fusion and support vector machine. Sensors 2020, 21, 212. [Google Scholar] [CrossRef] [PubMed]
  25. Ma, Z.; Liu, Z.; Zhao, Y.; Zhang, L.; Liu, D.; Ren, T.; Zhang, X.; Li, S. An unsupervised crop classification method based on principal components isometric binning. ISPRS Int. J. Geo-Inf. 2020, 9, 648. [Google Scholar] [CrossRef]
  26. Jiang, Y.; Wei, H.; Hou, S.; Yin, X.; Wei, S.; Jiang, D. Estimation of Maize Yield and Protein Content under Different Density and N Rate Conditions Based on UAV Multi-Spectral Images. Agronomy 2023, 13, 421. [Google Scholar] [CrossRef]
  27. Duke, O.P.; Alabi, T.; Neeti, N.; Adewopo, J. Comparison of UAV and SAR performance for Crop type classification using machine learning algorithms: A case study of humid forest ecology experimental research site of West Africa. Int. J. Remote Sens. 2022, 43, 4259–4286. [Google Scholar] [CrossRef]
  28. Gao, J.; Nuyttens, D.; Lootens, P.; He, Y.; Pieters, J.G. Recognising weeds in a maize crop using a random forest machine-learning algorithm and near-infrared snapshot mosaic hyperspectral imagery. Biosyst. Eng. 2018, 170, 39–50. [Google Scholar] [CrossRef]
  29. Che’Ya, N.N.; Dunwoody, E.; Gupta, M. Assessment of weed classification using hyperspectral reflectance and optimal multispectral UAV imagery. Agronomy 2021, 11, 1435. [Google Scholar] [CrossRef]
  30. Xu, B.; Meng, R.; Chen, G.; Liang, L.; Lv, Z.; Zhou, L.; Sun, R.; Zhao, F.; Yang, W. Improved weed mapping in corn fields by combining UAV-based spectral, textural, structural, and thermal measurements. Pest Manag. Sci. 2023, 79, 2591–2602. [Google Scholar] [CrossRef] [PubMed]
  31. Pasi, J.M. Modelling the Impacts of Increased Air Temperature on Maize Yields in Selected Areas of the South African Highveld Using the Cropsyst Model. Ph.D. Thesis, University of KwaZulu-Natal, Pietermaritzburg, South Africa, 2014. [Google Scholar]
  32. PIX4Dmapper: Professional Photogrammetry Software for Drone Mapping. Available online: https://www.pix4d.com/product/pix4dmapper-photogrammetry-software (accessed on 20 March 2023).
  33. Micasense. MicaSense RedEdge-MX™ and DLS 2 Integration Guide. Available online: https://support.micasense.com/hc/article_attachments/1500011727381/RedEdge-MX-integration-guide.pdf (accessed on 31 August 2023).
  34. Planet Team. Planet Surface Reflectance Product v2; Planet Labs, Inc.: San Francisco, CA, USA, 2020; Available online: https://assets.planet.com/marketing/PDF/Planet_Surface_Reflectance_Technical_White_Paper.pdf (accessed on 30 June 2023).
  35. Marta, S. Planet Imagery Product Specifications; Planet Labs: San Francisco, CA, USA, 2018; Volume 91. [Google Scholar]
  36. Woebbecke, D.M.; Meyer, G.E.; Von Bargen, K.; Mortensen, D.A. Color indices for weed identification under various soil, residue, and lighting conditions. Trans. ASAE 1995, 38, 259–269. [Google Scholar] [CrossRef]
  37. Hunt Jr, E.R.; Daughtry, C.; Eitel, J.U.; Long, D.S. Remote sensing leaf chlorophyll content using a visible band index. Agron. J. 2011, 103, 1090–1099. [Google Scholar] [CrossRef]
  38. Gitelson, A.A.; Kaufman, Y.J.; Stark, R.; Rundquist, D. Novel algorithms for remote estimation of vegetation fraction. Remote Sens. Environ. 2002, 80, 76–87. [Google Scholar] [CrossRef]
  39. Tucker, C.J. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sens. Environ. 1979, 8, 127–150. [Google Scholar] [CrossRef]
  40. Huete, A.; Justice, C.; Van Leeuwen, W. MODIS vegetation index (MOD13). Algorithm Theor. Basis Doc. 1999, 3, 295–309. [Google Scholar]
  41. Huete, A.R. A soil-adjusted vegetation index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
  42. Barnes, E.; Clarke, T.; Richards, S.; Colaizzi, P.; Haberland, J.; Kostrzewski, M.; Waller, P.; Choi, C.; Riley, E.; Thompson, T. Coincident detection of crop water stress, nitrogen status and canopy density using ground based multispectral data. In Proceedings of the Fifth International Conference on Precision Agriculture, Bloomington, MN, USA, 16–19 July 2000; p. 6. [Google Scholar]
  43. Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
  44. Cutler, A.; Cutler, D.R.; Stevens, J.R. Random forests. In Ensemble Machine Learning: Methods and Applications; Springer: New York, NY, USA, 2012; pp. 157–175. [Google Scholar]
  45. Biau, G.; Scornet, E. A random forest guided tour. Test 2016, 25, 197–227. [Google Scholar] [CrossRef]
  46. Rodriguez-Galiano, V.F.; Ghimire, B.; Rogan, J.; Chica-Olmo, M.; Rigol-Sanchez, J.P. An assessment of the effectiveness of a random forest classifier for land-cover classification. ISPRS J. Photogramm. Remote Sens. 2012, 67, 93–104. [Google Scholar] [CrossRef]
  47. Guan, H.; Li, J.; Chapman, M.; Deng, F.; Ji, Z.; Yang, X. Integration of orthoimagery and lidar data for object-based urban thematic mapping using random forests. Int. J. Remote Sens. 2013, 34, 5166–5186. [Google Scholar] [CrossRef]
  48. Belgiu, M.; Drăguţ, L. Random forest in remote sensing: A review of applications and future directions. ISPRS J. Photogramm. Remote Sens. 2016, 114, 24–31. [Google Scholar] [CrossRef]
  49. Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V. Scikit-learn: Machine learning in Python. J. Mach. Learn. Res. 2011, 12, 2825–2830. [Google Scholar]
  50. Cortes, C.; Vapnik, V. Support-vector networks. Mach. Learn. 1995, 20, 273–297. [Google Scholar] [CrossRef]
  51. Awad, M.; Khanna, R.; Awad, M.; Khanna, R. Support vector machines for classification. In Efficient Learning Machines: Theories, Concepts, and Applications for Engineers and System Designers; Apress: Berkeley, CA, USA, 2015. [Google Scholar] [CrossRef]
  52. Shao, Y.; Lunetta, R.S. Comparison of support vector machine, neural network, and CART algorithms for the land-cover classification using limited training data points. ISPRS J. Photogramm. Remote Sens. 2012, 70, 78–87. [Google Scholar] [CrossRef]
  53. Castaldi, F.; Pelosi, F.; Pascucci, S.; Casa, R. Assessing the potential of images from unmanned aerial vehicles (UAV) to support herbicide patch spraying in maize. Precis. Agric. 2017, 18, 76–94. [Google Scholar] [CrossRef]
  54. Peña, J.M.; Torres-Sánchez, J.; de Castro, A.I.; Kelly, M.; López-Granados, F. Weed mapping in early-season maize fields using object-based analysis of unmanned aerial vehicle (UAV) images. PLoS ONE 2013, 8, e77151. [Google Scholar] [CrossRef] [PubMed]
  55. Pei, H.; Sun, Y.; Huang, H.; Zhang, W.; Sheng, J.; Zhang, Z. Weed Detection in Maize Fields by UAV Images Based on Crop Row Preprocessing and Improved YOLOv4. Agriculture 2022, 12, 975. [Google Scholar] [CrossRef]
  56. Jurišić, M.; Radočaj, D.; Plaščak, I.; Galić Subašić, D.; Petrović, D. The Evaluation of the RGB and multispectral camera on the unmanned aerial vehicle (UAV) for the machine learning classification of maize. Poljoprivreda 2022, 28, 74–80. [Google Scholar] [CrossRef]
  57. Torres-Sánchez, J.; Peña-Barragán, J.; Gómez-Candón, D.; De Castro, A.; López-Granados, F. Imagery from unmanned aerial vehicles for early site specific weed management. In Precision Agriculture’13; Wageningen Academic Publishers: Wageningen, The Netherlands, 2013; pp. 193–199. [Google Scholar]
  58. Munghemezulu, C.; Mashaba-Munghemezulu, Z.; Ratshiedana, P.E.; Economon, E.; Chirima, G.; Sibanda, S. Unmanned Aerial Vehicle (UAV) and Spectral Datasets in South Africa for Precision Agriculture. Data 2023, 8, 98. [Google Scholar] [CrossRef]
  59. Ranđelović, P.; Đorđević, V.; Milić, S.; Balešević-Tubić, S.; Petrović, K.; Miladinović, J.; Đukić, V. Prediction of soybean plant density using a machine learning model and vegetation indices extracted from RGB images taken with a UAV. Agronomy 2020, 10, 1108. [Google Scholar] [CrossRef]
  60. Anderegg, J.; Tschurr, F.; Kirchgessner, N.; Treier, S.; Schmucki, M.; Streit, B.; Walter, A. On-farm evaluation of UAV-based aerial imagery for season-long weed monitoring under contrasting management and pedoclimatic conditions in wheat. Comput. Electron. Agric. 2023, 204, 107558. [Google Scholar] [CrossRef]
  61. Brewer, K.; Clulow, A.; Sibanda, M.; Gokool, S.; Naiken, V.; Mabhaudhi, T. Predicting the chlorophyll content of maize over phenotyping as a proxy for crop health in smallholder farming systems. Remote Sens. 2022, 14, 518. [Google Scholar] [CrossRef]
  62. Quan, L.; Lou, Z.; Lv, X.; Sun, D.; Xia, F.; Li, H.; Sun, W. Multimodal remote sensing application for weed competition time series analysis in maize farmland ecosystems. J. Environ. Manag. 2023, 344, 118376. [Google Scholar] [CrossRef] [PubMed]
  63. Agarwal, R.; Hariharan, S.; Rao, M.N.; Agarwal, A. Weed Identification using K-Means Clustering with Color Spaces Features in Multi-Spectral Images Taken by UAV. In Proceedings of the 2021 IEEE International Geoscience and Remote Sensing Symposium IGARSS, Virtual, 12–16 July 2021; pp. 7047–7050. [Google Scholar]
  64. Kumar, D.A.; Srikanth, P.; Neelima, T.; Devi, M.U.; Suresh, K.; Murthy, C. Monitoring of spectral signatures of maize crop using temporal sar and optical remote sensing data. Int. J. Bio-Resour. Stress Manag. 2021, 12, 745–750. [Google Scholar] [CrossRef]
  65. Liu, B.; Bruch, R. Weed detection for selective spraying: A review. Curr. Robot. Rep. 2020, 1, 19–26. [Google Scholar] [CrossRef]
  66. United Nations. The Sustainable Development Goals: Report 2022; UN: New York, NY, USA, 2022. [Google Scholar]
  67. Gokool, S.; Mahomed, M.; Kunz, R.; Clulow, A.; Sibanda, M.; Naiken, V.; Chetty, K.; Mabhaudhi, T. Crop monitoring in smallholder farms using unmanned aerial vehicles to facilitate precision agriculture practices: A scoping review and bibliometric analysis. Sustainability 2023, 15, 3557. [Google Scholar] [CrossRef]
  68. Roslim, M.H.M.; Juraimi, A.S.; Che’Ya, N.N.; Sulaiman, N.; Manaf, M.N.H.A.; Ramli, Z.; Motmainna, M. Using remote sensing and an unmanned aerial system for weed management in agricultural crops: A review. Agronomy 2021, 11, 1809. [Google Scholar] [CrossRef]
  69. Sharma, G.; Shrestha, S.; Kunwar, S.; Tseng, T.-M. Crop diversification for improved weed management: A review. Agriculture 2021, 11, 461. [Google Scholar] [CrossRef]
  70. Noujdina, N.V.; Ustin, S.L. Mapping downy brome (Bromus tectorum) using multidate AVIRIS data. Weed Sci. 2008, 56, 173–179. [Google Scholar] [CrossRef]
  71. Torres-Sánchez, J.; Peña, J.M.; de Castro, A.I.; López-Granados, F. Multi-temporal mapping of the vegetation fraction in early-season wheat fields using images from UAV. Comput. Electron. Agric. 2014, 103, 104–113. [Google Scholar] [CrossRef]
Figure 1. Location of the experimental field at the Bronkhorstspruit farm (on the left) and the UAV image with the farm boundary (on the right) on 18 May 2022.
Figure 1. Location of the experimental field at the Bronkhorstspruit farm (on the left) and the UAV image with the farm boundary (on the right) on 18 May 2022.
Sustainability 15 13416 g001
Figure 2. Flowchart for the overall methodology for detecting weed in maize fields.
Figure 2. Flowchart for the overall methodology for detecting weed in maize fields.
Sustainability 15 13416 g002
Figure 3. Photos taken during the different phenological stages of the maize development. On 27 January 2022, the photo (a) was taken at the early stage of booting and weed growth. The photo taken on 27 February 2022 (b) depicts the maize during the heading stage compared to 18 May 2022 (c) when the maize was at maturity close to harvest. In all three images, the presence of weeds are evident.
Figure 3. Photos taken during the different phenological stages of the maize development. On 27 January 2022, the photo (a) was taken at the early stage of booting and weed growth. The photo taken on 27 February 2022 (b) depicts the maize during the heading stage compared to 18 May 2022 (c) when the maize was at maturity close to harvest. In all three images, the presence of weeds are evident.
Sustainability 15 13416 g003
Figure 4. The variance explained using PCs derived from UAV and PlanetScope input features for January, February, April, and May 2022.
Figure 4. The variance explained using PCs derived from UAV and PlanetScope input features for January, February, April, and May 2022.
Sustainability 15 13416 g004
Figure 5. Cross-validation scores for PlanetScope (a) and UAV (b) data on box and whisker plots for all the sampling dates using all bands and PCA feature selection and the RF and SVM algorithms.
Figure 5. Cross-validation scores for PlanetScope (a) and UAV (b) data on box and whisker plots for all the sampling dates using all bands and PCA feature selection and the RF and SVM algorithms.
Sustainability 15 13416 g005
Figure 6. Overall accuracy for all the models over time.
Figure 6. Overall accuracy for all the models over time.
Sustainability 15 13416 g006
Figure 7. The comparison of the mean accuracy from the overall accuracy, precision, recall, F1, and cross-validation scores for each model of the four experiments (1 to 4) using UAV and PlanetScope data.
Figure 7. The comparison of the mean accuracy from the overall accuracy, precision, recall, F1, and cross-validation scores for each model of the four experiments (1 to 4) using UAV and PlanetScope data.
Sustainability 15 13416 g007
Figure 8. Variable importance plots for the experiments for January to May 2022 of experiments 1 (ad) and 3 (eh) for the UAV dataset.
Figure 8. Variable importance plots for the experiments for January to May 2022 of experiments 1 (ad) and 3 (eh) for the UAV dataset.
Sustainability 15 13416 g008
Figure 9. Dynamics of overall accuracy (a) and F1 score (b) for maize crops during the mid-to-late crop growth stage.
Figure 9. Dynamics of overall accuracy (a) and F1 score (b) for maize crops during the mid-to-late crop growth stage.
Sustainability 15 13416 g009
Figure 10. PlanetScope data and derived classification results: (a) the true color composite for the mean imagery of February 2022. The classification results are displayed for the best-performing model (experiment 3) from January (b), February (c), April (d), and May 2022 (e).
Figure 10. PlanetScope data and derived classification results: (a) the true color composite for the mean imagery of February 2022. The classification results are displayed for the best-performing model (experiment 3) from January (b), February (c), April (d), and May 2022 (e).
Sustainability 15 13416 g010
Figure 11. UAV data and derived classification results: (a) the true color composite for February 2022. The classification results are displayed for the best-performing model (experiment 1) from January (b), February (c), April (d), and May 2022 (e).
Figure 11. UAV data and derived classification results: (a) the true color composite for February 2022. The classification results are displayed for the best-performing model (experiment 1) from January (b), February (c), April (d), and May 2022 (e).
Sustainability 15 13416 g011
Table 1. Band characteristics for the RedEdge-MX multispectral camera.
Table 1. Band characteristics for the RedEdge-MX multispectral camera.
Band NameCentral Band (nm)WidthReference
Blue47532Micasense [33]
Green56027
Red66814
Near Infrared (NIR)84257
Red-Edge71712
Table 2. Monthly UAV image acquisition dates per month of fieldwork.
Table 2. Monthly UAV image acquisition dates per month of fieldwork.
UAV Flight
Acquisition Date
Phenological Stages
January 202226 January 2022Booting
February 202223 February 2022Heading
April 20226 April 2022Dough
May 202218 May 2022Maturity
Table 3. Band characteristics for the PlanetScope Dove Satellite.
Table 3. Band characteristics for the PlanetScope Dove Satellite.
Band NameBand (nm)WidthReference
Blue455–51560Team [34]
Green500–59090
Red590–67080
Near Infrared (NIR)780–86080
Table 4. List of features used for maize/weed image classification.
Table 4. List of features used for maize/weed image classification.
FeaturesFormulaUAVPlanetScopeReference
Blue xx
Green xx
Red xx
Near Infrared xx
Red-Edge x
Hue, Saturation, Brightness Value (HSV) xx
L*a*b* xx
Excess Green
Index (EGI/EXG)
2 × G R B R + G + B xxWoebbecke, et al. [36]
Triangular Greenness
Index (TGI)
190 R G 120 R B 2 xxHunt Jr, et al. [37]
Visible Atmospheric
Resistant Index (VARI)
G R G + R B xxGitelson, et al. [38]
Normalized Difference Vegetation Index (NDVI) N I R R N I R + R xxTucker [39]
Enhanced Vegetation
Index (EVI)
2.5 N I R R N I R + 6 × R 7.5 × B + 1 xxHuete, et al. [40]
Soil Adjusted Vegetation Index (SAVI) ( 1 + 0.5 ) × ( N I R R ) ( N I R + R + 0.5 ) xxHuete [41]
Normalized Difference Red-Edge Index (NDRE) N I R R E d g e N I R + R E d g e x Barnes, et al. [42]
Abbreviations: R, G, B, NIR, and REdge represent red, green, blue, near-infrared, and red-edge bands, respectively.
Table 5. Description of the four experiments conducted with UAV and PlanetScope data.
Table 5. Description of the four experiments conducted with UAV and PlanetScope data.
Experiment Description
Experiment 1All bands + RF
Experiment 2All bands + SVM
Experiment 3PCs + RF
Experiment 4PCs + SVM
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

de Villiers, C.; Munghemezulu, C.; Mashaba-Munghemezulu, Z.; Chirima, G.J.; Tesfamichael, S.G. Weed Detection in Rainfed Maize Crops Using UAV and PlanetScope Imagery. Sustainability 2023, 15, 13416. https://doi.org/10.3390/su151813416

AMA Style

de Villiers C, Munghemezulu C, Mashaba-Munghemezulu Z, Chirima GJ, Tesfamichael SG. Weed Detection in Rainfed Maize Crops Using UAV and PlanetScope Imagery. Sustainability. 2023; 15(18):13416. https://doi.org/10.3390/su151813416

Chicago/Turabian Style

de Villiers, Colette, Cilence Munghemezulu, Zinhle Mashaba-Munghemezulu, George J. Chirima, and Solomon G. Tesfamichael. 2023. "Weed Detection in Rainfed Maize Crops Using UAV and PlanetScope Imagery" Sustainability 15, no. 18: 13416. https://doi.org/10.3390/su151813416

APA Style

de Villiers, C., Munghemezulu, C., Mashaba-Munghemezulu, Z., Chirima, G. J., & Tesfamichael, S. G. (2023). Weed Detection in Rainfed Maize Crops Using UAV and PlanetScope Imagery. Sustainability, 15(18), 13416. https://doi.org/10.3390/su151813416

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop