Next Article in Journal
Mining and Expression Pattern Analysis of Genes Related to the Regulation of Flowering in Korean Pine (Pinus koraiensis)
Previous Article in Journal
Enhanced Properties of Cryptomeria japonica (Thunb ex L.f.) D.Don from the Azores Through Heat-Treatment
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Assessment of Mango Canopy Water Content Through the Fusion of Multispectral Unmanned Aerial Vehicle (UAV) and Sentinel-2 Remote Sensing Data

1
The School of Geography and Planning, Nanning Normal University, Nanning 530001, China
2
The Key Laboratory of Environment Change and Resources Use in Beibu Gulf, Ministry of Education, Nanning Normal University, Nanning 530001, China
3
The Faculty of Environment and Resource Studies, Mahilon University, Nakhon Pathom 73170, Thailand
*
Author to whom correspondence should be addressed.
Forests 2025, 16(1), 167; https://doi.org/10.3390/f16010167
Submission received: 12 November 2024 / Revised: 14 January 2025 / Accepted: 15 January 2025 / Published: 17 January 2025

Abstract

:
This study proposes an Additive Wavelet Transform (AWT)-based method to fuse Multispectral UAV (MS UAV, 5 cm resolution) and Sentinel-2 satellite imagery (10–20 m resolution), generating 5 cm resolution fused images with a focus on near-infrared and shortwave infrared bands to enhance the accuracy of mango canopy water content monitoring. The fused Sentinel-2 and MS UAV data were validated and calibrated using field-collected hyperspectral data to construct vegetation indices, which were then used with five machine learning (ML) models to estimate Fuel Moisture Content (FMC), Equivalent Water Thickness (EWT), and canopy water content (CWC). The results indicate that the addition of fused Sentinel-2 data significantly improved the estimation accuracy of all parameters compared to using MS UAV data alone, with the Genetic Algorithm Backpropagation Neural Network (GABP) model performing best (R2 = 0.745, 0.859, and 0.702 for FMC, EWT, and CWC, respectively), achieving R2 improvements of 0.066, 0.179, and 0.210. Slope, canopy coverage, and human activities were identified as key factors influencing the spatial variability of FMC, EWT, and CWC, with CWC being the most sensitive to environmental changes, providing a reliable representation of mango canopy water status.

1. Introduction

Crop water content reflects the physiological state of plants under water stress [1], so it is crucial for drought diagnosis [2]. The timely and accurate monitoring of the crop canopy’s water content is essential for developing precise irrigation strategies and improving crop yield and quality [3].
Recent studies have demonstrated the importance of Fuel Moisture Content (FMC) [4], leaf Equivalent Water Thickness (EWT) [5], and canopy water content (CWC) [6] in accurately delineating the moisture condition of vegetation. FMC focuses on analyzing the flammability of dry matter, making it suitable for fire risk modeling. EWT is a key parameter for assessing plant water stress at the individual level, while CWC provides insights into ecosystem- or large-scale water dynamics. A combined analysis of these parameters can assist agricultural and forestry managers in optimizing irrigation strategies and formulating drought mitigation measures. The monitoring of vegetation water status primarily relies on water absorption characteristic bands (970, 1200, 1450, 1950, and 2250 nm [7]). To capture spatiotemporal variations in vegetation water content on broad scales, spectral reflectance is typically acquired using remote sensing satellites and Unmanned Aerial Vehicles (UAVs) [8], alongside the construction of related vegetation indices (VIs) [9,10,11]. Developing efficient and simple methods for accurate vegetation water monitoring has long been a key research focus.
Multispectral UAV (MS UAV), equipped with multispectral sensors, are simple to operate, acquire smaller datasets, and offer high spatial resolution, making them widely used in vegetation monitoring studies. However, MS UAVs primarily capture spectral information in the visible and near-infrared bands, lacking the spectral data sensitive to vegetation water content (such as additional near-infrared and shortwave infrared bands) [7,12,13]. Therefore, additional auxiliary data (e.g., surface temperature [14]) are often required to improve the accuracy of vegetation water content monitoring.
Multispectral satellite imagery, on the other hand, covers water-sensitive spectral bands and is also frequently used for monitoring vegetation water content [15,16]. However, compared to UAV data, satellite imagery generally has a lower spatial resolution, which may limit its ability to capture fine-scale variations within vegetation canopies, particularly for heterogeneous crops like mango. Given the complementary advantages of UAVs and satellites, data fusion techniques have gained prominence. By combining the high spatial resolution of UAV imagery with the broader spectral range offered by satellite data, it is possible to leverage the advantages of both sources to generate products with enhanced spatial and spectral resolution, thereby improving the accuracy of vegetation water content estimation. This approach not only overcomes the spatial limitations of satellite data but also addresses the spectral limitations of UAV sensors, resulting in more precise assessments of vegetation water content.
Image fusion technology in remote sensing has been widely applied on satellite platforms [17]. The Additive Wavelet Transform (AWT) has demonstrated clear advantages in remote sensing image fusion, particularly when compared to traditional methods such as Principal Component Analysis (PCA) [18] and Gram–Schmidt (GS) [19]. Unlike PCA, which may introduce spectral distortion due to its reliance on the statistical properties of images, AWT preserves the spectral integrity of the original inputs by utilizing the Intensity–Hue–Saturation (IHS) transformation as a foundation for fusion [20]. Compared to GS, which often enhances spatial resolution at the expense of spectral fidelity, AWT achieves a balance, enhancing both spatial and spectral details [21]. Studies have shown that the use of AWT to fuse multispectral images (e.g., MS UAV) with satellite images (e.g., Landsat-8) significantly improves the accuracy of biophysical parameter estimation, such as evapotranspiration, due to AWT’s capability to retain essential spectral characteristics [22]. This robust approach underscores AWT’s feasibility and applicability as an advanced fusion method for integrating UAV and satellite data.
In recent years, machine learning (ML) methods have gained widespread attention for their powerful predictive capabilities and data mining potential in estimating crop water content. For instance, the Random Forest (RF) model has been successfully applied to the inversion of vegetation physiological parameters due to its robustness in handling high-dimensional data [23]. The Partial Least Squares (PLS) model has demonstrated exceptional performance in processing multivariate spectral data with high collinearity [24]. The Support Vector Machine (SVM) model is widely used in hyperspectral data classification and regression tasks due to its strong performance in small-sample learning scenarios [25]. Additionally, the Genetic-Algorithm-optimized Backpropagation Neural Network (GABP) model has been employed to analyze complex agricultural datasets, effectively enhancing inversion accuracy and model stability [26].
Mango is a highly important fruit, widely cultivated in tropical and subtropical regions. Although mango trees are drought-tolerant, their growth and fruit production are still influenced by water stress [27]. This study focuses on addressing three primary objectives: (1) to assess the efficacy of the AWT method in integrating MS UAV data with Sentinel-2 satellite imagery; (2) to evaluate whether the inclusion of fused Sentinel-2 data, particularly water-sensitive shortwave infrared and red-edge bands, can significantly enhance the estimation accuracy of vegetation water content parameters (FMC, EWT, and CWC) compared to using MS UAV data alone when applied to various machine learning methods; and (3) to investigate the potential spatial heterogeneity in the distribution patterns of different vegetation water content parameters using the optimal estimation models.

2. Materials and Methods

2.1. Site Description

The study area is located at the mango demonstration base in Baise City, Guangxi Zhuang Autonomous Region in southern China (23.687° N, 107.088° E), covering an area of 0.069 km2. The base mainly cultivates a mix of Taiwan and Guiqi mango tree varieties. The region benefits from abundant sunlight, extended daylight duration, and a frost-free period with warm summers and mild winters in the South Asian Tropical Monsoon climate. The flat terrain of the study site offers optimal circumstances for MS UAV operations.
The field experiment for this study was conducted from 23 September to 25 September 2023, encompassing 57 sampling points to collect leaf area index (LAI) data and leaf samples, with canopy hyperspectral data from two sampling points excluded due to cloud cover (resulting in 55 usable data points), along with acquiring MS UAV and Sentinel-2 satellite images (Figure 1).

2.2. Field Measurements

The measurements were carried out at the height of 1–1.3 m above the top of the mango canopy, using an Analytical Spectral Devices FieldSpec 4 spectroradiometer (ASD) (Analytical Spectral Devices, Inc., Boulder, CO, USA) equipped with a 25° Field of View (FOV) fiber optic. The sensor was positioned vertically downward during measurements, covering an area approximately equivalent to a circular region with a radius of 0.25 m. Prior to collecting spectral data from the mango canopy, it is necessary to perform a dark current correction and reference panel on the ASD, with at least nine scans performed at each sampling point; all measurements were carried out between 10:30 a.m. and 2:30 p.m. local time, ensuring clear or nearly clear weather conditions. The positions for the spectral data collection were determined by the Real-Time Kinematic (RTK) instrument. To improve the data quality, we applied Savitzky–Golay filtering with a window size of 3 × 3 to smooth the raw spectra data [28]. (This location needs your help to reinsert the correct reference). This process helped to reduce the noise introduced by the instrument, resulting in smoother spectral signals and providing a more stable foundation for subsequent analysis.
A clear weather condition was chosen, and a DJI Phantom 4 MS UAV (DJI, Shenzhen, China) was chosen for capturing real-time images of the study area. This UAV was outfitted with six sensors, comprising one color sensor and five monochrome sensors (Table 1). The MS UAV images were acquired at 12:00 local time on 23 September 2023. During the capture, the multispectral camera lens should be oriented vertically downwards to take orthomosaic images of the mango canopy. The MS UAV was flown at a height of 50 m above the ground with a speed of 7 m/s, the spatial resolution was approximately 3 cm (resampled to 5 cm to meet subsequent data fusion requirements), and the longitudinal and lateral image overlaps were set at 80%. DJI Terra software (version 4.2.0) for was utilized stitching and pre-processing the MS UAV images and performed radiometric calibration of the remote sensing images using standard reference panels.
(1)
Leaf area index
To avoid underestimating LAI caused by direct sunlight, we chose to use the LI-COR LAI-2200 Plant Canopy Analyzer (LI-COR Biosciences, Lincoln, OR, USA) to assess the LAI of mango trees under cloudy and twilight conditions. During the experimental procedure, it was advisable to apply a 270° viewing cap to prevent direct exposure to sunlight and to verify that the operator and the tree trunk were not captured within the sensor. The sensor probe should be held horizontally with a steady orientation. Initially, the measurement of parameter A should be taken in an open area above the canopy, followed by reading parameter B in four different directions below the canopy. The average of these four B values will be used to determine the LAI of the entire tree. We also employed the same methodology to measure the LAI of the spectral acquisition positions.
(2)
Leaf sampling and water content measurement
The mango canopy was partitioned into four directions, east, west, south, and north, for collecting healthy and intact leaves. Three age groups (new, mature, and aged leaves) were chosen from each direction to ensure that the water content parameters accurately reflected leaves from various age groups in the survey area. After collecting the leaves, we immediately weighed their fresh weight (FW). Then, we placed the mango leaves in an incubator to prevent water loss during transportation back to the laboratory. In the lab, the leaves were subjected to blanching in an oven at 105 °C for half an hour, followed by drying at 80 °C until a constant weight is reached. We measured their dry weight (DW). Finally, we calculated the Fuel Moisture Content (FMC) using the following formula:
FMC = ( FW DW ) DW × 100 %
In the formula, FW denotes the fresh weight of the leaf (kg), DW represents the dry weight of the leaf (g), and FMC represents the Fuel Moisture Content (%). In order to calculate the Equivalent Water Thickness (EWT), we placed the leaves flat on a printer scanning, using the Support Vector Machine (SVM) method to identify the leaf and calculate the leaf area by multiplying the number of pixels by the pixel size:
EWT = ( FW DW ) A
where “ A ” represents the leaf area (cm2), and EWT is equivalent to water thickness (g/cm2).
The formula for calculating canopy water content is as follows:
CWC = EWT × LAI
LAI represents the leaf area index (cm2·cm2), and CWC is canopy water content (g/cm2).
A single direction of the mango canopy was selected as the spectral sampling point (n = 55), while the average values of FMC, EWT, and CWC from different directions were used to represent the water content parameters for the entire mango canopy (n = 53). To enhance the generalizability of the model, all samples (n = 108) were subsequently used for model training (Table 2).
(3)
Sentinel-2 remote sensing data
The Multispectral Instrument (MSI) carried by the Sentinel-2 satellite comprises 13 spectral bands ranging from visible light to shortwave infrared, with spatial resolutions ranging from 10 to 60 m. The revisit periods of the Sentinel-2A and Sentinel-2B satellites are both 10 days, with a combined constellation revisit frequency was 7 days, enabling effective monitoring of the vegetation growth information. We acquired satellite images closer to the date of the UAV experiment, on 22 September 2023. The Sentinel-2 Level-1C products are available for free access through the European Space Agency (ESA) Data Center (https://scihub.copernicus.eu/dhus/, accessed on 22 September 2023). To eliminate atmospheric interference, we applied the Sen2cor plugin and SNAP for radiometric calibration and atmospheric correction. Finally, we employed the sharpening method of the Sen2Res plugin to reconstruct 20 m resolution bands to 10 m resolution [29].

2.3. Image Fusion

To better preserve the spectral and spatial details of the original images, we employed the AWT method to fuse these two types of images. The AWT method operated by weighting and summing the detailed images of each scale. We drew inspiration from the Fast-Intensity–Hue-Saturation (FIHS) image fusion algorithm, with the following formula [28]:
Sentinel-2 high = Sentinel-2 low + [ UAV UAV low ]
where Sentinel-2high is the high-resolution Senitnel-2 image obtained through AWT fusion; Sentniel-2low is the original Sentinel-2 image, which was resampled to the same resolution as the original UAV image; the UAV is the original high-resolution image; and UAVlow is the down-sampled image using the B3 cubic spline function with spatial resolution consistent with the original Sentinel-2 image [20].

2.4. Selection of Spectral Indices for Vegetation Water Content Retrieval

Vegetation indices (VIs) are typically calculated linearly or nonlinearly based on the reflectance of the vegetation-sensitive multispectral or hyperspectral bands in remote sensing data [30]. In this study, we used the AWT method to fuse Multispectral USA and Sentinel-2 satellite images to obtain 12 VIs, summarized in Table 3.

2.5. Machine Learning and Data Simulation

Based on the reflectance from hyperspectral UAV bands, the fused Sentinel-2 satellite bands, and vegetation indices, we employed five commonly used machine learning algorithms to estimate canopy water content parameters (FMC, EWT, and CWC). During the estimation of canopy water content across the mango orchard, the spatial distribution of mango trees was extracted using the Mask-RCNN instance segmentation model to exclude the influence of other land features [42].
The Random Forest (RF) [43], Partial Least Squares (PLS) [44], and Support Vector Machine (SVM) [45] models were implemented using Python’s (version 3.7) Scikit-learn library (version 1.02). For the RF model, the parameters were set as follows: n_estimators = 20, max_features = 8, bootstrap = True, max_depth = 2, min_samples_leaf = 3, and min_samples_split = 5. For the PLSR model, the parameters were configured as n_components = 11, scale = False, max_iter = 100, tol = 0.0001, and copy = True. For the SVM model, the parameters were set to C = 20, kernel = ’poly’, gamma = 0.1, degree = 3, and coef0 = 2. The Extreme Gradient Boosting (XGBoost) (version 1.6.2) [46] algorithm was implemented in Python with the following parameters: n_estimators = 500, max_depth = 5, and learning_rate = 0.1.The Genetic Algorithm–Backpropagation Neural Network (GABP) [47] was developed using MATLAB’s (version 2023a) Deep Learning Toolbox (version 14.4). The parameters for this model were set as follows: hiddennum = 9, training_epochs = 1000, learning_rate = 0.0001, PopulationSize_Data = 30, maxGenerations_Data = 50, crossover_ probabilities = 0.8, and mutation_probabilities = 0.2.
Shapley Additive exPlanation (SHAP) values [48] were used in this study to quantify the impact of model input variables on predictions, including both the magnitude and direction (positive or negative) of their influence. This approach is relatively intuitive and easy to interpret, particularly for tree-based ensemble methods. Consequently, we measured the contribution of each index to model predictions by calculating the average absolute SHAP values for variables in both RF and XGBoost models, and the effective variables were then input into the five aforementioned models to reduce model complexity.

2.6. Accuracy Validation

This study used the coefficient of determination R2 to quantify the differences between the Multispectral UAV bands reflectance, the fused Sentinel-2 satellite bands reflectance, and the ASD measurement reflectance. The calculation of the model accuracy was performed using the R2, Root Mean Square Error (RMSE), and percent Relative Error (RE%). The formulas for these metrics are as follows:
R 2 = 1 i = 1 n   y ^ i y ¯ 2 i = 1 n   y i y ¯ 2
RMSE = i = 1 n   y ^ i y i 2 n
RE ( % ) = 1 n i = 1 n | y i ^ y i | y i × 100 %
where n represents the total number of samples; y i and y i ^ denote the observed and predicted values, respectively; and i is the sample number.
All models were evaluated using 10-fold cross-validation to assess their performance

3. Results

3.1. Accuracy Assessment of MS UAV Bands Reflectance

Despite the MS UAV having undergone radiometric calibration using a diffuser panel to minimize system errors, in this study, we still refer to the spectral reflectance by the ASD measurement as a reference. Generally, the accuracy of reflectance for different bands of the multispectral UAV and satellite is validated by resampling the spectral reflectance data measured by the ASD spectroradiometer measurement according to the spectral response functions. However, since the DJI Phantom 4 does not have standardized spectral response functions, we conducted spectral resampling by calculating the average reflectance values. The ASD spectrometer measures the spectral reflectance over a circular area with a coverage radius of 0.2 m2. Hence, we utilized ArcGIS software (version 10.8) to delineate a circular buffer zone with an area of 0.2 m2 around the measured points for extracting the average MS UAV band reflectance. Subsequently, the multispectral UAV band’s reflectance will be subjected to correlation analysis with the ASD spectrometer measurements. The results (Figure 2) demonstrated a high consistency level between the two datasets’ spectral reflectance. The green band of the UAV exhibited the highest accuracy (R2 = 0.825), followed by the red edge (R2 = 0.722), red (R2 = 0.72), NIR (R2 = 0.601), and blue (R2 = 0.598) bands. All UAV bands were calibrated based on regression equations derived from the ASD spectroradiometer bands’ reflectance to ensure the accuracy of subsequent vegetation index calculations.

3.2. Accuracy Assessment of Sentinel-2 Satellite Image Fusion

Since the Green band of the MS UAV exhibited the highest precision, we fused the Sen2Res-sharpened Sentinel-2 satellite image with this band by the AWT fusion method and conducted a correction analysis between the fused result and the measured band reflectance in Figure 3. The AWT fusion showed a good correlation with the measured spectral reflectance by the ASD, with the highest accuracy observed in the B3 band (R2 = 0.73), followed by the B5 (R2 = 0.667), B2 (R2 = 0.598), B4 (R2 = 0.428), B8 (R2 = 0.435), B11 (R2 = 0.42), B6 (R2 = 0.362), B12 (R2 = 0.428), B7 (R2 = 0.265), and B8A (R2 = 0.25) bands.
The Sentinel-2 satellite data that were not sharpened by the Sen2Res were resampled from 20 m to 10 m resolution and subsequently correlated with measurements by the ASD spectroradiometer spectral reflectance (Figure 4). The results indicated that the sharpened Sentinel-2 satellite image outperforms the unsharpened version (Figure 3) after Sen2Res sharpening, and the near-infrared bands (B5, B6, B7, and B8A) and shortwave infrared bands (B11 and B12) have elevated the R2 fusion accuracy in AWT by 0.049, 0.133, 0.127, 0.132, 0.165, and 0.166, respectively. Consequently, the subsequent analyses of vegetation indices utilized data processed through Sen2Res sharpening and AWT fusion.

3.3. Estimation of Mango Canopy Water Content Based on Spectral Reflectance and Vegetation Indices

As shown in Figure 5, the total number of contributing variables is higher for the RF model than for the XGBoost model, although the number of contributing variables is relatively consistent. In the FMC estimation models (a) and (d), almost all variables contribute, with S2_NMDI and UAV_SR being the most significant contributors. In the FMC estimation models (b) and (e), the XGBoost model has more effective variables than the RF model. The major contributing variables in the XGBoost model are S2-NDTI and S2_NMDI, while in the RF model, they are UAV_B3 and S2_SWID. There is a noticeable difference in the key contributing variables between the two models. In the FMC estimation models (c) and (f), S2-NDTI and S2_NDVI have the highest contribution rates. In the subsequent model construction, variables with 0 contributions were removed, and the retained variables selected through SHAP values from the RF and XGBoost models were inputted into the RF, XGBoost, SVM, PLS, and GABP models for further accuracy evaluation.
Based on the above analysis, this study further evaluated the modeling accuracy of standalone MS UAV data versus the combined MS UAV and Sentinel-2 (S2) data. When using only MS UAV data, all available variables were included in the models due to the limited variable count. The results (Table 4) show that among the five methods, the GABP model achieved the highest R2 values for all canopy moisture parameters, with values of 0.745, 0.837, and 0.702 for FMC, EWT, and CWC, respectively. With the addition of UAV + S2 data, the GABP model’s estimation accuracy significantly improved, with R2 values increasing by 0.066, 0.179, and 0.210, especially for CWC. These findings indicate that the fused Sentinel-2 data enhance the predictive performance of the model, particularly when combined with the GABP model.

3.4. Assessment of Mango Canopy Water Content

The optimal GABP model was utilized to estimate the spatial distributions of FMC, EWT, and CWC, revealing distinct distribution patterns within the study area (Figure 6). The mean FMC was 163%, ranging from 90% to 210%, with higher values concentrated in the northwestern region, reflecting the influence of vegetation canopy structure. The mean EWT was 0.0161 g/cm2, primarily falling between 0.0125 and 0.0225 g/cm2. Higher EWT values were observed in the northern and western areas, indicating denser or more water-rich canopies in these regions. In contrast, the mean CWC was 0.0514 g/cm2, predominantly ranging between 0.03 and 0.08 g/cm2. Lower CWC values were identified near buildings in the northwest and along roadsides in the southwest, potentially due to environmental disturbances or human activities. The spatial variation of FMC was relatively small, with its distribution pattern in the northwest contrasting with that of EWT. Meanwhile, the spatial distribution of EWT closely resembled that of CWC, although CWC exhibited more pronounced spatial variability. These findings highlight the effectiveness of the GABP model in capturing fine-scale variations in mango canopy water content, which is crucial for understanding plant health and water stress in heterogeneous landscapes.

4. Discussion

4.1. Accuracy Assessment of MS UAV Band Reflectance

In this study, to obtain accurate mango canopy reflectance, we compared the spectral data measured by the ASD FieldSpec 4 spectroradiometer with the canopy spectra collected by the MS UAV (Figure 2). Although the overall results indicated good consistency, the Blue (R2 = 0.598) and NIR (R2 = 0.601) bands showed relatively lower consistency. One possible reason for this is that the blue band (450 ± 16 nm) is significantly affected by atmospheric scattering (especially Rayleigh scattering), particularly under conditions with high levels of atmospheric dust and water vapor [49]. The NIR band (840 ± 26 nm) is heavily influenced by atmospheric water vapor absorption [50]. Another reason might be that the calibration parameters for the multispectral camera, determined under laboratory conditions, may not correspond to the correct values for field use as the equipment ages, as indicated by previous studies [51]. Additionally, the DJI Phantom 4 multispectral UAV does not have a standard spectral response function. Future research should consider the impact of atmospheric scattering and water vapor on the MS UAV band reflectance and conduct regular inspections and calibrations of the equipment.

4.2. Evaluation of Image Fusion Methods

The AWT is a robust image fusion technique that can be used to merge satellite images with significant differences in spatial resolution [52]. By comparing the MS UAV images to the AWT fusion images (Figure 7), we have observed significant disparities in the color of the images surrounding roads, houses, and cultivated land, as well as in the color of the grid boundaries. Therefore, it is typically necessary to ensure the homogeneity of pixel content between high-resolution and coarse-resolution images before AWT fusion, as this is also a key factor in maintaining the good consistency of AWT results. [22].
In this study, we sharpened the 20 m resolution bands to 10 m and observed that the Sen2Res sharpening method improved accuracy in the Sentinel-2 satellite imagery after fusion with AWT. Notably, the acquisition times of the Sentinel-2 and UAV images were not synchronized, with a one-day difference. However, the performance of the Sentinel-2 satellite image fusion (Figure 3) demonstrates a good consistency between the reflectance values of all bands and the spectral reflectance obtained by the ASD spectroradiometer. These results further validate the feasibility and effectiveness of using the AWT image fusion method under similar image acquisition times, consistent with the findings of Chen et al. [52] and Dehkordi et al. [22]. Since the data were collected for only one fusion date, future research should include additional data and datasets to draw stronger conclusions regarding the AWT fusion of UAV and satellite images. Additionally, future research should continuously improve similar image fusion techniques such as IHS, to reduce overprediction and underprediction when fusing UAV and satellite images.

4.3. Sensitivities of the Spectral Reflectance, Vegetation Indices, and Canopy Moisture Indicators

In this study, the MS UAV imagery was combined with AWT-fused Sentinel-2 imagery to calculate the corresponding vegetation indices (Table 3), followed by a sensitivity analysis of the canopy water parameters (CWC, EWT, and FMC) (Figure 8). As shown in Figure 8a, the reflectances of the UAV_B1 and UAV_B5 bands from the multispectral UAV imagery exhibited relatively low correlations with the reflectances from their corresponding bands S2_B2 and S2_B8 from the AWT-fused imagery, while the other bands showed good consistency. In addition to the aforementioned effects of atmospheric scattering and water vapor, further comparison of bandwidths reveals that UAV_B1 and UAV_B5 have bandwidths of 32 nm and 52 nm, respectively, while S2_B2 and S2_B8 have bandwidths of 98 nm and 145 nm, respectively. However, the bandwidths of other bands range from 18 nm to 45 nm. Since sensors with wider bandwidths captured more spectral information and noise, spectral resampling using spectral response functions or taking their averages could overlook additional spectral features [53]; thus, the significant difference in bandwidths may be one of the reasons for the relatively low correlation between the reflectance of multispectral UAV imagery and AWT-fused imagery. Consequently, for these reasons, the vegetation indices (NDVI-SAVI) constructed using the visible light bands and near-infrared bands from the multispectral UAV and AWT-fused imagery exhibited relatively low correlations.
Both FMC and EWT were significantly correlated with the visible light bands of the multispectral UAV imagery (R ≥ 0.4, p < 0.001), such as UAV_B1, but were not significantly correlated with the visible light bands of the AWT-fused imagery. CWC exhibited a relatively significant correlation with the red-edge bands of the AWT-fused imagery (0.3 < R < 0.4, p < 0.001). A certain correlation was observed among the FMC, EWT, and shortwave infrared bands, further highlighting the necessity of combining multispectral UAV imagery with AWT-fused data to jointly construct vegetation indices (Table 3). From the correlation graphs of the various parameters with vegetation indices, the FMC, EWT, and CWC show the most significant correlation with vegetation indices constructed using the shortwave infrared bands (B11, B12) (R ≥ 0.4, p < 0.001), such as NMDI and NDTI vegetation indices, which strongly demonstrates the great potential of combining UAV and satellite fusion for estimating vegetation water content.

4.4. Analysis of the Model Prediction Result

In this study, predictive models were established for characterizing the vegetation canopy water parameters (CWC, EWT, FMC) using five prediction models: RF, XGBoost, SVM, PLS, and GABP. To reduce model complexity, the irrelevant input variables were removed based on the SHAP value ranking. Among the predicted results for CWC, EWT, and FMC, the accuracy of using only UAV reflectance data was the lowest, while significant improvements in prediction accuracy were observed when combining UAV data with fused reflectance data. The differences in prediction accuracy after removing irrelevant variables based on RF and XGBoost SHAP value rankings were not significant, possibly because both RF and XGBoost are ensemble learning models based on decision trees, resulting in the consistent selection of contributing variables [46,54]. Among the five machine learning methods, GABP yielded the best predictive results; this could be attributed to the inherent flexibility of GABP neural networks, which can adapt to various data types and complex nonlinear relationships, enabling effective prediction even in the presence of noise and outliers. In contrast, the RF, XGBoost, SVM, and PLS models may face limitations in handling nonlinear relationships; although they exhibit some robustness to noise and outliers, they are prone to overfitting. Compared to the other four machine learning methods, the GABP model requires more time for training and prediction. Therefore, further exploration and optimization of complex neural network algorithms are needed to improve the computational efficiency of the model.
The SHAP analysis in Figure 5 demonstrates that vegetation indices derived from near-infrared bands (UAV’s NIR) and shortwave infrared bands (Sentinel-2’s B11 and B12) contribute significantly to the estimation of FMC, EWT, and CWC. This indicates that the prediction of canopy-moisture-related parameters relies heavily on the water-sensitive spectral bands in multispectral data, particularly the strong responsiveness of SWIR bands to variations in water content. These findings highlight the critical role of incorporating Sentinel-2 data in improving prediction accuracy.
From the modeling results in Table 4, it is evident that compared to using only MS UAV data, the integration of the fused Sentinel-2 data significantly enhances the estimation accuracy of FMC, EWT, and CWC. This underscores the importance of data fusion for the modeling of canopy moisture parameters. Specifically, the complementary nature of multisource data effectively leverages the strengths of Sentinel-2 in the NIR and SWIR bands, as well as the high spatial resolution of UAV data in capturing fine-scale terrestrial features, thereby addressing the limitations of single data sources.
Overall, the fusion of multispectral and high-resolution data allows for a more comprehensive capture of the spectral characteristics of canopy moisture.

4.5. Spatial Distribution Differences of Mango Canopy Moisture Indicators

The relatively small study area, characterized by homogenous topography, elevation, climate, and soil texture, provided favorable conditions for investigating the factors influencing the spatial distribution of FMC, EWT, and CWC. The analysis of the spatial patterns of these variables and their influencing factors enhances our understanding of the dynamic regulation of vegetation water characteristics, thus providing a scientific basis for precision ecological management and optimized agricultural water use.
The spatial distribution of Fuel Moisture Content (FMC) exhibited minimal variability (Figure 6a), predominantly ranging from 200% to 300% across the study area. This uniformity likely reflects the homogeneity of vegetation cover and the canopy’s water regulation capacity. However, a localized inverse relationship was observed between the spatial distributions of FMC and Equivalent Water Thickness (EWT). In areas with lower EWT values (e.g., steeper slopes), FMC remained relatively high. This phenomenon may be attributed to vegetation canopy cover and leaf characteristics. Specifically, with lower canopy cover or sparse foliage, although total water content per unit area (EWT) is reduced, the lower dry matter content results in a relatively higher water content per unit mass (FMC). This further suggests that FMC is primarily regulated by inherent vegetation properties, such as canopy cover and leaf area index [55]. Furthermore, previous studies have indicated that the relationship between FMC and EWT is indirect, mediated by factors such as vegetation biomass and dry matter content [56]. Future research incorporating additional vegetation structural parameters, such as biomass and dry matter content, could further validate this hypothesis.
The spatial distribution of EWT exhibited significant variability (Figure 6b), with lower values predominantly concentrated in areas with steeper slopes. This is likely attributable to increased soil erosion and runoff on steeper slopes, consequently reducing soil water storage capacity. Furthermore, lower EWT values observed around roads and buildings may be associated with anthropogenic activities, such as soil compaction, which decrease soil porosity and infiltration rates, leading to reduced water storage. Overall, the distribution of EWT is influenced by the combined effects of topography and soil water dynamics, a finding consistent with other studies [57,58]. Therefore, slope and anthropogenic activities are likely key factors influencing EWT distribution within the study area.
The spatial distribution of canopy water content (CWC) showed significant variability (Figure 6c), exhibiting a distribution pattern similar to EWT, suggesting that topographic slope and anthropogenic activities also significantly influence CWC [59]. Furthermore, the CWC distribution pattern reveals its sensitivity to vegetation canopy cover (i.e., Leaf Area Index, LAI). In areas with lower canopy cover, sparse foliage results in weaker canopy water storage capacity, leading to a significant reduction in CWC. The lower CWC values around buildings and roads were even more pronounced, further indicating that CWC distribution is subject to multiple, interacting controls. Notably, although CWC exhibits a high correlation with EWT, the inclusion of LAI allows CWC to more comprehensively reflect actual canopy water content.

5. Conclusions

In this study, the AWT method was employed to fuse MS UAV and Sentinel-2 data, and relevant vegetation indices were constructed based on parameters representing the vegetation canopy water (FMC, EWT, and CWC). Subsequently, five machine learning methods (RF, PLS, SVM, XGBoost, and GABP) were employed to estimate these parameters. Based on the research findings, the following conclusions can be drawn:
(1)
The feasibility of the AWT method for fusing MS UAV and Sentinel-2 satellite data was demonstrated by comparing them with the measured spectral data, highlighting the effectiveness of the fused data. Additionally, this study confirmed the effectiveness of the Sen2Res plugin in SNAP software for reconstructing Sentinel-2 imagery from 20 m to 10 m resolution, further enhancing its utility in high-resolution vegetation monitoring.
(2)
Among the five machine learning methods, the GABP model exhibited the best estimation performance. The addition of the fused Sentinel-2 data to MS UAV data improved the R2 values for estimating the FMC, EWT, and CWC by 0.066, 0.179, and 0.210, respectively.
(3)
The spatial distribution analysis of FMC, EWT, and CWC indicates that FMC exhibits limited spatial variability, while EWT and CWC display pronounced spatial heterogeneity. Factors such as slope, canopy coverage, and human activities are identified as the primary drivers of these differences. Among the three, CWC is particularly sensitive to environmental changes, providing a more comprehensive reflection of the actual canopy water content.
This study underscores the value of AWT-based data fusion in improving canopy water parameter estimation and offers a novel approach for precise vegetation monitoring, with implications for enhanced irrigation and water management practices. However, this study also has certain limitations. The primary aim of this research was to explore whether the fused Sentinel-2 data could improve the accuracy of estimating mango canopy water parameters without delving into the differences between the different varieties of mango trees. Due to constraints in personnel and resources, this study was performed at specific locations and vegetation types. In future research, efforts will be directed toward selecting the most effective fusion methods, expanding the coverage area of the sampling points, increasing the number of sampling points and different tree species, and conducting studies in areas with different climates and on different time scales to obtain more comprehensive conclusions regarding the spatial distribution and temporal trends of the vegetation water content.

Author Contributions

J.L. was responsible for data processing, statistical analysis, interpretation of results, data validation, and the initial drafting of the manuscript. J.H., J.J., M.W. and N.P. contributed to the conceptualization, study design, and development of the methodology. H.J. oversaw the acquisition and processing of multispectral images. J.L., T.Q. and S.H. conducted field investigations and managed data curation. Y.H. provided critical revisions to the manuscript, ensuring significant intellectual content and approving the final version for publication. All authors have read and agreed to the published version of the manuscript.

Funding

This work was partly supported by the National Natural Science Foundation of China (42201420), the Guangxi Natural Science Foundation of China (AB23026044) and the Guangxi Key Research and Development Program (2023AB06010).

Data Availability Statement

This study utilized the following datasets: Sentinel-2 imagery, which is accessible from the European Space Agency (ESA) Data Center (https://scihub.copernicus.eu/dhus/, accessed on 22 September 2023). The multispectral UAV data, along with the field-collected mango tree LAI and leaf physiological data (including dry weight, fresh weight, and leaf area), are available from the corresponding author, Y.H., upon reasonable request.

Acknowledgments

We would like to express our gratitude to the Guangxi Institute of Botany, Chinese, Academy of Sciences, for their provision of equipment support, and to the European Space Agency (ESA) for providing the Sentinel-2 satellite imagery used in this study.

Conflicts of Interest

The authors declare no conflicts of interest related to this article.

References

  1. Mwinuka, P.R.; Mbilinyi, B.P.; Mbungu, W.B.; Mourice, S.K.; Mahoo, H.F.; Schmitter, P. The feasibility of hand-held thermal and UAV-based multispectral imaging for canopy water status assessment and yield prediction of irrigated African eggplant (Solanum aethopicum L). Agric. Water Manag. 2021, 245, 106584. [Google Scholar] [CrossRef]
  2. Juenger, T.E.; Verslues, P.E. Time for a drought experiment: Do you know your plants’ water status? Plant Cell 2022, 35, 10–23. [Google Scholar] [CrossRef] [PubMed]
  3. Liu, X.; Peng, Y.; Yang, Q.; Wang, X.; Cui, N. Determining optimal deficit irrigation and fertilization to increase mango yield, quality, and WUE in a dry hot environment based on TOPSIS. Agric. Water Manag. 2021, 245, 106650. [Google Scholar] [CrossRef]
  4. Yi, Q.-X.; Bao, A.-M.; Wang, Q.; Zhao, J. Estimation of leaf water content in cotton by means of hyperspectral indices. Comput. Electron. Agric. 2013, 90, 144–151. [Google Scholar] [CrossRef]
  5. Danson, F.M.; Steven, M.D.; Malthus, T.J.; Clark, J.A. High-spectral resolution data for determining leaf water content. Int. J. Remote Sens. 1992, 13, 461–470. [Google Scholar] [CrossRef]
  6. Clevers, J.G.P.W.; Kooistra, L.; Schaepman, M.E. Using spectral information from the NIR water absorption features for the retrieval of canopy water content. Int. J. Appl. Earth Obs. Geoinf. 2008, 10, 388–397. [Google Scholar] [CrossRef]
  7. Sims, D.A.; Gamon, J.A. Estimation of vegetation water content and photosynthetic tissue area from spectral reflectance: A comparison of indices based on liquid water and chlorophyll absorption features. Remote Sens. Environ. 2003, 84, 526–537. [Google Scholar] [CrossRef]
  8. Jin, X.; Zarco-Tejada, P.J.; Schmidhalter, U.; Reynolds, M.P.; Hawkesford, M.J.; Varshney, R.K.; Yang, T.; Nie, C.; Li, Z.; Ming, B.; et al. High-Throughput Estimation of Crop Traits: A Review of Ground and Aerial Phenotyping Platforms. IEEE Geosci. Remote Sens. Mag. 2021, 9, 200–231. [Google Scholar] [CrossRef]
  9. Ruffault, J.; Limousin, J.-M.; Pimont, F.; Dupuy, J.-L.; De Càceres, M.; Cochard, H.; Mouillot, F.; Blackman, C.J.; Torres-Ruiz, J.M.; Parsons, R.A.; et al. Plant hydraulic modelling of leaf and canopy fuel moisture content reveals increasing vulnerability of a Mediterranean forest to wildfires under extreme drought. N. Phytol. 2023, 237, 1256–1269. [Google Scholar] [CrossRef]
  10. Wu, Y.; Jiang, J.; Zhang, X.; Zhang, J.; Cao, Q.; Tian, Y.; Zhu, Y.; Cao, W.; Liu, X. Combining machine learning algorithm and multi-temporal temperature indices to estimate the water status of rice. Agric. Water Manag. 2023, 289, 108521. [Google Scholar] [CrossRef]
  11. Cheng, M.; Sun, C.; Nie, C.; Liu, S.; Yu, X.; Bai, Y.; Liu, Y.; Meng, L.; Jia, X.; Liu, Y.; et al. Evaluation of UAV-based drought indices for crop water conditions monitoring: A case study of summer maize. Agric. Water Manag. 2023, 287, 108442. [Google Scholar] [CrossRef]
  12. Allen, W.A.; Gausman, H.W.; Richardson, A.J.; Cardenas, R. Water and Air Changes in Grapefruit, Corn, and Cotton Leaves with Maturation1. Agron. J. 1971, 63, 392–394. [Google Scholar] [CrossRef]
  13. Ceccato, P.; Gobron, N.; Flasse, S.; Pinty, B.; Tarantola, S. Designing a spectral index to estimate vegetation water content from remote sensing data: Part 1: Theoretical approach. Remote Sens. Environ. 2002, 82, 188–197. [Google Scholar] [CrossRef]
  14. Chen, S.; Chen, Y.; Chen, J.; Zhang, Z.; Fu, Q.; Bian, J.; Cui, T.; Ma, Y. Retrieval of cotton plant water content by UAV-based vegetation supply water index (VSWI). Int. J. Remote Sens. 2020, 41, 4389–4407. [Google Scholar] [CrossRef]
  15. Miao, J.; Wang, J.; Zhao, D.; Shen, Z.; Xiang, H.; Gao, C.; Li, W.; Cui, L.; Wu, G. Modeling strategies and influencing factors in retrieving canopy equivalent water thickness of mangrove forest with Sentinel-2 image. Ecol. Indic. 2024, 158, 111497. [Google Scholar] [CrossRef]
  16. Hassanpour, R.; Majnooni-Heris, A.; Fard, A.F.; Verrelst, J. Monitoring Biophysical Variables (FVC, LAI, LCab, and CWC) and Cropland Dynamics at Field Scale Using Sentinel-2 Time Series. Remote. Sens. 2024, 16, 2284. [Google Scholar] [CrossRef]
  17. Vivone, G. Multispectral and hyperspectral image fusion in remote sensing: A survey. Inf. Fusion 2023, 89, 405–417. [Google Scholar] [CrossRef]
  18. Yang, S.; Wang, M.; Jiao, L. Fusion of multispectral and panchromatic images based on support value transform and adaptive principal component analysis. Inf. Fusion 2012, 13, 177–184. [Google Scholar] [CrossRef]
  19. Huang, Z.; Chen, Q.; Chen, Q.; Liu, X. Variational Pansharpening for Hyperspectral Imagery Constrained by Spectral Shape and Gram–Schmidt Transformation. Sensors 2018, 18, 4330. [Google Scholar] [CrossRef]
  20. Nunez, J.; Otazu, X.; Fors, O.; Prades, A.; Pala, V.; Arbiol, R. Multiresolution-based image fusion with additive wavelet decomposition. IEEE Trans. Geosci. Remote Sens. 1999, 37, 1204–1211. [Google Scholar] [CrossRef]
  21. Bama, B.S.; Sankari, S.G.S.; Kamalam, R.E.J.; Kumar, P.S. New Additive Wavelet Image Fusion Algorithm for Satellite Images. In Pattern Recognition and Machine Intelligence; Springer: Berlin/Heidelberg, Germany, 2013; pp. 313–318. [Google Scholar]
  22. Dehkordi, R.H.; Pelgrum, H.; Meersmans, J. High spatio-temporal monitoring of century-old biochar effects on evapotranspiration through the ETLook model: A case study with UAV and satellite image fusion based on additive wavelet transform (AWT). GIScience Remote Sens. 2022, 59, 111–141. [Google Scholar] [CrossRef]
  23. Yang, B.; Zhang, H.; Lu, X.; Wan, H.; Zhang, Y.; Zhang, J.; Jin, Z. Inversion of Leaf Water Content of Cinnamomum camphora Based on Preferred Spectral Index and Machine Learning Algorithm. Forests 2023, 14, 2285. [Google Scholar] [CrossRef]
  24. Verrelst, J.; Malenovský, Z.; Van der Tol, C.; Camps-Valls, G.; Gastellu-Etchegorry, J.-P.; Lewis, P.; North, P.; Moreno, J. Quantifying Vegetation Biophysical Variables from Imaging Spectroscopy Data: A Review on Retrieval Methods. Surv. Geophys. 2019, 40, 589–629. [Google Scholar] [CrossRef] [PubMed]
  25. Zhong, G.; Chen, J.; Huang, R.; Yi, S.; Qin, Y.; You, H.; Han, X.; Zhou, G. High Spatial Resolution Fractional Vegetation Coverage Inversion Based on UAV and Sentinel-2 Data: A Case Study of Alpine Grassland. Remote. Sens. 2023, 15, 4266. [Google Scholar] [CrossRef]
  26. Pan, Y.; Ren, C.; Liang, Y.; Zhang, Z.; Shi, Y. Inversion of surface vegetation water content based on GNSS-IR and MODIS data fusion. Satell. Navig. 2020, 1, 21. [Google Scholar] [CrossRef]
  27. Liu, W.; Zhong, Q.; Luo, L. Effects of different irrigation volume co-applied with super absorbent polymers on soil moisture and fruit of mango orchards. Southwest China J. Agric. Sci. 2019, 32, 2378–2382. [Google Scholar] [CrossRef]
  28. Tu, T.-M.; Su, S.-C.; Shyu, H.-C.; Huang, P.S. A new look at IHS-like image fusion methods. Inf. Fusion 2001, 2, 177–186. [Google Scholar] [CrossRef]
  29. Brodu, N. Super-Resolving Multiresolution Images with Band-Independent Geometry of Multispectral Pixels. IEEE Trans. Geosci. Remote Sens. 2017, 55, 4610–4617. [Google Scholar] [CrossRef]
  30. Barnes, E.M.; Clarke, T.; Richards, S.E.; Colaizzi, P.D.; Haberland, J.; Kostrzewski, M.; Waller, P.M.; Choi, C.Y.; Riley, E.; Thompson, T.L.; et al. Coincident detection of crop water stress, nitrogen status and canopy density using ground-based multispectral data. In Proceedings of the 5th International Conference on Precision Agriculture, Bloomington, MN, USA, 16–19 July 2000. [Google Scholar]
  31. Tucker, C.J. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sens. Environ. 1979, 8, 127–150. [Google Scholar] [CrossRef]
  32. Birth, G.S.; Mcvey, G.R. Measuring the Color of Growing Turf with a Reflectance Spectrophotometer. Agron. J. 1968, 60, 640–643. [Google Scholar] [CrossRef]
  33. Huete, A.; Didan, K.; Miura, T.; Rodriguez, E.P.; Gao, X.; Ferreira, L.G. Overview of the radiometric and biophysical performance of the MODIS vegetation indices. Remote Sens. Environ. 2002, 83, 195–213. [Google Scholar] [CrossRef]
  34. Huete, A.R. A soil-adjusted vegetation index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
  35. Gao, B.-C. NDWI—A normalized difference water index for remote sensing of vegetation liquid water from space. Remote Sens. Environ. 1996, 58, 257–266. [Google Scholar] [CrossRef]
  36. Merzlyak, M.N.; Gitelson, A.A.; Chivkunova, O.B.; Rakitin, V.Y. Non-destructive optical detection of pigment changes during leaf senescence and fruit ripening. Physiol. Plant. 1999, 106, 135–141. [Google Scholar] [CrossRef]
  37. Hardisky, M.; Klemas, V.; Smart, A. The Influence of Soil Salinity, Growth Form, and Leaf Moisture on the Spectral Radiance of Spartina alterniflora Canopies. Photogramm. Eng. Remote Sens. 1983, 48, 77–84. [Google Scholar]
  38. Wang, L.; Qu, J.J. NMDI: A normalized multi-band drought index for monitoring soil and vegetation moisture with satellite remote sensing. Geophys. Res. Lett. 2007, 34, L20405. [Google Scholar] [CrossRef]
  39. Xu, H. Modification of normalised difference water index (NDWI) to enhance open water features in remotely sensed imagery. Int. J. Remote Sens. 2006, 27, 3025–3033. [Google Scholar] [CrossRef]
  40. Rock, B.N.; Vogelmann, J.E.; Williams, D.L.; Vogelmann, A.F.; Hoshizaki, T. Remote Detection of Forest Damage: Plant responses to stress may have spectral “signatures” that could be used to map, monitor, and measure forest damage. BioScience 1986, 36, 439–445. [Google Scholar] [CrossRef]
  41. Deventer, A.v.; Ward, A.D.; Gowda, P.H.; Lyon, J.G. Using Thematic Mapper Data to Identify Contrasting Soil Plains and Tillage Practices. Photogramm. Eng. Remote Sens. 1997, 63, 87–93. [Google Scholar]
  42. Yu, Y.; Zhang, K.; Yang, L.; Zhang, D. Fruit detection for strawberry harvesting robot in non-structural environment based on Mask-RCNN. Comput. Electron. Agric. 2019, 163, 104846. [Google Scholar] [CrossRef]
  43. Zhang, Y.; Sui, B.; Shen, H.; Ouyang, L. Mapping stocks of soil total nitrogen using remote sensing data: A comparison of random forest models with different predictors. Comput. Electron. Agric. 2019, 160, 23–30. [Google Scholar] [CrossRef]
  44. Furlanetto, R.H.; Nanni, M.R.; Crusiol, L.G.T.; Silva, G.F.C.; Junior, A.d.O.; Sibaldelli, R.N.R. Identification and quantification of potassium (K+) deficiency in maize plants using an unmanned aerial vehicle and visible/near-infrared semi-professional digital camera. Int. J. Remote Sens. 2021, 42, 8783–8804. [Google Scholar] [CrossRef]
  45. Cortes, C.; Vapnik, V. Support-vector networks. Mach. Learn. 1995, 20, 273–297. [Google Scholar] [CrossRef]
  46. Zhang, J.; Fu, P.; Meng, F.; Yang, X.; Xu, J.; Cui, Y. Estimation algorithm for chlorophyll-a concentrations in water from hyperspectral images based on feature derivation and ensemble learning. Ecol. Inform. 2022, 71, 101783. [Google Scholar] [CrossRef]
  47. Yang, J.; Hu, Y.; Zhang, K.; Wu, Y. An improved evolution algorithm using population competition genetic algorithm and self-correction BP neural network based on fitness landscape. Soft Comput. 2021, 25, 1751–1776. [Google Scholar] [CrossRef]
  48. Lundberg, S.M.; Lee, S.-I. A unified approach to interpreting model predictions. In Proceedings of the 31st International Conference on Neural Information Processing Systems, Long Beach, CA, USA, 4–9 December 2017; pp. 4768–4777. [Google Scholar]
  49. Zhen, Z.; Chen, S.; Yin, T.; Gastellu-Etchegorry, J.-P. Globally quantitative analysis of the impact of atmosphere and spectral response function on 2-band enhanced vegetation index (EVI2) over Sentinel-2 and Landsat-8. ISPRS J. Photogramm. Remote Sens. 2023, 205, 206–226. [Google Scholar] [CrossRef]
  50. Gao, B.-C.; Kaufman, Y.J. Water vapor retrievals using Moderate Resolution Imaging Spectroradiometer (MODIS) near-infrared channels. J. Geophys. Res. Atmos. 2003, 108, 4389. [Google Scholar] [CrossRef]
  51. Mamaghani, B.; Salvaggio, C. Multispectral Sensor Calibration and Characterization for sUAS Remote Sensing. Sensors 2019, 19, 4453. [Google Scholar] [CrossRef]
  52. Chen, S.; Wang, W.; Liang, H. Evaluating the effectiveness of fusing remote sensing images with significantly different spatial resolutions for thematic map production. Phys. Chem. Earth Parts A/B/C 2019, 110, 71–80. [Google Scholar] [CrossRef]
  53. Wang, H.; Yang, J.; Xue, B. A novel method for the definition of central wavelength and spectral bandwidth. Results Phys. 2020, 18, 103327. [Google Scholar] [CrossRef]
  54. Rajković, D.; Jeromela, A.M.; Pezo, L.; Lončar, B.; Grahovac, N.; Špika, A.K. Artificial neural network and random forest regression models for modelling fatty acid and tocopherol content in oil of winter rapeseed. J. Food Compos. Anal. 2023, 115, 105020. [Google Scholar] [CrossRef]
  55. Ghulam, A.; Li, Z.-L.; Qin, Q.; Tong, Q.; Wang, J.; Kasimu, A.; Zhu, L. A method for canopy water content estimation for highly vegetated surfaces-shortwave infrared perpendicular water stress index. Sci. China Ser. D Earth Sci. 2007, 50, 1359–1368. [Google Scholar] [CrossRef]
  56. Ceccato, P.; Flasse, S.; Tarantola, S.; Jacquemoud, S.; Grégoire, J.-M. Detecting vegetation leaf water content using reflectance in the optical domain. Remote Sens. Environ. 2001, 77, 22–33. [Google Scholar] [CrossRef]
  57. Gao, Z.; Wang, Q.; Cao, X.; Gao, W. The responses of vegetation water content (EWT) and assessment of drought monitoring along a coastal region using remote sensing. GIScience Remote Sens. 2014, 51, 1–16. [Google Scholar] [CrossRef]
  58. Scheliga, B.; Tetzlaff, D.; Nuetzmann, G.; Soulsby, C. Assessing runoff generation in riparian wetlands: Monitoring groundwater–surface water dynamics at the micro-catchment scale. Environ. Monit. Assess. 2019, 191, 116. [Google Scholar] [CrossRef]
  59. Liu, L.; Li, S.; Yang, W.; Wang, X.; Luo, X.; Ran, P.; Zhang, H. Forest Canopy Water Content Monitoring Using Radiative Transfer Models and Machine Learning. Forests 2023, 14, 1418. [Google Scholar] [CrossRef]
Figure 1. Study area map and sampling distribution: (a) National map of China (the dark green area represents Guangxi Zhuang Autonomous Region), (b) Guangxi Zhuang Autonomous Region (the green area represents Baise City), and (c) sampling points within the study area.
Figure 1. Study area map and sampling distribution: (a) National map of China (the dark green area represents Guangxi Zhuang Autonomous Region), (b) Guangxi Zhuang Autonomous Region (the green area represents Baise City), and (c) sampling points within the study area.
Forests 16 00167 g001
Figure 2. Relationships between the MS UAV and the ASD spectroradiometer measurement of spectral reflectance. (a) Blue band reflectance of the MS UAV, (b) Green band reflectance of the MS UAV, (c) Red band reflectance of the MS UAV, (d) Red Edge band reflectance of the MS UAV, (e) Near-infrared band reflectance of the MS UAV. Note: “UAV_Band” and “ASD_Band” represent the spectral reflectance obtained from the MS UAV and ASD spectroradiometer, respectively.
Figure 2. Relationships between the MS UAV and the ASD spectroradiometer measurement of spectral reflectance. (a) Blue band reflectance of the MS UAV, (b) Green band reflectance of the MS UAV, (c) Red band reflectance of the MS UAV, (d) Red Edge band reflectance of the MS UAV, (e) Near-infrared band reflectance of the MS UAV. Note: “UAV_Band” and “ASD_Band” represent the spectral reflectance obtained from the MS UAV and ASD spectroradiometer, respectively.
Forests 16 00167 g002
Figure 3. Relative relationships between the AWT fusion of the sharpened Sentinel−2 satellite and the measured spectral reflectance by the ASD spectroradiometer. (a) Blue band reflectance of Sentinel-2, (b) Green band reflectance of Sentinel-2, (c) Red band reflectance of Sentinel-2, (d) Red Edge 1 band reflectance of Sentinel-2, (e) Red Edge 2 band reflectance of Sentinel-2, (f) Red Edge 3 band reflectance of Sentinel-2, (g) Near-infrared band reflectance of Sentinel-2, (h) Red Edge 4 band reflectance of Sentinel-2, (i) Short-Wave Infrared 1 band reflectance of Sentinel-2, (j) Short-Wave Infrared 2 band reflectance of Sentinel-2. Note: “AWT_Band” and “ASD_Band” represent the spectral reflectance obtained from the AWT fusion of the Sentinel-2 satellite image and the ASD spectroradiometer, respectively.
Figure 3. Relative relationships between the AWT fusion of the sharpened Sentinel−2 satellite and the measured spectral reflectance by the ASD spectroradiometer. (a) Blue band reflectance of Sentinel-2, (b) Green band reflectance of Sentinel-2, (c) Red band reflectance of Sentinel-2, (d) Red Edge 1 band reflectance of Sentinel-2, (e) Red Edge 2 band reflectance of Sentinel-2, (f) Red Edge 3 band reflectance of Sentinel-2, (g) Near-infrared band reflectance of Sentinel-2, (h) Red Edge 4 band reflectance of Sentinel-2, (i) Short-Wave Infrared 1 band reflectance of Sentinel-2, (j) Short-Wave Infrared 2 band reflectance of Sentinel-2. Note: “AWT_Band” and “ASD_Band” represent the spectral reflectance obtained from the AWT fusion of the Sentinel-2 satellite image and the ASD spectroradiometer, respectively.
Forests 16 00167 g003
Figure 4. Relationships between the unsharpened Sentinel-2 satellite image band reflectance after using the AWT fusion and measured band reflectance. (a) Red Edge 1 band reflectance of Sentinel-2, (b) Red Edge 2 band reflectance of Sentinel-2, (c) Red Edge 3 band reflectance of Sentinel-2, (d) Red Edge 4 band reflectance of Sentinel-2, (e) Short-Wave Infrared 1 band reflectance of Sentinel-2, (f) Short-Wave Infrared 2 band reflectance of Sentinel-2. Note: “20m_AWT_Band” and “ASD_Band” represent the spectral reflectance obtained from the Sentinel-2 satellite image with a 20 m resolution after AWT fusion and the spectral reflectance measured by the ASD spectroradiometer, respectively.
Figure 4. Relationships between the unsharpened Sentinel-2 satellite image band reflectance after using the AWT fusion and measured band reflectance. (a) Red Edge 1 band reflectance of Sentinel-2, (b) Red Edge 2 band reflectance of Sentinel-2, (c) Red Edge 3 band reflectance of Sentinel-2, (d) Red Edge 4 band reflectance of Sentinel-2, (e) Short-Wave Infrared 1 band reflectance of Sentinel-2, (f) Short-Wave Infrared 2 band reflectance of Sentinel-2. Note: “20m_AWT_Band” and “ASD_Band” represent the spectral reflectance obtained from the Sentinel-2 satellite image with a 20 m resolution after AWT fusion and the spectral reflectance measured by the ASD spectroradiometer, respectively.
Forests 16 00167 g004
Figure 5. Average absolute SHAP values of input variables in the RF model: (a) input variables for estimating FMC, (b) input variables for estimating EWT, (c) input variables for estimating CWC, and in the XGBoost model: (d) input variables for estimating FMC, (e) input variables for estimating EWT, (f) input variables for estimating CWC.
Figure 5. Average absolute SHAP values of input variables in the RF model: (a) input variables for estimating FMC, (b) input variables for estimating EWT, (c) input variables for estimating CWC, and in the XGBoost model: (d) input variables for estimating FMC, (e) input variables for estimating EWT, (f) input variables for estimating CWC.
Forests 16 00167 g005
Figure 6. Distribution of the mango canopy water content of FMC (a), EWT (b), and CWC (c).
Figure 6. Distribution of the mango canopy water content of FMC (a), EWT (b), and CWC (c).
Forests 16 00167 g006
Figure 7. True-color (RGB) images of the MS UAV (a) and AWT fusion (b).
Figure 7. True-color (RGB) images of the MS UAV (a) and AWT fusion (b).
Forests 16 00167 g007
Figure 8. Correlation analysis between the band reflectance, vegetation indices, and canopy moisture indicators: (a) correlation analysis between canopy moisture indicators (FMC, EWT, CWC) and the reflectance of UAV and AWT-fused Sentinel-2 satellite bands; (bd) correlation analysis between FMC, EWT, CWC, and vegetation indices. Note: “UAV_Band” and “UAV_Vegetation index” represent spectral bands and vegetation indices, respectively, obtained from the UAV. “S2_Band” and “S2_Vegetation index” represent the spectral bands and vegetation indices, respectively, obtained from the Sentinel-2 satellite images after AWT fusion.
Figure 8. Correlation analysis between the band reflectance, vegetation indices, and canopy moisture indicators: (a) correlation analysis between canopy moisture indicators (FMC, EWT, CWC) and the reflectance of UAV and AWT-fused Sentinel-2 satellite bands; (bd) correlation analysis between FMC, EWT, CWC, and vegetation indices. Note: “UAV_Band” and “UAV_Vegetation index” represent spectral bands and vegetation indices, respectively, obtained from the UAV. “S2_Band” and “S2_Vegetation index” represent the spectral bands and vegetation indices, respectively, obtained from the Sentinel-2 satellite images after AWT fusion.
Forests 16 00167 g008
Table 1. Key parameters of the MS UAV sensors.
Table 1. Key parameters of the MS UAV sensors.
Spectral BandImage BandCentral WavelengthBandwidth
Blue1450 nm32 nm
Green2560 nm32 nm
Red3650 nm32 nm
Red-edge4730 nm32 nm
Near-infrared5840 nm52 nm
Lens5.74 mm focal length, 62.7° field of view
GNSSGPS; BeiDou or GLONASS; Galileo
Weight1487 g
Table 2. Mathematical statistics of LAI, FMC, EWT, and CWC.
Table 2. Mathematical statistics of LAI, FMC, EWT, and CWC.
ParametersMaxMinMeanStandard Deviation
Spectral sampling points FMC (%)216.123113.965162.13529.615
EWT (g/cm2)0.0260.0120.0190.003
CWC (g/cm2)0.0910.0190.0520.020
Canopy average FMC (%)227.548119.434163.20725.864
EWT (g/cm2)0.0280.0150.0190.003
CWC (g/cm2)0.0790.0240.0500.016
Table 3. Vegetation indices are used for the estimation of vegetation water content.
Table 3. Vegetation indices are used for the estimation of vegetation water content.
NoIndexFormulaS2MS UAVMS UAV + S2References
1Normalized difference
Vegetation Index (NDVI)
NIR Red / NIR + Red [31]
2Simple Ratio Index (SR) Blue / Red [32]
3Enhanced Vegetation Index (EVI) 2.5 × [ ( NIR Red ) / ( NIR + 6 × Red 7.5 × Green + 1 ) ] [33]
4Soil-Adjusted Vegetation Index (SAVI) 1 + 0.5 × [ ( NIR Red ) / ( NIR + Red + 0.5 ) ] [34]
5Normalized Difference Water Index (NDWI) NIR SWIR 1 / [ NIR + SWIR 1 ] [35]
6plant senescence reflectance index (PSRI) ( Red Green ) / RE_B6 [36]
7Normalized Difference Infrared Index (NDII) ( NIR SWIR 2 ) / ( NIR + SWIR 2 ) [37]
8Normalized Multiband Drought Index (NMDI) [ NIR ( SWIR 1 SWRI 2 ) ] / [ NIR + ( SWIR 1 SWRI 2 ) ] [38]
9Modified Normalized Difference Water Index (MNDWI) Red SWIR 2 / [ Red + SWIR 2 ] [39]
10Moisture Stress Index (MSI) SWIR 2 / NIR [40]
11Normalized Difference Tillage Index (NDTI) ( SWIR 1 SWIR 2 ) / ( SWIR 1 + SWIR 2 ) [41]
12Shortwave Infrared Difference (SWID) SWIR 1 SWIR 2
Note: “S2” represents the VIs calculated from the fused Sentinel-2 satellite bands, “MS UAV” represents the VIs calculated from the MS UAV bands, and “MS UAV + S2” represents the reflectance that is not within the range of the MS UAV bands, as indicated by the fused Sentinel-2 satellite bands.
Table 4. Estimation results of canopy moisture indicators using five ML algorithms.
Table 4. Estimation results of canopy moisture indicators using five ML algorithms.
MethodParameterUAVUAV + S2_RF_SHAPUAV + S2_XG_SHAP
R2RMSERE (%)R2RMSERE (%)R2RMSERE (%)
RFFMC0.21823.07812.4860.24422.92612.1080.28522.38311.670
EWT0.2220.00311.6670.2550.00310.3540.3350.00210.522
CWC−0.0990.01836.1260.0450.01631.5400.1050.01630.943
XGBoostFMC0.19223.43512.0880.19623.74111.2950.26522.80711.151
EWT−0.0230.00312.0100.1720.00310.9280.3530.0029.809
CWC−0.3320.02041.3530.1100.01730.8520.0540.01732.902
SVMFMC0.19723.17112.1060.22722.17311.1780.34021.45610.862
EWT0.1990.00310.8790.4650.0028.7070.4670.0028.432
CWC−0.0600.01834.1480.1870.01527.4100.2930.01424.792
PLSFMC0.25622.08711.6640.39319.76810.2980.40519.92510.034
EWT0.1860.00311.4840.3960.0028.7310.4500.0028.967
CWC−0.0460.01835.1620.1390.23427.4300.2700.01522.661
GABPFMC0.67914.8987.5360.74514.3977.2270.72415.4437.636
EWT0.6800.0027.5190.8370.0015.0520.8590.0014.454
CWC0.4920.01224.2070.7020.0057.2230.6980.0057.95
Note: Bold text represents the highest accuracy.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Liu, J.; Huang, J.; Wu, M.; Qin, T.; Jia, H.; Hao, S.; Jin, J.; Huang, Y.; Pumijumnong, N. Assessment of Mango Canopy Water Content Through the Fusion of Multispectral Unmanned Aerial Vehicle (UAV) and Sentinel-2 Remote Sensing Data. Forests 2025, 16, 167. https://doi.org/10.3390/f16010167

AMA Style

Liu J, Huang J, Wu M, Qin T, Jia H, Hao S, Jin J, Huang Y, Pumijumnong N. Assessment of Mango Canopy Water Content Through the Fusion of Multispectral Unmanned Aerial Vehicle (UAV) and Sentinel-2 Remote Sensing Data. Forests. 2025; 16(1):167. https://doi.org/10.3390/f16010167

Chicago/Turabian Style

Liu, Jinlong, Jing Huang, Mengjuan Wu, Tengda Qin, Haoyi Jia, Shaozheng Hao, Jia Jin, Yuqing Huang, and Nathsuda Pumijumnong. 2025. "Assessment of Mango Canopy Water Content Through the Fusion of Multispectral Unmanned Aerial Vehicle (UAV) and Sentinel-2 Remote Sensing Data" Forests 16, no. 1: 167. https://doi.org/10.3390/f16010167

APA Style

Liu, J., Huang, J., Wu, M., Qin, T., Jia, H., Hao, S., Jin, J., Huang, Y., & Pumijumnong, N. (2025). Assessment of Mango Canopy Water Content Through the Fusion of Multispectral Unmanned Aerial Vehicle (UAV) and Sentinel-2 Remote Sensing Data. Forests, 16(1), 167. https://doi.org/10.3390/f16010167

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop