Next Article in Journal
A New Architecture of a Complex-Valued Convolutional Neural Network for PolSAR Image Classification
Next Article in Special Issue
Identification of the Initial Anthesis of Soybean Varieties Based on UAV Multispectral Time-Series Images
Previous Article in Journal
Sofia Airport Visibility Estimation with Two Machine-Learning Techniques
Previous Article in Special Issue
A Prediction Model of Maize Field Yield Based on the Fusion of Multitemporal and Multimodal UAV Data: A Case Study in Northeast China
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Estimation of Winter Wheat Yield Using Multiple Temporal Vegetation Indices Derived from UAV-Based Multispectral and Hyperspectral Imagery

1
State Key Laboratory of Efficient Utilization of Arid and Semi-Arid Arable Land in Northern China, Institute of Agricultural Resources and Regional Planning, Chinese Academy of Agricultural Sciences, Beijing 100081, China
2
Dryland Farming Institute, Hebei Academy of Agriculture and Forestry Sciences, Hengshui 053000, China
3
Key Laboratory of Crop Drought Tolerance Research of Hebei Province, Hengshui 053000, China
4
Institute of Environment and Sustainable Development in Agriculture, Chinese Academy of Agricultural Sciences, Beijing 100081, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Remote Sens. 2023, 15(19), 4800; https://doi.org/10.3390/rs15194800
Submission received: 29 July 2023 / Revised: 26 September 2023 / Accepted: 27 September 2023 / Published: 1 October 2023
(This article belongs to the Special Issue Crop Quantitative Monitoring with Remote Sensing II)

Abstract

:
Winter wheat is a major food source for the inhabitants of North China. However, its yield is affected by drought stress during the growing period. Hence, it is necessary to develop drought-resistant winter wheat varieties. For breeding researchers, yield measurement, a crucial breeding indication, is costly, labor-intensive, and time-consuming. Therefore, in order to breed a drought-resistant variety of winter wheat in a short time, field plot scale crop yield estimation is essential. Unmanned aerial vehicles (UAVs) have developed into a reliable method for gathering crop canopy information in a non-destructive and time-efficient manner in recent years. This study aimed to evaluate strategies for estimating crop yield using multispectral (MS) and hyperspectral (HS) imagery derived from a UAV in single and multiple growth stages of winter wheat. To accomplish our objective, we constructed a simple linear regression model based on the single growth stages of booting, heading, flowering, filling, and maturation and a multiple regression model that combined these five growth stages to estimate winter wheat yield using 36 vegetation indices (VIs) calculated from UAV-based MS and HS imagery, respectively. After comparing these regression models, we came to the following conclusions: (1) the flowering stage of winter wheat showed the highest correlation with crop yield for both MS and HS imagery; (2) the VIs derived from the HS imagery performed better in terms of estimation accuracy than the VIs from the MS imagery; (3) the regression model that combined the information of five growth stages presented better accuracy than the one that considered the growth stages individually. The best estimation regression model for winter wheat yield in this study was the multiple linear regression model constructed by the VI of ‘ b 1 b 2 / b 3 b 4 ’ derived from HS imagery, incorporating the five growth stages of booting, heading, flowering, filling, and maturation with r of 0.84 and RMSE of 0.69 t/ha. The corresponding central wavelengths were 782 nm, 874 nm, 762 nm, and 890 nm, respectively. Our study indicates that the multiple temporal VIs derived from UAV-based HS imagery are effective tools for breeding researchers to estimate winter wheat yield on a field plot scale.

1. Introduction

Wheat (Triticum aestivum L.), as the third largest cereal crop in the world, plays an important role in world food production and food security strategies. According to the Food and Agriculture Organization of the United Nations (FAO), more than 220 million ha are sown with wheat, with over 770 million tons of wheat being produced in 2021 [1]. China accounts for over 23 million ha of wheat crop and is responsible for nearly 18% (about 137 million tons) of all wheat produced worldwide [1]. Wheat is a major food source for the people in North China. The North China Plain is one of China’s primary wheat-growing areas (mainly winter wheat). However, in this region, winter wheat becomes more vulnerable to drought stress due to the continental monsoon climate, leading to decreased output. Groundwater irrigation is one main source of water supply during the winter wheat growth season of the North China Plain, but it is severely restricted according to the sustainable development policy. Thus, in order to fulfill the dual purpose of food security protection and water conservation, it has become necessary to breed drought-resistant varieties of winter wheat to maintain production. As the ultimate detection target, the crop yield becomes an important selection parameter of winter wheat breeding. However, in the empirical breeding process, yield can only be measured after the crop growth cycle, which is costly, labor-intensive, and time-consuming for the breeding researchers. Therefore, establishing a method that could estimate winter wheat yield on a field plot scale within a short time would help them in selecting drought-resistant varieties.
Satellite remote sensing data have been widely applied for non-destructive crop yield estimation over various large-scale regions (from local to national, continental, and global) since the 1970s [2,3]. The yield estimation models based on these data have demonstrated reasonable crop yield estimation accuracy in the large-scale regions based on satellite imagery and have been widely used due to their convenience and simplicity [2,3,4,5,6,7,8]. However, for breeding researchers, the application of satellite data in yield estimation is often hampered by the high spatial heterogeneity within the small areas of breeding fields, missing data during critical crop growth stages, and high costs due to the coarse spatial resolution and fixed passing time and band setting.
In recent years, the development of sensor technologies in unmanned aerial vehicles (UAVs) has promoted their application for data acquisition [9]. Compared to satellite remote sensing, UAV remote sensing has an improved spatial, spectral, and temporal resolution and is associated with lower costs and greater flexibility and versatility [10], making UAVs increasingly popular in precision agriculture [11,12]. Previous studies have reported the relationship between crop yield and crop phenotypic parameters such as plant height, leaf nitrogen content, leaf area index, above-ground biomass, and so on [13,14,15,16]. UAV platforms have exerted a beneficial effect in the retrieval of a wide array of crop characteristics that are associated with yield [17,18,19,20], and their use can help meet breeding researchers’ requirements regarding crop yield estimation on a small plot area scale within a short amount of time.
Furthermore, vegetation indices (VIs) derived from UAV-mounted multispectral (MS) and hyperspectral (HS) sensors have been widely used to estimate crop yield [21,22]. Duan et al. developed a method based on VIs that were derived from UAV multispectral data to correlate with rice phenotyping and estimate grain yield [13]. García-Martínez et al. estimated corn grain yield by combining vegetation indices, canopy cover, and plant density using multispectral and RGB images acquired via the use of UAVs [23]. Ramos et al. proposed a random forest algorithm that performed well in tests with a ranking-based strategy that focused on predicting maize crop yield using UAV-based multispectral vegetation indices [24]. However, the application of UAV-mounted hyperspectral sensors for agricultural monitoring is limited by the weight of imaging systems and the complexity of image processing [25,26]. As a result, they have rarely employed by breeding researchers in the study of crop breeding. Moreover, the VIs used in previous studies have been mostly derived from limited growth stages within the crop growing season, which may increase the risk of missing critical spectral features in other growth stages [27,28]. Therefore, estimating crop yield through combining UAV-mounted HS sensor-based VIs in several critical crop growing stages could be helpful to integrate critical spectral features throughout the crop growing season and lead to a higher estimation accuracy, which would be beneficial to breeding researchers.
For this study, motivated by the need to boost the efficiency of winter wheat breeding, we aimed to evaluate yield estimation among various winter wheat breeding cultivars using multi-temporal vegetation indices derived from UAV-mounted multispectral and hyperspectral sensors. To accomplish this objective, we (1) investigated the correlation between the winter wheat yield and 19 UAV-based multispectral VIs and 17 band combination types of hyperspectral data at single growth stages and multiple growth stages, respectively, and (2) identified the best VI and the best time for estimating winter wheat yield using UAV data.

2. Materials and Methods

2.1. Experimental Setup

The experimental site was set up at the Dry-Land Farming Institute of Hebei Academy of Agricultural and Forestry Sciences (DFI) at Hengshui City, Hebei Province, China (37°54′15.63″N, 115°42′29.32″E, World Geodetic System 1984) (Figure 1). The area has a semi-arid temperate and monsoonal climate characterized by four distinct seasons, with an average yearly temperature of 13.3 °C and a yearly precipitation of 497.1 mm.
The experimental site design included eleven winter wheat cultivars: C1 (Chang8744), C2 (Shimai22), C3 (Luyuan472), C4 (Shimai15), C5 (HengH1603), C6 (Xinmai28), C7 (Jimai418), C8 (Shannong28), C9 (Nongda212), C10 (Heng4399), and C11 (Jimai22). Each cultivar was then split into seven different irrigation groups and three repeats (1.5 m × 6 m in size) according to a randomized block design. All cultivars were planted on 15 October 2020 at a density of 375 plants/m2. A base fertilizer, pure nitrogenous fertilizer (225 kg/ha), P2O5 (112.5 kg/ha), and K2O (112.5 kg/ha) were applied before sowing. No additional fertilizers were used for the growth of the winter wheat discussed in this study. The irrigation date of each irrigated sub-plot is shown in Table 1. The irrigation volume of each time was 750 m3/ha. The total precipitation during the 2020–2021 growing season in the site was 43.9 mm.

2.2. Data Acquisition

2.2.1. Ground Truth Data

All winter wheat cultivars were harvested on 8 July 2021. The yield of each sub-plot was weighted and normalized to a moisture content of 13% and is expressed as ‘t/ha’. According to the experimental design, 231 samples were measured, while 17 measurements were eliminated as outliers, including 6 plots of Repeat 1, 4 plots of Repeat 2, and 7 plots of Repeat 3. The statistics of measured winter wheat yield are outlined in Table 2. The mean measured grain yield values under different irrigation groups and winter wheat cultivars are shown in Figure 2. Winter wheat yield differences across different irrigation groups and different cultivars were assessed using a one-way analysis of variance after checking the normality assumption at 0.05 probability level (Table 3). There were significant differences among the different irrigation groups and different winter wheat cultivars. The grain yield of irrigation group B was lower than other irrigation groups, and group A had the highest yield. The C6 cultivar showed the poorest yield, while C9 performed best.

2.2.2. Multi-Sensor UAV Data

UAV images derived from multispectral (MS) and hyperspectral (HS) sensors were employed in this study (Figure 3). The UAV campaign was conducted under low wind speed and clear sky conditions between 10:00 a.m. and 2:00 p.m. local time to reduce the influence of atmospheric and solar radiation. The overlap percentages in the forward and lateral flying directions of both UAVs were 80% and 70%, respectively. The acquisition dates and details corresponding to the growth stages regarding both UAVs are shown in Table 4.
The DJI P4 Multispectral (DJI Technology Co., Ltd., Shenzhen, China) was used to collect multispectral images, including 5 sensors of blue, green, red, red-edge, and near-infrared with wavelengths of 456 nm (±16 nm), 560 nm (±16 nm), 650 nm (±16 nm), 730 nm (±16 nm), and 840 nm (±26 nm), respectively. The sensors used a 1/2.9 inch complementary metal-oxide-semiconductor (CMOS). The field of view was 62.7°, the focal length was 5.74 mm, the f-number was f/2.2, and focus was kept at the infinite point (∞). The MS images were obtained at a flight height of 50 m with an accuracy of flight altitude of 0.1 m, and the corresponding ground pixel resolution was 1.6 cm. The MS orthomosaic maps were generated using Pix4D mapper (Pix4D SA, Lausanne, Switzerland).
A DJI M600 Pro (DJI Technology Co., Ltd., Shenzhen, China) equipped with a Pika L hyperspectral camera (Resonon, Inc., Bozeman, MT, USA) was used to capture the hyperspectral images discussed in the study. The hyperspectral camera has 150 bands in a spectral range of 400–1000 nm with a spectral resolution of 4 nm. The HS images were acquired at a flight height of 50 m with an accuracy of flight attitude of 0.5 m, and the corresponding ground pixel resolution was 3.0 cm. SpectrononPro (Resonon, Inc., Bozeman, MT, USA) and ENVI 5.3 (Esri Inc., Redlands, CA, USA) were used to generate HS orthomosaic maps.

2.3. Vegetation Indices Calculation

The average of a 0.8 m × 4 m image area was used for band reflectance value calculation for each sub-plot. The area was approximately in the center of each sub-plot to eliminate the effect of the marginal areas of each sub-plot and other neighboring sub-plots. The extracted images are herein referred to as UAV images.
A large number vegetation indices have been proposed for grain yield estimation. A total of 19 MS VIs that were previously used for crop yield and yield-related phenotypic characteristic estimation were calculated, with average reflectance being derived from UAV MS images [3,18], and the 19 indices are shown in Table 5.
Additionally, for various spectral bands of the UAV HS sensor, the HS VIs minimized the spectral redundance, which was usually found in the hyperspectral data and also promoted the computational optimization [29,30]. Therefore, 17 prevalent formulas, composed of two, three, or four spectral bands using function of sum, difference, ratio, double difference, normalized difference, and hybrid, were regarded as the HS VIs and employed in the study [31]; the 17 formulas are shown in Table 6. Band iteration was applied to all hyperspectral bands in each formulation.
Table 5. Multispectral vegetation indices used in the study.
Table 5. Multispectral vegetation indices used in the study.
Multispectral Vegetation IndexFormulationReference
Difference vegetation index DVI = G B [32]
Ratio vegetation index RVI = NIR R [33]
Green chlorophyll index GCI = NIR G 1 [34]
Red-edge chlorophyll index RECI = NIR RE 1 [34]
Normalized difference vegetation index NDVI = NIR R NIR + R [35]
Green normalized difference vegetation index GNDVI = NIR G NIR + G [36]
Green-red vegetation index GRVI = G R G + R [33]
Green-blue vegetation index GBVI = G B G + B [37]
Normalized difference red-edge NDRE = NIR RE NIR + RE [38]
Normalized difference re-edge index NDREI = RE G RE + G [39]
Simplified canopy chlorophyll content index SCCCI = NDRE NDVI [40]
Enhanced vegetation index EVI = 2.5 × NIR R 1 + NIR 2.4 × R [41]
Two-band enhanced vegetation index EVI 2 = 2.5 × NIR R NIR + 2.4 × R + 1 [42]
Optimized soil adjusted vegetation index OSAVI = NIR R NIR R + L   L = 0.16 [43]
Modified chlorophyll absorption in reflectance index MCARI = RE R 0.2 × RE G × RE R [44]
Transformed chlorophyll absorption in reflectance index TCARI = 3 × RE R 0.2 × RE G × RE R [45]
MCARI/OSAVIMCARI/OSAVI[44]
TACRI/OSAVITACRI/OSAVI[45]
Wide dynamic range vegetation index WDRVI = a × NIR R a × NIR + R   a = 0.12 [46]
Note: ‘R’, ‘G’, ‘B’, ‘RE’, and ‘NIR’: the average value of the red, green, blue, red-edge, and near-infrared bands of the UAV-derived multispectral images, respectively.

2.4. Yield Estimation Model

Two repeats (Repeat 1 and Repeat 2) were employed for all cultivars to construct the winter wheat yield estimation model from each vegetation index. Pearson’s correlation coefficient (r) was used to demonstrate the yield estimation accuracy by comparing the yield measured in the field and estimated yield from the regression models below using Student’s t-test at a 95% confidence level. After excluding the outliers described in Section 2.2.1, 144 samples were used to establish the yield estimation model in the study.
The most commonly used remote-sensing based approaches for crop yield estimation involve the use of empirical statistical models, which demonstrate the relationship between yield and canopy spectrum characteristics in an intuitive way. Therefore, the simple linear regression function (SLR) was used to analyze the relationship between winter wheat yield and vegetation indices in individual growth stages (Equation (1)).
y = a × x + b 1
where y represents the winter wheat yield; x represents the vegetation index values at the booting, heading, flowering, filling, or maturation stages; and a and b 1 are parameters calculated from a least-squares fitting method.
In addition, the multiple linear regression function (MLR), which was used to combine the vegetation indices among the multiple growth stages to estimate crop yield, is presented in Equation (2).
y = a 1 × x 1 + a 2 × x 2 + a 3 × x 3 + a 4 × x 4 + a 5 × x 5 + b 2
In this equation, y represents the winter wheat yield; x 1 , x 2 , x 3 , x 4 , and x 5 represent the vegetation index values at the booting, heading, flowering, filling, and maturation stages, respectively; and a 1 , a 2 , a 3 , a 4 , a 5 , and b 2 are parameters calculated from a least-squares fitting method.

2.5. Validation of the Crop Yield Estimation Model

After eliminating outliers, 70 samples of Repeat 3 for all cultivars were used to validate the regression model using root mean square error (RMSE) and mean absolute percentage error (MAPE). The RMSE and MAPE equations are presented in Equations (3) and (4).
R M S E = 1 n i = 1 n y i y ^ i 2
M A P E = 100 % n i = 1 n y ^ i y i y i
In these equations, y i is the crop yield measured in the field of sample i, y ^ i is the crop yield predicted by the estimation models of sample i, and n is the number of valid samples.

3. Results

3.1. Winter Wheat Yield Estimation using Vegetation Indices from Individual Growth Stages

The relationships between winter wheat yield and the 19 multispectral vegetation indices at five different growth stages are displayed based on correlation coefficients in Table 7. According to Table 7, although the vegetation indices showed various relationships with yield at different growth stages, there was no multispectral vegetation index that significantly correlated with winter wheat yield at a single stage (r > 0.70). Most multispectral vegetation indices showed lower correlations with winter wheat yield at the booting, heading, and maturation stages than at the flowering and filling stages. The ‘NDRE’ vegetation index at the flowering stage had the best correlation with winter wheat yield (r = 0.67; RMSE = 0.92 t/ha) (Figure 4a).
The highest correlation with winter wheat yield at each individual growth stage for each band combination type of the UAV hyperspectral images are shown in Table 8. Similar to the multispectral vegetation indices, the correlation coefficients between winter wheat yield and the hyperspectral vegetation indices were higher at the flowering and filling stages than at the booting, heading, and maturation stages. No VIs showed a significant correlation with grain yield at the early growth stages of booting and heading. Unlike the multispectral vegetation indices, the band combinations of the hyperspectral images at the flowering stage exhibited more significant correlations with yield than that at the filling stage. Moreover, all VIs, excluding ‘ b 1 + b 2 ’, were significantly correlated with yield at the flowering stage (r > 0.70). The band combination type of ‘ b 1 b 2 / b 3 b 4 ’ showed the best correlation with grain yield throughout the whole growth period among all VIs. Hence, the best performance in terms of yield estimation with hyperspectral-based VIs, according to the simple linear regression model used in the present study, was achieved by the VI of ‘ b 1 b 2 / b 3 b 4 ’ at the flowering stage, with an r value of 0.80 and RMSE of 0.75 t/ha (Figure 4b), and the corresponding central wavelengths of ‘ b 1 ’, ‘ b 2 ’, ‘ b 3 ’, and ‘ b 4 ’ were 838 nm, 722 nm, 710 nm, and 970 nm.

3.2. Winter Wheat Yield Estimation with Vegetation Indices Combining Multiple Growth Stages

The multiple linear regression models (Equation (2)) were employed to further investigate the relationship between winter wheat yield and the multi-temporal spectral vegetation indices.
The correlation coefficients between winter wheat yield and the multispectral vegetation indices at the five growth stages were determined via the use of the multiple linear regression models, and the results are presented in Figure 5. The results show that the vegetation indices combining multiple growth stages presented a higher correlation than the indices did at the individual growth stages. According to Figure 5, the MLR models combining multispectral vegetation indices of ‘DVI’, ‘RECI’, ‘GBVI’, ‘NDRE’, ‘MCARI’, ‘TCARI’, ‘MCARI/OSAVI’, and ‘TCARI/OSAVI’ across the five growth stages showed a significant correlation with winter wheat yield (r > 0.70). The optimal fit of the MLR model for winter wheat yield estimation based on the multispectral vegetation indices in this study was found at the index of ‘NDRE’ (r = 0.75; RMSE = 0.83 t/ha) (Figure 6a).
The best correlation coefficients of each band combination type of the UAV hyperspectral images including the five growth stages with winter wheat yield are shown in Figure 7. Comparing Table 8 and Figure 7, all band combination types demonstrated better regression accuracy of grain yield estimation based on multiple growth stages compared to when the growth stages were only considered singularly. According to Figure 7, all hyperspectral band combinations except ‘ b 1 + b 2 ’ presented a significant correlation with winter wheat yield when combining the five growth stages. The optimal MLR model for winter wheat yield estimation was based on the combination type of ‘ b 1 b 2 / b 3 b 4 ’, with r of 0.84 and RMSE of 0.69 t/ha (Figure 6b), which was significantly higher than the optimal MLR model of the multispectral vegetation index above. The central wavelengths of the best hyperspectral band combination, ‘ b 1 b 2 / b 3 b 4 ’, were 782 nm, 874 nm, 762 nm, and 890 nm, respectively.

3.3. Validation of the Regression Models for Winter Wheat Yield Estimation

The validation of the regression models for winter wheat yield estimation was conducted by using the independent dataset of ‘Repeat 3’ (n = 70).

3.3.1. Validation of the Simple Linear Regression Model at a Single Growth Stage

The RMSE and MAPE of the simple linear regression models at individual growth stages based on the multispectral vegetation indices and hyperspectral band combination types are shown in Table 9 and Table 10, respectively. The VI of ‘NDRE’ derived from the UAV-derived multispectral images at the flowering stage, which achieved the best correlation with grain yield, showed the lowest RMSE of 0.84 t/ha and a MAPE of 8.38%. For the VIs calculated based on the UAV-derived hyperspectral images, the best RMSE (0.78 t/ha) and MAPE (7.24%) were achieved by ‘ b 1 b 2 / b 3 b 4 ’ at the flowering stage. Moreover, these two indices achieved the best correlation with crop yield for the UAV-derived multispectral and hyperspectral images at all individual growth stages considered in this study, respectively. The validation of the estimated winter wheat yield (estimated via the use of two simple linear regression models) based on these two indices at the flowering stage is illustrated in Figure 8.

3.3.2. Validation of the Multiple Linear Regression Models for Winter Wheat Yield Estimation Combining Five Different Growth Stages

The RMSE of the multiple linear regression model for winter wheat yield estimation based on UAV-calculated multispectral and hyperspectral vegetation indices combining five growth stages and the corresponding MAPE are shown in Figure 9. According to Figure 9a, the lowest MAPE for the multispectral VIs was 8.44% (found at ‘NDRE’; RMSE of 0.90 t/ha). For the hyperspectral VIs, the lowest MAPE was 6.56%, which was achieved by ‘ b 1 b 2 / b 3 b 4 ’, with an RMSE of 0.70 t/ha (Figure 9b). The validation of the estimated winter wheat yield (derived from the use of two multiple linear regression models) based on these two indices is illustrated in Figure 10. Similar to the validation of the simple linear regression models at the individual growth stages, the RMSE and MAPE were lower for the multiple linear regression model based on the hyperspectral vegetation index than that based on the multispectral vegetation index. Moreover, comparing the regression models constructed only by the vegetation indices of an individual growth stage, the regression models which combined multi-temporal vegetation indices information demonstrated their robustness in winter yield estimation. The multiple linear regression model based on the band combination of ‘ b 1 b 2 / b 3 b 4 ’ from the hyperspectral sensor performed reasonably well (achieving the highest correlation coefficient and lowest RMSE and MAPE for winter wheat yield estimation) and could be regarded as the best grain yield estimation model independent of the photography conditions in this study.

4. Discussion

Currently, NDVI is the most widely used vegetation index for crop yield estimation. However, because of the characteristics of NDVI, it has significant saturation under a high vegetation coverage level, thereby affecting estimation accuracy [47,48,49,50]. According to the spectral reflectance characteristics of the plant, the absorption of chlorophyll on the red-edge waveband is weaker than that of red band, with the red-edge region having stronger transmission ability with the crop canopy [51]. The use of a red-edge band rather than a red band in the vegetation index of NDVI can reduce the saturation phenomenon, improving crop yield estimation accuracy [51,52]. Additionally, the spectrum of each irrigation group for the winter wheat cultivator of ‘C9’ at the flowering stage is shown in Figure 11. According to the figure, there were significant differences among different irrigation groups at the near-infrared band. Therefore, in this study, vegetation indices composed of the red-edge and near-infrared bands, both for MS imagery (NDRE) and HS imagery (‘ b 1 b 2 / b 3 b 4 ’), demonstrated reasonable robustness in terms of their winter wheat yield estimation.
Remote crop yield estimation methods are commonly based on the high correlation between the crop yield and the vegetation index taken at a specific crop growth stage [50]. The success of the development of a vegetation index is dependent on the use of bands with different sensitivities to the key parameter that is to be monitored [51]. The potential of multi-spectrum and hyper-spectrum for winter wheat yield estimation was systematically compared in this study. The lower yield estimation accuracy based on multispectral vegetation indices is mostly due to the obvious shortcoming of the limited fixed bands with a wide resolution. The hyperspectral sensor captured much richer information and is more sensitive to crop canopy characteristics with the continuous acquisition of reflectance at narrow wavelengths [45,52]. Additionally, for all vegetation indices calculated from the UAV-hyperspectral imagery used in this study, with the number of bands including in the vegetation index, the yield estimation accuracy of winter wheat assumed a rising tendency. According to the study of Thenkabail et al. [53], four sensitive band combination-based optimum multiple narrow band reflectance models could explain up to 92% of the crop biophysical parameter variability. Hence, the four-band combination type based on hyperspectral imagery achieved a more significant correlation with winter wheat yield than the vegetation indices derived from multispectral imagery throughout the whole growth period of winter wheat in this study.
According to Qader et al., yield estimation models that use VIs from the crop’s critical growth stage can obtain a higher accuracy across remote sensing data [54]. Our results regarding winter wheat yield estimation based on singular growth stages confirmed this and strongly indicated that the flowering stage was a critical period for winter wheat yield estimation. Some studies have shown that the accumulative vegetation index can improve the stability of yield estimation and that adding one or more growth stages (except the critical growth stage) could improve estimation accuracy [28,55,56]. In this study, the correlation coefficient between crop yield and the hyperspectral vegetation index of ‘ b 1 b 2 / b 3 b 4 ’ increased from 0.80 to 0.84, and RMSE decreased from 0.75 t/ha to 0.69 t/ha when the booting, heading, filling, and maturation stages were added with the MLR model in the flowering stage of the simple linear model.
Although the MLR model combining multiple temporal hyperspectral vegetation indices calculated according to hyperspectral imagery acquired good yield estimation accuracy for winter wheat, machine learning algorithms have demonstrated the potential to retrieve crop characteristics using multispectral satellite data, aerial multispectral data, aerial hyperspectral data, and so on [57,58,59]. Therefore, future research should aim to explore machine learning regression models to strengthen crop estimation ability via multiple temporal UAV-derived hyperspectral datasets. Additionally, data fusion approaches which could integrate UAV data and satellite data-based vegetation index time-series curves together and improve temporal resolutions will also be investigated in the context of estimating crop yield in a future study.

5. Conclusions

This study assessed the accuracy of winter wheat yield estimation values based on UAV-derived multispectral and hyperspectral images from single and multiple growth stages. The results suggested that the proposed multiple linear regression model, constructed by the vegetation index of ‘ b 1 b 2 / b 3 b 4 ’ with central wavelengths of 782 nm, 874 nm, 762 nm, 890 nm, which were calculated from UAV-based hyperspectral images using the growth stages from booting to maturation, can be used as a fast and reliable method for winter wheat yield estimation to contribute to the breeding of drought-resistant varieties of winter wheat in a field plot scale over a short amount of time. Moreover, the red-edge and near-infrared bands are recommended for use in the context of crop yield estimation.
From a longer-term perspective, more in-depth investigations into crop yield estimation (including rice, maize, soybean, and other crops) based on UAV-mounted hyperspectral datasets (via not only linear regression models but also machine learning algorithms, data assimilation, and so on) are expected.

Author Contributions

Conceptualization, Y.L., L.S. and B.L.; methodology, Y.L., L.S. and B.L.; data acquisition, Y.L., L.S., B.L., Y.W., J.M., W.Z., B.W. and Z.C.; data analysis, Y.L.; writing, Y.L.; revision, L.S., B.L., Y.W. and J.M., L.S. and B.L. contributed equally to this work and should be considered co-corresponding authors. All authors have read and agreed to the published version of the manuscript.

Funding

This research study was funded by the National Key Research and Development Program of China (grant number 2022YFD2001102), the Science and Technology Innovation Project of the Chinese Academy of Agricultural Sciences (grant number GJ2023-20-4), and the Key Research and Development Program of Hebei province (grant number 20326406D).

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to the privacy restriction.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Food and Agriculture Organization of the United Nations. Available online: https://www.fao.org/faostat/en/#data/QCL/visualize (accessed on 28 July 2023).
  2. Battude, M.; Al Bitar, A.; Morin, D.; Cros, J.; Huc, M.; Sicre, C.M.; Le Dantec, V.; Demarez, V. Estimating maize biomass and yield over large areas using high spatial and temporal resolution Sentinel-2 like remote sensing data. Remote Sens. Environ. 2016, 184, 668–681. [Google Scholar] [CrossRef]
  3. Maimaitijiang, M.; Sagan, V.; Sidike, P.; Hartling, S.; Esposito, F.; Fritschi, F.B. Soybean yield prediction from UAV using multimodal data fusion and deep learning. Remote Sens. Environ. 2020, 237, 111599. [Google Scholar] [CrossRef]
  4. Johnson, D.M. An assessment of pre-and within-season remotely sensed variables for forecasting corn and soybean yields in the United States. Remote Sens. Environ. 2014, 141, 116–128. [Google Scholar] [CrossRef]
  5. Holzman, M.E.; Carmona, F.; Rivas, R.; Niclòs, R. Early assessment of crop yield from remotely sensed water stress and solar radiation data. ISPRS J. Photogramm. Remote Sens. 2018, 145, 297–308. [Google Scholar] [CrossRef]
  6. Sun, L.; Gao, F.; Anderson, M.C.; Kustas, W.P.; Alsina, M.M.; Sanchez, L.; Sams, B.; McKee, L.; Dulaney, W.; White, W.A.; et al. Daily mapping of 30 m LAI and NDVI for grape yield prediction in California vineyards. Remote Sens. 2017, 9, 317. [Google Scholar] [CrossRef]
  7. Venancio, L.P.; Mantovani, E.C.; do Amaral, C.H.; Neale, C.M.U.; Gonçalves, I.Z.; Filgueiras, R.; Campos, I. Forecasting corn yield at the farm level in Brazil based on the FAO-66 approach and soil-adjusted vegetation index (SAVI). Agric. Water Manag. 2019, 225, 105779. [Google Scholar] [CrossRef]
  8. Hunt, M.L.; Blackburn, G.A.; Carrasco, L.; Redhead, J.W.; Rowland, C.S. High resolution wheat yield mapping using Sentinel-2. Remote Sens. Environ. 2019, 233, 111410. [Google Scholar] [CrossRef]
  9. Colomina, I.; Molina, P. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef]
  10. Zhang, C.; Kovacs, J.M. The application of small unmanned aerial systems for precision agriculture: A review. Precis. Agric. 2012, 13, 693–712. [Google Scholar] [CrossRef]
  11. Hunt, E.R., Jr.; Daughtry, C.S.T. What good are unmanned aircraft systems for agricultural remote sensing and precision agriculture? Int. J. Remote Sens. 2018, 39, 5345–5376. [Google Scholar] [CrossRef]
  12. Sankaran, S.; Khot, L.R.; Espinoza, C.Z.; Jarolmasjed, S.; Sathuvalli, V.R.; Vandemark, G.J.; Miklas, P.N.; Carter, A.H.; Pumphrey, M.O.; Knowles, N.R. Low-altitude, high-resolution aerial imaging systems for row and field crop phenotyping: A review. Eur. J. Agron. 2015, 70, 112–123. [Google Scholar] [CrossRef]
  13. Duan, B.; Fang, S.; Gong, Y.; Peng, Y.; Wu, X.; Zhu, R. Remote estimation of grain yield based on UAV data in different rice cultivars under contrasting climatic zone. Field Crops Res. 2021, 267, 108148. [Google Scholar] [CrossRef]
  14. Han, X.; Wei, Z.; Chen, H.; Zhang, B.; Li, Y.; Du, T. Inversion of winter wheat growth parameters and yield under different water treatments based on UAV multispectral remote sensing. Front. Plant Sci. 2021, 12, 609876. [Google Scholar] [CrossRef]
  15. Yu, D.; Zha, Y.; Shi, L.; Jin, X.; Hu, S.; Yang, Q.; Huang, K.; Zeng, W. Improvement of sugarcane yield estimation by assimilating UAV-derived plant height observations. Eur. J. Agron. 2020, 121, 126159. [Google Scholar] [CrossRef]
  16. Zhang, X.; Zhang, K.; Sun, Y.; Zhao, Y.; Zhuang, H.; Ban, W.; Chen, Y.; Fu, E.; Chen, S.; Liu, J.; et al. Combining spectral and texture features of UAS-based multispectral images for maize leaf area index estimation. Remote Sens. 2022, 14, 331. [Google Scholar] [CrossRef]
  17. Lu, N.; Zhou, J.; Han, Z.; Li, D.; Cao, Q.; Yao, X.; Tian, Y.; Zhu, Y.; Cao, W.; Cheng, T. Improved estimation of aboveground biomass in wheat from RGB imagery and point cloud data acquired with a low-cost unmanned aerial vehicle system. Plant Methods 2019, 15, 1–16. [Google Scholar] [CrossRef]
  18. Liu, Y.; Hatou, K.; Aihara, T.; Kurose, S.; Akiyama, T.; Kohno, Y.; Lu, S.; Omasa, K. A robust vegetation index based on different UAV RGB images to estimate SPAD values of naked barley leaves. Remote Sens. 2021, 13, 686. [Google Scholar] [CrossRef]
  19. Yu, J.; Wang, J.; Leblon, B. Evaluation of soil properties, topographic metrics, plant height, and unmanned aerial vehicle multispectral imagery using machine learning methods to estimate canopy nitrogen weight in corn. Remote Sens. 2021, 13, 3105. [Google Scholar] [CrossRef]
  20. Zheng, H.; Cheng, T.; Li, D.; Zhou, X.; Yao, X.; Tian, Y.; Cao, W.; Zhu, Y. Evaluation of RGB, color-infrared and multispectral images acquired from unmanned aerial systems for the estimation of nitrogen accumulation in rice. Remote Sens. 2018, 10, 824. [Google Scholar] [CrossRef]
  21. Swain, K.C.; Thomson, S.J.; Jayasuriya, H.P.W. Adoption of an unmanned helicopter for low-altitude remote sensing to estimate yield and total biomass of a rice crop. Trans. ASABE 2010, 53, 21–27. [Google Scholar] [CrossRef]
  22. Teoh, C.C.; Nadzim, N.; Mohd Shahmihaizan, M.J.; Mohd Khairil Izani, I.; Faizal, K.; Mohd Shukry, H.B. Rice yield estimation using below cloud remote sensing images acquired by unmanned airborne vehicle system. Int. J. Adv. Sci. Eng. Inf. Technol. 2016, 6, 516–519. [Google Scholar] [CrossRef]
  23. García-Martínez, H.; Flores-Magdaleno, H.; Ascencio-Hernández, R.; Khalil-Gardezi, A.; Tijerina-Chávez, L.; Mancilla-Villa, O.R.; Vázquez-Peña, M.A. Corn grain yield estimation from vegetation indices, canopy cover, plant density, and a neural network using multispectral and RGB images acquired with unmanned aerial vehicles. Agriculture 2020, 10, 277. [Google Scholar] [CrossRef]
  24. Ramos, A.P.M.; Osco, L.P.; Furuya, D.E.G.; Gonçalves, W.N.; Santana, D.C.; Teodoro, L.P.R.; da Silva Junior, C.A.; Capristo-Silva, G.F.; Li, J.; Baio, F.H.R.; et al. A random forest ranking approach to predict yield in maize with uav-based vegetation spectral indices. Comput. Electron. Agric. 2020, 178, 105791. [Google Scholar] [CrossRef]
  25. Liu, J.; Zhao, C.; Yang, G.; Yu, H.; Zhao, X.; Xu, B.; Niu, Q. Review of field-based phenotyping by unmanned aerial vehicle remote sensing platform. Trans. Chin. Soc. Agric. Eng. 2016, 32, 98–106. [Google Scholar] [CrossRef]
  26. Yue, J.; Yang, G.; Li, C.; Li, Z.; Wang, Y.; Feng, H.; Xu, B. Estimation of winter wheat above-ground biomass using unmanned aerial vehicle-based snapshot hyperspectral sensor and crop height improved models. Remote Sens. 2017, 9, 708. [Google Scholar] [CrossRef]
  27. Fan, J.; Zhou, J.; Wang, B.; de Leon, N.; Kaeppler, S.M.; Lima, D.C.; Zhang, Z. Estimation of Maize Yield and Flowering Time Using Multi-Temporal UAV-Based Hyperspectral Data. Remote Sens. 2022, 14, 3052. [Google Scholar] [CrossRef]
  28. Zhou, X.; Zheng, H.B.; Xu, X.Q.; He, J.Y.; Ge, X.K.; Yao, X.; Cheng, T.; Zhu, Y.; Cao, W.X.; Tian, Y.C. Predicting grain yield in rice using multi-temporal vegetation indices from UAV-based multispectral and digital imagery. ISPRS J. Photogramm. Remote Sens. 2017, 130, 246–255. [Google Scholar] [CrossRef]
  29. Crusiol, L.G.T.; Sun, L.; Sun, Z.; Chen, R.; Wu, Y.; Ma, J.; Song, C. In-season monitoring of maize leaf water content using ground-based and UAV-based hyperspectral data. Sustainability 2022, 14, 9039. [Google Scholar] [CrossRef]
  30. Shu, M.; Shen, M.; Zuo, J.; Yin, P.; Wang, M.; Xie, Z.; Tang, J.; Wang, R.; Li, B.; Yang, X.; et al. The application of UAV-based hyperspectral imaging to estimate crop traits in maize inbred lines. Plant Phenomics 2021, 2021, 9890745. [Google Scholar] [CrossRef]
  31. Ashourloo, D.; Nematollahi, H.; Huete, A.; Aghighi, H.; Azadbakht, M.; Shahrabi, H.S.; Goodarzdashti, S. A new phenology-based method for mapping wheat and barley using time-series of Sentinel-2 images. Remote Sens. Environ. 2022, 280, 113206. [Google Scholar] [CrossRef]
  32. Kawashima, S.; Nakatani, M. An algorithm for estimating chlorophyll content in leaves using a video camera. Ann. Bot. 1998, 81, 49–54. [Google Scholar] [CrossRef]
  33. Tucker, C.J. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sens. Environ. 1979, 8, 127–150. [Google Scholar] [CrossRef]
  34. Gitelson, A.A.; Viña, A.; Ciganda, V.; Rundquist, D.C.; Arkebauer, T.J. Remote estimation of canopy chlorophyll content in crops. Geophy. Res. Lett. 2005, 32. [Google Scholar] [CrossRef]
  35. Rouse, J.W.; Haas, R.H.; Schell, J.A.; Deering, D.W. Monitoring vegetation systems in the Great Plains with ERTS. In Proceedings of the Third Earth Resources Technology Satellite-1 Symposium, Greenbelt, MD, USA, 10–14 December 1973; NASASP-351 I: Greenbelt, MD, USA, 1973; pp. 309–317. [Google Scholar]
  36. Gitelson, A.A.; Gritz, Y.; Merzlyak, M.N. Relationships between leaf chlorophyll content and spectral reflectance and algorithms for non-destructive chlorophyll assessment in higher plant leaves. J. Plant Physiol. 2003, 160, 271–282. [Google Scholar] [CrossRef]
  37. Hunt, E.R.; Cavigelli, M.; Daughtry, C.S.T.; Mcmurtrey, J.E.; Walthall, C.L. Evaluation of digital photography from model aircraft for remote sensing of crop biomass and nitrogen status. Precis. Agric. 2005, 6, 359–378. [Google Scholar] [CrossRef]
  38. Gitelson, A.A.; Merzlyak, M.N. Remote estimation of chlorophyll content in higher plant leaves. Int. J. Remote Sens. 1997, 18, 2691–2697. [Google Scholar] [CrossRef]
  39. Hassan, M.A.; Yang, M.; Rasheed, A.; Jin, X.; Xia, X.; Xiao, Y.; He, Z. Time-series multispectral indices from unmanned aerial vehicle imagery reveal senescence rate in bread wheat. Remote Sens. 2018, 10, 809. [Google Scholar] [CrossRef]
  40. Raper, T.B.; Varco, J.J. Canopy-scale wavelength and vegetative index sensitivities to cotton growth parameters and nitrogen status. Precis. Agric. 2015, 16, 62–76. [Google Scholar] [CrossRef]
  41. Huete, A.; Didan, K.; Miura, T.; Rodriguez, E.P.; Gao, X.; Ferreira, L.G. Overview of the radiometric and biophysical performance of the MODIS vegetation indices. Remote Sens. Environ. 2002, 83, 195–213. [Google Scholar] [CrossRef]
  42. Jiang, Z.; Huete, A.R.; Didan, K.; Miura, T. Development of a two-band enhanced vegetation index without a blue band. Remote Sens. Environ. 2008, 112, 3833–3845. [Google Scholar] [CrossRef]
  43. Rondeaux, G.; Steven, M.; Baret, F. Optimization of soil-adjusted vegetation indices. Remote Sens. Environ. 1996, 55, 95–107. [Google Scholar] [CrossRef]
  44. Daughtry, C.S.T.; Walthall, C.L.; Kim, M.S.; de Colstoun, E.B.; McMurtrey, J.E., III. Estimating corn leaf chlorophyll concentration from leaf and canopy reflectance. Remote Sens. Environ. 2000, 74, 229–239. [Google Scholar] [CrossRef]
  45. Haboudane, D.; Miller, J.R.; Tremblay, N.; Zarco-Tejada, P.J.; Dextraze, L. Integrated narrow-band vegetation indices for prediction of crop chlorophyll content for application to precision agriculture. Remote Sens. Environ. 2002, 81, 416–426. [Google Scholar] [CrossRef]
  46. Gitelson, A.A. Wide dynamic range vegetation index for remote quantification of biophysical characteristics of vegetation. J. Plant Physiol. 2004, 161, 165–173. [Google Scholar] [CrossRef]
  47. Beck, P.S.A.; Atzberger, C.; Høgda, K.A.; Johansen, B.; Skidmore, A.K. Improved monitoring of vegetation dynamics at very high latitudes: A new method using MODIS NDVI. Remote Sens. Environ. 2006, 100, 321–334. [Google Scholar] [CrossRef]
  48. Gnyp, M.L.; Miao, Y.; Yuan, F.; Ustin, S.L.; Yu, K.; Yao, Y.; Huang, S.; Bareth, G. Hyperspectral canopy sensing of paddy rice aboveground biomass at different growth stages. Field Crops Res. 2014, 155, 42–55. [Google Scholar] [CrossRef]
  49. Li, H.; Zhao, C.; Yang, G.; Feng, H. Variations in crop variables within wheat canopies and responses of canopy spectral characteristics and derived vegetation indices to different vertical leaf layers and spikes. Remote Sens. Environ. 2015, 169, 358–374. [Google Scholar] [CrossRef]
  50. Shaver, T.M.; Khosla, R.; Westfall, D.G. Evaluation of two ground-based active crop canopy sensors in maize: Growth stage, row spacing, and sensor movement speed. Soil Sci. Soc. Am. J. 2010, 74, 2101–2108. [Google Scholar] [CrossRef]
  51. Zhang, K.; Ge, X.; Shen, P.; Li, W.; Liu, X.; Cao, Q.; Zhu, Y.; Cao, W.; Tian, Y. Predicting rice grain yield based on dynamic changes in vegetation indexes during early to mid-growth stages. Remote Sens. 2019, 11, 387. [Google Scholar] [CrossRef]
  52. Thompson, L.J.; Ferguson, R.B.; Kitchen, N.; Frazen, D.W.; Mamo, M.; Yang, H.; Schepers, J.S. Model and sensor-based recommendation approaches for in-season nitrogen management in corn. Agron. J. 2015, 107, 2020–2030. [Google Scholar] [CrossRef]
  53. Thenkabail, P.S.; Smith, B.; De Pauw, E. Hyperspectral vegetation indices and their relationships with agricultural crop characteristics. Remote Sens. Environ. 2000, 71, 158–182. [Google Scholar] [CrossRef]
  54. Qader, S.H.; Dash, J.; Atkinson, P.M. Forecasting wheat and barley crop production in arid and semi-arid regions using remotely sensed primary productivity and crop phenology: A case study in Iraq. Sci. Total Environ. 2018, 613, 250–262. [Google Scholar] [CrossRef] [PubMed]
  55. Serrano, L.; Filella, I.; Penuelas, J. Remote sensing of biomass and yield of winter wheat under different nitrogen supplies. Crop Sci. 2000, 40, 723–731. [Google Scholar] [CrossRef]
  56. Wang, L.; Tian, Y.; Yao, X.; Zhu, Y.; Cao, W. Predicting grain yield and protein content in wheat by fusing multi-sensor and multi-temporal remote-sensing images. Field Crops Res. 2014, 164, 178–188. [Google Scholar] [CrossRef]
  57. Crusiol, L.G.T.; Sun, L.; Sibaldelli, R.N.R.; Junior, V.F.; Furlaneti, W.X.; Chen, R.; Sun, Z.; Wuyun, D.; Chen, C.; Nanni, M.R.; et al. Strategies for monitoring within-field soybean yield using Sentinel-2 Vis-NIR-SWIR spectral bands and machine learning regression methods. Precis. Agric. 2022, 23, 1093–1123. [Google Scholar] [CrossRef]
  58. Ma, J.; Liu, B.; Ji, L.; Zhu, Z.; Wu, Y.; Jiao, W. Field-scale yield prediction of winter wheat under different irrigation regimes based on dynamic fusion of multimodal UAV imagery. Int. J. Appl. Earth Obs. Geoinf. 2023, 118, 103292. [Google Scholar] [CrossRef]
  59. da Silva, E.E.; Baio, F.H.R.; Teodoro, L.P.R.; da Silva, C.A., Jr.; Borges, R.S.; Teodoro, P.E. UAV-multispectral and vegetation indices in soybean grain yield prediction based on in situ observation. Remote Sens. Appl. Soc. Environ. 2020, 18, 100318. [Google Scholar] [CrossRef]
Figure 1. Experimental site field map. All cultivars were randomly distributed in each irrigation group.
Figure 1. Experimental site field map. All cultivars were randomly distributed in each irrigation group.
Remotesensing 15 04800 g001
Figure 2. Mean measured winter wheat yield value under different irrigation groups (a) and different cultivars (b).
Figure 2. Mean measured winter wheat yield value under different irrigation groups (a) and different cultivars (b).
Remotesensing 15 04800 g002
Figure 3. UAVs employed in the study. Multispectral images were captured by the DJI P4 Multispectral (a). Hyperspectral images were captured by the DJI M600 Pro (through the use of a Pika L hyperspectral camera) (b).
Figure 3. UAVs employed in the study. Multispectral images were captured by the DJI P4 Multispectral (a). Hyperspectral images were captured by the DJI M600 Pro (through the use of a Pika L hyperspectral camera) (b).
Remotesensing 15 04800 g003
Figure 4. Relationship between measured winter wheat yield and estimated winter wheat yield calculated using simple linear regression models based on the multispectral vegetation index of ‘NDRE’ (a) and the hyperspectral band combination of ‘ b 1 b 2 / b 3 b 4 ’ (b) at the flowering stage (n = 144). The red line is the fitted line between measured yield and estimated yield.
Figure 4. Relationship between measured winter wheat yield and estimated winter wheat yield calculated using simple linear regression models based on the multispectral vegetation index of ‘NDRE’ (a) and the hyperspectral band combination of ‘ b 1 b 2 / b 3 b 4 ’ (b) at the flowering stage (n = 144). The red line is the fitted line between measured yield and estimated yield.
Remotesensing 15 04800 g004
Figure 5. Correlation coefficients between winter wheat yield and multispectral vegetation indices combining five growth stages based on multiple linear regression models (n = 144).
Figure 5. Correlation coefficients between winter wheat yield and multispectral vegetation indices combining five growth stages based on multiple linear regression models (n = 144).
Remotesensing 15 04800 g005
Figure 6. Relationship between measured winter wheat yield and estimated winter wheat yield calculated using multiple linear regression models based on the multispectral vegetation index of ‘NDRE’ (a) and the hyperspectral band combination of ‘ b 1 b 2 / b 3 b 4 ’ (b) combining five growing stages (n = 144). The red line is the fitted line between measured yield and estimated yield.
Figure 6. Relationship between measured winter wheat yield and estimated winter wheat yield calculated using multiple linear regression models based on the multispectral vegetation index of ‘NDRE’ (a) and the hyperspectral band combination of ‘ b 1 b 2 / b 3 b 4 ’ (b) combining five growing stages (n = 144). The red line is the fitted line between measured yield and estimated yield.
Remotesensing 15 04800 g006
Figure 7. The highest correlation coefficient between winter wheat yield and each band combination type for the hyperspectral sensor images when the five growth stages were considered to be combined (results based on the use of the multiple linear regression models) (n = 144).
Figure 7. The highest correlation coefficient between winter wheat yield and each band combination type for the hyperspectral sensor images when the five growth stages were considered to be combined (results based on the use of the multiple linear regression models) (n = 144).
Remotesensing 15 04800 g007
Figure 8. Validation of the simple linear regression model used to estimate winter wheat yield at the flowering stage based on ‘NDRE’ (a) and ‘ b 1 b 2 / b 3 b 4 ’ (b) (n = 70). The red line is the fitted line between measured yield and estimated yield.
Figure 8. Validation of the simple linear regression model used to estimate winter wheat yield at the flowering stage based on ‘NDRE’ (a) and ‘ b 1 b 2 / b 3 b 4 ’ (b) (n = 70). The red line is the fitted line between measured yield and estimated yield.
Remotesensing 15 04800 g008
Figure 9. RMSE and MAPE of the validation for winter wheat yield estimation using multispectral vegetation indices (a) and hyperspectral vegetation indices (b) combining five growth stages (n = 70).
Figure 9. RMSE and MAPE of the validation for winter wheat yield estimation using multispectral vegetation indices (a) and hyperspectral vegetation indices (b) combining five growth stages (n = 70).
Remotesensing 15 04800 g009
Figure 10. Validation of the multiple linear regression model estimating winter wheat yield combining five different growth stages based on ‘NDRE’ (a) and ‘ b 1 b 2 / b 3 b 4 ’ (b) (n = 70). The red line is the fitted line between measured yield and estimated yield.
Figure 10. Validation of the multiple linear regression model estimating winter wheat yield combining five different growth stages based on ‘NDRE’ (a) and ‘ b 1 b 2 / b 3 b 4 ’ (b) (n = 70). The red line is the fitted line between measured yield and estimated yield.
Remotesensing 15 04800 g010
Figure 11. The spectrum of each irrigation group for the winter wheat cultivator of ‘C9’ at the flowering stage.
Figure 11. The spectrum of each irrigation group for the winter wheat cultivator of ‘C9’ at the flowering stage.
Remotesensing 15 04800 g011
Table 1. The irrigation date and corresponding growth stages of different irrigated sub-plots in the study.
Table 1. The irrigation date and corresponding growth stages of different irrigated sub-plots in the study.
Irrigation GroupIrrigation Date (d/m/y)Growth StageTotal Irrigation Volume (m3/ha)
A3 April 2021Jointing stage1500
3 May 2021Flowering stage
BNone--0
C29 November 2020Overwintering stage750
D10 March 2021Regreen stage750
E3 April 2021Jointing stage750
F10 April 2021Jointing stage750
G18 April 2021Booting stage750
Table 2. Descriptive statistics of measured winter wheat yield (t/ha).
Table 2. Descriptive statistics of measured winter wheat yield (t/ha).
ParameterNumber of SamplesMinimumMaximumMeanStandard
Deviation
Coefficient of
Variation
Grain yield21446.4611.248.581.2814.97%
Table 3. Results of our one-way analysis of variance of winter wheat yield for different irrigation groups and cultivars at 0.05 probability level.
Table 3. Results of our one-way analysis of variance of winter wheat yield for different irrigation groups and cultivars at 0.05 probability level.
F-Valuep-Value
Different irrigation groups13.730.00
Different winter wheat cultivars9.840.00
Table 4. The details of UAV multispectral and hyperspectral imagery acquisition in the study.
Table 4. The details of UAV multispectral and hyperspectral imagery acquisition in the study.
Acquisition Date (d/m/y)Growth Stage
UAV imagery18 April 2021Booting stage
28 April 2021Heading stage
12 May 2021Flowering stage
21 May 2021Filling stage
2 June 2021Maturation stage
Table 6. Band combinations of the UAV-derived hyperspectral images used in the study.
Table 6. Band combinations of the UAV-derived hyperspectral images used in the study.
NumberBand CombinationNumberBand Combination
1 b 1 + b 2 10 b 1 b 2 / b 3
2 b 1 b 2 11 b 1 + b 2 / b 3
3 b 1 / b 2 12 b 1 + b 2 / b 3 + b 2
4 b 1 / b 1 + b 2 13 b 1 b 2 / b 3 + b 2
5 b 1 / b 1 b 2 14 b 1 + b 2 / b 3 b 2
6 b 1 b 2 / b 1 + b 2 15 b 1 b 2 / b 3 b 2
7 b 1 + b 2 b 3 16 b 1 b 2 + b 3 b 4
8 b 1 / b 2 + b 3 17 b 1 b 2 / b 3 b 4
9 b 1 / b 2 b 3
Note: ‘ b 1 ’, ‘ b 2 ’, ‘ b 3 ’, and ‘ b 4 ’: the average value of each band of the UAV-derived hyperspectral images, respectively.
Table 7. Correlation coefficient (r) and RMSE between winter wheat yield and each multispectral vegetation index at the five different growth stages (n = 144).
Table 7. Correlation coefficient (r) and RMSE between winter wheat yield and each multispectral vegetation index at the five different growth stages (n = 144).
Multispectral
Vegetation Index
Growth Stages
BootingHeadingFloweringFillingMaturation
rRMSE (t/ha)rRMSE (t/ha)rRMSE (t/ha)rRMSE (t/ha)rRMSE (t/ha)
DVI0.261.210.311.180.121.240.481.100.431.13
RVI0.361.160.441.120.561.030.610.980.441.12
GCI0.071.240.251.210.630.970.561.030.341.17
RECI0.011.250.351.170.670.930.660.930.331.18
NDVI0.371.160.461.110.571.020.640.960.491.09
GNDVI0.101.240.281.200.640.960.610.980.361.16
GRVI0.141.240.211.220.241.210.491.090.491.09
GBVI0.211.220.221.220.261.200.391.150.391.15
NDRE0.001.250.361.160.670.920.670.930.331.18
NDREI0.161.230.141.240.481.100.521.070.281.20
SCCCI0.091.240.261.200.481.090.371.160.161.23
EVI0.381.150.471.100.161.230.121.240.041.25
EVI20.371.160.451.110.571.020.630.960.471.10
OSAVI0.161.240.331.200.441.130.541.060.161.23
MCARI0.211.220.141.230.201.220.481.100.471.10
TCARI0.031.250.091.240.081.240.191.220.531.06
MCARI/OSAVI0.211.220.141.230.201.220.481.100.471.10
TACRI/OSAVI0.031.250.091.240.081.240.191.220.531.06
WDRVI0.371.160.451.120.571.030.630.970.451.11
Note: The best results in terms of the r and RMSE values derived from the use of the simple linear regression functions for each growth stage are in bold typeface.
Table 8. The highest correlation coefficient between winter wheat yield and each band combination type of hyperspectral sensor images at five different growth stages (n = 144).
Table 8. The highest correlation coefficient between winter wheat yield and each band combination type of hyperspectral sensor images at five different growth stages (n = 144).
Band CombinationGrowth Stages
BootingHeadingFloweringFillingMaturation
rRMSE (t/ha)rRMSE (t/ha)rRMSE (t/ha)rRMSE (t/ha)rRMSE (t/ha)
b 1 + b 2 0.451.110.311.180.670.930.601.000.441.12
b 1 b 2 0.571.020.610.990.73 **0.860.75 **0.830.660.93
b 1 / b 2 0.571.030.610.990.75 **0.820.75 **0.830.650.95
b 1 / b 1 + b 2 0.571.030.610.990.75 **0.820.74 **0.830.650.95
b 1 / b 1 b 2 0.561.040.561.030.74 **0.840.700.900.680.91
b 1 b 2 / b 1 + b 2 0.571.030.610.990.75 **0.820.74 **0.830.650.95
b 1 + b 2 b 3 0.601.000.650.950.76 **0.810.700.890.700.88
b 1 / b 2 + b 3 0.610.990.670.930.79 **0.770.76 **0.820.72 **0.86
b 1 / b 2 b 3 0.600.990.650.950.76 **0.800.74 **0.840.73 **0.85
b 1 b 2 / b 3 0.601.000.650.950.78 **0.780.75 **0.830.73 **0.86
b 1 + b 2 / b 3 0.610.990.670.930.78 **0.770.76 **0.820.73 **0.86
b 1 + b 2 / b 3 + b 2 0.601.000.650.950.78 **0.790.75 **0.830.72 **0.86
b 1 b 2 / b 3 + b 2 0.601.000.650.950.78 **0.790.75 **0.830.72 **0.86
b 1 + b 2 / b 3 b 2 0.600.990.650.950.77 **0.800.73 **0.850.73 **0.85
b 1 b 2 / b 3 b 2 0.571.020.610.990.76 **0.800.77 **0.790.680.91
b 1 b 2 + b 3 b 4 0.640.960.660.940.79 **0.770.76 **0.810.73 **0.85
b 1 b 2 / b 3 b 4 0.650.950.680.920.80 ** 0.750.78 ** 0.780.75 ** 0.83
Note: ‘ b 1 ’, ‘ b 2 ’, ‘ b 3 ’, and ‘ b 4 ’: the average value of each band of the UAV hyperspectral images, respectively. ** indicates significance at p-value < 0.01. The best results in terms of the r and RMSE values derived from the use of the multiple linear regression functions for each growth stage are in bold typeface.
Table 9. RMSE and MAPE for winter wheat yield estimation with different multispectral vegetation indices at each individual growth stage (n = 70).
Table 9. RMSE and MAPE for winter wheat yield estimation with different multispectral vegetation indices at each individual growth stage (n = 70).
Multispectral
Vegetation Index
Growth Stages
BootingHeadingFloweringFillingMaturation
RMSE (t/ha)MAPERMSE (t/ha)MAPERMSE (t/ha)MAPERMSE (t/ha)MAPERMSE (t/ha)MAPE
DVI1.3514.09%1.3914.62%1.3113.44%1.1511.15%1.3613.94%
RVI1.2413.28%1.1011.43%1.009.93%0.979.99%1.1712.12%
GCI1.3413.87%1.2212.71%0.969.34%1.0310.52%1.2212.44%
RECI1.3513.94%1.2112.45%0.918.58%0.878.69%1.1912.05%
NDVI1.2413.22%1.0811.32%0.999.91%0.959.74%1.1512.01%
GNDVI1.3313.83%1.2012.48%0.959.23%0.969.78%1.2112.36%
GRVI1.3614.08%1.3714.37%1.2412.91%1.0911.33%1.2312.93%
GBVI1.3614.10%1.4014.58%1.2512.72%1.1911.82%1.3613.92%
NDRE1.3513.93%1.2012.38%0.848.38%0.858.47%1.1812.00%
NDREI1.3413.88%1.2813.30%1.1711.97%1.1211.52%1.3113.57%
SCCCI1.3814.11%1.2813.13%1.1311.22%1.1311.54%1.3513.96%
EVI1.2313.11%1.1011.37%1.3713.94%1.3513.93%1.3513.93%
EVI21.2413.24%1.0911.33%0.999.90%0.959.79%1.1612.03%
OSAVI1.3414.35%1.2212.63%1.1211.34%1.0310.55%1.3513.95%
MCARI1.3914.46%1.3313.93%1.3013.60%1.1611.98%1.2012.65%
TCARI1.3513.89%1.3313.67%1.3513.92%1.2913.55%1.1912.90%
MCARI/OSAVI1.3914.46%1.3313.93%1.3013.60%1.1611.98%1.2012.65%
TACRI/OSAVI1.3513.89%1.3313.67%1.3513.92%1.2913.55%1.1912.90%
WDRVI1.2413.26%1.0911.37%0.999.90%0.969.88%1.1712.07%
Table 10. RMSE and MAPE for winter wheat yield estimation with different band combination types of the UAV-derived hyperspectral sensor images at each individual growth stage (n = 70).
Table 10. RMSE and MAPE for winter wheat yield estimation with different band combination types of the UAV-derived hyperspectral sensor images at each individual growth stage (n = 70).
Band CombinationGrowth Stages
BootingHeadingFloweringFillingMaturation
RMSE (t/ha)MAPERMSE (t/ha)MAPERMSE (t/ha)MAPERMSE (t/ha)MAPERMSE (t/ha)MAPE
b 1 + b 2 1.2112.69%1.2413.24%0.949.50%1.0911.24%1.2412.61%
b 1 b 2 1.1511.43%1.2111.94%0.949.20%0.979.80%1.0010.34%
b 1 / b 2 1.2712.58%1.1010.95%0.908.84%0.9710.05%0.959.71%
b 1 / b 1 + b 2 1.2712.59%1.1010.94%0.898.77%0.9710.04%0.969.65%
b 1 / b 1 b 2 2.2117.83%1.1611.81%0.878.79%1.0110.03%0.919.26%
b 1 b 2 / b 1 + b 2 1.2712.59%1.1010.94%0.898.77%0.9710.04%0.969.65%
b 1 + b 2 b 3 1.0911.07%1.1010.90%0.837.81%0.979.82%0.939.39%
b 1 / b 2 + b 3 1.1511.55%1.0210.29%0.797.38%0.9810.11%0.898.79%
b 1 / b 2 b 3 1.0710.95%1.0910.81%0.848.36%0.939.13%0.919.18%
b 1 b 2 / b 3 1.0911.27%1.0910.86%0.807.64%0.969.94%0.949.51%
b 1 + b 2 / b 3 1.1511.55%1.0210.30%0.797.68%0.9810.11%0.888.69%
b 1 + b 2 / b 3 + b 2 1.0811.25%1.1210.99%0.847.87%0.979.99%0.939.35%
b 1 b 2 / b 3 + b 2 1.0811.25%1.1210.99%0.847.87%0.979.99%0.939.35%
b 1 + b 2 / b 3 b 2 1.0811.12%1.1110.92%0.848.40%0.909.24%0.919.03%
b 1 b 2 / b 3 b 2 1.2712.55%1.1010.92%0.919.00%0.838.08%0.989.88%
b 1 b 2 + b 3 b 4 1.1011.30%1.0510.48%0.797.40%0.888.94%0.9510.02%
b 1 b 2 / b 3 b 4 0.979.93%1.0210.10%0.787.24%0.848.47%0.898.92%
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Liu, Y.; Sun, L.; Liu, B.; Wu, Y.; Ma, J.; Zhang, W.; Wang, B.; Chen, Z. Estimation of Winter Wheat Yield Using Multiple Temporal Vegetation Indices Derived from UAV-Based Multispectral and Hyperspectral Imagery. Remote Sens. 2023, 15, 4800. https://doi.org/10.3390/rs15194800

AMA Style

Liu Y, Sun L, Liu B, Wu Y, Ma J, Zhang W, Wang B, Chen Z. Estimation of Winter Wheat Yield Using Multiple Temporal Vegetation Indices Derived from UAV-Based Multispectral and Hyperspectral Imagery. Remote Sensing. 2023; 15(19):4800. https://doi.org/10.3390/rs15194800

Chicago/Turabian Style

Liu, Yu, Liang Sun, Binhui Liu, Yongfeng Wu, Juncheng Ma, Wenying Zhang, Bianyin Wang, and Zhaoyang Chen. 2023. "Estimation of Winter Wheat Yield Using Multiple Temporal Vegetation Indices Derived from UAV-Based Multispectral and Hyperspectral Imagery" Remote Sensing 15, no. 19: 4800. https://doi.org/10.3390/rs15194800

APA Style

Liu, Y., Sun, L., Liu, B., Wu, Y., Ma, J., Zhang, W., Wang, B., & Chen, Z. (2023). Estimation of Winter Wheat Yield Using Multiple Temporal Vegetation Indices Derived from UAV-Based Multispectral and Hyperspectral Imagery. Remote Sensing, 15(19), 4800. https://doi.org/10.3390/rs15194800

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop