Next Article in Journal
Effect of Nitrogen Fertilisation and Inoculation with Bradyrhizobium japonicum on the Fatty Acid Profile of Soybean (Glycine max (L.) Merrill) Seeds
Next Article in Special Issue
Simplified and Advanced Sentinel-2-Based Precision Nitrogen Management of Wheat
Previous Article in Journal
Physiological Properties and Molecular Regulation in Different Edamame Cultivars under Drought Stress
Previous Article in Special Issue
Assimilation of Sentinel-2 Estimated LAI into a Crop Model: Influence of Timing and Frequency of Acquisitions on Simulation of Water Stress and Biomass Production of Winter Wheat
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Assessment of Maize Growth and Development with High- and Medium-Resolution Remote Sensing Products

by
Rocío Ballesteros
1,*,
Miguel A. Moreno
1,
Fellype Barroso
2,
Laura González-Gómez
3 and
José F. Ortega
1
1
Institute of Regional Development, University of Castilla-La Mancha, 02071 Albacete, Spain
2
Northeast Citizenship Institute, Fortaleza 60.724-502, Brazil
3
Irrigation Department, Centro de Edafología y Biología Aplicada del Segura (CEBAS-CSIC), 30100 Murcia, Spain
*
Author to whom correspondence should be addressed.
Agronomy 2021, 11(5), 940; https://doi.org/10.3390/agronomy11050940
Submission received: 13 April 2021 / Revised: 2 May 2021 / Accepted: 3 May 2021 / Published: 10 May 2021

Abstract

:
The availability of a great amount of remote sensing data for precision agriculture purposes has set the question of which resolution and indices, derived from satellites or unmanned aerial vehicles (UAVs), offer the most accurate results to characterize vegetation. This study focused on assessing, comparing, and discussing the performances and limitations of satellite and UAV-based imagery in terms of canopy development, i.e., the leaf area index (LAI), and yield, i.e., the dry aboveground biomass (DAGB), for maize. Three commercial maize fields were studied over four seasons to obtain the LAI and DAGB. The normalized difference vegetation index (NDVI) and visible atmospherically resistant index (VARI) from satellite platforms (Landsat 5TM, 7 ETM+, 8OLI, and Sentinel 2A MSI) and the VARI and green canopy cover (GCC) from UAV imagery were compared. The remote sensing predictors in addition to the growing degree days (GDD) were assessed to estimate the LAI and DAGB using multilinear regression models (MRMs). For LAI estimation, better adjustments were obtained when predictors from the UAV platform were considered. The DAGB estimation revealed similar adjustments for both platforms, although the Landsat imagery offered slightly better adjustments. The results obtained in this study demonstrate the advantage of remote sensing platforms as a useful tool to estimate essential agronomic features.

1. Introduction

Maize (Zea mays L.) is the most produced cereal in the world, with more than one thousand million tons harvested in 2019 [1]. The maize harvested area was close to 200 million hectares, representing more than 14% of the worldwide cropped land. It is the most important crop for grain production, although wheat and rice are the most important for direct human consumption [2]. Research has forecasted that the world population will grow to reach 9.1 billion in 2050. Therefore, maize should be considered one of the main pillars for global food security.
It is not possible to speak about global population increase without considering future global warming scenarios where the water availability will decrease in amount and quality. Maize water requirements range from 500 to more than 800 mm year−1 [2], which makes this cereal one of the most water demanding. Most maize cultivated lands require supplemental irrigation for optimal growth. Therefore, maximizing the crop production and reducing the water applied for the same cultivated land (crop water productivity) is one of the main challenges for global agriculture. In this context, precision agriculture plays a key role by improving sustainable decision making, defined as a way to apply the right treatment in the right place and at the right time [3].
Precision agriculture demands crop monitoring and management as well as crop biomass and yield estimation, considering the leaf area index (LAI, m2 leaf m−2 soil) as one of the most representative indicators [4]. With the objective of improving yield predictions, accurate LAI characterization constitutes one of the first steps for crop modeling and, therefore, helps farming decision making. LAI characterization along the growth cycle provides significant information regarding the canopy architecture, efficiency of light interception, and yield—primarily for those crops such as maize with a large photosynthetic system. Field measurements are commonly destructive and tedious, and sampling is not repeatable across time. Intensive work has been conducted to obtain non-destructive approaches based on the empirical relationships between LAI and spectral vegetation indices (VIs) [5] or crop structure parameters, such as green canopy cover (GCC) (Ballesteros et al. 2014).
Satellite remote sensing is widely used for crop monitoring through the relationship between the normalized difference vegetation index (NDVI) and crop vigor [6], where infrared bands are essential for NDVI calculations. High-spatial-resolution observations are not obtained with conventional satellite missions, such as Landsat or Sentinel. A new generation of satellites, e.g., RapidEye or Formosat-2, offer higher revisit cycles and resolutions, although they do not supply open data. Sentinel-2 appeared as an opportunity to improve these concerns with the effectiveness of decametric resolution satellite imagery to describe and monitor crop growth and development [6]. This open data satellite imagery is not suitable for field monitoring purposes in agricultural systems as it causes mixed soil, crop, and weed pixels, leading to biased VIs [7]. Using different platforms and sensors demands the analysis of interoperability between the different products. The authors of [8] compared the atmospherically corrected NDVI from Landsat 8 and that from Sentinel-2. They found that Sentinel-2 offered slightly lower values than Landsat 8 did, with a very high correlation coefficient. However, the RMSE was 0.06, which is lower than the generated model errors. We can therefore be confident that this aspect did not highly impact the final results. The NDVI was developed to try to reduce these negative impacts. The interoperability between Landsat 7 and 8 [9] showed no deviation for high NDVI values and slight deviation when sparse vegetation was found (early crop development stages). Additionally, [10] showed an RMSE of 0.08 for the NDVI TOA between Landsat 8 and Landsat 7. Again, we found a low impact of this deviation compared with the accuracy of the final model. When analyzing the interoperability between Landsat 7 and Landsat 5, [11] indicated a high degree of similarity, which implies that monitoring activities initiated using Landsat 5 data can be continued with a minimal amount of caution using Landsat 7 data. In the case of UAV-based information, vegetation segmentation was not performed. The spectral response of soil and vegetation was averaged for each sampling area. We followed this strategy to ensure interoperability between satellite-based and UAV-based data.
The studies of Haboudane et al. (2004) and Viña et al. (2011) are only two examples of the great efforts made to describe the LAI using spectral data since remote sensing technologies emerged in the early 1980s. VIs have demonstrated a high sensitivity to the LAI. The NDVI has been one of the most traditionally used indices. It is very well known, as the soil background influences LAI characterization when using NDVI [12,13]. Other authors utilized other indices, such as the soil-adjusted vegetation index (SAVI), to overcome this problem [14].
The visible atmospherically resistant index (VARI) has been widely described in the scientific literature for crop monitoring through different VIs, such as LAI estimation [15]. In the case of satellite-based remote sensing, where low spatial resolution appears, VARI is not widely utilized because it is based on the relationship between visible bands, and the response of vegetation in the visible spectrum is much lower than that in the infrared region of the spectrum.
However, in the case where a higher resolution is obtained, such as in the case of UAV-based remote sensing, the visible spectrum supplies useful information that is not contaminated by elements other than vegetation [16]. However, Haboudane et al. agreed that it is difficult to recommend a specific index that describes only the desired variable and is insensitive to the other environmental or biological parameters [12].
Unmanned aerial vehicles (UAVs), are becoming a pillar for precision agriculture purposes in the context of Agriculture 4.0. They are easy to use, acquiring a platform is becoming more and more affordable, the obtained products are independent of cloudy conditions, and they are able to capture images of a higher spatial and temporal resolution [17] compared with the traditionally used Landsat or modern Sentinel satellites. However, there are still some important limitations, e.g., in large-scale applications. Large-scale remote sensing data allow for monitoring the heterogeneity of cropland, whilst high temporal resolutions allow for accurate description of crop growth, especially when farming management decisions are required [4].
Most of the studies based on information derived from UAVs utilize multispectral information, which is an attempt to adapt the indices and methodologies used for satellite-based remote sensing to the images captured by UAVs. The drastic change in the spatial resolution must be accompanied by a change in the approach to obtain useful information for agriculture. The choice of red, green, and blue (RGB) bands when using UAV platforms is one of the main changes, with more affordable sensors and easier photogrammetry processes. The use of RGB information can be exploited by generating new RGB-based indices [15,16] or even exploiting the generated point clouds and, therefore, the 3D information [16,18]. Many scientific works achieved significant results by using RGB cameras mounted on UAV platforms [19,20,21,22].
Currently, real-time crop growth and biomass prediction, where LAI plays a key role [23], is one of the main concerns not only for research purposes but also for farming decision management. Many research works have been developed with this aim [5,6,21,22,24]. Most of them focused on studying specific resolutions, high or medium (satellite platforms) and high or very high resolutions (UAV platforms). However, to the authors’ knowledge, comparisons of both platforms have not been sufficiently developed for most agricultural frameworks, as most of them were focused on vineyards, orchard agricultural systems, and forests [7,25].
The need to answer the question of whether a higher resolution is needed in addition to deciding which indices—derived from satellite or UAVs—offer more accurate results to characterize maize vegetation inspired this present research. The objective of this research work was to assess, compare, and discuss the performance and limitations of satellite-based and UAV imagery in terms of the canopy development (LAI) and dry aboveground biomass (DAGB) for maize. To achieve this objective, a reliable description of the crops is required.

2. Materials and Methods

2.1. Study Area

This study was carried out in Tarazona de La Mancha (Albacete) in the southeast of Spain (Figure 1) over four seasons in 2011, 2012, 2015, and 2016. This area is classified as semi-arid, as the aridity index reaches 0.26 [22]. The meteorological variables were recorded from an agro-meteorological station located close to the field sites (X = 593,176 m; Y = 4,346,196 m (ETRS89, UTM30N EPSG: 25830), 722 m asl).
The daily minimum temperature (Tmin) values for the maize growing season ranged from 7.7 to 9.8 °C in 2011 and 2016, respectively; the daily maximum temperatures (TMAX) were higher than 33 °C (Table 1). According to registered agro-climatic data, the 2016 season was dry compared to the rest of the seasons, with precipitation (P) and reference evapotranspiration (ETo) values higher than the main registered ones for the four seasons (P = 230 mm and ETo = 942 mm) (Table 1).
Field experiments were performed in three different fields of irrigated FAO-700 cycles of maize grain crop: maize was grown in field A (18.02 ha) for the 2011 and 2012 seasons, in field B (42 ha) for the 2015 season, and in field C (37 ha) for the 2016 season (Figure 1). Field A was cultivated under conventional farming practices, whilst fields B and C were cultivated following conservation agriculture practices, i.e., promoting minimum mechanical and soil disturbances and permanent soil organic coverages. The landscape is characterized by a vast plain. The soil is classified as Inceptisol Xerept in the study area [26,27]. Soil analyses were performed before seeding for every season. The soil was characterized as sandy clay loam for field A and as sandy loam for fields B and C. Local soils were characterized by alkaline pH values, i.e., higher than 8.0. Organic matter contents ranged from 0.91 to 1.58% for field A and from 3.8 to 4.2% for fields B and C, respectively. The higher values of organic matter content for fields B and C compared with field A show how no tillage systems increase organic matter content. The maize was sown on 29 April 2011, 26 April 2012, 3 April 2015, and 29 March 2016 with a seed density of 120,000 seeds ha−1 for the four seasons. The field experiments were performed in the mentioned three commercial fields belonging to different landowners under different cultivation techniques.
The growing degree days (GDD) were calculated using the double sine wave method [28,29]. The minimum and maximum threshold temperatures were 6 and 26 °C for the FAO-I stage, 7 and 31 °C for the FAO-II stage, 12 and 38 °C for the FAO-III stage, and 6 and 40 °C for the FAO-IV stage, respectively [30,31]. Harvest occurred on 28 and 30 November in 2011 and 2012 and 18 and 28 September in 2015 and 2016.
Although the harvest dates were quite different for 2011 and 2012 compared with 2015 and 2016, the four seasons reached full ripeness when more than 2000 GDD were accumulated. Field A was irrigated using a permanent solid system (4918 m3 ha−1 in 2011 and 5210 m3 ha−1 in 2012). Fields B and C were irrigated using a central pivot system (8816 m3 ha−1 and 8067 m3 ha−1). Differences in the total amounts of irrigation were related to different irrigation strategies and scheduling as well as different farm managing strategies. These differences in farming strategies will serve to evaluate the generalizability of the obtained relationships.

2.2. Field Data Collection

For the four analyzed seasons, the maize was monitored weekly according to the Bundesanstalt Bundessortenamt and Chemical Industry scale (BBCH) [32]. The yield values were 15,300, 13,900, 16,674, and 17,343 kg ha−1 for the 2011, 2012, 2015, and 2016 seasons, respectively, according to the last sampling event when full ripening was reached.
The dates for field data acquisition were selected according to the main canopy changes and considering the main phenological stages as defined by [33]. The dates, together with the main phenological stages, are presented in Table 3. The sampling plots were randomly selected for every date. Coordinates are shown in Supplementary Table S1. Three 1.4-square-meter sampling frames delimited each sampling plot for the 2011, 2012, and 2016 seasons and four for the 2015 season. The total maize plants per frame were counted to determine the plant density. Six plants per sampling plot were collected. The LAI was calculated, determining the leaf area (LA) using an automated infrared imaging system, LI-COR-3100C (LI-COR Inc., Lincoln, NE, USA), and the plant density (Figure 2, step 3). The sampled maize plants were carried to the laboratory where the dry aboveground biomass (DAGB) was determined for every sampling event. The coordinates of the sample points were obtained with a GNSS-RTK with an accuracy of 2 cm in planimetry.

2.3. UAV Imagery

Very-high-resolution flights and field sampling events were performed at the same time. Red, green, and blue band (RGB) aerial images were taken using a Microdrone md4-200 (Microdrones, Inc., Kreuztal Germany) provided with an RGB PENTAX A40 digital camera (Pentax, Golden, CO, USA) in 2011 and 2012 and with a Microdrone md4-1000 (Microdrones, Inc., Kreuztal, Germany) provided with a SONY α ILCE-5100L (SONY Corporation, Tokyo, Japan) in 2015 and 2016. To obtain a ground resolution of 1.03 cm pixel−1, flights were performed at a 44-meter height for the md4-200 and 48 m for the md4-1000 microdrone.
Aerial images were taken automatically following a flight plan for each UAV platform following the Microdrones Photogrammetric Flight Planning software (MFLIP) output [34] (Figure 2, step 1.2.). This software requires, as input parameters, a digital elevation model (DEM) of the area of flight and the internal orientation parameters of the digital camera. This software also considers GPS errors, which results in an improvement of the flight planning performance. Proper overlapping imagery was established in 60% and sidelapping was established in 25%. Blurred images were automatically detected following the proposed methodology of [35]. The geomatic products (orthoimage, digital surface model (DSM), and point cloud) were obtained using PhotoScan Professional version 1.4.1 software (Agisoft LLC, St. Petersburg, Russia). Flights were always performed at solar midday and sun glints and hotspots were detected following the advice of [36].

2.4. Satellite Data Acquisition and Preprocessing

Satellite images were downloaded considering the closest dates to sampling events and UAV flights for every season, as climatic conditions, such as clouds, did not allow for obtaining low and medium imagery (satellite) from when sampling events and UAV flights were performed for the same dates. The analyzed satellite images were extracted from operational Landsat 5 TM (2011 season), Landsat 7 ETM+ (2011 and 2012 seasons), Landsat 8 OLI (2015 and 2016 seasons), and Sentinel 2A MSI (2016 season) (Figure 2, step 1.1.). Reflectance images with atmospheric correction were used. The Landsat images were download free of charge from the United States Geological Survey (USGS) [37]. Landsat 8 OLI, Landsat 5 TM, and Landsat 7 ETM+ collection 2 surface reflectances are Level-2 data products. Level-2 data products present radiometric calibration and atmospheric correction algorithms to Level-1 Landsat data products. In the case of Landsat 8 OLI, the data were processed by the suppliers using the Land Surface Reflectance Code (LaSRC). Landsat 5 TM and Landsat 7 ETM+ data were processed by the suppliers using the Landsat Ecosystem Disturbance Adaptive Processing System (LEDAPS) algorithm [37]. Sentinel images were from the Copernicus Open Access Hub website [38] and corresponded with the Level-2A product (orthorectified Bottom-Of-Atmosphere reflectances) The Level-2A product presents an atmospheric correction applied to Top-Of-Atmosphere (TOA) Level-1C orthoimage products [39]. The Level-2A Sentinel 2 products were processed by the suppliers using Sen2Corprocessor. The final products present geometric, radiometric, and atmospheric corrections. Specifically, 43 cloud-free images were used in this work (Table 2).

2.5. Calculation of the Vegetation Indices and Parameters

In the case of satellite-based remote sensing, the relationships between the LAI and DAGB field data and the VIs, NDVILandsat, NDVISentinel, VARILandsat, and VARISentinel were computed. The NDVI is the most widespread VI, while VARI was calculated to compare the results of using the RGB bands obtained with the drones with the visible bands for each utilized mission. The NDVI was computed using Formula (1) [40], where NIR and R are the near-infrared and red bands of the image, respectively (Figure 2, step 2.1.).
N D V I = N I R R N I R + R
VARI was determined following Gitelson et al. (2002) (2), where r, g, and b are the normalized red, green, and blue bands of the image, respectively (Figure 2, step 2.2.).
V A R I = g r g + r b .
The NDVI and VARI values were linearly interpolated between adjacent dates (Table 2) to obtain the daily values for every field for the same dates on which sampling events and UAV flights were performed (Table 3) analyzed in the study.
In the case of UAV-based remote sensing data, VARIUAV and GCCUAV were calculated to monitor the LAI and DAGB along the growth cycle (Figure 2, step 2.2.). The radiometric quality of conventional RGB cameras is much lower than the radiometric quality provided by satellite missions. Thus, to minimize this effect, before calculating the proposed VI, digital numbers (DNs) were calculated from the average of the digital number of the R, G, and B bands of every sampling plot. Nevertheless, non-normalized RGB coordinates are directly proportional to the total light reflected from a surface and are highly sensitive to the intensity of the illuminating source and its angle with the target surface. To minimize this problem, the RGB coordinates (the DNs for every band) were used to derive the red, green, and blue chromatic coordinates (ρ, γ, and β) [16,41], ensuring normalized values for each band. This procedure also minimizes the negative effect of using different sensors for the different years. The GCC was determined using the Leaf Area Index Calculation software [42], which segments the vegetation in an orthoimage using a machine learning algorithm.

2.6. Statistical Analysis

Statistical analysis was performed using MATLAB™ (MathWorks, Natick, MA, USA) (Figure 2, step 4). Two different groups of predictor variables were used to assess the LAI and DAGB: (i) a predictor derived from the agro-climatic variable, i.e., GDD (Figure 2, step 2.1. and step 2.2.); and (ii) the previously mentioned VIs, i.e., NDVI and VARI for the satellite imagery and GCC and VARI for the UAV imagery (Figure 2, step 2.1. and step 2.2.).
A multilinear regression model (MRM) was explored to estimate both target variables, i.e., LAI and DAGB, for the 2011, 2012, and 2015 seasons from Landsat missions (Landsat 5 TM, Landsat 7 ETM+, Landsat 8 OLI) (NDVILandsat and VARILandsat) and the calculated GDD. The MRM was used to establish the relationships between the GCCUAV and VARIUAV obtained from the UAV imagery and the calculated GDD from the field data. As performed for the satellite approach, LAI and DAGB were trained using GCCUAV, VARIUAV, and GDD as independent variables for the 2011, 2012, and 2015 seasons. The testing of the selected models was performed using independent data from the 2016 season, which was not used in the previous processes.
The Sentinel platform was launched in 2015. Stepwise linear regression (SLR) was explored to predict the LAI and DAGB for the 2016 season to evaluate the performance of this mission. SLR may be considered when a small dataset is used, without losing a significant portion of the explanatory power of the data [43]. This was the case for the evaluation of the Sentinel 2A MSI mission, for which only one irrigation season (2016) was available in this study.
The considered predictor variables were NDVISentinel, VARISentinel, and the computed GDD for the LAI and DAGB. SLR was performed using forward and backward regression to determine the final model. The selected criterion to add or remove terms was the p-value for the F test. If the p-value of the model was smaller than 0.05, then the term was added to the model, whereas if the p-value of the model was higher than 0.1, then the term was removed from the model.
The adjusted coefficient of determination (R2adj) and the root mean squared error (RMSE) were used to assess the performance of the obtained models.

3. Results

3.1. Field Data Sampling

The predictor variables NDVI, VARI, GCC, and GDD and the field sampling data LAI and DAGB for every sampling event are shown in Table 3. VARILandsat, VARISentinel, and VARIUAV for every sampling event in every season are shown in Supplementary Figure S1. LAI values ranged from 0.31 to 5.36 m2 leaf m−2. The maximum LAI values were sampled at the development of the fruit and ripening phenological stages, when maximum predictor variable values for NDVILandsat, NDVISentinel, VARILandsat, VARISentinel, GCCUAV, and VARIUAV were also reached (at around 1100 GDD). For the highest LAI values, the coefficients of variation (CVs) were the lowest.
The maximum DAGB reached 3473, 3479, 2910, and 2924 g m−2 soil in 2011, 2012, 2015, and 2016, respectively. The maximum DAGB values were registered at ripening when more than 3400 GDD was reached for the 2011 and 2012 seasons and at around 2900 GDD for the 2015 and 2016 seasons. The 2012 season showed higher heterogeneity regarding the DAGB compared with the others: the CV reached 36% in 2012, while it went from 8 to 11% for 2011, 2015, and 2016. The maximum values of all predictors from the satellite and UAV platforms appeared at the same time that the LAI did: the third FAO stage. The NDVILandsat and NDVISentinel maximum values ranged from 0.69 to 0.89. The VARILandsat and VARISentinel maximum values reached 0.25. The GCC reached 95.36%, and the VARIUAV reached 0.09.

3.2. Relationships between the Predictors, LAI, and DAGB

Table 4 shows the studied models’ estimation of the LAI from NDVILandsat and GDD, VARILandsat and GDD, GCCUAV and GDD, and VARIUAV and GDD for the 2011, 2012, and 2015 seasons. Better adjustments were obtained when the predictor variables from the UAV platform were considered. The GCCUAV and GDD predictors offered values of 0.931 and 0.468 m2 leaf m−2 soil for the R2adj and RMSE, respectively, whilst predictors from Landsat were lower than 0.730 for the R2adj, and the RMSE values were higher than 1.00 m2 leaf m−2 soil.
DAGB estimation revealed similar adjustments for both platforms, although the Landsat imagery offered slightly better adjustments: The VARILandsat and GDD predictors showed values of 0.779 and 578 g m−2 soil for the R2adj and RMSE, respectively, whilst the NDVILandsat and GDD showed similar adjustments (0.770 and 589 g m−2 soil for R2adj and RMSE, respectively). As adjustments obtained for the Landsat platform, the DAGB model obtained by GCCUAV and GDD offered values of 0.747 and 615 g m−2 soil for the R2adj and RMSE, respectively, and the model described by the VARIUAV and GDD predictors offered similar statistical adjustments compared with the GCCUAV and GDD.
Figure 3 shows, graphically, the results obtained when the predicted and observed LAI values are represented for the 2016 season using GCCUAV and GDD as predictor variables and the selected model. We observed that the LAIpredicted values were overestimated when they were lower than 3.48 m2 m−2 soil.
The predicted versus observed DAGB values using VARILandsat and GDD as predictor variables can be observed graphically in Figure 3 for the 2016 season. The statistical indicators revealed significant adjustments: R2adj = 0.875. The maximum deviation was observed at the end of the growing cycle when the crop was completely developed according to the sampling data (Table 4). An overestimation for the DAGB predicted values can be observed along the whole growing cycle using VARILandsat and NDVILandsat (Figure 4).

3.3. Relationships between LAI and DAGB and the Predictor Variables Obtained from Sentinel

Table 5 shows the obtained models for LAI and DAGB from the Sentinel imagery in 2016. LAI prediction by NDVISentinel and GDD showed values of 0.871 and 0.540 m2 m−2 for the R2adj and RMSE, respectively. The DAGB model revealed that the only significant predictor was GDD. SLR did not consider any of the VIs proposed, and only GDD was selected as a predictor. Therefore, considering NDVISentinel or VARISentinel as predictor variables offered the same model and no statistical differences.

4. Discussion

The results obtained in this study demonstrate the advantage of remote sensing platforms as a tool to estimate the essential agronomic features of maize fields under different management practices. The significant correlations found in this study between the VIs derived from remote sensing platforms and the agronomic features, such as LAI and DAGB, offer promising results in the context of climate change and its implications for food security.
High canopy uniformity was observed when the LAI was at the maximum, as the CV values were low when most plants were at their maximum development. Therefore, the heterogeneity regarding canopy development in the maize crop rates disappeared when the fruit development phenological stage was reached (full canopy reached). We observed higher heterogeneity for the DAGB compared with the LAI. The 2012 season offered the highest CV (36%), whilst this heterogeneity did not appear for the LAI variable (CV was 10%) for the same season.
These results agree with those obtained by [44], who observed high heterogeneity rates—higher than 20% CV when DAGB was measured. Higher rates of CV for the DAGB were noted at the end of the growing cycle when the maximum values were reached in the 2012 season, when the amount of water for irrigation was much lower than that applied in 2015 and 2016, although the ETo values were similar.
The VARI reached quite different values depending on the platform used when the maximum canopy development occurred. We observed that the VARILandsat and VARISentinel offered significantly higher values than VARIUAV did. It is important to highlight that the different ranges of VARI values extracted from the different platforms did not affect the goodness of the adjustments, as they were considered independent predictors for LAI and DAGB for every season, and they were not combined. VARILandsat appeared to be the best indicator for DAGB estimation. Additionally, VARIUAV offered significant results. We observed the high dependency of the full canopy for these indices, as the CV was lower when the first growth and development stages were completed. The NDVILandsat and NDVISentinel revealed almost no variations when the full canopy was reached, indicating a saturation of this index when the canopy closes [43,44].
The obtained results showed better adjustments between the UAV predictor variables than those obtained from the satellite to represent canopy development. The NDVI from satellite and crop canopy radiometric information can be altered by other sources, such as inter-row paths [6]. The generated information from satellite platforms could be insufficient to properly evaluate the canopy behavior. Therefore, the predicted LAI from the satellite VIs led to a bias at the early crop growth stages, when the influence of the background could bias the LAI predictions [25]. In this case, imagery with a very high resolution is required to properly assess the canopy development.
Discriminating soil from crops is feasible using raw RGB images, but quantifying the percentage of canopy cover is necessary to classify these classes. As shown in the results, GCCUAV was highly sensitive in monitoring the canopy development. This predictor variable could potentially be used not only to determine the LAI but also to represent canopy development for irrigation scheduling and early decision farming management. As mentioned before, at the end of the growing cycle, when reaching the maximum values of LAI, the NDVI might not reflect the increase in the canopy density linearly [44]. In these cases, imagery with a very high resolution is required to properly assess the canopy development.
Despite the good results shown for the VARILandsat and GDD for DAGB prediction, this is not the only effective index for this estimation. NDVILandsat offered very good adjustments when it was combined with the GDD predictor. The differences between both indices were not highly significant. Therefore, we recommend the use of NDVILandsat and GDD to predict the DAGB, as has been traditionally used by remote sensing users.
The results obtained for training and validation of the DAGB model from the Sentinel data using SLR indicated that the DAGB was only determined by the GDD behavior. This suggests that the remote sensing information did not provide any information for DAGB modeling. The absence of significant relationships could be related to the need for more data to create a proper DAGB model, as was observed for the Landsat satellites.
Therefore, a multilinear regression model (MRM) was used to evaluate the potential of NDVISentinel, VARISentinel, and GDD as predictors to calibrate a DAGB model (Table 6). As was described for Landsat imagery, both VIs, NDVISentinel and VARISentinel, combined with the GDD revealed excellent adjustments. Nevertheless, the obtained results showed the difficulty in producing an accurate model to predict the DAGB, suggesting the necessity of creating a DAGB-predicting model in real time. The model should be fed the captured remote sensing data and recorded GDD training to test the model for every season in real time.

5. Conclusions

Accurate prediction of the main agricultural features for maize is a pillar for agricultural management overall in those countries where this crop is essential for food and feeding. Remote sensing platforms offer a huge amount of data for farming decision support tools. Comparing the accuracy of the different available platforms, i.e., satellites and UAVs, and the different generated vegetation indices and parameters is mandatory for better canopy and yield characterization. The LAI can be successfully characterized by using the indices derived from UAV platforms in addition to phenological information (GDD).
Whilst the predictors from Landsat satellites and UAV platforms offered similar results, the obtained results revealed that more data are required to evaluate the potential of the Sentinel data for LAI and DAGB prediction. The difficulties in performing an accurate prediction of DAGB suggest the importance of DAGB in real time when feeding a model with captured and processed remote sensing data. Satellite platforms offer open data and cover vast areas, whilst UAVs are more adaptable to different weather and field conditions. The use of RGB bands instead of traditional multispectral ones derived from the use of satellite data is one of the main opportunities with more affordable sensors and easier photogrammetry processes. Applying this methodology, LAI and DAGB prediction models must be calibrated for many other crops in different irrigation areas.

Supplementary Materials

The following are available online at https://www.mdpi.com/article/10.3390/agronomy11050940/s1, Table S1: Sampling events coordinates (ETRS89, UTM30N EPSG 25830), Figure S1: Visual Atmospheric Resistance Index from unmanned aerial vehicle values for 2011, 2012, 2015, and 2016 seasons, Visual Atmospheric Resistance Index from Landsat values for 2011, 2012, and 2015 seasons, and Visual Atmospheric Resistance Index from Sentinel values for 2016 season.

Author Contributions

R.B. is the principal investigator and corresponding author. She led and supervised the overall research and field data collection. J.F.O. wrote the paper in collaboration with R.B. and M.A.M., and F.B. performed the field data collection and UAV flights. L.G.-G. processed the satellite imagery. M.A.M., R.B., and J.F.O. performed financial and contractual duties. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Castilla-La Mancha Regional Government SPLY/19/180501/0080 and FEDER funds.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. FAOSTAT. Agricultural Statistical Data of Food and Agricultural Organization of the United Nations. Available online: http://www.fao.org/faostat/es/#data (accessed on 15 January 2021).
  2. Steduto, P.; Hsiao, T.C.; Fereres, E.; Raes, D. Respuesta del Rendimiento de los Cultivos al Agua; Food and Agriculture Organization of the United Nations: Rome, Italy, 2012; Volume 66, ISBN 9789253085644. [Google Scholar]
  3. Gebbers, R.; Adamchuk, V.I. Precision Agriculture and Food Security. Science 2010, 327, 328–830. [Google Scholar] [CrossRef] [PubMed]
  4. Dong, T.; Liu, J.; Qian, B.; Zhao, T.; Jing, Q.; Geng, X.; Wang, J.; Huffman, T.; Shang, J. Estimating winter wheat biomass by assimilating leaf area index derived from fusion of Landsat-8 and MODIS data. Int. J. Appl. Earth Obs. Geoinf. 2016, 49, 63–74. [Google Scholar] [CrossRef]
  5. Viña, A.; Gitelson, A.A.; Nguy-Robertson, A.L.; Peng, Y. Comparison of different vegetation indices for the remote assessment of green leaf area index of crops. Remote Sens. Environ. 2011, 115, 3468–3478. [Google Scholar] [CrossRef]
  6. Khaliq, A.; Comba, L.; Biglia, A.; Ricauda Aimonino, D.; Chiaberge, M.; Gay, P. Comparison of satellite and UAV-based multispectral imagery for vineyard variability assessment. Remote Sens. 2019, 11, 436. [Google Scholar] [CrossRef]
  7. Mazzia, V.; Comba, L.; Khaliq, A.; Chiaberge, M.; Gay, P. UAV and machine learning based refinement of a satellite-driven vegetation index for precision agriculture. Sensors 2020, 20, 2530. [Google Scholar] [CrossRef]
  8. Chakhar, A.; Ortega-Terol, D.; Hernández-López, D.; Ballesteros, R.; Ortega, J.F.; Moreno, M.A. Assessing the accuracy of multiple classification algorithms for crop classification using landsat-8 and sentinel-2 data. Remote Sens. 2020, 12, 1735. [Google Scholar] [CrossRef]
  9. Xu, D. Compare NDVI Extracted from Landsat 8 Imagery with that from Landsat 7 Imagery. Am. J. Remote Sens. 2014, 2, 10. [Google Scholar] [CrossRef]
  10. Roy, D.P.; Kovalskyy, V.; Zhang, H.K.; Vermote, E.F.; Yan, L.; Kumar, S.S.; Egorov, A. Characterization of Landsat-7 to Landsat-8 reflective wavelength and normalized difference vegetation index continuity. Remote Sens. Environ. 2016, 185, 57–70. [Google Scholar] [CrossRef]
  11. Vogelmann, J.E.; Helder, D.; Morfitt, R.; Choate, M.J.; Merchant, J.W.; Bulley, H. Effects of Landsat 5 Thematic Mapper and Landsat 7 Enhanced Thematic Mapper plus radiometric and geometric calibrations and corrections on landscape characterization. Remote Sens. Environ. 2001, 78, 55–70. [Google Scholar] [CrossRef]
  12. Haboudane, D.; Miller, J.R.; Pattey, E.; Zarco-Tejada, P.J.; Strachan, I.B. Hyperspectral vegetation indices and novel algorithms for predicting green LAI of crop canopies: Modeling and validation in the context of precision agriculture. Remote Sens. Environ. 2004. [Google Scholar] [CrossRef]
  13. Bannari, A.; Morin, D.; Bonn, F.; Huete, A. A review of vegetation indices. Remote Sens. Rev. 2009, 13, 95–120. [Google Scholar] [CrossRef]
  14. Venancio, L.P.; Mantovani, E.C.; do Amaral, C.H.; Usher Neale, C.M.; Gonçalves, I.Z.; Filgueiras, R.; Campos, I. Forecasting corn yield at the farm level in Brazil based on the FAO-66 approach and soil-adjusted vegetation index (SAVI). Agric. Water Manag. 2019, 225, 105779. [Google Scholar] [CrossRef]
  15. Gitelson, A.A.; Kaufman, Y.J.; Stark, R.; Rundquist, D. Novel algorithms for remote estimation of vegetation fraction. Remote Sens. Environ. 2002, 80, 76–87. [Google Scholar] [CrossRef]
  16. Ballesteros, R.; Ortega, J.F.; Hernandez, D.; Campo, A.; Moreno, M.A. Combined use of agro-climatic and very high-resolution remote sensing information for crop monitoring. Int. J. Appl. Earth Obs. Geoinf. 2018, 72, 66–75. [Google Scholar] [CrossRef]
  17. Zhang, S.; Zhao, G.; Lang, K.; Su, B.; Chen, X.; Xi, X.; Zhang, H. Integrated satellite, unmanned aerial vehicle (UAV) and ground inversion of the spad ofwinter wheat in the reviving stage. Sensors 2019, 19, 1485. [Google Scholar] [CrossRef]
  18. Del-Campo-Sanchez, A.; Ballesteros, R.; Hernandez-Lopez, D.; Fernando Ortega, J.; Moreno, M.A. Quantifying the effect of Jacobiasca lybica pest on vineyards with UAVs by combining geometric and computer vision techniques. PLoS ONE 2019, 14, 1–20. [Google Scholar] [CrossRef]
  19. Chen, A.; Orlov-Levin, V.; Meron, M. Applying high-resolution visible-channel aerial imaging of crop canopy to precision irrigation management. Agric. Water Manag. 2019. [Google Scholar] [CrossRef]
  20. Pádua, L.; Marques, P.; Hruška, J.; Adão, T.; Peres, E.; Morais, R.; Sousa, J.J. Multi-temporal vineyard monitoring through UAV-based RGB imagery. Remote Sens. 2018, 10, 1907. [Google Scholar] [CrossRef]
  21. Santana, L.S.; Santos, L.M.; Maciel, D.A.; Barata, R.A.P.; Reynaldo, É.F.; Rossi, G. Vegetative vigor of maize crop obtained through vegetation indexes in orbital and aerial sensors images. Rev. Bras. Eng. Biossistemas 2019, 13, 195–206. [Google Scholar] [CrossRef]
  22. Steduto, P.; Hsiao, T.C.; Raes, D.; Fereres, E. AquaCrop—The FAO Crop Model to Simulate Yield Response to Water: I. Concepts and Underlying Principles. Agron. J. 2009, 101, 426. [Google Scholar] [CrossRef]
  23. Lukas, V.; Novák, J.; Neudert, L.; Svobodova, I.; Rodriguez-Moreno, F.; Edrees, M.; Kren, J. The combination of UAV survey and Landsat imagery for monitoring of crop vigor in precision agriculture. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. ISPRS Arch. 2016, 8. [Google Scholar] [CrossRef]
  24. Tian, J.; Wang, L.; Li, X.; Gong, H.; Shi, C.; Zhong, R.; Liu, X. Comparison of UAV and WorldView-2 imagery for mapping leaf area index of mangrove forest. Int. J. Appl. Earth Obs. Geoinf. 2017. [Google Scholar] [CrossRef]
  25. Soil Survey Staff; Natural Resources Conservation Service; U.S. Department Of Agriculture. Claves para la Taxonomía de Suelos; USDA United States Department Agriculture: Washington, DC, USA, 2014; ISBN 0926487221.
  26. IGN España. Mapas Edafológicos. 2005. Available online: https://www.ign.es/web/catalogo-cartoteca/resources/html/030769.html (accessed on 15 January 2021).
  27. Sevacherian, V.; Stern, V.M.; Mueller, A.J. Heat Accumulation for Timing Lygus Control Measures in a Safflower-Cotton Complex. J. Econ. Entomol. 1977, 70, 399–402. [Google Scholar] [CrossRef]
  28. Ballesteros, R.; Moreno, M.A.; Ortega, J.F. Calibration and validation of thermal requirement models for characterizing phenological stages. Ital. J. Agrometeorol. 2015, 3, 47–62. [Google Scholar]
  29. Barroso, F.R. Imágenes Aéreas de Muy Alta Resolución para la Caracterización del Maíz (Zea mays L.) de Regadío en una Zona Semiárida. Ph.D. Thesis, Castilla-La Mancha University, Albacete, Spain, 2017. [Google Scholar]
  30. Ballesteros, R.; Moreno, M.A.; Ortega, F. 47-62 ballesteros: Layout 1. Ital. J. Agrometeorol. Ital. Agrometeorol. 2015, 3, 47–62. [Google Scholar]
  31. Meier, U. Growth Stages of Mono- and Dicotyledonous Plants. BBCH Monograph; Federal Biological Research Centre of Agriculture and Forest: Braunschweig, Germany, 2001. [Google Scholar]
  32. Allen, R.G.; Pereira, L.S.; Raes, D.; Smith, M. FAO Irrigation and Drainage Paper No. 56: Crop Evapotranspiration; Food and Agriculture Organization of the United Nations: Rome, Italy, 1998; ISBN 9251042195. [Google Scholar]
  33. Hernandez-Lopez, D.; Felipe-Garcia, B.; Gonzalez-Aguilera, D.; Arias-Perez, B. An automatic approach to UAV flight planning and control for photogrammetric applications: A test case in the asturias region (Spain). Photogramm. Eng. Remote Sens. 2013, 79, 87–98. [Google Scholar] [CrossRef]
  34. Ribeiro-Gomes, K.; Hernandez-Lopez, D.; Ballesteros, R.; Moreno, M.A. Approximate georeferencing and automatic blurred image detection to reduce the costs of UAV use in environmental and agricultural applications. Biosyst. Eng. 2016, 151, 308–327. [Google Scholar] [CrossRef]
  35. Ortega-Terol, D.; Hernandez-Lopez, D.; Ballesteros, R.; Gonzalez-Aguilera, D. Automatic hotspot and sun glint detection in UAV multispectral images. Sensors 2017, 17, 2352. [Google Scholar] [CrossRef] [PubMed]
  36. United States Geological Survey EarthExplorer. Available online: https://earthexplorer.usgs.gov/ (accessed on 7 February 2021).
  37. Copernicus. Available online: https://scihub.copernicus.eu/dhus/#/home (accessed on 1 January 2021).
  38. European Space Agency. Sentinel-2 User Handbook; European Space Agency: Paris, France, 2015. [Google Scholar]
  39. Rouse, J. Monitoring the Vernal Advancement and Retrogradation (Greenwave Effect) of Natural Vegetation; National Aaeronautics and Space Administration: Washington, DC, USA, 1972.
  40. Woebbecke, D.M.; Meyer, G.E.; Von Bargen, K.; Mortensen, D.A. Color indices for weed identification under various soil, residue, and lighting conditions. Trans. Am. Soc. Agric. Eng. 1995. [Google Scholar] [CrossRef]
  41. Córcoles, J.I.; Ortega, J.F.; Hernández, D.; Moreno, M.A. Estimation of leaf area index in onion (Allium cepa L.) using an unmanned aerial vehicle. Biosyst. Eng. 2013. [Google Scholar] [CrossRef]
  42. Huang, C.; Townshend, J.R.G. A stepwise regression tree for nonlinear approximation: Applications to estimating subpixel land cover. Int. J. Remote Sens. 2003, 24, 75–90. [Google Scholar] [CrossRef]
  43. Li, W.; Niu, Z.; Chen, H.; Li, D.; Wu, M.; Zhao, W. Remote estimation of canopy height and aboveground biomass of maize using high-resolution stereo images from a low-cost unmanned aerial vehicle system. Ecol. Indic. 2016, 67, 637–648. [Google Scholar] [CrossRef]
  44. Duan, T.; Chapman, S.C.; Guo, Y.; Zheng, B. Dynamic monitoring of NDVI in wheat agronomy and breeding trials using an unmanned aerial vehicle. Field Crop. Res. 2017, 210, 71–80. [Google Scholar] [CrossRef]
Figure 1. Locations of the studied maize commercial fields and agro-climatic stations in Tarazona de La Mancha (Spain).
Figure 1. Locations of the studied maize commercial fields and agro-climatic stations in Tarazona de La Mancha (Spain).
Agronomy 11 00940 g001
Figure 2. The workflow from the flight image acquisition to the statistical analysis. NDVILandsat: normalized difference vegetation index from Landsat imagery; VARILandsat: visible atmospherically resistant index from Landsat imagery; NDVISentinel: normalized difference vegetation index from Sentinel imagery; VARISentinel: visible atmospherically resistant index from Sentinel imagery; GDD: growing degree days; GCCUAV: green canopy cover from unmanned aerial vehicle imagery; VARIUAV: visible atmospherically resistant index from unmanned aerial vehicle imagery; LAI: leaf area index; DAGB: dry above ground biomass; MRM: multilinear regression model; SLR: stepwise linear regression; R2adj: adjusted coefficient of determination; RMSE: root mean squared error.
Figure 2. The workflow from the flight image acquisition to the statistical analysis. NDVILandsat: normalized difference vegetation index from Landsat imagery; VARILandsat: visible atmospherically resistant index from Landsat imagery; NDVISentinel: normalized difference vegetation index from Sentinel imagery; VARISentinel: visible atmospherically resistant index from Sentinel imagery; GDD: growing degree days; GCCUAV: green canopy cover from unmanned aerial vehicle imagery; VARIUAV: visible atmospherically resistant index from unmanned aerial vehicle imagery; LAI: leaf area index; DAGB: dry above ground biomass; MRM: multilinear regression model; SLR: stepwise linear regression; R2adj: adjusted coefficient of determination; RMSE: root mean squared error.
Agronomy 11 00940 g002
Figure 3. Results of the estimation of the leaf area index (LAI) for the testing irrigation season (2016).
Figure 3. Results of the estimation of the leaf area index (LAI) for the testing irrigation season (2016).
Agronomy 11 00940 g003
Figure 4. Results of the estimation of the dry aboveground biomass (DAGB) for the testing irrigation season (2016) using DAGB = −174.24–1317.3.1NDVILansat + 2.3997GDD.
Figure 4. Results of the estimation of the dry aboveground biomass (DAGB) for the testing irrigation season (2016) using DAGB = −174.24–1317.3.1NDVILansat + 2.3997GDD.
Agronomy 11 00940 g004
Table 1. The main registered agro-climatic data throughout the four seasons (2011, 2012, 2015, and 2016).
Table 1. The main registered agro-climatic data throughout the four seasons (2011, 2012, 2015, and 2016).
SeasonTminTMAXPETo
20119.833.3215.8921.0
20129.233.8311.6899.9
20159.636.2219.5954.1
20167.733.6175.3995.5
Annual mean9.134.2230.6942.6
Tmin: mean daily minimum temperature (°C); TMAX: mean daily maximum temperature (°C); P: total rainfall (mm); ETo: total reference evapotranspiration (mm).
Table 2. Dates of the images used in the analysis.
Table 2. Dates of the images used in the analysis.
Field A (2011)Field A (2012)Field B (2015)Field C (2016)
L59 April 2011L710 April 2012L86 May 2015S2A1 May 2016
L511 May 2011L728 May 2012L822 May 2015S2A21 May 2016
L519 June 2011L713 June 2012L87 June 2015L89 June 2016
L528 June 2011L78 July 2012L830 June 2015S2A13 June 2016
L77 August 2011L731 July 2012L89 July 2015S2A20 June 2016
L723 August 2011L725 August 2012L816 July 2015S2A23 June 2016
L57 September 2011L73 October 2012L81 August 2015L82 July 2016
L826 August 2015S2A10 July 2016
L811 September 2015L811 July 2016
L818 July 2016
L827 July 2016
S2A2 August 2016
S2A9 August 2016
L812 August 2016
L819 August 2016
S2A22 August 2016
S2A29 August 2016
L84 September 2016
S2A8 September 2016
L5: Landsat 5; L7: Landsat 7; L8: Landsat 8; S2A: Sentinel 2A.
Table 3. Statistics of the field sampling data, satellite vegetation indices, and unmanned aerial vehicle (UAV) indices and data.
Table 3. Statistics of the field sampling data, satellite vegetation indices, and unmanned aerial vehicle (UAV) indices and data.
SeasonSampling
Date
Event
Principal Growth Scale (BBCH Scale) aGDDField Sampling DataSatellite Vegetation IndicesUAV Indices and Data
LAIDAGBNDVILandsatVARILandsatNDVISentinelVARISentinelGCCVARIUAV
x ¯ CV x ¯ CV x ¯ CV x ¯ CV x ¯ CV x ¯ CV x ¯ CV x ¯ CV
201122 June1: Leaf development4090.859.7466.1219.270.433.42−0.13−3.49 21.444.43−0.23−42.45
20 August7: Development of fruit12445.146.07996.3020.970.690.000.070.00 88.251.980.0919.85
30 August8: Ripening14124.7822.093473.178.410.610.00−0.040.00 78.6522.20−0.03−57.96
201221 June1: Leaf development4281.540.00136.070.000.400.00−0.130.00 34.840.00−0.110.00
12 July6: Flowering, anthesis6984.368.391333.6527.050.650.00−0.010.00 75.3216.750.0822.45
14 August7: Development of fruit11984.615.692096.952.120.710.000.090.00 95.360.160.0515.60
28 August8: Ripening14553.8418.902750.0716.760.680.730.0920.61 58.566.38−0.03−21.79
10 September8: Ripening16402.8829.093478.7036.240.521.47−0.04−16.95 37.236.50−0.09−21.88
201529 May1: Leaf development4030.4414.4632.3320.400.3466.69−0.06−66.83 12.607.19−0.23−2.81
10 June3: Stem elongation5762.149.41191.3310.120.6011.240.0290.36 47.988.65−0.03−47.38
25 June6: Flowering, anthesis7845.3610.42658.5219.200.5866.690.0866.68 84.137.480.058.57
8 July6: Flowering, anthesis10165.3510.121098.142.290.850.520.199.31 84.903.290.053.14
24 July7: Development of fruit13005.267.381770.1411.590.860.990.213.36 82.486.160.059.62
6 August7: Development of fruit15345.2916.172565.5116.790.870.530.254.87 78.237.350.0418.80
19 August8: Ripening17574.778.052910.249.110.840.410.224.62 66.938.370.0251.00
2 September8: Ripening19842.146.482477.7511.030.731.290.1012.75 16.158.42−0.21−4.53
201614 June1: Leaf development5610.3149.4932.2625.040.377.54−0.14−14.650.2911.93−0.20−11.9023.004.16−0.15−3.94
27 June3: Stem elongation7391.1441.16130.5666.270.5612.39−0.04−148.530.5710.75−0.05−46.3532.342.30−0.10−4.12
13 July6: Flowering, anthesis10113.8810.44362.8676.270.794.070.1225.600.826.480.2533.2958.2616.610.0333.49
21 July6: Flowering, anthesis11434.0313.61650.5623.610.831.640.178.540.873.360.3716.3364.1512.140.0412.13
3 August7: Development of fruit13683.8513.571102.882.590.840.890.196.390.970.310.642.3566.633.590.0315.82
19 August7: Development of fruit16284.3513.221533.0227.670.861.160.252.410.940.230.453.2375.886.000.0511.96
26 August8: Ripening17453.656.362074.1025.910.810.400.166.390.881.180.359.8772.726.450.0517.76
7 September8: Ripening19523.647.072535.0620.610.730.070.057.470.802.850.1817.9574.964.530.0510.79
23 September8: Ripening21601.4858.532924.5111.040.570.64−0.07−3.510.595.40−0.09−35.1334.8119.75−0.09−41.99
LAI: leaf area index (m2 m−2 soil); DAGB: dry aboveground biomass (g m−2); NDVILandsat: normalized difference vegetation index obtained from Landsat; VARILandsat: visible atmospherically resistant index from Landsat; NDVISentinel: normalized difference vegetation index obtained from Sentinel; VARISentinel: visible atmospherically resistant index from Sentinel; GCC: green canopy cover (%); VARIUAV: visible atmospherically resistant index from unmanned aerial vehicle (UAV); x ¯ : mean value; CV: coefficient of variation (%). a Bundesanstalt Bundessortenamt and Chemical Industry scale (BBCH) scale [27].
Table 4. Trained models using the 2011, 2012, and 2015 field and remote sensing data.
Table 4. Trained models using the 2011, 2012, and 2015 field and remote sensing data.
VariablePredictorsEquationR2adjRMSE
LAINDVILandsat, GDDLAI = −2.4829 + 9.1596NDVILandsat + 2.1015*10^-5GDD0.7231.01
VARILandsat, GDDLAI = 3.0711 + 11.68VARILandsat−0.00012514GDD0.6131.19
GCCUAV, GDDLAI = −0.75607 + 5.4966GCCUAV + 0.00095473GDD0.9310.468
VARIUAV, GDDLAI = 2.8943 + 14.353VARIUAV + 0.0010841GDD0.8530.682
DAGBNDVILandsat, GDDDAGB = −174.24–1317.3NDVILansat + 2.3997GDD0.770589
VARILandsat, GDDDAGB = −1001.9–2106.3VARILansat + 2.4677GDD0.779578
GCCUAV, GDDDAGB = −941.17 + 426.42GCCUAV + 2.0389GDD0.746615
VARIUAV, GDDDAGB = −685.97 + 785.29792.
66VARIUAV + 2.0598GDD
0.741621
LAI: leaf area index (m2 m−2 soil); NDVILandsat: normalized difference vegetation index obtained from Landsat; GDD: growing degree days; VARILandsat: visible atmospherically resistant index from Landsat; GCC: green canopy cover (%); VARIUAV: visible atmospherically resistant index from unmanned aerial vehicle (UAV); DAGB: dry aboveground biomass (g m−2); R2adj: adjusted coefficient of determination; RMSE: root mean squared error (m2 leaf m−2 soil for LAI and g m−2 for DAGB).
Table 5. Trained and calibrated models using the 2016 field and remote sensing data from Sentinel.
Table 5. Trained and calibrated models using the 2016 field and remote sensing data from Sentinel.
VariablePredictorEquationR2adjRMSE
LAINDVISentinel, GDDLAI = −1.9427 + 6.5039NDVISentinel0.8710.54
VARISentinel, GDDLAI = 1.8985 + 4.906VARISentinel0.7540.747
DAGBNDVISentinel, GDDDAGB = −1111.4 + 1.7402GDD0.944244
VARISentinel, GDDDAGB = −1111.4 + 1.7402GDD0.944244
LAI: leaf area index (m2 m−2 soil); NDVISentinel: normalized difference vegetation index obtained from Sentinel; GDD: growing degree days; VARISentinel: visible atmospherically resistant index from Sentinel; DAGB: dry aboveground biomass (g m−2); R2adj: adjusted coefficient of determination; RMSE: root mean squared error (m2 leaf m−2 soil for LAI and g m−2 for DAGB).
Table 6. Trained multilinear regression model (MRM) model using the 2016 field and remote sensing data from Sentinel.
Table 6. Trained multilinear regression model (MRM) model using the 2016 field and remote sensing data from Sentinel.
VariablePredictorEquationR2adjRMSE
DAGBNDVISentinel, GDDDAGB = −1092.6–29.079NDVISentinel + 1.7415GDD0.942248
VARISentinel, GDDDAGB = −1111.4 + 15.86VARISentinel + 1.7407GDD0.942248
NDVISentinel: normalized difference vegetation index obtained from Sentinel; GDD: growing degree days; VARISentinel: visible atmospherically resistant index from Sentinel; DAGB: dry aboveground biomass (g m−2); R2adj: adjusted coefficient of determination; RMSE: root mean squared error (m2 leaf m−2 soil for LAI and g m−2 for DAGB).
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Ballesteros, R.; Moreno, M.A.; Barroso, F.; González-Gómez, L.; Ortega, J.F. Assessment of Maize Growth and Development with High- and Medium-Resolution Remote Sensing Products. Agronomy 2021, 11, 940. https://doi.org/10.3390/agronomy11050940

AMA Style

Ballesteros R, Moreno MA, Barroso F, González-Gómez L, Ortega JF. Assessment of Maize Growth and Development with High- and Medium-Resolution Remote Sensing Products. Agronomy. 2021; 11(5):940. https://doi.org/10.3390/agronomy11050940

Chicago/Turabian Style

Ballesteros, Rocío, Miguel A. Moreno, Fellype Barroso, Laura González-Gómez, and José F. Ortega. 2021. "Assessment of Maize Growth and Development with High- and Medium-Resolution Remote Sensing Products" Agronomy 11, no. 5: 940. https://doi.org/10.3390/agronomy11050940

APA Style

Ballesteros, R., Moreno, M. A., Barroso, F., González-Gómez, L., & Ortega, J. F. (2021). Assessment of Maize Growth and Development with High- and Medium-Resolution Remote Sensing Products. Agronomy, 11(5), 940. https://doi.org/10.3390/agronomy11050940

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop