Next Article in Journal
An Adaptive Image Segmentation Method with Automatic Selection of Optimal Scale for Extracting Cropland Parcels in Smallholder Farming Systems
Next Article in Special Issue
Crop Monitoring Using Sentinel-2 and UAV Multispectral Imagery: A Comparison Case Study in Northeastern Germany
Previous Article in Journal
Spatio-Temporal Evolution of a Typical Sandstorm Event in an Arid Area of Northwest China in April 2018 Based on Remote Sensing Data
Previous Article in Special Issue
Comparative Sensitivity of Vegetation Indices Measured via Proximal and Aerial Sensors for Assessing N Status and Predicting Grain Yield in Rice Cropping Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Herbage Mass, N Concentration, and N Uptake of Temperate Grasslands Can Adequately Be Estimated from UAV-Based Image Data Using Machine Learning

1
Institute of Geography, University of Cologne, 50923 Cologne, Germany
2
Research Center Hanninghof, Yara International ASA, 48249 Dülmen, Germany
3
INRES—Institute of Crop Science and Resource Conservation, University of Bonn, 53113 Bonn, Germany
*
Author to whom correspondence should be addressed.
Remote Sens. 2022, 14(13), 3066; https://doi.org/10.3390/rs14133066
Submission received: 17 May 2022 / Revised: 18 June 2022 / Accepted: 22 June 2022 / Published: 26 June 2022
(This article belongs to the Special Issue UAV Imagery for Precision Agriculture)

Abstract

:
Precise and timely information on biomass yield and nitrogen uptake in intensively managed grasslands are essential for sustainable management decisions. Imaging sensors mounted on unmanned aerial vehicles (UAVs) along with photogrammetric structure-from-motion processing can provide timely data on crop traits rapidly and non-destructively with a high spatial resolution. The aim of this multi-temporal field study is to estimate aboveground dry matter yield (DMY), nitrogen concentration (N%) and uptake (Nup) of temperate grasslands from UAV-based image data using machine learning (ML) algorithms. The study is based on a two-year dataset from an experimental grassland trial. The experimental setup regarding climate conditions, N fertilizer treatments and slope yielded substantial variations in the dataset, covering a considerable amount of naturally occurring differences in the biomass and N status of grasslands in temperate regions with similar management strategies. Linear regression models and three ML algorithms, namely, random forest (RF), support vector machine (SVM), and partial least squares (PLS) regression were compared with and without a combination of both structural (sward height; SH) and spectral (vegetation indices and single bands) features. Prediction accuracy was quantified using a 10-fold 5-repeat cross-validation (CV) procedure. The results show a significant improvement of prediction accuracy when all structural and spectral features are combined, regardless of the algorithm. The PLS models were outperformed by their respective RF and SVM counterparts. At best, DMY was predicted with a median RMSECV of 197 kg ha−1, N% with a median RMSECV of 0.32%, and Nup with a median RMSECV of 7 kg ha−1. Furthermore, computationally less expensive models incorporating, e.g., only the single multispectral camera bands and SH metrics, or selected features based on variable importance achieved comparable results to the overall best models.

1. Introduction

Grasslands are one of the major terrestrial biomes providing important ecological and economic services [1]. To name but a few, grasslands provide forage for ruminants for milk and meat production, are a source for biofuels and fibers, serve as a wildlife habitat, prevent soil erosion, play an important role in carbon sequestration, and, finally, also contribute to water purification [2]. Thus, grasslands substantially contribute to climate protection and food security [3,4,5]. In the European Union, grasslands account for 31.2% of the utilized agricultural area [6] and are mainly used for fodder and forage production [7]. To safeguard these ecosystem services, agricultural management needs to adjust to site-specific characteristics of grasslands to prevent, e.g., nutrient leaching originating from massive use of fertilizer, while simultaneously securing grassland production in adequate quantity and quality [2]. Both dimensions, expressed as yield (quantity) on the one hand and digestibility, protein content, fibers, or metabolizable energy (quality) on the other hand, are strongly related to plant N concentration. N is among the key parameters limiting agricultural production but also ecosystem functioning [8]. Precise information about herbage yield and plant N concentration is key to calculating N uptake and to specify required fertilizer rates, in order to prevent overuse of fertilizer and subsequent N leaching and eutrophication [9,10]. This is especially important in light of the EU Nitrates Directive, which obligates the member states to meet certain N application standards [11].
Thus, the sustainable and profitable management of grasslands relies on timely information on the quantity and quality of the sward to support management decisions, such as grazing rotation, timing of harvests, and fertilizer application [12,13]. Precision agriculture (PA) technologies are widely used to meet tasks regarding site-specific management, not only in arable crops [14,15], but also in grasslands [12].
However, PA technologies for herbage yield and quality assessment in high spatial and temporal resolution are still in the development phase due to the immanent spatio-temporal heterogeneity and variability of grasslands [13,16]. Different species and plant organs contribute to herbage yield throughout the growing season. Additionally, use intensity (cutting or stocking rates), fertilization, soil, climate, and weather characteristics have a strong impact on growth rates, yield, and N uptake [17,18,19]. The proper assessment of herbage yield and nutrient status traditionally implies destructive biomass sampling and subsequent laboratory analysis, such as wet chemistry or near-infrared spectroscopy (NIRS). However, laboratory analysis is costly, labor-intensive, and time consuming, and thus less suitable for timely and site-specific management decisions [20].
Non-destructive methods for herbage yield assessment, such as simple rulers or rising plate meters, which exploit the allometric relationship between (compressed) sward height and biomass, have already been developed [21,22,23]. Technologically, more advanced methods utilize LEDs or ultrasound to measure sward height [24,25]. Furthermore, a variety of vehicle-mounted or handheld proximal sensing applications exist to derive estimates of biomass and plant nutrient status exploiting canopy reflectance, such as the Yara N-Sensor [26,27], Trimble GreenSeeker [28], or general field spectroradiometers [29,30]. These methods have limitations, such as insufficiently representing spatial variability, allowing operator bias, requiring a high number of repetitions, and needing vehicles and direct access to the field, thereby potentially disturbing the canopy [31].
Alternatively, remote sensing technology has the potential to assess important plant traits (such as aboveground biomass, yield, and N status) with high spatial precision and on variable spatial scales [15,32,33,34]. Extensive overviews of remote sensing-based applications in grassland management are provided by Schellberg et al. [16], Wachendorf et al. [35], and Reinermann et al. [36]. Since the 1980s, mainly data from satellites and aircraft (active and passive sensors) have been employed for this purpose. Nevertheless, satellite-based data have limitations in terms of spatial resolution versus costliness, repetition rates, and weather conditions [37].
In the past 15 years, however, due to technological innovations in sensor and platform design, the use of unmanned aerial vehicles (UAV) and small high-resolution camera-systems in various spectral ranges (visible to near-infrared to shortwave infrared) has become more and more important in research and practical farming. Their user-friendliness and flexibility ensure data acquisition aligned to particular management needs to deliver information on plant growth and status with high spatial and temporal resolution [38,39]. Furthermore, the developments in structure-from-motion (SfM) and Multiview stereopsis (MVS) provide tools to easily derive spatially explicit 3D data from UAV-based image data [40,41,42,43]. These methods have been effectively applied to grassland monitoring by utilizing UAV-mounted imaging sensors. The majority of these studies focus on estimating sward height or dry matter yield by means of structural or spectral features, or a combination of both [44,45,46,47,48,49,50,51,52,53,54]. Only a few studies addressed bio-chemical parameters, such as N concentration, N uptake or fixation, crude protein, or acid detergent fiber (i.e., quality parameters) [49,55,56,57,58,59,60].
The combination of structural and spectral features improved the estimation accuracy for biomass and N concentration; Viljanen et al. [61] stated a slight improvement in DMY estimation when combining structural and spectral features using random forest and multiple linear regression. In the study by Pranga et al. [62], the combination of spectral and structural features from a multispectral camera using random forest provided the best results for the DMY estimation of perennial ryegrass with an RMSE of 382 kg ha−1. Karunaratne et al. [63] reported a consistent improvement of DMY estimation when combining vegetation indices and 3D features over different flying heights. Oliveira et al. [59] reported an improvement for DMY estimation when using 3D features and spectral features from a hyperspectral camera, while spectral features from an RGB and multispectral camera in combination with 3D features provided the best results for the estimation of N concentration and uptake. The potential of combining features of different dimensions is therefore evident. To process the large quantity of data collected by UAV-based sensors, machine learning (ML) algorithms have gained popularity in recent decades. ML algorithms can handle multicollinearity (highly correlated features), high dimensional datasets, and non-linear relationships.
All in all, the potential of UAV-based sensors, the combination of structural and spectral features, and the use of ML algorithms to estimate the quality and quantity parameters of grasslands is evident. However, the above-mentioned studies are limited in growth stages (i.e., sample sizes) and the estimation of N concentration and uptake by UAV-based data is underrepresented in the current literature, since mainly ground-based approaches have been investigated to date. The present study aims to improve on these aspects.
Therefore, this study investigates UAV-based structural and spectral features from multispectral and RGB imaging sensors to quantify three important grassland management parameters: herbage dry matter yield (DMY), N concentration (N%), and N uptake (Nup), for an experimental grassland site in Germany based on data obtained over two years. The specific aims of this study are:
(i)
To develop estimation models for DMY, N%, and Nup utilizing three ML algorithms (namely, partial least squares, support vector machines, and random forest).
(ii)
To compare the prediction accuracy of the developed models with and without structural features.
(iii)
To identify potential key variables (most important features) for the estimation models.

2. Materials and Methods

2.1. Study Site

This study is based on data acquired from a grassland field trial in Germany. The site is located in Neunkirchen-Seelscheid (Bergisches Land region, North Rhine-Westphalia), about 30 km southeast of Cologne (50.858986N, 7.312736E). The mean annual temperature is 10 °C and mean annual precipitation 800 mm. The mean temperature in 2018 and 2019 was significantly higher than the long-term average (11.7 and 11.4 °C, respectively), and annual precipitation was lower (628 and 782 mm, respectively). As can be seen in Figure 1, distinct arid periods occur in the summer months.
The field experiment was established in March 2017 on a conventionally managed grassland field with a significant slope. The vegetation type was identified as Lolio-Cynosuretum and the main species were Lolium perenne, Alopecurus sp., and Festuca sp., among others. The field trial comprised three N-treatment levels (50, 100, 150 kg N ha−1 for growing periods one and two, and 25, 50, and 75 kg N ha−1 for growing period three) and one control treatment with no fertilizer application. Each treatment had 39 replicates arranged in a chessboard-like pattern, resulting in 156 plots. Each plot covered an area of 6 × 6 m. Figure 2 shows the layout of the experimental field.
The site was set up as a three-cut system. The first cut was harvested in May, the second in July/August, and the third in October, similar to local farming practice. To obtain field reference data, a conventional lawn mower (Viking Model MB_756_YC, STIHL AG & Co. KG, Dieburg, Germany) was used to harvest a 0.54 × 5.5 m wide strip of standing biomass from each of the 156 plots (see Figure 3c). The fresh biomass samples were weighed directly in the field and a subsample of about 300 g was obtained. The subsamples were dried in a forced-air dryer to a constant weight for three days at 65 °C. Subsequently, the weights of the dried subsamples were used to calculate the dry matter yield (DMY) in kg per hectare. N% was determined by the Kjeldahl method, extracting complete N. Nup was calculated from DMY and N% values for every plot ((DMY × N%)/100).

2.2. Sensors and Platform

A multirotor UAV, (MK Okto XL 6S12, HiSystems GmbH, Moormerland, Germany) was used as a carrier platform for two cameras. RGB (visible spectral range) image data were recorded with a Sony Alpha 7r with 36 megapixels (MP) and a Zeiss Batis 25 mm lens. Multispectral image data were recorded with a Micasense RedEdge-M camera (MS) (AgEagle Sensor Systems Inc., d/b/a MicaSense, Wichita, KS, USA). The MS camera was equipped with five different sensors (1.2 MP each) in the visible-to-near-infrared (VNIR) region of the electromagnetic spectrum (see Figure 4 and Table 1). Additionally, the camera had an irradiance sensor and a GPS module.

2.3. UAV-Based Data Acquisition

Flight campaigns for biomass sampling dates (TS) were scheduled either one day in advance of the harvest or on the same day between 10 and 11 a.m. (before solar noon). To obtain the base models for sward height calculations for each growing period, flight campaigns with the RGB camera were scheduled after the complete harvest (T0) of the field. The image overlap for the MS camera was approximately 75% forward and 70% sideward (FOV diagonal 57°). The Sony camera had approximately 80% overlap in both directions (FOV diagonal 84°). Flight height of the UAV was set to 35 m above ground, following the terrain, resulting in a ground sampling distance (GSD) of ~0.7 cm for the RGB and ~2.3 cm for the MS camera. For accurate georeferencing, 15 ground control points (GCPs) were evenly distributed across the site (see Figure 2). The GCPs were measured on each sampling date with a real-time kinematic differential GPS (GR-5, Topcon, Japan). Before and after each flight, the radiometric calibration panel (gray standard provided by the manufacturer) was recorded with the MS camera to provide reflectance values for subsequent radiometric calibration.
For an additional radiometric assessment of the MS camera, six panels (80 × 80 cm) in different shades of gray were laid out at the experimental site (see Figure 5). The panels were coated with a matte color, exhibiting near-Lambertian reflectance properties (NEXTEL® color with a proportion of 100, 75, and 50, and 25, 10, and 0% black). Unfortunately, the panels were not available for all sampling dates and the 10 and 0% black panels exhibited extreme saturation. The reflectance of each panel was measured with an ASD FieldSpec 3 spectroradiometer (Malvern Panalytical Ltd., Cambridge, UK), which records reflectance in the range of 350–2500 nm. Subsequently, spectroradiometer readings were summarized to fit the bands of the MS camera. The data from the NEXTEL-coated panels were not included into the radiometric calibration workflow of the MS data.
To summarize the field sampling campaigns, Figure 6 shows a schematic overview of the sampling dates for UAV-based data acquisition and for destructive biomass sampling.

2.4. UAV-Based Data Processing and Feature Extraction

The relevant steps of data processing, feature extraction, and model building and assessment are depicted in the schematic workflow in Figure 7.
Image data were processed in the SfM software Agisoft Metashape (Agisoft LLC, St. Petersburg, Russia). After loading the images, at least ten markers were placed for each GCP. After image alignment (high-quality setting) the dense cloud was computed using the high-quality setting with aggressive depth filtering for the base models and mild depth filtering for the sampling dates to preserve the finer details of the sward. The datasets of the MS camera were then radiometrically calibrated by the calibrate reflectance function using the calibration factors of the irradiance sensor and the gray reference panel. After computing the dense point cloud, the DSM and orthomosaic were compiled. The DSMs of the RGB camera were exported in a resolution of 1 cm and the orthomosaics of the MS camera were exported with a 2.5 cm resolution.
To obtain the estimates of UAV-based sward height, the base model DSM of each growing period was subtracted from the sampling date DSMs of the respective growing period using ArcGIS Pro (ESRI, Redlands, CA, USA). In a second step, the following sward height (SH) metrics were calculated in the statistical computation software R [65] using R Studio GUI [66]: mean, minimum, maximum, 90th, 75th, 50th (median), and 25th percentiles (SHmean, SHmin, SHmax, SHp90, SHp75, SHp50, SHp25). The graphical modeling interface model builder in ArcGIS Pro was used for vegetation index (VI) calculations from the raster bands of the MS orthomosaics. A total of 19 indices were selected based on their characterization of biochemical and structural traits of vegetation in order to be comparable to existing studies. For each date, a polygonal shapefile was created for the biomass sampling area per plot based on orthomosaics obtained directly after biomass sampling (see Figure 8). The respective shapefiles were used to extract height and spectral features by zonal statistics for each plot in ArcGIS Pro for further analysis in R. Table 2 lists the 15 VIs based on the visible and near-infrared region, hereafter referred to as VI.ms.
Additional to the VI.ms, four indices from the visible spectrum were selected, hereafter referred to as VI.rgb (see Table 3). We decided to calculate the VI.rgb from the bands of the MS camera since no radiometric correction was applied to the RGB camera data. Comparability between sampling dates with varying irradiation is rarely possible with an uncalibrated RGB camera, as shown by Lussem et al. 2019.

2.5. Statistical Analysis

UAV-based structural (height) and spectral (reflectance) information of the canopy and their combination were set as independent variables to estimate DMY, N%, and Nup based on the pooled dataset from 2018 and 2019. To test the predictive power of the extracted features, empirical models using a linear regression model (LM) and three machine learning algorithms were compared, i.e., partial least squares regression (PLS), random forest regression (RF), and support vector machine regression (SVM). The non-parametric algorithms were chosen to (i) handle multicollinearity and non-linear relationships present in the data, and (ii) to be comparable to similar studies.
Table 4 presents the feature sets considered in the analysis. Statistical analysis was performed in R. The package caret was chosen as a modeling framework, since it provides cross-validation procedures and most ML algorithms can be implemented [84,85].
PLS is a generalization of multivariate linear regression, but with the ability to handle the collinearity of multiple predictor variables. It also linearly combines and thus reduces the predictor variables to a few latent vectors, which cover the maximum of covariance between response and predictor variables [86]. Random forest (RF) is an ensemble algorithm [87] and has become popular in remote sensing research because of its good predictive performance [88]. For regression, a large number of decision trees was constructed with a random selection of variable subsets and the mean prediction of all trees was taken as the output. The support vector machine (SVM) algorithm was developed by [89]. SVMs have gained popularity in remote sensing due to their capacity to generalize well, even with small sample sizes [90]. SVMs map the covariates into a high-dimensional feature space using a kernel trick to determine an optimal separating hyperplane between data points. Both linear and non-linear kernels can be implemented, depending on the relationship of the data [91].

Hyperparameter Tuning and Model Assessment

To achieve reliable estimates, the algorithm-specific hyperparameters should be tuned (i.e., optimized) [92,93]. Thus, for each algorithm and feature set combination, some of the respective hyperparameters were determined using search parameters within a cross-validation process. Final parameters were chosen based on the lowest RMSE and implemented in the final model. We used the pls package [94] for PLS modeling. The algorithm-specific hyperparameter of PLS is the number of components (ncomp) to be considered. Ncomp was tuned using all possible numbers for each feature set combination (e.g., for a model with 31 features, ncomp was set from 2 to 31).
RF hyperparameters include mtry, the number of randomly selected variables at each split, and ntree, the number of decision trees built. To find the best value for mtry, a search was performed over all possible numbers for each feature set combination (e.g., for a model with 31 features, the search for mtry was set from 2 to 31). Ntree was set to 500, as commonly recommended [88]. Tests with a higher number of trees did not yield significantly better results. The minimum number of observations per node (minimum node size) was set to five and the splitrule to extratrees (extremely randomized trees). The importance measure was set to permutation, as recommended for RF regression analysis [95]. The ranger package [96] was chosen as the RF algorithm. Ranger is a computationally faster implementation of the original random forest algorithm by Breiman [87].
For the SVM algorithm, a radial basis function kernel was chosen, implemented in the kernlab package [97], to account for non-linear relationships present in the data. A search over the cost parameter C was performed with C equal to 0.25, 0.5, 1, 2, or 4. Parameter C allows a trade-off between training error and model complexity, and thus reduces overfitting [91].
For linear models, all features were analyzed separately; whereas, for the ML algorithms, the feature sets SH, SB, VI.rgb, and VI.ms were either analyzed separately or in combination (SH_SB, SH_VI.rgb, SH_VI.ms). Finally, all feature sets were combined (SH_SB_VI.rgb_VI.ms) for analysis. Since no genuinely independent test dataset was available (i.e., from a different site), models were built using 10-fold cross-validation with 5 repeats, resulting in 50 model runs per algorithm and feature set. The prediction accuracy was quantified by calculating the cross-validation (CV) coefficient of determination (R2CV) and root-mean-square error (RMSECV) for each model run to assess the models in terms of precision and variability. Additionally, the relative (or normalized) RMSE was calculated using the mean value of the respective response variable as the divisor.
Furthermore, variable importance was calculated for all models to determine the important features of the feature set combinations and evaluate the stability of feature importance over different models. To achieve comparability between the different ML algorithms, the variable importance was calculated by permutation, which is a model agnostic method and can be applied to any kind of model. This approach was implemented in the R package vip [98,99]. The permutation-based variable importance was defined as the decrease in a chosen error metric (here RMSE) when a single variable was randomly shuffled to break the connection to the response variable. Iteratively, all features were permuted and the permuted result was compared to the original result. Thus, the more the RMSE increased, the more important a specific feature was for the model. This procedure was applied to all variables of all models and repeated 25 times per model to account for the stability of the outcome.

3. Results

For the statistical analysis, the dataset was filtered for plots with a lodging canopy. In 2018, 3 plots of the first growth, 84 plots of the second growth, and 1 plot of the third growth were excluded. In 2019, 16 plots of the first growth were excluded. Additionally, one observation with missing laboratory results was excluded. Overall, 832 observations were analyzed. In Figure 8, the selected images highlight the variability of the sward within one year (a and b) and the sampled areas within the quadratic plots (c).

3.1. Distribution of Response Variables and Correlation Analysis

The experimental setup of four treatments generated a wide range of biomass values (see Figure 8). However, due to severe drought in 2019, the biomass values were at a lower level for the second and third cuts compared to 2018. The range of DMY was 300–3376 kg ha−1 in 2018 and 158–3244 kg ha−1 in 2019.
Figure 8 shows that the first cut has the highest variability in DMY for both years. Lowest variability in DMY was found in the third cut in both years (2018: 300–3376 kg ha−1, 2019: 158–3244 kg ha−1). The N concentration ranged from 1.67–5.06% in 2018, and 1.7–4.92% in 2019. N uptake ranged from 10–108 kg N ha−1 in 2018, and 5–107 kg N ha−1 in 2019 (see also Table 5).
The results of the bivariate linear correlation analysis of the pooled dataset are presented in a correlation map in Figure 9. DMY displays strong correlations with most canopy height metrics, moderate-to-strong correlations with most VI.ms, and weak correlations with VI.rgb and single bands, except for the NIR band. N% only shows very weak correlations to the predictor variables, with the highest, although negative, correlation to the PPRI. Nup displays a similar pattern in correlation strength as DMY, although slightly weaker. Furthermore, the predictor variables show a high degree of multicollinearity.

3.2. Error Assessment of SfM/MVS Processing

Although all image datasets were processed using the same parameters, small differences in the processing results are apparent (see Table 6). The errors in the X- and Y-coordinates were 0.97–6.74 cm, while the height error ranged from 0.50–1.43 cm for the Sony α 7r datasets. The MS orthomosaics were calculated only for the sampling dates. In general, the MS data errors were less than the Sony data, especially the Z error.

3.3. Radiometric Assessment MS Camera

The radiometric assessment of the MS camera compared to the reflectance values of the reference panels measured by an ASD FieldSpec 3 spectroradiometer is presented in Figure 10 and Table 7. Unfortunately, the reference panels were not available for all sampling dates. In particular, the blue and red bands exhibit the lowest R2 values with a distinct saturation effect above reflectance of 0.2. However, the reflectance of plants in the blue and red regions seldom exceeds this value; thus, it can be presumed that the MS camera provides reliable results. The NIR and RE bands have the closest relationship to the benchmark instrument, although the MS camera overestimates the NIR reflectance. However, on the third sampling date in 2019, the NIR channel was underestimated by the MS camera.
Table 7 presents the R2 and RMSE between the spectroradiometer measurements and the MS camera of the reference panels. In particular, the blue and red bands show weaker relationships with the ASD FieldSpec3 measurements (R2 0.58–0.74 and 0.69–0.88, respectively). As mentioned above, these low R2 values are due to the saturation effect above 0.2% reflectance. On the other hand, the red edge and NIR bands show the best agreement with the benchmarking instrument.

3.4. Dry Matter Yield Prediction

In Figure 11, the results of all cross-validation model runs of the simple linear regression (SLR) are presented. The metrics SHmean, SHp90, SHp75, and SHp50 proved to be better predictors for the DMY estimation than all spectral features in simple linear regression based on the median RMSECV (below 350 kg ha−1). VI.ms (GNDVI, NDREI, SR, CCCI, MSAVI, MSR, and NIR.RE) all had a median R2CV above 0.75 and RMSECV below 400 kg ha−1. The VI.rgb and single bands, except the NIR band, all exhibited very low R2CV (almost zero) and high RMSECV.
The results for the performance metrics of all cross-validation model runs of the three ML algorithms (PLS, SVM, RF) for the DMY estimation are shown in Figure 12, based on different feature sets. In general, the PLS algorithm provides poorer results compared to the RF and SVM algorithms. Furthermore, the RGB feature set performed worse, irrespective of the modeling algorithm. The best models per algorithm were the combinations of all features and the SH_VI.ms feature combination. Although other feature combinations showed similar R2CV values, the median RMSECV was lower and less spread in these two mentioned combinations when comparing the performance by algorithm.
For the DMY estimation, the interquartile range (IQR) of the error distribution (RMSECV) was the lowest for the RF models. The best performing RF models were the feature combinations SH_VI.ms and SH_SB_VI.ms_VI.rgb (median RMSECV 206 and 203 kg ha−1, respectively, and IQR of 25 kg ha−1 for both models). For the SVM models, the SH_VI.ms feature combination performed best (median RMSECV 199 and IQR 27 kg ha−1). The SH_SB models using SVM and RF performed well with a median RMSECV of 214 and 226 kg ha−1. Additionally, the VI.ms (RF and SVM) were among the best ten models ranked by RMSE. The highest error distribution was found using only VI.rgb with the SVM algorithm (IQR 62 kg ha−1). Surprisingly, using only the single bands already lead to accurate results (median RMSECV of 245 and 230 kg ha−1 and IQR of 37 and 34 kg ha−1 for RF and SVM, respectively). The best model by the PLS algorithm using all features combined ranked in between the previously mentioned SB models (median RMSECV of 232 kg ha−1 and IQR of 27 kg ha−1). For a detailed listing of the performance metrics per model, the reader is referred to Table A2 in Appendix A.
Since it was assumed that the inclusion of the SH metrics significantly improved the prediction accuracy, the models with and without SH metrics were compared using the Wilcoxon signed-rank test (one-sided) based on the RMSECV values. The assumption was met in all instances. Additionally, the two best models per algorithm were compared (SH_SB_VI.ms_VI.rgb and SH_VI.ms). The inclusion of all variables improved the prediction accuracy for the PLS and RF models; however, for the SVM model, the prediction accuracy of the full model was not higher than for SH_VI.ms model (p-value: 0.157).
In Figure 13, the ten most important variables for the two best models per algorithm are presented (variable importance plots for all models, including all features for DMY estimation, are presented in Figure A1, Figure A2 and Figure A3 in Appendix A). The SH metrics were among the top variables for SVM and RF; whereas, for the PLS models, these features were less important. Except for the NGRDI in the full PLS model, no VI.rgb was among the top ten variables. In the RF models, MSAVI, GNDVI, NIR.RE, NDREI, and MSAVI were among the most important variables in both the full and SH_VI.ms models. In contrast, the ranking of the spectral variables in the SVM models was not as stable as in the RF models.

3.5. N-Concentration Prediction

The results of all cross-validation model runs of the SLR for N% prediction are presented as box and whisker plots in Figure 14. No single predictor was able to adequately predict N%, and all models showed high variability in the cross-validation results. However, PPRI, a VI.rgb, showed the highest median R2CV and lowest median RMSECV.
In Figure 15, the results of all cross-validation model runs of the three ML algorithms (PLS, SVM, RF) are shown based on different feature sets to predict N%. Irrespective of the algorithm, the SH and VI.rgb features and their combination provided the poorest results for N% estimation. Again, the PLS models generally exhibited poorer results than RF and SVM. The two best models were the combinations of all feature sets with a median RMSECV of 0.31 and 0.32 N% for RF and SVM, respectively (both median R2CV of 0.83). Surprisingly, the SH_SB combination yielded the third best result using the SVM algorithm with a median R2CV of 0.81 and median RMSECV of 0.32 N%. The SVM models using the VI.ms and SH_VI.ms feature sets showed similar performances to the SH_SB model. A detailed listing of the performance metrics can be found in Table A3 in Appendix A.
Similar to the DMY prediction models, it was tested whether the inclusion of SH metrics improved the prediction accuracy of N% using the Wilcoxon signed-rank test. In most cases, the assumption was met; only for the RF models using the single bands and the SVM models using the VI.ms was the difference not significant. The comparison between algorithms for the best models showed differences only significant to the PLS model (test results, see Table A5 in Appendix A).
In Figure 16, the variable importance plots of the two best models per algorithm are depicted. The SH features are only in the SVM models among the most important features. Common to most models are the important features, including the blue band and the NIR and red edge bands. The PPRI, an index from the visible region only, is the most important variable for the RF and SVM algorithms in the full model. Without VI.rgb included, the BNDVI is the paramount variable. Again, the most important variables in both best RF models are similar. Between the top variables of the SVM models, greater variation is present. Furthermore, the SVM model includes the RGBVI, similar to the PPRI, an index from the visible region, as the third best variable in the full model. For the PLS models mainly, the VI.ms are of higher importance than all SH, VI.rgb, or SB features. The VIPs for all models and all features can be found in Appendix A in Figure A4, Figure A5 and Figure A6 for completeness.

3.6. N Uptake Prediction

In Figure 17, the results of all cross-validation model runs of the SLR for N-uptake prediction are presented. The SH metrics mean, p90, p75, and p50 (median) show the best results, followed by GNDVI, NDREI, SR, CCCI, and the NIR/RE ratio with a median R2CV of around 0.6. Except for the NIR band, VI.rgb and SB exhibit very low R2CV (almost zero).
The results for the N uptake estimations of all cross-validation model runs of the three ML algorithms (PLS, SVM, RF) are shown based on different feature sets in Figure 18. In general, the PLS algorithm provides poorer results than the RF and SVM algorithms, and the VI.rgb feature set performs the least, irrespective of the modeling algorithm. Again, the best results could be achieved with the RF and SVM algorithms using all features or the SH_VI.ms combination (median RMSECV of 7 kg N ha−1 with IQR of 1 kg N ha−1 for all four models). Furthermore, the SH_SB, and VI.ms models using RF or SVM performed similarly well. Compared to the linear models, the feature sets and combinations performed significantly better using the ML algorithms, except for the PLS-VI.rgb model. A detailed listing of the performance metrics for all N uptake estimation models can be found in Table A4 in Appendix A.
Again, the models were compared by the Wilcoxon signed-rank test to test the difference in prediction accuracy when SH metrics were included. The difference was significant for all combinations, except the RF SB model, so the inclusion of the SH metrics yielded no better result for N-uptake prediction with the single bands of the MS camera. The comparison between the two best models showed significant differences for the PLS and RF-based models, but was not significant between the RF and SVM models.
The variable importance plots (Figure 19) reveal that no clear pattern for Nup estimation can be distinguished between the best models of SVM and PLS. The SH metrics are of higher importance for the RF and SVM models than for the PLS models. In the RF models, the important features are comparable in their ranking, but no such stability can be achieved for the SVM models. Variable importance plots for all Nup estimation models, including all features, can be found in Figure A7, Figure A8 and Figure A9 in Appendix A for completeness.

3.7. Models with Reduced Features

Since it is desirable to implement models with as few features as possible, it was decided to select a set of important features based on the models developed in the previous sections. The performance of RF and SVM models utilizing all features was not significantly different; the RF algorithm was chosen to train new models, including only the ten most important variables based on the variable importance scores (see Figure 13, Figure 16 and Figure 19). The RMSECV of the three models with reduced features for the estimation of DMY, N%, and Nup is shown in Figure 20. The respective median R2CV values were 0.93, 0.80, and 0.91. The DMY estimation with reduced features resulted in a 5% loss of accuracy of median RMSECV compared to the full model (203 kg ha−1 compared to 213 kg ha−1 DMY). For N% estimation, the loss of accuracy of median RMSECV was higher (9%, 0.31 N% compared to 0.34 N%), but for Nup, the loss of accuracy was negligible (1.5%, 6.8 kg ha−1 N compared to 7 kg ha−1 N).

4. Discussion

The primary aim of this study was to evaluate the suitability of combining UAV-based sward height metrics and vegetation indices to estimate DMY, N%, and Nup of mixed temperate grassland by comparing different feature combinations and machine learning models (PLS, RF, SVM). The models were evaluated by applying a 10-fold cross-validation procedure with five repeats to assess the variability in predictive accuracy. There are only a few studies investigating the combination of canopy height and spectral data to estimate DMY, N%, and Nup in grassland. However, the approach of combining structural and spectral data is not new at all. Earlier studies implemented, e.g., radar satellite data combined with hyperspectral satellite data for wheat monitoring [100], or combined LiDAR and hyperspectral data for forest classification [101,102]. However, since the last decade, platforms, sensors, and algorithms are becoming more accessible to a broader scientific audience [43].

4.1. Data Accuracy

The conditions for data acquisition, such as sun angle, temperature, humidity, rain, wind speed, dust, and haze, are crucial for the accurate prediction of plant traits by UAV-based data. Furthermore, image quality can be influenced by flight speed, flying height, camera tilt, and exposure time. Ideally, these factors are stable through every flight mission and comparable for every mission. However, in practical farming and even under experimental conditions, these requirements are seldom met.
Although the MS sensor in this study was equipped with an irradiance sensor to correct for changes in incoming light, the parameters for multitemporal data acquisition should be either clear sky or fully overcast conditions to ensure consistent irradiance and no or only low wind speed to avoid moving canopies and ensure a stable flightpath. These instabilities are apparent in the radiometric assessment of the camera compared to the ASD readings using the additional gray panels for four sampling dates (Figure 10). The radiometric correction of the MS camera in the present study was based on the suggested settings of the official Agisoft Metashape workflow for MicaSense RedEdge cameras [103]. A custom calibration procedure, as suggested by [104], was therefore not applied. However, a thorough evaluation of the camera’s spectral response with laboratory equipment, such as an integrating sphere or monochromator, preferably on a regular basis, is helpful in quantifying errors based on hardware degradation and standard radiometric correction workflow. This is especially important in light of real farming applications, where a shift in camera sensitivity could result in a severe loss of profit when a field is incorrectly characterized as unhealthy or healthy.
Regarding UAV-based structural features, in recent studies, it has been shown that UAV-based canopy height data can well replace RPM and other manual measurements and work well in swards above 30 cm height [44,45]. This was also one of the key findings of our study. However, a degree of uncertainty remains in the proposed method; in particular, the calculation of the base model generally seems to be influenced by the stubble height after the full harvest, as well as the remains of the harvest, and possible rodent activity. Furthermore, the calculation of a base model by interpolating bare soil points is not feasible for practical farming applications, since no paths between the plots were set up (in this case) and the sward was permanently managed for several years (even decades) without any plowing. Furthermore, calculating a base model by interpolating RTK-GPS points is rather laborious, especially when the site is not flat and shows small depressions and mounds. Therefore, our method was a compromise for practical farming applications, but further research should be directed to incorporate LiDAR data, especially for base model acquisition [105].
In addition, the timing of the sampling harvests was not ideal in every case. As mentioned in the Methods Section, lodging or heavily bent plots were excluded from the analysis, since they do not reflect the actual farming practice; swards are mown before reaching a certain maturity due to decreasing quality parameters, such as digestibility. Furthermore, the relationship between the studied parameters (DMY, N%, Nup) and UAV-based height or reflectance of the canopy is impaired when the plants are bent or even lodging [30,106]. Ultimately, a procedure, as proposed by Wilke et al. [107], could be integrated into DMY estimation models to correct for lodging severity.
The dry periods in the summer months were unusual compared to the long-term mean temperature and precipitation. Consequently, sward growth was negatively affected, and thus the third harvest in both years might not be representative of typical biomass values. This phenomenon might reflect the increasing prevalence of drought, warming, and climate variability [108]. Thus, the investigated years might therefore reflect changing yields in the future.

4.2. Impact of Combining Structural and Spectral Data on Predictive Performance

An important aspect of assessing biomass by means of spectral reflectance features is the well-known saturation effect of spectral reflectance with increasing biomass and leaf area index (LAI) [109]. Furthermore, the nutrient status of plants, especially N concentration, affects chlorophyll concentration in the leaves and therefore influences the reflectance characteristics [110,111,112]. These challenges can be addressed by integrating other biophysical plant traits into the modeling framework, such as plant or canopy height measured with various sensors. This has been observed for various crops [30,113] and grasslands [24,114].
Recently, comparable studies highlighted the potential of UAV-based data for trait estimation in grassland. In the study by Viljanen et al. [61], the best model utilized both feature types (derived from a UAV-mounted RGB and hyperspectral camera) using the RF algorithm, with an RMSE of 0.34 t ha−1 for the DMY estimation for the primary growth. When structural and spectral features were combined, the CH features were of higher importance for the DMY prediction models than the spectral features. This was also observed in our study, except when using the PLS algorithm. Michez et al. [115] achieved an RMSE of 0.09 kg m2 (900 kg ha−1) for the DMY estimation by combining VIs and CH (adj. R2 = 0.49), and also found that the CH had the highest relative importance in the multilinear regression model. Grüner et al. [55] investigated DMY and N fixation by comparing RF and PLS models of spectral features with and without texture features. They concluded that the additional texture features substantially improved the estimation models: the RMSE of the DMY estimation improved from 0.86 t ha−1 to 0.52 t ha−1 when texture features were included using RF. In the study of Karunaratne et al. [63], different feature combinations and flying heights were tested and the best models always considered both feature types (~400 kg ha−1 RMSE). Similar to [63], a MicaSense RedEdge camera was employed in the study conducted by Pranga et al. [62]. Here, the combination of height and VIs yielded an RMSE of 382 kg ha−1 DMY. Pranga et al. also found the CH features of highest importance when predicting DMY with a fused dataset (from the RGB camera as well as the MS camera); however, estimating the DMY with pure CH features yielded an rRMSE of 30–35%. In comparison, the rRMSE of the DMY estimation was 10% lower on average for the purely CH-based models in our study.
In a purely spectral approach, Togeiro de Alckmin et al. [116] achieved an RMSE of 397–464 kg ha−1 depending on the modeling algorithm for the DMY estimation using a small MS camera (Parrot Sequoia) in a two-year study. Oliveira et al. [59] employed a similar sensor setup as [61] and reported an RMSE of 389 kg ha−1 for the combination of CH fused with narrowband indices and single bands from an HS camera modeled by RF regression. In comparison, our study yielded significantly lower RMSECV than all of the studies mentioned above. Depending on the algorithm, we achieved a median RMSECV of around 200 kg ha−1 DMY with a very narrow error range. However, the low RMSE might be affected by the range of the DMY samples, since both sampling years were affected by drought.
N concentration and uptake were also assessed by Oliveira et al. [59]. They found that the combination of structural (CH) and spectral (RGB and VNIR) features showed the best results (12.5% and 19% rRMSE, respectively, for the primary growth). This feature combination is comparable to our full model, although our study yielded an even lower median rRMSE of 10% for N% and 17% for Nup.
It needs to be noted here that N uptake is generally dependent on accurate measurements of both DMY and N%, since it is calculated as the product of both parameters. In the present experiment, the most important features for Nup were therefore the CH features and Vis, which also performed well in our N% estimation models. Predicting the N concentration by a single predictor is limited [109,117,118]. Therefore, it was not surprising that no single feature was able to adequately estimate N%. However, when using a set of features or a combination of structural and spectral features together with ML algorithms, the estimation accuracy rose significantly. The most important features for N% estimation included the blue or red edge bands, like the PPRI, the BNDVI, the NDREI, the CCCI, and the NIR.RE ratio. This is in line with the literature, as these bands are sensitive to chlorophyll concentration in the leaves, which is closely, but not necessarily linear, linked to N [109,119,120,121,122].
A promising approach is integrating the SWIR spectral domain into biomass and N monitoring [34]. Such an application was successfully implemented by Jenal et al. [48,123] using selected spectral bands from a novel VNIR/SWIR imaging system. As our study showed, the information contained in the single MS camera bands fused with CH features yielded already accurate results. For DMY, N%, and Nup estimations, the performance of the SH_SB combination was in the same range as the computationally more expensive SH_VI.ms combination using RF or SVM. Thus, the selection of distinct wavelengths, which are more directly linked to, e.g., plant N, as in the SWIR domain, instead of using indirect links via chlorophyll absorption in the visible and red edge region, might reduce the need for VI calculation in an ML modeling framework. While hyperspectral sensors may outperform MS sensors, the latter are easier to operate and investment costs are substantially lower.

4.3. Transferability and Generality of Models

The practical benefit of prediction models based on spatially explicit data lies in the possibility to derive maps of management zones depicting the desired feature, such as N uptake or DMY. These maps can aid in decision making for site-specific treatments (e.g., fertilizer, pest control), optimal time of harvest for silage and hay production, or grazing management, e.g., in combination with virtual fencing technology [124]. However, the empirically derived models need to generalize well and have to be transferable to different sites and years.
We hypothesized that combining UAV-based structural and spectral data with ML modeling could build adequate models to estimate important grassland management parameters, such as DMY, N%, and Nup. By incorporating different years with various weather and climate conditions, applying different N treatments and having a significant slope in our field experiment, we accounted for the substantial variations in our dataset. Thus, the data are able to cover a considerable amount of naturally occurring differences in the biomass and N status of grasslands in temperate regions with similar management practices. As discussed by Geipel et al. [58], the transferability and generalizability of their models increased when calibration was performed using the pooled dataset. This was also observed in our models. Models built with a dataset based on multiple years, and preferably on multiple sites, are able to generalize better due to the higher variation of the dataset, and thus may reflect conditions on other sites and during different years better. However, ML algorithms are known for their excellent performance of in-sample predictions, but poorer performance on unseen data [125,126]. Thus, our hypothesis has yet to be proven by transferring the derived models to unseen data, e.g., new sites.

5. Conclusions and Outlook

Monitoring biomass and N-related traits is crucial for the sustainable and profitable management decisions in grasslands. This study investigated the potential of UAV-based structural and spectral features and their combination to predict DMY, N%, and Nup. Models built with structural features (SH metrics) from a consumer-grade RGB camera and spectral features from a small MS camera using linear regression and three ML algorithms (PLS, RF, SVM) were compared. Overall, models with a combination of both structural and spectral features improved the prediction of DMY, N%, and Nup, regardless of the ML algorithm. However, the PLS models were outperformed by their RF and SVM counterparts. The best models for DMY estimation were the full models using all features (SH_SB_VI.ms_VI.rgb) and the combination SH_VI.ms using the RF or SVM algorithms, with a median RMSE of 197–206 kg ha−1. Similarly, Nup was best estimated with the aforementioned feature combinations by RF and SVM (median RMSE 7 kg ha−1). N% was best estimated using the full model or the combinations of SH_SB and SH_VI.ms (median RMSE of 0.31–0.33%). Surprisingly, the combination of single bands of the MS camera and SH features yielded already accurate results for all three traits. This observation implies that the accurate modeling of DMY, N%, and Nup could be achieved with less computational effort, since the VI calculation could be omitted. Furthermore, the performance of the models with a reduced feature set based on feature selection (best ten features using RF) was also only marginally less than the best models. Thus, the estimations of the three important grassland parameters, DMY, N%, and Nup, are easily achievable with UAV-based data from relatively inexpensive, small imaging sensors and ML models. The presented approaches may offer alternatives to traditional destructive measurements of important plant traits and improve estimation accuracy. These spatially explicit predictions can aid in a range of agricultural applications for site-specific management, such as yield maps. Although the transferability of our models needs to be validated with independent datasets (e.g., different sites and years), this study demonstrated the applicability of UAV-based structural and spectral features for accurate DMY, N%, and Nup estimations. Future research should be directed towards integrating LiDAR data, especially for the acquisition of an accurate base model for CH calculation. Additionally, the integration of texture features seems to be a promising approach, especially for biomass estimation. Furthermore, the short-wave infrared region would be a beneficial addition to the models, due to the direct link to plant N.
This study expanded on the current research of the combination of structural and spectral features from UAV-based imaging sensors to estimate the important traits in temperate grasslands using machine learning algorithms. All in all, our study demonstrated the potential of UAV-based data for agricultural applications, and highlighted areas for future research using remote sensing for grassland management.

Author Contributions

Conceptualization, U.L., M.L.G., J.S. and G.B.; data curation, U.L., A.B. and M.L.G.; formal analysis, U.L., I.K. and M.L.G.; funding acquisition, J.J., J.S. and G.B.; investigation, U.L., A.B. and M.L.G.; methodology, U.L., A.B., M.L.G. and G.B.; supervision, J.J., J.S. and G.B.; writing—original draft, U.L. and G.B.; writing—review and editing, U.L., A.B., I.K., J.J., M.L.G., J.S. and G.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research was partly funded by the German Federal Ministry of Education and Research (BMBF) (Grant number: 031B0734F), as part of the consortium research project “GreenGrass”. We acknowledge support for the Article Processing Charge from the DFG (German Research Foundation, 491454339).

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

We would like to thank the local farmer Reinhard Mosler and our student staff Jannis Menne, Joscha Grasshoff, Lilian Bromen, Marina Herbrecht, Christoph Müller, Nikolai Kirch, Mirijam Zickel, and Mika Thuning. Furthermore, we would like to thank Michael Schmidt from Arbeitskreis Landwirtschaft, Wasser und Boden (ALWB) in Siegburg. Last, but not least, we would like to thank Hubert Hüging from Bonn University.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Cross-validation Regression results simple linear regression for all features.
Table A1. Cross-validation Regression results simple linear regression for all features.
DMY kg ha⁻¹N % BiomassN up kg ha⁻¹
FeatureR2 sdRMSE sdR2 sdRMSE sdR2 sdRMSE sd
SHmean0.82±0.04342±310.02±0.020.75±0.020.77±0.0411.5±1.0
SHp900.82±0.04341±320.03±0.020.74±0.030.74±0.0412.2±1.1
SHp750.83±0.04337±300.02±0.020.75±0.020.76±0.0411.6±1.0
SHp500.82±0.04342±310.02±0.020.75±0.020.78±0.0411.3±0.9
SHp250.81±0.04357±320.02±0.020.75±0.020.77±0.0411.3±0.9
SHmax0.77±0.04391±330.07±0.030.73±0.030.63±0.0514.4±1.3
SHmin0.62±0.06497±360.02±0.020.75±0.020.61±0.0714.9±1.5
NGRDI0.44±0.05606±400.11±0.070.72±0.030.29±0.0520.1±1.5
PPRI0.01±0.01807±280.26±0.070.65±0.030.07±0.0523.1±1.6
RGBVI0.22±0.05717±400.20±0.100.69±0.040.09±0.0422.7±1.6
VARI0.55±0.05546±360.07±0.050.73±0.030.42±0.0518.2±1.4
BNDVI0.64±0.03485±250.16±0.080.70±0.030.44±0.0617.9±1.2
EVI0.70±0.03445±210.16±0.050.70±0.030.52±0.0616.5±1.3
GNDVI0.79±0.02370±230.06±0.040.73±0.030.65±0.0514.1±1.3
NDREI0.79±0.03374±240.07±0.040.73±0.030.64±0.0614.3±1.3
NDVI0.67±0.04470±310.08±0.060.73±0.030.52±0.0516.7±1.3
OSAVI0.74±0.02415±200.16±0.060.69±0.030.54±0.0616.2±1.3
RDVI0.74±0.03416±200.16±0.050.69±0.030.54±0.0616.1±1.3
SR0.78±0.03384±280.08±0.050.73±0.030.62±0.0614.7±1.2
CCCI0.77±0.03389±240.07±0.040.73±0.030.63±0.0614.5±1.3
MCARI10.70±0.03447±220.18±0.050.69±0.030.51±0.0616.7±1.3
MSAVI0.79±0.03369±230.11±0.050.71±0.030.62±0.0614.7±1.3
MSR0.78±0.03383±270.09±0.050.72±0.030.61±0.0514.9±1.2
MTVI20.72±0.03431±200.19±0.050.68±0.030.51±0.0616.6±1.3
NIR.RE0.76±0.03397±260.06±0.040.73±0.030.64±0.0614.3±1.3
RE.R0.53±0.07554±470.10±0.060.72±0.030.38±0.0618.8±1.5
B0.01±0.02805±310.01±0.010.75±0.020.01±0.0123.7±1.6
G0.02±0.02804±300.13±0.060.71±0.030.04±0.0323.4±1.6
R0.20±0.05726±390.01±0.020.75±0.020.18±0.0321.6±1.4
RE0.02±0.02804±290.15±0.060.70±0.030.01±0.0123.7±1.6
NIR0.69±0.03455±240.17±0.050.69±0.030.51±0.0616.7±1.2
Table A2. Cross-validation results for DMY estimation using PLS, RF and SVM regression (nRMSE normalized via mean observed value). Bold highlights indicate two best models per algorithm. iqr: interquartile range, sd: standard deviation.
Table A2. Cross-validation results for DMY estimation using PLS, RF and SVM regression (nRMSE normalized via mean observed value). Bold highlights indicate two best models per algorithm. iqr: interquartile range, sd: standard deviation.
R2RMSE kg ha⁻¹nRMSE %
ModelnameHyperparametersMedian iqrMean sdMedian iqrMean sdMedian iqrMean sd
PLSSHncomp = 30.84±0.050.83±0.04334±41336±3125.1±3.125.2±2.3
SBncomp = 40.84±0.040.83±0.02331±28331±2324.9±2.124.9±1.7
VI.rgbncomp = 30.67±0.080.67±0.05461±35465±3534.6±2.634.9±2.6
VI.msncomp = 140.87±0.030.86±0.02302±31302±2422.7±2.322.6±1.8
SH_SBncomp = 110.89±0.030.89±0.02269±43269±2520.2±3.220.2±1.9
SH_VI.rgbncomp = 50.84±0.050.83±0.03331±41333±3024.9±3.125.0±2.2
SH_VI.msncomp = 180.91±0.030.91±0.02247±30247±2418.6±2.218.6±1.8
SH_SB_VI.ms_VI.rgbncomp = 290.92±0.020.92±0.02232±27232±2117.4±2.017.5±1.6
RFSHmtry = 20.85±0.050.84±0.03319±49320±3223.9±3.724.0±2.4
SBmtry = 40.91±0.020.91±0.02245±37242±2218.4±2.818.2±1.7
VI.rgbmtry = 20.76±0.050.76±0.04402±48395±3730.2±3.629.7±2.8
VI.msmtry = 150.92±0.020.92±0.01226±24225±1817.0±1.816.9±1.3
SH_SBmtry = 120.92±0.020.92±0.01226±26227±2117.0±1.917.1±1.5
SH_VI.rgbmtry = 30.86±0.040.86±0.03304±41306±3222.8±3.123.0±2.4
SH_VI.msmtry = 80.94±0.020.94±0.01206±25205±1715.5±1.915.4±1.3
SH_SB_VI.ms_VI.rgbmtry = 120.94±0.020.94±0.01203±25203±1715.2±1.915.2±1.3
SVMSHC = 0.5, sigma = 1.620.84±0.050.84±0.03320±48320±3224.1±3.624.0±2.4
SBC = 4, sigma = 0.620.92±0.020.92±0.02230±34226±2217.3±2.617.0±1.6
VI.rgbC = 4, sigma = 0.950.77±0.070.77±0.05388±62393±4229.1±4.729.5±3.2
VI.msC = 4, sigma = 0.240.93±0.020.93±0.02221±27219±2116.6±2.016.4±1.6
SH_SBC = 4, sigma = 0.240.93±0.030.93±0.02214±35215±2416.1±2.616.2±1.8
SH_VI.rgbC = 4, sigma = 0.270.87±0.050.87±0.03288±42290±3321.6±3.121.8±2.5
SH_VI.msC = 2, sigma = 0.160.94±0.020.94±0.01199±27200±1914.9±2.015.0±1.4
SH_SB_VI.ms_VI.rgbC = 4, sigma = 0.070.94±0.020.94±0.01197±32198±2014.8±2.414.9±1.5
Table A3. Cross-validation results for N% estimation using PLS, RF and SVM regression (nRMSE normalized via mean observed value). Bold highlights indicate two best models per algorithm. iqr: interquartile range, sd: standard deviation.
Table A3. Cross-validation results for N% estimation using PLS, RF and SVM regression (nRMSE normalized via mean observed value). Bold highlights indicate two best models per algorithm. iqr: interquartile range, sd: standard deviation.
R2RMSE N % nRMSE %
ModelnameHyperparametersMedian iqrMean sdMedian iqrMean sdMedian iqrMean sd
PLSSHncomp = 30.21±0.070.21±0.060.68±0.050.67±0.0322.6±1.522.4±1.1
SBncomp = 40.64±0.070.63±0.050.45±0.050.46±0.0415.1±1.615.3±1.2
VI.rgbncomp = 30.27±0.130.28±0.080.65±0.060.64±0.0421.5±2.121.4±1.5
VI.msncomp = 140.75±0.060.74±0.050.39±0.050.39±0.0413.0±1.812.9±1.3
SH_SBncomp = 100.70±0.070.70±0.050.41±0.060.42±0.0413.7±2.013.9±1.2
SH_VI.rgbncomp = 100.49±0.080.47±0.060.55±0.050.55±0.0418.4±1.818.3±1.3
SH_VI.msncomp = 210.76±0.060.75±0.050.37±0.050.38±0.0412.4±1.712.5±1.3
SH_SB_VI.ms_VI.rgbncomp = 300.77±0.060.76±0.050.37±0.060.37±0.0412.3±1.912.3±1.3
RFSHmtry = 70.28±0.140.30±0.090.64±0.060.63±0.0521.3±2.121.2±1.7
SBmtry = 50.78±0.050.78±0.040.35±0.040.35±0.0311.7±1.211.8±1.0
VI.rgbmtry = 20.41±0.150.43±0.100.59±0.070.58±0.0519.8±2.519.2±1.8
VI.msmtry = 150.79±0.050.79±0.040.35±0.040.35±0.0311.6±1.311.6±1.1
SH_SBmtry = 120.79±0.070.79±0.050.35±0.050.35±0.0311.6±1.511.7±1.1
SH_VI.rgbmtry = 110.59±0.110.59±0.070.49±0.070.49±0.0416.4±2.216.3±1.5
SH_VI.msmtry = 220.80±0.040.81±0.040.34±0.040.34±0.0411.2±1.311.2±1.2
SH_SB_VI.ms_VI.rgbmtry = 300.83±0.050.83±0.040.31±0.040.32±0.0310.4±1.210.5±1.1
SVMSHC = 0.5, sigma = 1.800.32±0.100.32±0.080.63±0.060.63±0.0520.9±1.921.0±1.6
SBC = 4, sigma = 0.510.79±0.060.79±0.040.35±0.050.35±0.0411.6±1.811.5±1.2
VI.rgbC = 4, sigma = 0.890.47±0.110.47±0.090.57±0.080.57±0.0619.1±2.618.9±2.1
VI.msC = 4, sigma = 0.270.81±0.050.81±0.040.33±0.050.33±0.0411.0±1.811.0±1.3
SH_SBC = 4, sigma = 0.190.81±0.050.81±0.040.32±0.050.33±0.0410.8±1.810.9±1.2
SH_VI.rgbC = 4, sigma = 0.310.60±0.080.60±0.060.48±0.060.48±0.0416.0±2.016.0±1.5
SH_VI.msC = 4, sigma = 0.170.81±0.050.81±0.040.32±0.050.33±0.0410.8±1.811.0±1.2
SH_SB_VI.ms_VI.rgbC = 4, sigma = 0.070.83±0.060.83±0.040.32±0.060.32±0.0410.6±1.910.6±1.3
Table A4. Cross-validation results for Nup estimation using PLS, RF and SVM regression (nRMSE normalized via mean observed value). Bold highlights indicate two best models per algorithm. iqr: interquartile range, sd: standard deviation.
Table A4. Cross-validation results for Nup estimation using PLS, RF and SVM regression (nRMSE normalized via mean observed value). Bold highlights indicate two best models per algorithm. iqr: interquartile range, sd: standard deviation.
R2RMSE kg ha⁻¹nRMSE %
ModelnameHyperparametersMedian iqrMean sdMedian iqrMean sdMedian iqrMean sd
PLSSHncomp = 30.79±0.060.79±0.0511.0±1.310.9±1.228.6±3.528.3±3.1
SBncomp = 40.78±0.050.77±0.0411.3±1.311.4±0.929.4±3.529.7±2.5
VI.rgbncomp = 30.67±0.040.67±0.0413.6±1.213.8±1.135.5±3.036.0±2.8
VI.msncomp = 140.81±0.040.81±0.0310.4±1.310.4±0.927.0±3.427.0±2.4
SH_SBncomp = 90.86±0.040.86±0.049.2±1.49.1±1.223.9±3.623.6±3.2
SH_VI.rgbncomp = 70.83±0.050.83±0.049.9±1.39.8±1.225.8±3.425.5±3.2
SH_VI.msncomp = 200.87±0.040.87±0.038.7±1.38.6±1.122.6±3.422.4±2.8
SH_SB_VI.ms_VI.rgbncomp = 290.88±0.040.88±0.038.2±1.38.2±1.121.4±3.421.3±2.8
RFSHmtry = 70.80±0.050.80±0.0410.8±0.910.7±0.828.0±2.527.9±2.1
SBmtry = 50.87±0.030.87±0.038.6±1.08.7±0.922.4±2.522.5±2.4
VI.rgbmtry = 20.77±0.050.76±0.0511.7±1.411.6±1.230.5±3.730.2±3.0
VI.msmtry = 150.90±0.030.89±0.037.7±1.57.8±1.020.0±3.820.3±2.6
SH_SBmtry = 120.88±0.030.88±0.038.3±1.28.4±0.921.7±3.122.0±2.3
SH_VI.rgbmtry = 100.85±0.050.85±0.039.2±1.59.2±0.924.0±4.024.0±2.5
SH_VI.msmtry = 210.91±0.030.91±0.027.0±1.17.1±0.918.3±3.018.5±2.3
SH_SB_VI.ms_VI.rgbmtry = 270.92±0.030.92±0.026.7±1.16.9±0.817.5±2.717.9±2.1
SVMSHC = 1, sigma = 1.620.80±0.070.79±0.0510.9±1.910.9±1.228.4±4.928.4±3.0
SBC = 4, sigma = 0.620.89±0.030.89±0.028.0±1.28.1±0.920.8±3.121.0±2.4
VI.rgbC = 4, sigma = 0.950.76±0.050.77±0.0511.5±1.411.6±1.230.0±3.630.1±3.1
VI.msC = 4, sigma = 0.240.89±0.030.89±0.027.9±1.47.9±0.920.7±3.620.6±2.4
SH_SBC = 4, sigma = 0.240.90±0.040.90±0.037.6±1.47.6±1.019.8±3.619.9±2.5
SH_VI.rgbC = 2, sigma = 0.270.86±0.050.86±0.048.9±1.59.0±1.123.2±3.923.5±2.9
SH_VI.msC = 4, sigma = 0.160.91±0.030.91±0.027.1±1.47.1±0.918.4±3.718.6±2.4
SH_SB_VI.ms_VI.rgbC = 4, sigma = 0.070.91±0.030.91±0.027.1±1.07.1±0.818.6±2.718.5±2.2
Table A5. Results Wilcoxon signed-rank test.
Table A5. Results Wilcoxon signed-rank test.
Wilcoxon Signed Rank TestSignificance Level adj. p-Value
Model 1Model 2DMYN%Nup
PLSVI.msSH_VI.ms************
SBSH_SB************
VI.rgbSH_VI.rgb************
SH_VI.msSH_SB_VI.ms_VI.rgb************
RFVI.msSH_VI.ms***********
SBSH_SB****nsns
VI.rgbSH_VI.rgb************
SH_VI.msSH_SB_VI.ms_VI.rgb**********
SVMVI.msSH_VI.ms****ns****
SBSH_SB*********
VI.rgbSH_VI.rgb************
SH_VI.msSH_SB_VI.ms_VI.rgbns****ns
Significance levels of adjusted p-values: **** ≤ 0.0001, *** ≤ 0.001, ** ≤ 0.01, ns > 0.05.
Figure A1. VIP of DMY estimation with PLS.
Figure A1. VIP of DMY estimation with PLS.
Remotesensing 14 03066 g0a1
Figure A2. VIP of DMY estimation with RF.
Figure A2. VIP of DMY estimation with RF.
Remotesensing 14 03066 g0a2
Figure A3. VIP of DMY estimation with SVM.
Figure A3. VIP of DMY estimation with SVM.
Remotesensing 14 03066 g0a3
Figure A4. VIP of N% estimation with PLS.
Figure A4. VIP of N% estimation with PLS.
Remotesensing 14 03066 g0a4
Figure A5. VIP of N% estimation with RF.
Figure A5. VIP of N% estimation with RF.
Remotesensing 14 03066 g0a5
Figure A6. VIP of N% estimation with SVM.
Figure A6. VIP of N% estimation with SVM.
Remotesensing 14 03066 g0a6
Figure A7. VIP of Nup estimation with PLS.
Figure A7. VIP of Nup estimation with PLS.
Remotesensing 14 03066 g0a7
Figure A8. VIP of Nup estimation with RF.
Figure A8. VIP of Nup estimation with RF.
Remotesensing 14 03066 g0a8
Figure A9. VIP of Nup estimation with SVM.
Figure A9. VIP of Nup estimation with SVM.
Remotesensing 14 03066 g0a9

References

  1. Gibson, D.J. Grasses and Grassland Ecology; Oxford University Press: Oxford, UK, 2009; ISBN1 978-0-19-852918-7. ISBN2 978-0-19-852919-4. [Google Scholar]
  2. Velthof, G.L.; Lesschen, J.P.; Schils, R.L.M.; Smit, A.; Elbersen, B.S.; Hazeu, G.W.; Mucher, C.A.; Oenema, O. Final Report: Grassland Areas, Production and Use. Lot 2. Methodological Studies in the Field of Agro-Environmental Indicators; Wageningen Environmental Research: Wageningen, The Netherlands, 2014. [Google Scholar]
  3. Shalloo, L.; Byrne, T.; Leso, L.; Ruelle, E.; Starsmore, K.; Geoghegan, A.; Werner, J.; O’Leary, N. A review of precision technologies in pasture-based dairying systems. Ir. J. Agric. Food Res. 2021, 59, 279–291. [Google Scholar] [CrossRef]
  4. O’Mara, F.P. The role of grasslands in food security and climate change. Ann. Bot. 2012, 110, 1263–1270. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  5. Lugato, E.; Bampa, F.; Panagos, P.; Montanarella, L.; Jones, A. Potential carbon sequestration of European arable soils estimated by modelling a comprehensive set of management practices. Glob. Change Biol. 2014, 20, 3557–3567. [Google Scholar] [CrossRef] [Green Version]
  6. Eurostat. Agriculture, Forestry and Fishery Statistics 2020 Edition; Cook, E., Ed.; European Union: Luxembourg, 2020; ISBN 978-92-76-21522-6. [Google Scholar]
  7. Wilkins, P.W.; Humphreys, M.O. Progress in breeding perennial forage grasses for temperate agriculture. J. Agric. Sci. 2003, 140, 129–150. [Google Scholar] [CrossRef]
  8. Muñoz-Huerta, R.F.; Guevara-Gonzalez, R.G.; Contreras-Medina, L.M.; Torres-Pacheco, I.; Prado-Olivarez, J.; Ocampo-Velazquez, R.V. A review of methods for sensing the nitrogen status in plants: Advantages, disadvantages and recent advances. Sensors 2013, 13, 10823–10843. [Google Scholar] [CrossRef] [PubMed]
  9. Gastal, F.; Lemaire, G. N uptake and distribution in crops: An agronomical and ecophysiological perspective. J. Exp. Bot. 2002, 53, 789–799. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  10. Reyes, J.; Schellberg, J.; Siebert, S.; Elsaesser, M.; Adam, J.; Ewert, F. Improved estimation of nitrogen uptake in grasslands using the nitrogen dilution curve. Agron. Sustain. Dev. 2015, 35, 1561–1570. [Google Scholar] [CrossRef] [Green Version]
  11. Lesschen, J.P.; Elbersen, B.; Hazeu, G.; van Doorn, A.; Mucher, S.; Velthof, G. Task 1—Defining and Classifying Grasslands in Europe: Methodological Studies in the Field of Agro-Environmental Indicators Lot 2. Grassland Areas, Production and Use; Wageningen Environmental Research: Wageningen, The Netherlands, 2014. [Google Scholar]
  12. Higgins, S.; Schellberg, J.; Bailey, J.S. Improving productivity and increasing the efficiency of soil nutrient management on grassland farms in the UK and Ireland using precision agriculture technology. Eur. J. Agron. 2019, 106, 67–74. [Google Scholar] [CrossRef]
  13. Shalloo, L.; Donovan, M.O.; Leso, L.; Werner, J.; Ruelle, E.; Geoghegan, A.; Delaby, L.; Leary, N.O. Review: Grass-based dairy systems, data and precision technologies. Animal 2018, 12, S262–S271. [Google Scholar] [CrossRef] [Green Version]
  14. Balafoutis, A.; Beck, B.; Fountas, S.; Vangeyte, J.; Van Der Wal, T.; Soto, I.; Gómez-Barbero, M.; Barnes, A.; Eory, V. Precision agriculture technologies positively contributing to ghg emissions mitigation, farm productivity and economics. Sustainability 2017, 9, 1339. [Google Scholar] [CrossRef] [Green Version]
  15. Mulla, D.J. Twenty five years of remote sensing in precision agriculture: Key advances and remaining knowledge gaps. Biosyst. Eng. 2013, 114, 358–371. [Google Scholar] [CrossRef]
  16. Schellberg, J.; Hill, M.J.; Gerhards, R.; Rothmund, M.; Braun, M. Precision agriculture on grassland: Applications, perspectives and constraints. Eur. J. Agron. 2008, 29, 59–71. [Google Scholar] [CrossRef]
  17. Schellberg, J.; Möseler, B.M.; Kühbauch, W.; Rademacher, I.F. Long-term effects of fertilizer on soil nutrient concentration, yield, forage quality and floristic composition of a hay meadow in the Eifel mountains, Germany. Grass Forage Sci. 1999, 54, 195–207. [Google Scholar] [CrossRef]
  18. Ergon, A.; Kirwan, L.; Bleken, M.A.; Skjelvag, A.O.; Collins, R.P.; Rognli, O.A. Species interactions in a grassland mixture under low nitrogen fertilization and two cutting frequencies: 1. dry-matter yield and dynamics of species composition. Grass 2016, 71, 667–682. [Google Scholar] [CrossRef]
  19. Elgersma, A.; Søegaard, K. Changes in nutritive value and herbage yield during extended growth intervals in grass—Legume mixtures: Effects of species, maturity at harvest, and relationships between productivity and components of feed quality. Grass Forage Sci. 2017, 73, 78–93. [Google Scholar] [CrossRef]
  20. Duranovich, F.N.; Yule, I.J.; Lopez-Villalobos, N.; Shadbolt, N.M.; Draganova, I.; Morris, S.T. Using Proximal Hyperspectral Sensing to Predict Herbage Nutritive Value for Dairy Farming. Agronomy 2020, 10, 1826. [Google Scholar] [CrossRef]
  21. Catchpole, W.R.; Wheeler, C.H. Estimating plant biomass: A review of techniques. Aust. J. Ecol. 1992, 17, 121–131. [Google Scholar] [CrossRef]
  22. O´Donovan, M.; Dillon, P.; Rath, M.; Stakelum, G. A comparison of four methods of herbage mass estimation. Ir. J. Agric. Food Res. 2002, 41, 17–27. [Google Scholar] [CrossRef]
  23. Sanderson, M.A.; Rotz, C.A.; Fultz, S.W.; Rayburn, E.B. Estimating Forage mass with a Commercial Capacitance Meter, rising Plate Meter and Pasture Ruler. Agron. J. 2001, 93, 1281–1286. [Google Scholar] [CrossRef] [Green Version]
  24. Fricke, T.; Wachendorf, M. Combining ultrasonic sward height and spectral signatures to assess the biomass of legume-grass swards. Comput. Electron. Agric. 2013, 99, 236–247. [Google Scholar] [CrossRef]
  25. Legg, M.; Bradley, S. Ultrasonic Arrays for Remote Sensing of Pasture Biomass. Remote Sens. 2019, 12, 111. [Google Scholar] [CrossRef] [Green Version]
  26. Portz, G.; Gnyp, M.L.; Jasper, J. Capability of crop canopy sensing to predict crop parameters of cut grass swards aiming at early season variable rate nitrogen top dressings. Adv. Anim. Biosci. 2017, 8, 792–795. [Google Scholar] [CrossRef]
  27. Berry, P.M.; Holmes, H.F.; Blacker, C. Development of methods for remotely sensing grass growth to enable precision application of nitrogen fertilizer. Adv. Anim. Biosci. 2017, 8, 758–763. [Google Scholar] [CrossRef]
  28. Flynn, S.; Dougherty, C.; Wendroth, O. Assessment of pasture biomass with Normalized Difference Vegetation Index from active ground-based sensors. Agron. J. 2008, 100, 114–121. [Google Scholar] [CrossRef]
  29. Pullanagari, R.R.; Yule, I.J.; Tuohy, M.P.; Hedley, M.J.; Dynes, R.A.; King, W.M. Proximal sensing of the seasonal variability of pasture nutritive value using multispectral radiometry. Grass Forage Sci. 2013, 68, 110–119. [Google Scholar] [CrossRef]
  30. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  31. Murphy, D.J.; Murphy, M.D.; O’Brien, B.; O’Donovan, M. A review of precision technologies for optimising pasture measurement on irish grassland. Agriculture 2021, 11, 600. [Google Scholar] [CrossRef]
  32. Ali, I.; Cawkwell, F.; Dwyer, E.; Barrett, B.; Green, S. Satellite remote sensing of grasslands: From observation to management. J. Plant Ecol. 2016, 9, 649–671. [Google Scholar] [CrossRef] [Green Version]
  33. Atzberger, C. Advances in remote sensing of agriculture: Context description, existing operational monitoring systems and major information needs. Remote Sens. 2013, 5, 949–981. [Google Scholar] [CrossRef] [Green Version]
  34. Berger, K.; Verrelst, J.; Féret, J.B.; Wang, Z.; Wocher, M.; Strathmann, M.; Danner, M.; Mauser, W.; Hank, T. Crop nitrogen monitoring: Recent progress and principal developments in the context of imaging spectroscopy missions. Remote Sens. Environ. 2020, 242, 111758. [Google Scholar] [CrossRef]
  35. Wachendorf, M.; Fricke, T.; Möckel, T. Remote sensing as a tool to assess botanical composition, structure, quantity and quality of temperate grasslands. Grass Forage Sci. 2017, 73, 1–14. [Google Scholar] [CrossRef]
  36. Reinermann, S.; Asam, S.; Kuenzer, C. Remote sensing of grassland production and management-A review. Remote Sens. 2020, 12, 1949. [Google Scholar] [CrossRef]
  37. Manfreda, S.; Mccabe, M.F.; Miller, P.E.; Lucas, R.; Madrigal, V.P.; Mallinis, G.; Dor, E.B.; Helman, D.; Estes, L.; Ciraolo, G.; et al. On the use of unmanned aerial systems for environmental monitoring. Remote Sens. 2018, 10, 641. [Google Scholar] [CrossRef] [Green Version]
  38. Colomina, I.; Molina, P. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef] [Green Version]
  39. Maes, W.H.; Steppe, K. Perspectives for Remote Sensing with Unmanned Aerial Vehicles in Precision Agriculture. Trends Plant Sci. 2018, 24, 152–164. [Google Scholar] [CrossRef] [PubMed]
  40. Dandois, J.P.; Ellis, E.C. Remote Sensing of Vegetation Structure Using Computer Vision. Remote Sens. 2010, 2, 1157–1176. [Google Scholar] [CrossRef] [Green Version]
  41. Harwin, S.; Lucieer, A. Assessing the accuracy of georeferenced point clouds produced via multi-view stereopsis from Unmanned Aerial Vehicle (UAV) imagery. Remote Sens. 2012, 4, 1573–1599. [Google Scholar] [CrossRef] [Green Version]
  42. Turner, D.; Lucieer, A.; Watson, C. An automated technique for generating georectified mosaics from ultra-high resolution Unmanned Aerial Vehicle (UAV) imagery, based on Structure from Motion (SFM) point clouds. Remote Sens. 2012, 4, 1392–1410. [Google Scholar] [CrossRef] [Green Version]
  43. Aasen, H.; Honkavaara, E.; Lucieer, A.; Zarco-Tejada, P.J. Quantitative remote sensing at ultra-high resolution with UAV spectroscopy: A review of sensor technology, measurement procedures, and data correction workflows. Remote Sens. 2018, 10, 1091. [Google Scholar] [CrossRef] [Green Version]
  44. Bareth, G.; Schellberg, J. Replacing Manual Rising Plate Meter Measurements with Low-cost UAV-Derived Sward Height Data in Grasslands for Spatial Monitoring. PFG J. Photogramm. Remote Sens. Geoinf. Sci. 2018, 86, 157–168. [Google Scholar] [CrossRef]
  45. Borra-Serrano, I.; De Swaef, T.; Muylle, H.; Nuyttens, D.; Vangeyte, J.; Mertens, K.; Saeys, W.; Somers, B.; Roldán-Ruiz, I.; Lootens, P. Canopy height measurements and non-destructive biomass estimation of Lolium perenne swards using UAV imagery. Grass Forage Sci. 2019, 74, 356–369. [Google Scholar] [CrossRef]
  46. Zhang, H.; Sun, Y.; Chang, L.; Qin, Y.; Chen, J.; Qin, Y.; Du, J.; Yi, S.; Wang, Y. Estimation of grassland canopy height and aboveground biomass at the quadrat scale using unmanned aerial vehicle. Remote Sens. 2018, 10, 85. [Google Scholar] [CrossRef] [Green Version]
  47. Grüner, E.; Astor, T.; Wachendorf, M. Biomass Prediction of Heterogeneous Temperate Grasslands Using an SfM Approach Based on UAV Imaging. Agronomy 2019, 9, 54. [Google Scholar] [CrossRef] [Green Version]
  48. Jenal, A.; Lussem, U.; Bolten, A.; Gnyp, M.L.; Schellberg, J.; Jasper, J.; Bongartz, J.; Bareth, G. Investigating the Potential of a Newly Developed UAV-based VNIR/SWIR Imaging System for Forage Mass Monitoring. PFG—J. Photogramm. Remote Sens. Geoinf. Sci. 2020, 88, 493–507. [Google Scholar] [CrossRef]
  49. Capolupo, A.; Kooistra, L.; Berendonk, C.; Boccia, L.; Suomalainen, J. Estimating Plant Traits of Grasslands from UAV-Acquired Hyperspectral Images: A Comparison of Statistical Approaches. ISPRS Int. J. Geo-Inform. 2015, 4, 2792–2820. [Google Scholar] [CrossRef]
  50. Cunliffe, A.M.; Brazier, R.E.; Anderson, K. Ultra-fine grain landscape-scale quantification of dryland vegetation structure with drone-acquired structure-from-motion photogrammetry. Remote Sens. Environ. 2016, 183, 129–143. [Google Scholar] [CrossRef] [Green Version]
  51. Forsmoo, J.; Anderson, K.; Macleod, C.J.A.; Wilkinson, M.E.; Brazier, R. Drone-based structure-from-motion photogrammetry captures grassland sward height variability. J. Appl. Ecol. 2018, 55, 2587–2599. [Google Scholar] [CrossRef]
  52. Geipel, J.; Korsaeth, A. Hyperspectral Aerial Imaging for Grassland Yield Estimation. Adv. Anim. Biosci. 2017, 8, 770–775. [Google Scholar] [CrossRef]
  53. Lussem, U.; Schellberg, J.; Bareth, G. Monitoring Forage Mass with Low-Cost UAV Data: Case Study at the Rengen Grassland Experiment. PFG J. Photogramm. Remote Sens. Geoinf. Sci. 2020, 88, 407–422. [Google Scholar] [CrossRef]
  54. Rueda-Ayala, V.P.; Peña, J.M.; Höglind, M.; Bengochea-Guevara, J.M.; Andújar, D. Comparing UAV-based technologies and RGB-D reconstruction methods for plant height and biomass monitoring on grass ley. Sensors 2019, 19, 535. [Google Scholar] [CrossRef] [Green Version]
  55. Grüner, E.; Wachendorf, M.; Astor, T. The potential of UAV-borne spectral and textural information for predicting aboveground biomass and N fixation in legume-grass mixtures. PLoS ONE 2020, 15, e234703. [Google Scholar] [CrossRef] [PubMed]
  56. Näsi, R.; Viljanen, N.; Kaivosoja, J.; Alhonoja, K.; Hakala, T.; Markelin, L.; Honkavaara, E. Estimating biomass and nitrogen amount of barley and grass using UAV and aircraft based spectral and photogrammetric 3D features. Remote Sens. 2018, 10, 1082. [Google Scholar] [CrossRef] [Green Version]
  57. Yuan, M.; Burjel, J.C.; Isermann, J.; Goeser, N.J.; Pittelkow, C.M. Unmanned aerial vehicle-based assessment of cover crop biomass and nitrogen uptake variability. J. Soil Water Conserv. 2019, 74, 350–359. [Google Scholar] [CrossRef] [Green Version]
  58. Geipel, J.; Bakken, A.K.; Jørgensen, M.; Korsaeth, A. Forage yield and quality estimation by means of UAV and hyperspectral imaging. Precis. Agric. 2021, 22, 1437–1463. [Google Scholar] [CrossRef]
  59. Oliveira, R.A.; Näsi, R.; Niemeläinen, O.; Nyholm, L.; Alhonoja, K.; Kaivosoja, J.; Jauhiainen, L.; Viljanen, N.; Nezami, S.; Markelin, L.; et al. Machine learning estimators for the quantity and quality of grass swards used for silage production using drone-based imaging spectrometry and photogrammetry. Remote Sens. Environ. 2020, 246, 111830. [Google Scholar] [CrossRef]
  60. Wijesingha, J.; Astor, T.; Schulze-Brüninghof, D.; Wengert, M. Predicting Forage Quality of Grasslands Using UAV-Borne Imaging Spectroscopy. Remote Sens. 2020, 12, 126. [Google Scholar] [CrossRef] [Green Version]
  61. Viljanen, N.; Honkavaara, E.; Näsi, R.; Hakala, T.; Niemeläinen, O.; Kaivosoja, J. A Novel Machine Learning Method for Estimating Biomass of Grass Swards Using a Photogrammetric Canopy Height Model, Images and Vegetation Indices Captured by a Drone. Agriculture 2018, 8, 70. [Google Scholar] [CrossRef] [Green Version]
  62. Pranga, J.; Borra-Serrano, I.; Aper, J.; De Swaef, T.; Ghesquiere, A.; Quataert, P.; Roldan-Ruiz, I.; Janssens, I.A.; Ruysschaert, G.; Lootens, P. Improving Accuracy of Herbage Yield Predictions in Perennial Ryegrass with UAV-Based Structural and Spectral Data Fusion and Machine Learning. Remote Sens. 2021, 13, 3459. [Google Scholar] [CrossRef]
  63. Karunaratne, S.; Thomson, A.; Morse-McNabb, E.; Wijesingha, J.; Stayches, D.; Copland, A.; Jacobs, J. The fusion of spectral and structural datasets derived from an airborne multispectral sensor for estimation of pasture dry matter yield at paddock scale with time. Remote Sens. 2020, 12, 2017. [Google Scholar] [CrossRef]
  64. MicaSense RedEdge-M User Manual. Available online: https://support.micasense.com/hc/en-us/article_attachments/115004168274/RedEdge-M_User_Manual.pdf (accessed on 15 November 2021).
  65. R Core Team. R: A Language and Environment for Statistical Computing; R Foundation for Statistical Computing: Vienna, Austria, 2021; Available online: http://www.r-project.org/ (accessed on 15 March 2022).
  66. R Studio Team. RStudio: Integrated Development for R; RStudio: Boston, MA, USA, 2020; Available online: http://www.rstudio.com/ (accessed on 15 March 2022).
  67. Rouse, J.W.; Haas, R.H.; Schell, J.A.; Deering, D.W. Monitoring Vegetation Systems in the Great Plains with ERTS. NASA Spec. Publ. 1974, 351, 309–317. [Google Scholar]
  68. Gitelson, A.A.; Kaufman, Y.J.; Merzlyak, M.N. Use of a green channel in remote sensing of global vegetation from EOS-MODIS. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
  69. Yang, C.; Everitt, J.H.; Bradford, J.M.; Murden, D. Airborne hyperspectral imagery and yield monitor data for mapping cotton yield variability. Precis. Agric. 2004, 5, 445–461. [Google Scholar] [CrossRef]
  70. Rondeaux, G.; Steven, M.; Baret, F. Optimization of Soil-Adjusted Vegetation Indices. Remote Sens. Environ. 1996, 55, 95–107. [Google Scholar] [CrossRef]
  71. Qi, J.; Chehbouni, A.; Huete, A.R.; Kerr, Y.H.; Sorooshian, S. A Modified Soil Adjusted Vegetation Index. Remote Sens. Environ. 1994, 126, 119–126. [Google Scholar] [CrossRef]
  72. Haboudane, D.; Miller, J.R.; Pattey, E.; Zarco-Tejada, P.J.; Strachan, I.B. Hyperspectral vegetation indices and novel algorithms for predicting green LAI of crop canopies: Modeling and validation in the context of precision agriculture. Remote Sens. Environ. 2004, 90, 337–352. [Google Scholar] [CrossRef]
  73. Huete, A.; Didan, K.; Miura, T.; Rodriguez, E.; Gao, X.; Ferreira, L. Overview of the radiometric and biophysical performance of the MODIS vegetation indices. Remote Sens. Environ. 2002, 83, 195–213. [Google Scholar] [CrossRef]
  74. Gitelson, A.; Merzlyak, N. Quantitative estimation of chlorophyll-a using reflectance spectra: Experiments with autumn chestnut and maple leaves. J. Photochem. Photobiol. B Biol. 1994, 22, 247–252. [Google Scholar] [CrossRef]
  75. Roujean, J.L.; Breon, F.M. Estimating PAR absorbed by vegetation from bidirectional reflectance measurements. Remote Sens. Environ. 1995, 51, 375–384. [Google Scholar] [CrossRef]
  76. Jordan, C.F. Derivation of leaf-area index from quality of light on the forest floor. Ecology 1969, 50, 663–666. [Google Scholar] [CrossRef]
  77. Barnes, E.M.; Clarke, T.R.; Richards, S.E.; Colaizzi, P.D.; Haberland, J.; Kostrzewski, M.; Waller, P.; Choi, C.; Riles, E.; Thompson, T.; et al. Coincident Detection of Crop Water Stress, Nitrogen Status and Canopy Density Using Ground-Based Multispectral Data. In Proceedings of the Fifth International Conference on Precision Agriculture, Bloomington, MN, USA, 16–19 July 2000; Roberts, P.C., Rust, R.H., Larson, W.E., Eds.; American Society of Agronomy: Madison, WI, USA, 2000. [Google Scholar]
  78. Long, D.S.; Eitel, J.U.H.; Huggins, D.R. Assessing Nitrogen Status of Dryland Wheat Using the Canopy Chlorophyll Content Index. Crop Manag. 2009, 8, 1–8. [Google Scholar] [CrossRef]
  79. Chen, J.M. Evaluation of Vegetation Indices and a Modified Simple Ratio for Boreal Applications. Can. J. Remote Sens. 1996, 22, 1–21. [Google Scholar] [CrossRef]
  80. Ramoelo, A.; Skidmore, A.K.; Cho, M.A.; Schlerf, M.; Mathieu, R.; Heitkönig, I.M.A. Regional estimation of savanna grass nitrogen using the red-edge band of the spaceborne Rapideye sensor. Int. J. Appl. Earth Obs. Geoinf. 2012, 19, 151–162. [Google Scholar] [CrossRef]
  81. Tucker, C.J. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sens. Environ. 1979, 8, 127–150. [Google Scholar] [CrossRef] [Green Version]
  82. Metternicht, G. Vegetation indices derived from high-resolution airborne videography for precision crop management. Int. J. Remote Sens. 2003, 24, 2855–2877. [Google Scholar] [CrossRef]
  83. Gitelson, A.; Kaufman, Y.; Stark, R.; Rundquist, D. Novel Algorithms for Remote Estimation of Vegetation Fraction. Remote Sens. Environ. 2002, 80, 76–87. [Google Scholar] [CrossRef] [Green Version]
  84. Kuhn, M.; Wing, J.; Weston, S.; Williams, A.; Keefer, C.; Engelhardt, A.; Cooper, T.; Mayer, Z.; Kenkel, B.; The R Core Team; et al. Package “caret”: Classification and Regression Training 2022. Available online: https://cran.r-project.org/web/packages/caret/caret.pdf (accessed on 15 March 2022).
  85. Kuhn, M. Building Predictive Models in R Using the caret Package. J. Stat. Softw. 2008, 28, 1–26. [Google Scholar] [CrossRef] [Green Version]
  86. Wold, S.; Sjöström, M.; Eriksson, L. PLS-regression: A basic tool of chemometrics. Chemom. Intell. Lab. Syst. 2001, 58, 109–130. [Google Scholar] [CrossRef]
  87. Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef] [Green Version]
  88. Belgiu, M.; Drăgu, L. Random forest in remote sensing: A review of applications and future directions. ISPRS J. Photogramm. Remote Sens. 2016, 114, 24–31. [Google Scholar] [CrossRef]
  89. Cortes, C.; Vapnik, V. Support-vector networks. Mach. Learn. 1995, 20, 273–297. [Google Scholar] [CrossRef]
  90. Mountrakis, G.; Im, J.; Ogole, C. Support vector machines in remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2011, 66, 247–259. [Google Scholar] [CrossRef]
  91. Smola, A.J.; Schölkopf, B. A tutorial on support vector regression. Stat. Comput. 2004, 14, 199–222. [Google Scholar] [CrossRef] [Green Version]
  92. Probst, P.; Wright, M.; Boulesteix, A. Hyperparameters and tuning strategies for random forest. WIREs Data Min. Knowl. Discov. 2019, 9, e1301. [Google Scholar] [CrossRef] [Green Version]
  93. Schratz, P.; Muenchow, J.; Iturritxa, E.; Richter, J.; Brenning, A. Hyperparameter tuning and performance assessment of statistical and machine-learning algorithms using spatial data. Ecol. Modell. 2019, 406, 109–120. [Google Scholar] [CrossRef] [Green Version]
  94. Mevik, B.-H.; Wehrens, R. The pls package: Principal component and partial least squares regression in R. J. Stat. Softw. 2007, 18, 1–24. [Google Scholar] [CrossRef] [Green Version]
  95. Szymczak, S.; Holzinger, E.; Dasgupta, A.; Malley, J.D.; Molloy, A.M.; Mills, J.L.; Brody, L.C.; Stambolian, D.; Bailey-Wilson, J.E. r2VIM: A new variable selection method for random forests in genome-wide association studies. BioData Min. 2016, 9, 7. [Google Scholar] [CrossRef] [Green Version]
  96. Wright, M.; Ziegler, A. ranger: A fast Implementation of Random Forest for High Dimensional Data in C++ and R. J. Stat. Softw. 2017, 17, 1–77. [Google Scholar] [CrossRef] [Green Version]
  97. Karatzoglou, A.; Smola, A.; Hornik, K.; Zeileis, A. kernlab—An S4 package for kernel methods in R. J. Stat. Softw. 2004, 11, 1–20. [Google Scholar] [CrossRef] [Green Version]
  98. Greenwell, B.; Boehmke, B.; Gray, B. Package “vip”: Variable Importance Plots 2020. Available online: https://cran.r-project.org/web/packages/vip/vip.pdf (accessed on 15 March 2022).
  99. Greenwell, B.; Boehmke, B. Variable Importance Plots—An Introduction to the vip Package. R J. 2020, 12, 343–366. [Google Scholar] [CrossRef]
  100. Koppe, W.; Henning, S.D.; Li, F.; Gnyp, M.L.; Miao, Y.; Jia, L.; Chen, X.; Bareth, G. Multi-Temporal Hyperspectral and Radar Remote Sensing for Estimating Winter Wheat Biomass in the North China Plain. Photogramm. Fernerkund. Geoinf. 2012, 3, 281–298. [Google Scholar] [CrossRef]
  101. Dalponte, M.; Bruzzone, L.; Gianelle, D. Fusion of hyperspectral and LiDAR remote sensing data for classification of complex forest areas. IEEE Trans. Geosci. Remote Sens. 2008, 46, 1416–1427. [Google Scholar] [CrossRef] [Green Version]
  102. Puttonen, E.; Jaakkola, A.; Litkey, P.; Hyyppä, J. Tree classification with fused mobile laser scanning and hyperspectral data. Sensors 2011, 11, 5158–5182. [Google Scholar] [CrossRef] [PubMed]
  103. Agisoft Ltd MicaSense RedEdge MX Processing Workflow (Including Reflectance Calibration) in Agisoft Metashape Professional. Available online: https://agisoft.freshdesk.com/support/solutions/articles/31000148780-micasense-rededge-mx-processing-workflow-including-reflectance-calibration-in-agisoft-metashape-pro (accessed on 20 November 2021).
  104. Mamaghani, B.; Salvaggio, C. Multispectral sensor calibration and characterization for sUAS remote sensing. Sensors 2019, 19, 4453. [Google Scholar] [CrossRef] [Green Version]
  105. Hütt, C.; Bolten, A.; Hohlmann, B.; Komainda, M.; Lussem, U.; Isselstein, J.; Bareth, G. First results of applying UAV laser scanning to a cattle grazing experiment. In Proceedings of the 21st Symposium of the European Grassland Federation: Sensing—New Insights into Grassland Science and Practice; Astor, T., Dzene, I., Eds.; Organising Committee of the 21st Symposium of the European Grassland Federation, Universität Kassel, Grassland Science and Renewable Plant Resources: Kassel, Germany, 2021; pp. 77–79. [Google Scholar]
  106. Lussem, U.; Bolten, A.; Menne, J.; Gnyp, M.L.; Schellberg, J.; Bareth, G. Estimating biomass in temperate grassland with high resolution canopy surface models from UAV-based RGB images and vegetation indices. J. Appl. Remote Sens. 2019, 13, 034525. [Google Scholar] [CrossRef]
  107. Wilke, N.; Siegmann, B.; Klingbeil, L.; Burkart, A.; Kraska, T.; Muller, O.; van Doorn, A.; Heinemann, S.; Rascher, U. Quantifying Lodging Percentage and Lodging Severity Using a UAV-Based Canopy Height Model Combined with an Objective Threshold Approach. Remote Sens. 2019, 11, 515. [Google Scholar] [CrossRef] [Green Version]
  108. IPCC. Climate Change and Land: An IPCC Special Report on Climate Change, Desertification, Land Degradation, Sustainable Land Management, Food Security, and Green House Gas Fluxes in Terrestrial Ecosystems; Shukla, P.R., Skea, J., Buendia, E.C., Masson-Delmotte, V., Pörtner, H.-O., Roberts, D.C., Zhai, P., Slade, R., Connors, S., Diemen, R., et al., Eds.; IPCC: Geneva, Switzerland, 2019; Available online: https://www.ipcc.ch/srccl/ (accessed on 15 March 2022).
  109. Li, F.; Miao, Y.; Hennig, S.D.; Gnyp, M.L.; Chen, X.; Jia, L.; Bareth, G. Evaluating hyperspectral vegetation indices for estimating nitrogen concentration of winter wheat at different growth stages. Precis. Agric. 2010, 11, 335–357. [Google Scholar] [CrossRef]
  110. Thenkabail, P.S.; Smith, R.B.; De Pauw, E. Hyperspectral vegetation indices and their relationships with agricultural crop characteristics. Remote Sens. Environ. 2000, 71, 158–182. [Google Scholar] [CrossRef]
  111. Lamb, D.W.; Steyn-Ross, M.; Schaares, P.; Hanna, M.M.; Silvester, W.; Steyn-Ross, A. Estimating leaf nitrogen concentration in ryegrass (Lolium spp.) pasture using the chlorophyll red-edge: Theoretical modelling and experimental observations. Int. J. Remote Sens. 2002, 23, 3619–3648. [Google Scholar] [CrossRef]
  112. Mutanga, O.; Skidmore, A.K. Hyperspectral band depth analysis for a better estimation of grass biomass (Cenchrus ciliaris) measured under controlled laboratory conditions. Int. J. Appl. Earth Obs. Geoinf. 2004, 5, 87–96. [Google Scholar] [CrossRef]
  113. Tilly, N.; Aasen, H.; Bareth, G. Fusion of plant height and vegetation indices for the estimation of barley biomass. Remote Sens. 2015, 7, 11449–11480. [Google Scholar] [CrossRef] [Green Version]
  114. Schaefer, M.T.; Lamb, D.W. A combination of plant NDVI and LiDAR measurements improve the estimation of pasture biomass in tall fescue (Festuca arundinacea var. fletcher). Remote Sens. 2016, 8, 109. [Google Scholar] [CrossRef] [Green Version]
  115. Michez, A.; Lejeune, P.; Bauwens, S.; Lalaina Herinaina, A.A.; Blaise, Y.; Muñoz, E.C.; Lebeau, F.; Bindelle, J. Mapping and monitoring of biomass and grazing in pasture with an unmanned aerial system. Remote Sens. 2019, 11, 473. [Google Scholar] [CrossRef] [Green Version]
  116. Togeiro de Alckmin, G.; Lucieer, A.; Rawnsley, R.; Kooistra, L. Perennial ryegrass biomass retrieval through multispectral UAV data. Comput. Electron. Agric. 2022, 193, 106574. [Google Scholar] [CrossRef]
  117. Yu, K.; Li, F.; Gnyp, M.L.; Miao, Y.; Bareth, G.; Chen, X. Remotely detecting canopy nitrogen concentration and uptake of paddy rice in the Northeast China Plain. ISPRS J. Photogramm. Remote Sens. 2013, 78, 102–115. [Google Scholar] [CrossRef]
  118. Zheng, H.; Ma, J.; Zhou, M.; Li, D.; Yao, X.; Cao, W.; Zhu, Y.; Cheng, T. Enhancing the nitrogen signals of rice canopies across critical growth stages through the integration of textural and spectral information from unmanned aerial vehicle (UAV) multispectral imagery. Remote Sens. 2020, 12, 957. [Google Scholar] [CrossRef] [Green Version]
  119. Curran, P.J. Remote sensing of foliar chemistry. Remote Sens. Environ. 1989, 30, 271–278. [Google Scholar] [CrossRef]
  120. Curran, P.J.; Dungan, J.L.; Macler, B.A.; Plummer, S.E. The effect of a red leaf pigment on the relationship between red edge and chlorophyll concentration. Remote Sens. Environ. 1991, 35, 69–76. [Google Scholar] [CrossRef]
  121. Gitelson, A.A.; Gritz, Y.; Merzlyak, M.N. Relationships between leaf chlorophyll content and spectral reflectance and algorithms for non-destructive chlorophyll assessment in higher plant leaves. J. Plant Physiol. 2003, 160, 271–282. [Google Scholar] [CrossRef]
  122. Hansen, P.M.; Schjoerring, J.K. Reflectance measurement of canopy biomass and nitrogen status in wheat crops using normalized difference vegetation indices and partial least squares regression. Remote Sens. Environ. 2003, 86, 542–553. [Google Scholar] [CrossRef]
  123. Jenal, A.; Hüging, H.; Ahrends, H.E.; Bolten, A.; Bongartz, J.; Bareth, G. Investigating the potential of a newly developed uav-mounted vnir/swir imaging system for monitoring crop traits—A case study for winter wheat. Remote Sens. 2021, 13, 1697. [Google Scholar] [CrossRef]
  124. Riesch, F.; Komainda, M.; Horn, J.; Isselstein, J. Pastures of the future: Prospects for virtual fencing to promote grazing in European dairy farming. In Proceedings of the Grassland Science in Europe: Meeting Future Demands for Grassland Production, Helsinki, Finland, 19–22 October 2020; Virkajärvi, P., Hakala, K., Hakojärvi, M., Helin, I., Herzon, I., Jokela, V., Peltonen, S., Rinne, M., Seppänen, M., Uusi-Kämppä, J., Eds.; EGF: Helsinki, Finland, 2020; Volume 25, pp. 671–673. [Google Scholar]
  125. Yates, K.L.; Bouchet, P.J.; Caley, M.J.; Mengersen, K.; Randin, C.F.; Parnell, S.; Fielding, A.H.; Bamford, A.J.; Ban, S.; Barbosa, A.M.; et al. Outstanding Challenges in the Transferability of Ecological Models. Trends Ecol. Evol. 2018, 33, 790–802. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  126. Wenger, S.J.; Olden, J.D. Assessing transferability of ecological models: An underappreciated aspect of statistical validation. Methods Ecol. Evol. 2012, 3, 260–267. [Google Scholar] [CrossRef]
Figure 1. Walter–Lieth climate diagram from the German Weather Service (DWD) weather station (Cologne Bonn airport, station ID 2667) for the long-term average (1958–2016) and the sampling years 2018 and 2019.
Figure 1. Walter–Lieth climate diagram from the German Weather Service (DWD) weather station (Cologne Bonn airport, station ID 2667) for the long-term average (1958–2016) and the sampling years 2018 and 2019.
Remotesensing 14 03066 g001
Figure 2. Map of study site Neunkirchen-Seelscheid (50.858986N, 7.312736E), Germany. (Left) Digital surface model. (Right) RGB orthomosaic.
Figure 2. Map of study site Neunkirchen-Seelscheid (50.858986N, 7.312736E), Germany. (Left) Digital surface model. (Right) RGB orthomosaic.
Remotesensing 14 03066 g002
Figure 3. Close-up views of (a) bending and lodging canopy (22 May 2019), (b) rodent activity (23 July 2019), and (c) sampling area (white rectangles) within quadratic plots (3 July 2018).
Figure 3. Close-up views of (a) bending and lodging canopy (22 May 2019), (b) rodent activity (23 July 2019), and (c) sampling area (white rectangles) within quadratic plots (3 July 2018).
Remotesensing 14 03066 g003
Figure 4. Multirotor UAV equipped with MicaSense RedEdge-M and Sony Alpha 7r cameras.
Figure 4. Multirotor UAV equipped with MicaSense RedEdge-M and Sony Alpha 7r cameras.
Remotesensing 14 03066 g004
Figure 5. Acquiring images of the MicaSense RedEdge-M radiometric calibration panel ((left); image credits: Christoph Hütt), and additional gray panels for radiometric assessment and ground control point (right).
Figure 5. Acquiring images of the MicaSense RedEdge-M radiometric calibration panel ((left); image credits: Christoph Hütt), and additional gray panels for radiometric assessment and ground control point (right).
Remotesensing 14 03066 g005
Figure 6. Schematic overview of sampling dates for UAV-based data acquisition and destructive biomass sampling. T0: base model after equalization cut, Ts: sampling date, GP: growing period.
Figure 6. Schematic overview of sampling dates for UAV-based data acquisition and destructive biomass sampling. T0: base model after equalization cut, Ts: sampling date, GP: growing period.
Remotesensing 14 03066 g006
Figure 7. Schematic workflow of data acquisition, processing, feature extraction, and modeling.
Figure 7. Schematic workflow of data acquisition, processing, feature extraction, and modeling.
Remotesensing 14 03066 g007
Figure 8. Violin plots of DMY, N%, and Nup by harvest cut and year. Violins indicate point density, black boxes show the 25. and 75. percentiles, whiskers show the 5 and 95 percentiles, and white circles represent the median.
Figure 8. Violin plots of DMY, N%, and Nup by harvest cut and year. Violins indicate point density, black boxes show the 25. and 75. percentiles, whiskers show the 5 and 95 percentiles, and white circles represent the median.
Remotesensing 14 03066 g008
Figure 9. Correlation matrix (Pearson correlation coefficient) of response variables and prediction features (see Table 2, Table 3 and Table 4 for explanation of abbreviations).
Figure 9. Correlation matrix (Pearson correlation coefficient) of response variables and prediction features (see Table 2, Table 3 and Table 4 for explanation of abbreviations).
Remotesensing 14 03066 g009
Figure 10. Scatterplot of values for MS camera and ASD Fieldspec 3 measurements of reference panels.
Figure 10. Scatterplot of values for MS camera and ASD Fieldspec 3 measurements of reference panels.
Remotesensing 14 03066 g010
Figure 11. Box and whisker plot of R2CV and RMSECV (kg ha−1) of all resamples of linear models (simple linear regression) for herbage yield prediction.
Figure 11. Box and whisker plot of R2CV and RMSECV (kg ha−1) of all resamples of linear models (simple linear regression) for herbage yield prediction.
Remotesensing 14 03066 g011
Figure 12. Box and whisker plot of R2CV and RMSECV (kg ha−1) of all resamples of RF, SVM, and PLS models for DMY prediction (SH: sward height metrics. SB: single bands, VI.rgb: RGB indices, VI.ms: VNIR indices).
Figure 12. Box and whisker plot of R2CV and RMSECV (kg ha−1) of all resamples of RF, SVM, and PLS models for DMY prediction (SH: sward height metrics. SB: single bands, VI.rgb: RGB indices, VI.ms: VNIR indices).
Remotesensing 14 03066 g012
Figure 13. Variable importance plots of best models (10 most important variables shown) for DMY prediction. Variable importance is based on permutation.
Figure 13. Variable importance plots of best models (10 most important variables shown) for DMY prediction. Variable importance is based on permutation.
Remotesensing 14 03066 g013
Figure 14. Box and whisker plot of R2CV and RMSECV (%) of all resamples of linear models (simple linear regression) for N-concentration prediction.
Figure 14. Box and whisker plot of R2CV and RMSECV (%) of all resamples of linear models (simple linear regression) for N-concentration prediction.
Remotesensing 14 03066 g014
Figure 15. Box and whisker plot of R2CV and RMSECV (%) of all resamples of RF, SVM, and PLS models for N-concentration prediction (SH: sward height metrics. SB: single bands, VI.rgb: RGB-indices, VI.ms: VNIR indices).
Figure 15. Box and whisker plot of R2CV and RMSECV (%) of all resamples of RF, SVM, and PLS models for N-concentration prediction (SH: sward height metrics. SB: single bands, VI.rgb: RGB-indices, VI.ms: VNIR indices).
Remotesensing 14 03066 g015
Figure 16. Variable importance plots of best models (10 most important variables shown) for N% prediction.
Figure 16. Variable importance plots of best models (10 most important variables shown) for N% prediction.
Remotesensing 14 03066 g016
Figure 17. Box and whisker plot of R2CV and RMSECV (kg ha−1) of all resamples of linear models (simple linear regression) for N-uptake prediction.
Figure 17. Box and whisker plot of R2CV and RMSECV (kg ha−1) of all resamples of linear models (simple linear regression) for N-uptake prediction.
Remotesensing 14 03066 g017
Figure 18. Box and whisker plot of R2CV and RMSECV (N uptake kg ha−1) of all resamples of RF, SVM, and PLS models for N-uptake prediction (SH: sward height metrics. SB: single bands, VI.rgb: RGB indices, VI.ms: VNIR indices).
Figure 18. Box and whisker plot of R2CV and RMSECV (N uptake kg ha−1) of all resamples of RF, SVM, and PLS models for N-uptake prediction (SH: sward height metrics. SB: single bands, VI.rgb: RGB indices, VI.ms: VNIR indices).
Remotesensing 14 03066 g018
Figure 19. Variable importance plots of best models (10 most important variables shown) for N-uptake prediction.
Figure 19. Variable importance plots of best models (10 most important variables shown) for N-uptake prediction.
Remotesensing 14 03066 g019
Figure 20. Box and whisker plot of RMSECV of all resamples of the reduced feature sets and all features included (random forest) for DMY, N%, and Nup predictions. The ten best features based on variable importance of the full model are included in the reduced feature set.
Figure 20. Box and whisker plot of RMSECV of all resamples of the reduced feature sets and all features included (random forest) for DMY, N%, and Nup predictions. The ten best features based on variable importance of the full model are included in the reduced feature set.
Remotesensing 14 03066 g020
Table 1. Band specifications of the MicaSense RedEdge-M camera. RedEdge-M user manual [64].
Table 1. Band specifications of the MicaSense RedEdge-M camera. RedEdge-M user manual [64].
Band NumberBand NameCenter Wavelength (nm)Bandwidth FWHM (nm)
1Blue47520
2Green56020
3Red66810
4Near IR84040
5Red Edge71710
Table 2. Vegetation indices derived from the visible-to-near-infrared spectral region.
Table 2. Vegetation indices derived from the visible-to-near-infrared spectral region.
NameEquationApplicationReference
Normalized Difference Vegetation Index NDVI = NIR R NIR + R Greenness, green biomass, phenology[67]
Green Normalized Difference Vegetation Index GNDVI = NIR G NIR + G Green biomass, N concentration, LAI[68]
Blue Normalized Difference Vegetation Index BNDVI = NIR B NIR + B Greenness, green biomass, phenology[69]
Optimized Soil-Adjusted Vegetation Index OSAVI = 1 + 0.16 NIR R NIR + R + 0.16 Green biomass, photosynthesis rate[70]
Modified Soil-Adjusted Vegetation Index MSAVI = 0.5 2 NIR + 1 ( 2 NIR + 1 ) 2 8 NIR R Green biomass, photosynthesis rate[71]
Modified Chlorophyll Absorption in Reflectance Index 1MCARI1 =
1.2 2.5 NIR R 1.3 NIR G
Chlorophyll concentration, plant stress, photosynthesis rate[72]
Enhanced Vegetation Index EVI = 2.5 NIR R NIR + 6 R 7.5 B + 1 Green biomass, greenness, phenology[73]
Normalized Difference Red Edge Index NDREI = NIR RE NIR + RE Chlorophyll[74]
Renormalized Difference Vegetation Index RDVI = NIR RE NIR + RE Green biomass[75]
Simple Ratio SR = NIR R Green biomass[76]
(Simplified) Canopy Chlorophyll Content Index CCCI = NIR RE NIR + RE / NIR R NIR + R Chlorophyll concentration, photosynthesis[77] modified by [78]
Modified Triangular Vegetation Index 2 MTVI 2 = 1.5 1.2 NIR G 2.5 R G 2 NIR + 1 2 6 NIR 5 R 0.5 Green biomass[72]
Modified Simple Ratio MSR = NIR R + 1 / NIR R + 1 Green biomass[79]
Near-Infrared to Red Edge Ratio NIR . RE = NIR RE Chlorophyll, N[80]
Red Edge to Red Ratio RE . R = RE R Chlorophyll, N[80]
Table 3. Vegetation indices derived from the visible spectral region.
Table 3. Vegetation indices derived from the visible spectral region.
NameEquationApplicationReference
Normalized Green Red Difference Index NGRDI = G R G + R Green biomass[81]
Plant Pigment Ratio Index PPRI = G B G + B Chlorophyll[82]
Red Green Blue Vegetation Index RGBVI = G 2 R B G 2 + R B Green biomass[30]
Visible Atmospherically Resistant Index VARI = G R G + R B Green biomass[83]
Table 4. Feature sets.
Table 4. Feature sets.
NameDescriptionFeatures Included
SHSward height metricsSHmean, SHmin, SHmax, SHp90, SHp75, SHp50, SHp25
VI.msVegetation indices visible to near-infrared spectrumSee Table 2
SBSingle bands of the Micasense RedEdge-MBlue, green, red, red edge, near-infrared (B, G, R, RE, NIR)
VI.rgbVegetation indices visible spectrumSee Table 3
Table 5. Descriptive statistics for the response variables DMY, N%, and Nup per year (max: maximum; min: minimum; sd: standard deviation).
Table 5. Descriptive statistics for the response variables DMY, N%, and Nup per year (max: maximum; min: minimum; sd: standard deviation).
DMY kg ha−1N%Nup kg N ha−1
YearMeanMaxMinsdMeanMaxMinsdMeanMaxMinsd
2018150333763007912.995.061.670.88431081024
2019118932441587963.014.921.700.6435107523
Table 6. Error report of the photogrammetric data processing in Metashape for each UAV campaign: Sony Alpha 7r for canopy height feature’s calculation; MicaSense Rededge-M for spectral features (only for sampling dates). T0: date of base model acquisition; TS: sampling date; GP: growing period.
Table 6. Error report of the photogrammetric data processing in Metashape for each UAV campaign: Sony Alpha 7r for canopy height feature’s calculation; MicaSense Rededge-M for spectral features (only for sampling dates). T0: date of base model acquisition; TS: sampling date; GP: growing period.
YearGPDateError (cm) Sony α 7rError (cm) MS RedEdge-M
XYZXYZ
20181T0 (April 24)1.501.260.72---
TS (May 25)0.971.431.020.891.380.42
2T0 (May 29)2.721.830.62---
TS (May 02)1.961.570.500.801.280.35
3T0 (July 12)3.971.870.98---
TS (October 10)1.501.540.570.991.260.48
20191T0 (April 17)1.651.600.93---
TS (May 22)1.601.660.991.021.000.22
2T0 (June 18)2.881.470.58---
TS (August 05)1.251.440.681.101.220.34
3T0 (September 02)6.743.931.43---
TS (October 14)3.872.430.770.941.390.23
Table 7. Coefficient of determination R2 and RMSE (% reflectance) of the MS RedEdge-M camera and ASD Fieldspec 3 measurements of reference panels for four sampling dates.
Table 7. Coefficient of determination R2 and RMSE (% reflectance) of the MS RedEdge-M camera and ASD Fieldspec 3 measurements of reference panels for four sampling dates.
20182019
May 25July 2May 22October 14
R2RMSER2RMSER2RMSER2RMSE
Blue0.740.170.580.220.710.200.710.21
Green0.890.090.920.090.900.070.940.12
Red0.880.130.690.190.830.150.800.19
Red Edge0.930.081.000.020.950.060.960.11
NIR1.000.091.000.101.000.061.000.15
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Lussem, U.; Bolten, A.; Kleppert, I.; Jasper, J.; Gnyp, M.L.; Schellberg, J.; Bareth, G. Herbage Mass, N Concentration, and N Uptake of Temperate Grasslands Can Adequately Be Estimated from UAV-Based Image Data Using Machine Learning. Remote Sens. 2022, 14, 3066. https://doi.org/10.3390/rs14133066

AMA Style

Lussem U, Bolten A, Kleppert I, Jasper J, Gnyp ML, Schellberg J, Bareth G. Herbage Mass, N Concentration, and N Uptake of Temperate Grasslands Can Adequately Be Estimated from UAV-Based Image Data Using Machine Learning. Remote Sensing. 2022; 14(13):3066. https://doi.org/10.3390/rs14133066

Chicago/Turabian Style

Lussem, Ulrike, Andreas Bolten, Ireneusz Kleppert, Jörg Jasper, Martin Leon Gnyp, Jürgen Schellberg, and Georg Bareth. 2022. "Herbage Mass, N Concentration, and N Uptake of Temperate Grasslands Can Adequately Be Estimated from UAV-Based Image Data Using Machine Learning" Remote Sensing 14, no. 13: 3066. https://doi.org/10.3390/rs14133066

APA Style

Lussem, U., Bolten, A., Kleppert, I., Jasper, J., Gnyp, M. L., Schellberg, J., & Bareth, G. (2022). Herbage Mass, N Concentration, and N Uptake of Temperate Grasslands Can Adequately Be Estimated from UAV-Based Image Data Using Machine Learning. Remote Sensing, 14(13), 3066. https://doi.org/10.3390/rs14133066

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop