Next Article in Journal
Surface Characteristics, Elevation Change, and Velocity of High-Arctic Valley Glacier from Repeated High-Resolution UAV Photogrammetry
Next Article in Special Issue
Recognition of the Bare Soil Using Deep Machine Learning Methods to Create Maps of Arable Soil Degradation Based on the Analysis of Multi-Temporal Remote Sensing Data
Previous Article in Journal
Robotic Mapping Approach under Illumination-Variant Environments at Planetary Construction Sites
Previous Article in Special Issue
Assimilation of Wheat and Soil States into the APSIM-Wheat Crop Model: A Case Study
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Sentinel-2 Data and Unmanned Aerial System Products to Support Crop and Bare Soil Monitoring: Methodology Based on a Statistical Comparison between Remote Sensing Data with Identical Spectral Bands

1
Department of History and Cultures (DiSCi)-Geography Section, University of Bologna, Via Guerrazzi 20, 40125 Bologna, Italy
2
Department of Earth and Environmental Sciences (DSTA), University of Pavia, Via Ferrata 1, 27100 Pavia, Italy
3
Department of Civil, Chemical, Environmental and Materials Engineering (DICAM), University of Bologna, Viale Risorgimento 2, 40136 Bologna, Italy
4
Italian Institute for Environmental Protection and Research (ISPRA), Via Vitaliano Brancati 48, 00144 Rome, Italy
*
Author to whom correspondence should be addressed.
Remote Sens. 2022, 14(4), 1028; https://doi.org/10.3390/rs14041028
Submission received: 14 January 2022 / Revised: 16 February 2022 / Accepted: 18 February 2022 / Published: 20 February 2022

Abstract

:
The growing need for sustainable management approaches of crops and bare soils requires measurements at a multiple scale (space and time) field system level, which have become increasingly accurate. In this context, proximal and satellite remote sensing data cooperation seems good practice for the present and future. The primary purpose of this work is the development of a sound protocol based on a statistical comparison between Copernicus Sentinel-2 MIS satellite data and a multispectral sensor mounted on an Unmanned Aerial Vehicle (UAV), featuring spectral deployment identical to Sentinel-2. The experimental dataset, based on simultaneously acquired proximal and Sentinel-2 data, concerns an agricultural field in Pisa (Tuscany), cultivated with corn. To understand how the two systems, comparable but quite different in terms of spatial resolution and atmosphere impacts, can effectively cooperate to create a value-added product, statistical tests were applied on bands and the derived Vegetation and Soil index. Overall, as expected, due to the mentioned impacts, the outcomes show a heterogeneous behavior with a difference between the coincident bands as well for the derived indices, modulated in the same manner by the phenological status (e.g., during the canopy developments) or by vegetation absence. Instead, similar behavior between two sensors occurred during the maturity phase of crop plants.

1. Introduction

For several years, agriculture has been called upon to play its role as a supplier of food and other fundamental raw materials in a context of environmental sustainability and climate change [1,2,3]. This emergency was announced in the reports of the biggest international organizations, such as the Food and Agriculture Organization (FAO) and the United Nations (UN) [4,5,6]. As highlighted by Schiavon et al. (2021) [7], user needs of national and international institutional communities (i.e., policy makers, environmental agencies, regional government and local authorities) as well as private entities (i.e., insurance companies, businesses) require value-added information related to cropland mapping and phenological study, to cope with Common Agricultural Policy (CAP) 2023–2027 objectives, as sustainable agriculture production and natural resources management. For more than 40 years, remote sensing (RS) has contributed to the evolution of agricultural practice to support agricultural policy and economics. Exploiting various types of sensor in optical, thermal, and microwave spectral domains that are mounted on aircraft, satellites, drones or tractors, RS provides repetitive information on crop and soil status throughout the several seasonal steps, at different scales and for different actors [8,9]. Furthermore, the European Commission (EC), beginning in 2018 [10], has encouraged the use of new technologies such as Copernicus Sentinel data, integrated with EGNOS/Galileo, UAS data, geotagged photographs of EO products, possibly taking advantage of Copernicus In situ Component, for the control and granting of CAP payments by local authorities. This promotes open data with a common data-sharing approach. Nowadays, the technological innovation required in agricultural management, so-called smart farming [11], concerns the use of Information and Communication Technology (ICT) [12], big-data [13], Artificial Intelligence (AI) tools, specific sensors and integrated procedures of satellites and UAS data [14], to monitor crop growth and development, plant health, productivity and to manage nutrient and water use optimization programs. This is a great opportunity to boost the free and open market of Copernicus, the EO program of the European Union, as demonstrated by the analysis of the users’ technical and operational requirements for the expansion of the European space component and its services in different application domains [15].
Regarding the use of satellite and UAS data, many applications have been tested in precision agriculture (PA) [16,17,18]. The first satellite products emerged in the 1970s [19], although the spatial resolution was very low, such as Landsat data [20,21]. However, more recent advances in high-resolution satellites with a larger number of wavebands, as well as the advent of new satellite constellations such as Geo-Eye [22] or WorldView [23], have greatly facilitated the application of the technological achievements of remote sensing studies [8,9]. Nevertheless, unfavorable weather conditions, the timeliness of observation, high prices or not having suitable spatial resolution, still represent the most critical aspects in their practical applications for smart farming [9]. On the other hand, the UAS, also defined as proximal sensing, are applied inside the PA [24,25], through the redefinition of several techniques, from a soil plant analysis development (SPAD) method [26] to fluorescence sensing for nitrogen deficiencies [27,28]. UASs offer a series of advantages such as better spatial resolution, the ease of reaching narrow and difficult places accessible to humans and the ability to program the acquisition date [29]. On the contrary, the limitations of UAS are its not very extended tiles, the need for ongoing maintenance of its mechanical parts, problems with the battery or fuel, the short flying time and the small payload.
Considering that the two systems are potentially complementary, several studies have taken into account the hypothesis of integrating satellite and UAS data within the same time series [30]. The comparison of the results of certain elaborations carried out on the two remote sensing products suggest the possibility to integrate both data sources [31,32,33,34] or they specify particular use conditions [35,36,37,38,39]. In some cases, however, the spatial and spectral characteristics of the two considered systems are very different and/or the comparison method applied, such as linear regression only, might not consider all the variables involved.
The study presented focuses on a statistical comparison between data acquired by two multispectral instruments characterized by the same spectral bands, i.e., Sentinel-2 satellite sensor (S2) and MAIA S-2 (MS2) camera (SAL Engineering S.r.l.), mounted on an unmanned aerial vehicle. The purpose was to evaluate the feasibility of robust synergistic use of the two sensors, verifying the correspondence between the S2 and MS2_10 products. The ability to generate more information through sensor cooperation was analyzed and adapted to the crops and bare soil recognition. As already experimented by other authors in different contexts [40,41], the favorable and unfavorable conditions of UAS and satellite data synergistic use were studied. Therefore, the procedure provided for a statistical quantification of eventual differences found and to evaluate under which conditions they can be considered acceptable. The derived advantage is a greater awareness in the application of mixed time series to monitor and to discriminate crops and bare soil. As well as S2 [42,43,44,45], MS2 also found several applications due to its centimetric spatial resolution as PA, crop classification and assessment [46,47,48]. Using a corn crop field as a case study, the accurate statistical comparison between S2 and MS2 imagery was based on the Normalized Difference Vegetation Index (NDVI), [49] and the Soil Adjusted Vegetation Index (SAVI), [50]. Several studies are based on these vegetation indexes for different purposes, such as estimating living biomass in semi-arid areas [51] in pasture fields [52], identifying tree infections [53], assessing crop hail damage [54], comparing two indexes’ reliability with different land use [55] and to cross-compare between two different sensor products [56,57]. Considering different crop vegetal cover density and their time evolutions, the indices were compared using the Welch‘s test [58]. This test can perform when sample variances are not equal and allows easy inferences to be made using a null hypothesis based on the mean. Then, to better understand the results, the same test was also applied on corresponded Near Infrared (NIR) and Red bands.
In the end, as suggested by other studies [31,37,59,60], the linear regression between S2 and MS2 index values was calculated to verify the reliability of its use in the various scenarios analyzed.

2. Study Area and Materials

2.1. Study Area

The study area is located in Madonna dell’Acqua in the municipality of San Giuliano Terme (Pisa), a few kilometers north of Pisa (Figure 1). The field selected for the joint survey (S2-MS2) was cultivated in 2019 with corn (Zea mays) in an area of about 10 ha.
The soils that characterize the area, and in particular the studied field, are mainly Eutric Cambisols, according to the World Reference Base (WRB) taxonomy [61]. These are relatively young soils, featuring a more or less evolved subsurface horizon that presents pedogenetic alterations of a chemical/physical nature. These occurred on site (cambic horizon), and are not due to illuviation processes. The soils are very deep, with an Ap-Bw profile (horiz. Cambico) -Bg. They are non-gravelly, with a sandy to silty texture, are non-calcareous to slightly calcareous, with a neutral to moderately alkaline reaction, with saturation in bases very high and are moderately well drained. The field has conventional transversal drainage channels for the entire extension, with an average depth compared to the ground level of about 50 cm.

2.2. Data

Sentinel-2 comprises two polar-orbiting satellites, in the same sun-synchronous orbit: Sentinel-2A, launched in 2015, and Sentinel-2B, launched in 2017. They carry an optical instrument, called MultiSpectral Instrument (MSI), that collects energy in 13 different spectral bands with several spatial resolutions (Table 1). The data used were downloaded from the Theia web page (https://www.theia-land.fr/en/satellite-data/ (accessed on 30 September 2020). Theia is a Scientific Expertise Center conducting research on remote sensing, such as mobilization satellites and airborne and in situ data on ‘continental surface’ issues.
The S2 data used are level 2-A. These data were corrected for atmospheric effects using MAJA software [62]. This was developed by the Centre d’Etudes Spatiales de la Biosphère (CESBIO), which developed the methods and a prototype, and by the Centre National d’Etudes Spatiales (CNES) which funded the operational version of the processor. In the case of the S2 satellite, the first step consists in estimating and subsequently correcting for gaseous absorption. The next step is the creation of a composite image with the unclouded pixels from the processed data used as a reference for cloud detection [63]. After a multitemporal test, which is the sign of cloud presence, a final test measures the correlation of the pixel neighborhood in relation to the previous images. If a large correlation is observed, the pixel is, ultimately, not declared as a cloud. After the aerosol optical thickness (AOT) is estimated [64], it is possible to retrieve the surface reflectance, using look-up tables (LUT). Before editing the output image, the software corrects the adjacency effects and the effects of terrain slopes on illumination [65]. The S2 radiometric resolution is 12 bit, meaning the sensor can record 4096 energy levels. This range is converted into floating point type (from 0 to 1). S2 pixel values make ‘original’ reflectance × 10,000 (factor scale), but, at times, this value may be higher than 10,000 due to specific angular reflectivity effects [66].
MAIA S-2 is a multispectral camera that permits the simultaneous acquisition of high-resolution images at various wavelength intervals in Visible (VIS) and NIR electromagnetic spectrum regions. In particular, it is equipped with the same first nine wavelength intervals of the S2 satellite (Table 1). The MS2 system comprises of an array of nine monochromatic sensors that collect energy in unison thanks to global shutter technology, permitting it to be acquired ‘one-off’ for synchronized multiband measurements [67]. It was mounted on an UAV: a DJI S-900. For each performed survey, a single multispectral orthophoto was generated using the 300 acquired images. The orthorectification process was carried out using photogrammetric processing with Metashape-Agisoft software. The software uses Structure from Motion algorithms for the homologous point definition and for the Bundle Adjustment application based on the Ground Control Point (GCP) use, [68]. All flights were programmed and performed in the same way: an altitude of 100 m Above Ground Level (AGL) and strips able to ensure longitudinal overlap of over 80% and a lateral overlap of about 35% between the frames. For each frame, the Ground Sample Distance (GSD) equal to 4.7 cm and the footprint of 63 m × 48 m were due to the combination of study area size (9.1 ha), its almost rectangular shape and multispectral camera sensors of dimensions 4.8 mm × 3.6 mm with pixel size of 3.75 microns. For correct Bundle Adjustment process application, 8 GCPs were placed on the ground (Figure 1), uniformly distributed over the entire study area. All the GCPs, clearly visible on the images, were measured with dual frequency GNSS (Global Navigation Satellite System) geodetic instrumentation (Topcon GB500) equipped dual-frequency geodetic antennas (Topcon PG-A1 L1/L2). To guarantee centimeter accuracy (1–2 cm in the three dimensions), the single base static GNSS method was used. The reference instrument (Master Station) was positioned near the study area (maximum distance from the GCPs of less than 1 km).
To compare MS2 and S2 data, the S2 radiometric resolution was reduced from 12 bit to 8 bit. To achieve this result, the value of each DN was divided by 10,000 and then multiplied by 255 using a Raster Calculator in QGIS. To guarantee an accurate comparison between the two instruments, three acquisition dates were chosen based on the corn phenological cycle. For June and August there was a perfect match between MS2 and S2 with about half an hour’s difference between the two flights. Unfortunately, for July, the contemporaneity of the two acquisitions was hampered as the satellite data was affected by the presence of clouds. The available date closest to the proximity acquisition of the satellite data without cloud cover was therefore chosen (Table 2).
Before the statistical test application, to also make S2 and MS2 data geometrically comparable, the MS2 spatial resolution was resampled from 0.047 m to 10 m, using a grid framework that derived from S2. The MS2 resampled data will be later called MS2_10.

3. Methods

3.1. Georeferencing

The S2 orthoimages are in UTM-WGS84 projection [69]. Concerning MS2, to ensure final centimeter accuracy (1–2 cm), the Master Station point geodetic coordinates were determined by a long-term static GNSS survey based on the connection to the three EUREF network permanent stations: GENO, IGMI, ELBA. The chosen geodetic reference system was IGb14—epoch 2019.5, projected in the UTM-WGS84 cartogaphic system. For the processing of GNSS data of all three surveys, the precise ephemeris distributed by IGS (International GNSS Service) was always used. The final orthoimage accuracy, confermed by some GCPs not used for the Bundle Adjustment calculation, was always less than 5 cm, i.e., less than the size of the GSD. This result, also in relation to the GCP number choice, is in agreement with that reported by [70].
A quality control of the relative georeferencing between the MS2_10 and S2 images was required. In fact, if the georeferencing was not correct, the value measured in the pixel ( i , j )   of the first data would not correspond to that contained in the pixel ( i , j )   of the second data as they would cover different portions of land. Therefore, to achieve the correct overlap, it would be necessary to shift one pixel in relation to the other by a certain lag ( u , v ) . The square of the radiometric intensity difference (Equation (1)) is a measure of the fit between an image f ( x , y )   and a feature t ( x , y ) shifted by ( u , v ) :
d f , t 2 ( u , v ) = x , y [ f ( x , y ) t ( x u , y v ) ] 2  
where the ∑ is extended on the window containing the feature t translated by ( u , v ) . To identify the minimum of the distance d f , t 2 ( u , v ) , Lewis (1995) [71] proposed a robust algorithm that search the maximum of the Normalized Cross-Correlation matrix γ ( u , v ) , (Equation (2)):
γ ( u , v ) = x , y [ f ( x , y ) f   ¯ u , v ] [ t ( x u , y v )   t ¯ ] x , y [ f ( x , y ) f   ¯ u , v ] 2 x , y [ t ( x u , y v )   t ¯ ] 2
where:
f ( x , y ) is an image;
t ( x , y ) is a feature;
t ¯ is the mean of the feature;
f   ¯ u , v   is the mean of f ( x , y )   in the region under the feature.
The sum is over x ,   y under the window containing the feature t positioned at ( u , v ) . The maximum of γ ( u , v ) corresponds to the feature’s most right index [71].
The calculation was performed by comparing the MS2_10 and S2 bands over the study area in the three different phenological stage [72]. The maximum of the cross-correlation matrix normalized was obtained for indexes (8,8) i.e., for a lag (0, 0), considering that the image size is an 8 × 8 pixel. This means that no shifts improve the correlation achieved in georeferencing.
Figure 2 shows an example of the values obtained for the correlation coefficients matrix, for the observable NIR (June date).

3.2. Data Accuracy

For each band, the slightest noticeable radiometric difference between S2 and MS2_10 data depends on the acquisition accuracy level of the two sensors.
The MS2_10 translates the received radiation intensity (photons) into a discrete DN representable on 256 radiometric levels. This range was converted into floating point type, from 0 to 1. Numerous experimental tests, implemented by the authors with technicians from SAL Engineering of Modena (Italy), showed that the MS2 (8 bit) measurement have a repeatability of one bit. The tests were carried out on few hundred images defining the black level of all the CMOS sensors that make up the MAIA-S2 camera. The optical black pixel method [73] was used to define the dark voltage value. The test results have confirmed the stability of DN in correspondence of all the pixels of images, with a black level constant value of DN = 9. Therefore, we can assume an accuracy of τ = 1/256~0.004. This value is considered the resolution of the measurement, i.e., the minimum discriminable quantity.
Concerning S2, the original signal discrimination occurs at 12 bits but a resampling at 8 bits was carried out. This range was converted into floating point type, from 0 to 1. Similar considerations to the previous ones can be made about S2 radiometric accuracy.
Based on the assumed hypothesis, the differences between S2 and MS2 measures were significant if they exceeded τ’= 0.005.

3.3. Test Area Choice and Vegetation Index Applications

Significant within-field spatial variability exists in factors that influence crop yield; this variability can be measured [74]. Consequently, it is important that the resulting information is as correct as possible in order to be used to improve crop management practices, in relation to increasing profit and decreasing environmental impact. In order to analyze several vegetation spatial distribution scenario, three test areas with a different percentage of vegetation cover were identified inside the corn crop field. The areas had to be characterized by a high, medium and low vegetation percentage, respectively. For this purpose, a procedure based on the chlorophyll content map was applied. The chlorophyll content was used as a parameter of maturity and quality of food crops [75,76]. Several authors have showed that reflectance in the green and/or red-edge spectral regions is an optimal parameter for non-destructive estimation of leaf chlorophyll content in a wide range of its variation [77,78,79,80]. They used different vegetation indexes, such as the MERIS Terrestrial Chlorophyll Index (MTCI), [81], Enhanced Vegetation Index (EVI2), [82], and Red-edge Chlorophyll Index (CIRed-Edge), [80,83]. In this paper, the considered vegetation index was the green Chlorophyll Index (CIgreen, Equation (3)), [75]:
CI green = ρ NIR ρ Green 1
where ρ NIR and ρ Green are NIR and green reflectance, respectively.
As showed by [84,85,86], this index has a strong linear correlation with chlorophyll content at the leaf scale.
The procedure applied a moving window over the CIGreen map and calculated the statistical parameters on the window elements. In this way, the characteristics of homogeneity and intensity were calculated in correspondence with the neighborhood of the current position (i, j). The calculated statistical parameters are the average for the intensity and standard deviation for dispersion. The association of the vegetal percentage was based on the histogram of the average and standard deviation values referred at the floating window centers.
Several suitable positions have been identified. These showed the desired characteristics, i.e., high value and low dispersion of CIGreen (area A), low intensity and high dispersion of CIGreen (area C), and intermediate conditions for area B. Then, an expert operator chose the three definitive zones from the suitable positions. He defined the test area boundaries based on expeditious field surveys and a photo-interpretation of the MS2 data. Square areas measuring 80 × 80 m (equivalent to an 8 × 8 pixel) were chosen to facilitate subsequent comparisons.
Then, NDVI [49] and SAVI [50] indices (Equations (4) and (5)) were applied both on S2 and MAIA_S2 data. For both indices, the bands used were B4 (Red) and B8 (NIR) bands for Sentinel-2 and S4 (Red) and S8 (NIR) bands for MAIA S2_10.
NDVI = NIR Red NIR + Red
Concerning SAVI index, the Huete equation was used:
SAVI = [ NIR Red NIR + Red + L ] ( 1 + L )
where:
L = 0.75 for June data and A, B, C areas;
L = 0.50 for July data and B, C areas;
L = 0.25 for July data and A area.
Considering that the proximity data were resampled to the S2 data resolution, the choice of L values was based on their predominant use in the satellite data processing [50,87]. Considering the total vegetation cover showed in August data, the SAVI was not calculated for this epoch.

3.4. Statistical Test

When the index differences exceeded τ’, it was necessary to evaluate their statistical significance relating their value to the associated uncertainty level.
After the three test areas’ identification with different homogeneous vegetation cover, the obtained radiometric parameters from the S2 and MS2_10 vegetation index values were compared. The comparison is validated by statistical tests.
The radiometric homogeneity of each test area allowed its 64 pixels (n) belonging to the same statistical distribution to be considered. The greater the test area homogeneity, the greater the robustness of this hypothesis. Therefore, both for the 64 S2 pixels and for MS2_10, sample mean and variance (   x ¯ k , s k 2 ) were calculated (Equation (6)):
  x ¯ k = i n x k i n ;   s k 2 = i n ( x k i   x ¯ k ) ( n 1 )
And s x k ¯ was the standard deviation associated with the sample mean (Equation (7)):
s s k ¯ = s k n
Between the two means with variance estimated from the samples they belong to, the equality test was based on the null hypothesis of the expected values equality (Equation (8)). Namely, the expected value of the difference between corresponding values was considered equal to zero. In this context, the corresponding values were related to pixels of equal position (Equation (8)):
H 0 : E {   x ¯ 1 } = E {   x ¯ 2 } ,   H 0 : E {   x ¯ 1   x ¯ 2 } = 0
where:
k = 1 for S2
k = 2 for MS2_10.
The hypothesis was tested through Welch’s t-test. The null hypothesis (Equation (8)) based on the mean offered an intuitive interpretation. The nonparametric tests based on the comparison of ranks, and which generally have less power, have not been chosen. Tests based on trimmed means have the disadvantage that the amount of trimming must be arbitrarily chosen. On the other hand, the Welch test requires the normality of the distribution of the samples, which does not always occur [88].
The considered statistic was the ratio (Equation (9)) between the difference of the index means and its standard deviation estimated by the two samples (Equation (10)):
t υ = d   x ¯ s d   x ¯
d   x ¯ =   x ¯ 1   x ¯ 2 ;   s d   x ¯ = s 1 2 n 1 + s 2 2 n 2
where, with i = 1,2:
x ¯ i = sample mean;
s i = sample standard deviation;
n i = sample size.
In the Welch’s t-test, the degrees of freedom associated is approximated given by Equation (11):
υ ( s 1 2 n 1 + s 2 2 n 2 ) 2 s 1 4 ( n 1 2   υ 1 ) + s 2 4 ( n 2 2   υ 2 ) ;   υ i = n i 1
Considering that the two compared samples had the same element number n 1 = n 2 = n , the Equations (9) and (11) have become:
t υ = n (   x ¯ 1   x ¯ 2 ) s 1 2 + s 2 2 ;
  υ = ( n 1 ) ( s 1 2 + s 2 2 ) ( s 1 4 + s 2 4 )
The Welch test t variable follows to Student t distribution at ν degree of freedom (Equations (12) and (13)). The t ν   value was compared with the assumed critical value t Wc corresponding to a confidence chosen level.
It is underlined that the previous test concerned the difference between the average values and did not consider the oscillation of the differences around the average observed in the various pixels. Therefore, the differences between the averages reflected effects obviously affecting the entire area.
The used s d x ¯ can help to evaluate the fluctuation extent of the values in the various pixels around the average of the considered area.

3.5. Linear Regression

The evaluation of the correlation between the values obtained by the two sensors in the corresponding pixels was performed by linear regression (according to the least squares procedure). For each regression, Y = p 1   X + p 0   the following quantities were calculated: regression parameter p 1 and p 0 ; their standard deviations σ p 1 ,   σ p 0 ; the standard error of the regression σ 0 ; correlation coefficients; the determination coefficient R2 and linear correlation coefficients ρ .

4. Results

4.1. Test Areas

To identify several potential test areas, a procedure was implemented. It was based on the statistical parameter calculation of the pixels contained in a moving window on the image. The moving window was applied on the MS2_10 CIGreen map that originated from the MS2 data, characterized by higher geometric resolution (GSD 4.7 cm). Only the original July map was used to recognize the three scenarios due to it including a nonhomogeneous spatial vegetation distribution overall. On the other hand, the June and August MS2 CIGreen maps were not suitable: the first showed mainly bare soil due to the majority of corn plants not yet to having sprouted in that month; the second map showed a homogenous plant distribution over the whole crop field and the differentiation among vegetation densities was not possible.
In order to quantify and to discriminate the different vegetation distributions, the considered criteria were: high average (>4.4) and low dispersion (<1.4) identify area A, low mean (<4.0) and low dispersion (<1.1) identify area C; mean values included from 4.05 and 4.15 and high dispersion values (>1.75) describe area B. These values were chosen by analyzing the histograms of mean and dispersion of the CIGreen in the moving windows.
Figure 3 shows the pixels that satisfied both previous conditions (in yellow) in the corn field. To simplify the moving window application, the field surface was included in a larger rectangular area (in purple).
The agronomic expert operator then defined the final size of the test areas around the identified pixels, i.e., 80 × 80 m (8 × 8 S2 pixels), (Figure 4). Concerning test area B, to clearly separate it from test areas A and C, the expert preferred to consider the northernmost zone among those available.

4.2. Vegetation Index and Statistical Test Results

For each analyzed stage and for each test area, the NDVI and SAVI averages were calculated on the 64 vegetation index values included in each test area. For each average value, the corresponding root-mean-square error (r.m.s.e.) values were reported (Table 3).
The S2—MS2_10 vegetation index differences were calculated based on these average values (Table 3). When these differences exceeded the minimum appreciable radiometric resolution threshold τ’, the equality hypothesis of their expected values was tested. To confirm the reliability of the Welch’s test, it was applied on all test areas and all stages.
The difference values were compared with the assumed critical value t Wc and the hypothesis of equality of the means is accepted for tv <   t Wc = 2.6 (confidence level 99%), (Table 3).
To better understand the results obtained from the indices, the Welch’s test was also applied to the bands used, NIR and Red.
For each analyzed epoch and for each test area, the NIR and Red averages were calculated on the 64 vegetation index values included in the test area. For each average value, the corresponding root-mean-square error (r.m.s.e.) values were reported (Table 4). Skewnness values were also shown in the same table. The skewness values are generally low and comparable for both samples, except for some MS2_10 Red samples.
The S2—MS2_10 band differences were calculated based on these average values (Table 4). When these differences exceeded the minimum appreciable radiometric resolution threshold τ’, the equality hypothesis of their expected values was tested. To confirm the reliability of the Welch’s test, it was applied on all test areas and all epochs.
The difference values were compared with the assumed critical value t Wc and the hypothesis of equality of the means is accepted for t ν < t Wc = 2.6 (confidence level 99%), (Table 4).
The results of Table 4 show two anomaolous cases where the equality hypothesis was not accepted, despite the corresponded differences being lower than the significance level τ’: B and C test area, for the Red band and August epoch. In these cases, the two data were assumed to be equivalent.

4.3. Linear Regression: MS2_10 NDVI vs. S2 NDVI

The regressions of MS2_10 NDVI values towards those of S2 NDVI present in the corresponding pixels were performed.
Linear regression estimates the linear law Y = p 1   X + p 0 which transforms X measures (S2) into Y measures (MS2_10). The functional relationship is all the more reliable the less the experimental points are dispersed around the regression, i.e., the lower the value σ 0 and the parameter’s standard deviation σ p 1 ,   σ p 0 .
Figure 5 shows some interesting results compared to the Welch’s test finding, as commented in the Discussion section. The graphs report the NDVI value points (S2, MS2_10), the estimated regression line and the lines representing the expected range for a new observation at the confidence level of 0.99. For the same samples, Table 5 shows the estimate of the parameters ( p 1 ,   p 0 ) , their uncertainties ( σ p 1 ,   σ p 0 ) , the standard error of the regression ( σ 0 ) and the determination and linear correlation coefficients (R2, ρ ).

5. Discussion

Beginning from a short temporal series of proximal measurements derived from a UAV platform sensor compliant to the S2 (MAIA-S2, par. 2.2), a comparison analysis was developed between these data and the mostly contemporary S2 dataset. The study aimed to highlight their temporal behaviors through a robust statistical analysis between the two data and their derived VIs in a corn field (Figure 1 and par. 2.1), from the flowering to the ripening phase (June-August 2019). The main purpose was to verify the relationship between the dynamics of the canopy and the bare soil component, useful for precision agriculture approaches and to optimize the cropping systems in a more sustainable way.
Considering the most resolute S2 pixel size (e.g., 10 × 10 m), and the field features (e.g., phenological phase, microtopography, moisture and agriculture practices) it is realistic to expect that an ever-different mixture of canopy, bare soil and other elements affected the radiance signal provided by individual pixels during the research time span. In this perspective, the resampling of the proximity MS2 data (almost 5 cm) in the S2 satellite data resolution (10 m) was carried out. Precisely, the desire to study a proximity data role in a regional-scale analysis, compared to the mid-resolution, prompted the use of this uncommon approach [37,89]. This choice was made as many crop field activities are carried out on a regional scale, and will be even more so in the future, through free spatial mid-resolution satellite data and with temporal resolution of a few days (e.g., S2 and Landsat). Instead, the proximity data, even if more suitable for local scale studies, are more expensive to acquire and to process, both in time and economic terms.
After the MS2 data resampling (generation of the MAIA-S210 product), the quality of the relative georeferencing of the two data types (par. 2.2.1) was evaluated to correctly proceed with the next experimental design operations. The spatial optimal combination, in this case probably derived from the Madonna dell’Acqua site conditions, i.e., horizontal surface without rugged topography, require that the UAV data acquisition be carried out with the highest reference standards.
The strategy to select three A, B, C test sub-areas (par. 3.1) allowed the adequate exploitation of the homogeneity of pixel radiometric characteristics within the areas and to represent, by their sum, the heterogeneity of the patterns of crop and bare soil in the field under observation. This condition corresponded to the parametric test base hypothesis, i.e., that the measurements were realizations of a single variable. Therefore, the measurements constituted a random sample of realizations of the variable. Mean and variance of the variable were unknown, such as samples made with MAIA S2 and Sentinel-S2. There are three main advantages deriving from the analysis of subareas with different vegetation/bare soil cover:
it allows the identification of different behaviors of the relationship between bare soil and crop, and their mixture;
it allows the application of the most suitable and diversified analysis strategy within each subarea until the completion of the entire plot. The mobile window used was also essential for characterizing the entire plot and,
the transfer of the method to different types of crop systems characterized by areas with different percentages of vegetation cover.
In effect, for a specific multispectral analysis, it is necessary to take into account the realistic possibility of having, within cultivated plots with extensive crops, areas of even limited extensions with a different surface coverage ratio between canopy and soil. This ratio can be due to several factors, such as different nutritional supply, soil composition, irrigation methods and fertilizer spread, or the presence of anthropogenic elements such as ditches, drainage channels, service roads, and in any case, to elements of different nature; all factors that must be understood and managed.
Welch’s test was applied on NDVI average values before, and then on NIR and Red average values, to evaluate the correspondence between the two remote sensing data that nominally work with the same bands, even if at completely different scales of observation. Table 3 and Table 4 show that the two sensors provided statistically similar data, i.e., the mean equality hypothesis ( H 0 ) was accepted, for some specific scenarios. However, when the numerical difference of the means was less than τ’ = 0.005, the test application was not necessary because the mean values are indistinguishable. In these cases, the H 0 was always accepted. Based on the Welch’s test calculation, the   H 0 was accepted for June, the NIR data in A, B, C test areas; for August, the NDVI data in A and C test areas, Red data in A test area and NIR data in all test areas. Based on the epochs of acquisition, the analysis of previous results provided much of information [89] for the potential integrated use of the MS2 and S2 sensors.
In June, in dominant bare-ground conditions, the S2 and MS2_10 data could not be considered equivalent. Indeed, despite NIR measurements in A, B and C areas passing the Welch’s test, for Red band and NDVI data the H 0 was never accepted.
In July, the tested areas showed nonhomogeneous plant cover conditions due to the corn plants that not yet flowered or that were blooming. Consequently, the bare soil influence is again considerable in the B area and especially in the C area. According to the agronomic expert who supported the entire study, the 5-day phase shift in the two acquisitions in the July stage did not affect the results because the change related to vegetation variation in the field was minimal. Based on this data, it appears that there is a significant difference between the two sensors (p-value < 0.001). Without a total plant cover, the two sensors, looking at the same space, measured it in a different way. Even in the presence of fairly large bare terrain, the S2 data (at 10 m) tend to level off the results, meaning that the most pronounced potential DN differences between pixels are flattened out. This is fixed, on the other hand, by considering proximity data (in this case with original GSD equal to 5 cm) in which, even after aggregation at 10 m, a greater values dynamic is found.
On the other hand, in August, canopy development was almost complete and fairly homogeneous due to the plants reaching the maximum phenological maturity level and to complete the soil coverage. Relating to the NDVI index, the areas A and C passed the test but not the B area. However, in B, the NDVI average value difference of 0.007 is very close to the minimum threshold τ’, i.e., an uncertainty of two bits out of 256, which can be considered negligible. Furthermore, the NIR measurements were similar in the three test areas and for the Red band, the two areas that are statistically different showed differences of less than τ’, not satisfying the applicability conditions of the test itself. Therefore, the results revealed a vegetation in full activity with very strong emission in the NIR and a total absorption in the Red, as expected [47,90].
The knowledge of correspondences and differences between two sensors is important to recognize what Sentinel-2 data can measure with certainty on the one hand and with uncertainty on the other. Consequently, it became useful to understand when to use MS2_10 data as a calibration tool of S2 data and when to use MS2_10 data as an integration of S2 data. In the first case, as in August for corn, the MS2 data become ground truths suitable to calibrate the information extracted from the satellite data [57]; in the second case, as in June for corn, the MS2 data completed and expanded the S2 measurements [89].
Different plant phenological stages gradually define different plant canopy/bare soil ratios. In terms of the relationship between irradiance and reflection, for both sensors, the Red and NIR bands behaved very differently as time varies because they were modulated by the evolution of the phenological phase. The transition from the flowering phase to the ripening phase involves an increase in the leaf surface [65] and a decrease in the bare soil effect, as evidenced by the NDVI and SAVI values. The differences between indices are greater than those of bands. The differences in the Red are almost always greater than in the NIR, especially for periods with a lot of bare soil patches (such as June in our example). Further, Huang et al. [91] also demonstrated that by applying a paired t-test, the differences in the Red channel between different satellite sensors are always highly significant (p < 0.0001). Consequently, reporting the values of Red in the index formulas influences their result. While the reflectance in Red decreased with increasing growth (Table 4), the NIR had the opposite behavior. The vegetation reflects much of the NIR signal.
The use of SAVI [45], in addition to NDVI, derived from the curiosity to test a vegetation index able to minimize the brightness of the soil. Considering the scenario under observation, to make even a temporal comparison between VI products from S2 and MS2_10, the SAVI was calculated specifically for the June and July stages. It did not provide additional information but only a general decrease in values is noted with respect to the NDVI values. The trend of average SAVI do not change, especially for June (A, B, C) and July (B, C), when bare soil is present in a dominant or significant way, even though SAVI was studied to consider and reduce the contribution of the soil beneath the plants. In summary, based on the SAVI, the two sensors see different things, as already noted previously for Red and NIR bands. MAIA-S2, due to its higher spatial resolution, is more suitable to detect soil contribution, and the differences in this index are greater than the threshold τ‘; they are always significant. For these reasons, the authors think that the definition of L coefficient for proximity data should be revised through experimental analyses and described by a specific function.
Concerning linear regression as a tool (cf. 4.3, Figure 5 and Table 5) analyzing the correspondence between S2 and MS2_10 data, its effectiveness was not exhaustive. An initial comment concerns the significant difference between the NDVI ranges found in some cases between S2 values and the MS2_10 values that derive from resampled data. For example, in June, data with a lot of bare soil, in the test area B, most S2 NDVI values are in the 0.18 to 0.33 range (a few up to 0.42). These correspond to lower MS2_10 NDVI values, including between 0.07-0.25 (with a few up to 0.33). Even for area B, in July and in the presence of a variable bare-vegetated soil distribution, the S2 NDVI values were compacted in a much higher and limited range (0.81-0.89) while the corresponding MS2_10 values were distributed in a much wider range (0.58-0.85). This different behavior suggested that MS2_10, despite the resampling, is more able to discriminate different responses linked to mixture patches, while S2 merges the values into a small range [89]. In July, the other test areas also showed a similar trend while, in the August data, the value trend was very different. The data were distributed on NDVI intervals of 2-3 cents for both measurement types. They did not manifest a linear relationship although the estimation process obviously provides the regression line. Even if the statistical tests and the different evaluation between the means indicated a substantial correspondence of the S2 and MS2_10 data, the correlation was very low (for example, in the A and C test areas, R2 was 0.28 and 0.47 respectively). Consequently, the value trend seemed to depend more on the measurement variability of the two sensors than on a functional trend.

6. Conclusions

This paper evaluates the integrated use of multispectral data in order to accurately monitor the dynamics of the relationship between bare soil areas and the crop. The protocol developed to compare the two types of data is based on a statistical comparison between Sentinel-2 satellite data and UAV multispectral data, with the same spectral bands.
The activity was carried out in a corn field, considering three stages (June, July and August) and three test areas: A (high vegetation density), B (medium vegetation density), C (low vegetation density). The similarity between the two types of data was accepted:
for June, for NIR data in area A, B and C;
for August, for NDVI values in test areas A and C, Red data in test area A and NIR data in all test areas.
Based on the acquisition stages, these results provided much information for the potential integrated use of UAV and Sentinel-2 sensors. In particular, the analysis was useful in understanding when to use UAV data as a calibration tool for Sentinel-2 data and when to use UAV data as a supplement to Sentinel-2 data:
in August, MS2 data can become ground truths suitable for calibrating information extracted from satellite data;
in June, MS2 data can complement and extend S2 measurements.
Scaling up surveys to a local level is very useful when precise or small-area assessments are required. This approach would also allow for parameters that can specialize satellite and proximal data in one or more processing steps, such as spatial adaptation of atmospheric correction algorithms for satellite data, or the possibility of locally characterizing and estimating the SAVI L parameter to account for an actual average percentage of bare soil. All the above considerations will be useful for those who will have to elaborate classifications or detailed analyses aimed at improving the discrimination procedures of crops and bare soils. Finally, the analysis could also be completed in relation to moisture content with the addition of short-wave and thermal sensors.

Author Contributions

Conceptualization, M.D., N.P. and M.D.G.; methodology, N.P., M.D.G. and M.B.; software, N.P. and M.B.; validation, F.Z.; formal analysis, M.D. and N.P.; investigation, N.P., M.D.G. and M.B.; resources, A.T., M.D. and F.Z.; data curation, N.P. and M.B.; writing—original draft preparation, N.P., M.D.G., M.B., F.Z. and M.D.; writing—review and editing, M.D.G. and F.Z.; supervision, A.T.; funding acquisition, A.T. and F.Z. All authors have read and agreed to the published version of the manuscript.

Funding

The research (F.Z.) was funded by the Italian Institute for Environmental Protection and Research (ISPRA) in the framework of agreement between ISPRA and Italian Space Agency (ASI) on “Air Quality” (Agreement number F82F17000000005).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors wish to thank the staff from SSSA for trial management and the staff from Italian Institute for Environmental Protection and Research (ISPRA) for their valuable support in the field activities.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Lipper, L.; Thornton, P.; Campbell, B.M.; Baedeker, T.; Braimoh, A.K.; Bwalya, M.; Caron, P.; Cattaneo, A.; Garrity, D.P.; Henry, K.; et al. Climate-smart agriculture for food security. Nat. Clim. Chang. 2014, 4, 1068–1072. [Google Scholar] [CrossRef]
  2. Way, D.A.; Long, S.P. Climate-smart agriculture and forestry: Maintaining plant productivity in a changing world while minimizing production system effects on climate. Plant Cell Environ. 2015, 38, 1683–1685. [Google Scholar] [CrossRef] [PubMed]
  3. United Nations. Sustainable Development Goals: Goal 2 Zero Hunger. Available online: https://www.un.org/sustainabledevelopment/hunger/ (accessed on 21 September 2020).
  4. FAO. The Future of Food and Agriculture-Trends and Challenges; Food and Agriculture Organization of the United Nations: Rome, Italy, 2017; Volume 296, pp. 1–180. [Google Scholar]
  5. FAOSTAT. Food and Agriculture Organization Corporate Statistical Database (FAOSTAT) 29. 2018. Available online: http://www.fao.org/faostat/en/#home (accessed on 30 December 2020).
  6. IPCC. Climate change and land, the Intergovernmental Panel on Climate Change. In Special Report on Climate Change, Desertification, Land Degradation, Sustainable Land Management, Food Security, and Greenhouse Gas Fluxes in Terrestrial Ecosystems; IPCC: Geneva, Switzerland, 2019; Chapter 5; pp. 1–200. [Google Scholar]
  7. Schiavon, E.; Taramelli, A.; Tornato, A.; Pierangeli, F. Monitoring environmental and climate goals for European agriculture: User perspectives on the optimization of the Copernicus evolution offer. J. Environ. Manag. 2021, 296, 113121. [Google Scholar] [CrossRef] [PubMed]
  8. Weiss, M.; Jacob, F.; Duveiller, G. Remote sensing for agricultural applications: A meta-review. Remote Sens. Environ. 2020, 236, 111402. [Google Scholar] [CrossRef]
  9. Inoue, Y. Satellite- and drone-based remote sensing of crops and soils for smart farming–A review. Soil Sci. Plant Nutr. 2020, 66, 798–810. [Google Scholar] [CrossRef]
  10. European Commission. Commission Implementing Regulation (EU) 2018/746 of 18 May 2018 Amending Implementing Regulation (EU) No 809/2014 as Regards Modification of Single Applications and Payment Claims and Checks; C/2018/2976, OJ L 125, 22.5.2018; EC: Brussels, Belgium, 2018; pp. 1–7. [Google Scholar]
  11. Gupta, M.; Abdelsalam, M.; Khorsandroo, S.; Mittal, S. Security and privacy in smart farming: Challenges and opportunities. IEEE Access 2020, 8, 34564–34584. [Google Scholar] [CrossRef]
  12. Moysiadis, V.; Sarigiannidis, P.; Vitsas, V.; Khelifi, A. Smart Farming in Europe. Comput. Sci. Rev. 2021, 39, 100345. [Google Scholar] [CrossRef]
  13. Wolfert, S.; Ge, L.; Verdouw, C.; Bogaardt, M.-J. Big data in smart farming—A review. Agric. Syst. 2017, 153, 69–80. [Google Scholar] [CrossRef]
  14. Lytos, A.; Lagkas, T.; Sarigiannidis, P.; Zervakis, M.; Livanos, G. Towards smart farming: Systems, frameworks and exploitation of multiple sources. Comput. Networks 2020, 172, 107147. [Google Scholar] [CrossRef]
  15. Taramelli, A.; Tornato, A.; Magliozzi, M.L.; Mariani, S.; Valentini, E.; Zavagli, M.; Costantini, M.; Nieke, J.; Adams, J.; Rast, M. An interaction methodology to collect and assess user-driven requirements to define potential opportunities of future hyperspectral imaging sentinel mission. Remote Sens. 2020, 12, 1286. [Google Scholar] [CrossRef] [Green Version]
  16. Murugan, D.; Garg, A.; Singh, D. Development of an adaptive approach for precision agriculture monitoring with drone and satellite data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2017, 10, 5322–5328. [Google Scholar] [CrossRef]
  17. Pla, M.; Bota, G.; Duane, A.; Balagué, J.; Curcó, A.; Gutiérrez, R.; Brotons, L. Calibrating Sentinel-2 Imagery with Multispectral UAV Derived Information to Quantify Damages in Mediterranean Rice Crops Caused by Western Swamphen (Porphyrio porphyrio). Drones 2019, 3, 45. [Google Scholar] [CrossRef] [Green Version]
  18. Yeom, J.; Jung, J.; Chang, A.; Ashapure, A.; Maeda, M.; Maeda, A.; Landivar, J. Comparison of vegetation indices derived from UAV data for differentiation of tillage effects in agriculture. Remote Sens. 2019, 11, 1548. [Google Scholar] [CrossRef] [Green Version]
  19. Bauer, M.E.; Cipra, J.E. Identification of agricultural crops by computer processing of ERTS MSS data. In Proceedings of the Symposium on Significant Results Obtained from the Earth Resources Technology Satellite-1, New Carrollton, MD, USA, 5–9 March 1973; Volume 1, pp. 205–212. [Google Scholar]
  20. Jürgens, C. The modified normalized difference vegetation index (mNDVI) a new index to determine frost damages in agriculture based on Landsat TM data. Int. J. Remote Sens. 1997, 18, 3583–3594. [Google Scholar] [CrossRef]
  21. Lyle, G.; Lewis, M.; Ostendorf, B. Testing the temporal ability of landsat imagery and precision agriculture technology to provide high resolution historical estimates of wheat yield at the farm scale. Remote Sens. 2013, 5, 1549–1567. [Google Scholar] [CrossRef] [Green Version]
  22. Akanwa, A.O.; Okeke, F.I.; Nnodu, V.C.; Iortyom, E.T. Quarrying and its effect on vegetation cover for a sustainable development using high-resolution satellite image and GIS. Environ. Earth Sci. 2017, 76, 505. [Google Scholar] [CrossRef]
  23. Upadhyay, P.; Kumar, A.; Roy, P.S.; Ghosh, S.; Gilbert, I. Effect on specific crop mapping using WorldView-2 multispectral add-on bands: Soft classification approach. J. Appl. Remote Sens. 2012, 6, 063524. [Google Scholar] [CrossRef]
  24. Primicerio, J.; Di Gennaro, S.F.; Fiorillo, E.; Genesio, L.; Lugato, E.; Matese, A.; Vaccari, F.P. A flexible unmanned aerial vehicle for precision agriculture. Precis. Agric. 2012, 13, 517–523. [Google Scholar] [CrossRef]
  25. Maresma, Á.; Ariza, M.; Martínez, E.; Lloveras, J.; Martínez-Casasnovas, J.A. Analysis of Vegetation Indices to Determine Nitrogen Application and Yield Prediction in Maize (Zea mays L.) from a Standard UAV Service. Remote Sens. 2016, 8, 973. [Google Scholar] [CrossRef] [Green Version]
  26. Deng, L.; Mao, Z.; Li, X.; Hu, Z.; Duan, F.; Yan, Y. UAV-based multispectral remote sensing for precision agriculture: A comparison between different cameras. ISPRS J. Photogramm. Remote Sens. 2018, 146, 124–136. [Google Scholar] [CrossRef]
  27. Li, J.; Zhang, F.; Qian, X.; Zhu, Y.; Shen, G. Quantification of rice canopy nitrogen balance index with digital imagery from unmanned aerial vehicle. Remote Sens. Lett. 2015, 6, 183–189. [Google Scholar] [CrossRef]
  28. Barbedo, J.G.A. Detection of nutrition deficiencies in plants using proximal images and machine learning: A review. Comput. Electron. Agric. 2019, 162, 482–492. [Google Scholar] [CrossRef]
  29. Radoglou-Grammatikis, P.; Sarigiannidis, P.; Lagkas, T.; Moscholios, I. A compilation of UAV applications for precision agriculture. Comput. Netw. 2020, 172, 107148. [Google Scholar] [CrossRef]
  30. Thapa, S.; Millan, V.G.; Eklundh, L. Assessing Forest Phenology: A Multi-Scale Comparison of Near-Surface (UAV, Spectral Reflectance Sensor, PhenoCam) and Satellite (MODIS, Sentinel-2) Remote Sensing. Remote Sens. 2021, 13, 1597. [Google Scholar] [CrossRef]
  31. Benincasa, P.; Antognelli, S.; Brunetti, L.; Fabbri, C.A.; Natale, A.; Sartoretti, V.; Modeo, G.; Guiducci, M.; Tei, F.; Vizzari, M. Reliability of NDVI derived by high resolution satellite and UAV compared to in-field methods for the evaluation of early crop N status and grain yield in wheat. Exp. Agric. 2018, 54, 604–622. [Google Scholar] [CrossRef]
  32. Mancini, A.; Frontoni, E.; Zingaretti, P. Satellite and UAV data for Precision Agriculture Applications. In Proceedings of the 2019 International Conference on Unmanned Aircraft Systems (ICUAS), Atlanta, GA, USA, 11–14 June 2019; pp. 491–497. [Google Scholar]
  33. Pastonchi, L.; Di Gennaro, S.F.; Toscano, P.; Matese, A. Comparison between satellite and ground data with UAV-based information to analyse vineyard spatio-temporal variability. OENO One 2020, 54, 919–934. [Google Scholar] [CrossRef]
  34. Kavosi, Z.; Raoufat, M.H.; Dehghani, M.; Abdolabbas, J.; Kazemeini, S.A.; Nazemossadat, M.J.; Dehghaani, M.; Jafari, A.; Kazemeini, A.; Naazemossadat, M.J. Feasibility of satellite and drone images for monitoring soil residue cover. J. Saudi Soc. Agric. Sci. 2020, 19, 56–64. [Google Scholar] [CrossRef]
  35. Müllerová, J.; Brůna, J.; Bartaloš, T.; Dvořák, P.; Vitkova, M.; Pyšek, P. Timing is important: Unmanned aircraft vs. satellite imagery in plant invasion monitoring. Front. Plant Sci. 2017, 8, 887. [Google Scholar] [CrossRef] [Green Version]
  36. Rudd, J.D.; Roberson, G.T.; Classen, J.J. Application of satellite, unmanned aircraft system, and ground-based sensor data for precision agriculture: A review. In Proceedings of the 2017 Spokane, Washington, DC, USA, 16 July–19 July 2017; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2017; p. 1. [Google Scholar]
  37. Nonni, F.; Malacarne, D.; Pappalardo, S.E.; Codato, D.; Meggio, F.; De Marchi, M. Sentinel-2 Data Analysis and Comparison with UAV Multispectral Images for Precision Viticulture. GI_Forum 2018, 1, 105–116. [Google Scholar] [CrossRef]
  38. Messina, G.; Peña, J.; Vizzari, M.; Modica, G. A Comparison of UAV and Satellites Multispectral Imagery in Monitoring Onion Crop. An Application in the ‘Cipolla Rossa di Tropea’ (Italy). Remote Sens. 2020, 12, 3424. [Google Scholar] [CrossRef]
  39. Sozzi, M.; Kayad, A.; Marinello, F.; Taylor, J.; Tisseyre, B. Comparing vineyard imagery acquired from Sentinel-2 and Un-manned Aerial Vehicle (UAV) platform. Oeno One 2020, 54, 189–197. [Google Scholar] [CrossRef] [Green Version]
  40. Riihimäki, H.; Luoto, M.; Heiskanen, J. Estimating fractional cover of tundra vegetation at multiple scales using unmanned aerial systems and optical satellite data. Remote Sens. Environ. 2019, 224, 119–132. [Google Scholar] [CrossRef]
  41. Alvarez-Vanhard, E.; Corpetti, T.; Houet, T. UAV & satellite synergies for optical remote sensing applications: A literature review. Sci. Remote Sens. 2021, 3, 100019. [Google Scholar] [CrossRef]
  42. Gitelson, A.A. Wide dynamic range vegetation index for remote quantification of biophysical characteristics of vegetation. J. Plant Physiol. 2004, 161, 165–173. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  43. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  44. Gevaert, C.; Suomalainen, J.; Tang, J.; Kooistra, L. Generation of Spectral-Temporal Response Surfaces by Combining Multispectral Satellite and Hyperspectral UAV Imagery for Precision Agriculture Applications. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2015, 8, 3140–3146. [Google Scholar] [CrossRef]
  45. Mazzia, V.; Comba, L.; Khaliq, A.; Chiaberge, M.; Gay, P. UAV and Machine learning based refinement of a satellite-driven vegetation index for precision agriculture. Sensors 2020, 20, 2530. [Google Scholar] [CrossRef] [PubMed]
  46. Dubbini, M.; Pezzuolo, A.; De Giglio, M.; Gattelli, M.; Curzio, L.; Covi, D.; Yezekyan, T.; Marinello, F. Last generation instrument for agriculture multispectral data collection. Agric. Eng. Int. CIGR J. 2017, 19, 87–93. [Google Scholar]
  47. Chauhan, S.; Darvishzadeh, R.; Lu, Y.; Stroppiana, D.; Boschetti, M.; Pepe, M.; Nelson, A. Wheat lodging assessment using multispectral UAV data. ISPRS-Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, XLII-2/W13, 235–240. [Google Scholar] [CrossRef] [Green Version]
  48. Caturegli, L.; Gaetani, M.; Volterrani, M.; Magni, S.; Minelli, A.; Baldi, A.; Brandani, G.; Mancini, M.; Lenzi, A.; Orlandini, S.; et al. Normalized Difference Vegetation Index versus Dark Green Colour Index to estimate nitrogen status on bermudagrass hybrid and tall fescue. Int. J. Remote Sens. 2019, 41, 455–470. [Google Scholar] [CrossRef]
  49. Tucker, C.J. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sens. Environ. 1979, 8, 127–150. [Google Scholar] [CrossRef] [Green Version]
  50. Huete, A.R. A soil-adjusted vegetation index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
  51. Fern, R.R.; Foxley, E.A.; Bruno, A.; Morrison, M.L. Suitability of NDVI and OSAVI as estimators of green biomass and coverage in a semi-arid rangeland. Ecol. Indic. 2018, 94, 16–21. [Google Scholar] [CrossRef]
  52. Insua, J.R.; Utsumi, S.A.; Basso, B. Estimation of spatial and temporal variability of pasture growth and digestibility in grazing rotations coupling unmanned aerial vehicle (UAV) with crop simulation models. PLoS ONE 2019, 14, e0212773. [Google Scholar] [CrossRef] [Green Version]
  53. Garcia-Ruiz, F.; Sankaran, S.; Maja, J.M.; Lee, W.S.; Rasmussen, J.; Ehsani, R. Comparison of two aerial imaging platforms for identification of Huanglongbing-infected citrus trees. Comput. Electron. Agric. 2013, 91, 106–115. [Google Scholar] [CrossRef]
  54. Zhou, J.; Pavek, M.J.; Shelton, S.C.; Holden, Z.J.; Sankaran, S. Aerial multispectral imaging for crop hail damage assessment in potato. Comput. Electron. Agric. 2016, 127, 406–412. [Google Scholar] [CrossRef] [Green Version]
  55. Vani, V.; Mandla, V.R. Comparative study of NDVI and SAVI vegetation indices in Anantapur district semi-arid areas. Int. J. Civ. Eng. Technol. 2017, 8, 559–566. [Google Scholar]
  56. Xu, H.; Zhang, T.-J. Cross comparison of ASTER and Landsat ETM+ multispectral measurements for NDVI and SAVI vegetation indices. Spectrosc. Spectr. Anal. 2011, 31, 1902–1907. [Google Scholar]
  57. Ryu, J.-H.; Na, S.-I.; Cho, J. Inter-Comparison of normalized difference vegetation index measured from different footprint sizes in cropland. Remote Sens. 2020, 12, 2980. [Google Scholar] [CrossRef]
  58. Welch, B.L. The generalization of ‘student’s’ problem when several different population varlances are involved. Biometrika 1947, 34, 28–35. [Google Scholar] [CrossRef] [PubMed]
  59. Zhang, S.; Zhao, G.; Lang, K.; Su, B.; Chen, X.; Xi, X.; Zhang, H. Integrated Satellite, Unmanned Aerial Vehicle (UAV) and Ground Inversion of the SPAD of Winter Wheat in the Reviving Stage. Sensors 2019, 19, 1485. [Google Scholar] [CrossRef] [Green Version]
  60. Di Gennaro, S.F.; Dainelli, R.; Palliotti, A.; Toscano, P.; Matese, A. Sentinel-2 validation for spatial variability assessment in overhead trellis system viticulture versus UAV and agronomic data. Remote Sens. 2019, 11, 2573. [Google Scholar] [CrossRef] [Green Version]
  61. Gardin, L.; Vinci, A. Carta dei Suoli Della Regione Toscana in 1: 250.000 Scale. Available online: http://sit.lamma.rete.toscana.it/websuoli/ (accessed on 30 September 2020). (In Italian).
  62. Baetens, L.; Desjardins, C.; Hagolle, O. Validation of Copernicus Sentinel-2 cloud masks obtained from MAJA, Sen2Cor, and FMask processors using reference cloud masks generated with a supervised active learning procedure. Remote Sens. 2019, 11, 433. [Google Scholar] [CrossRef] [Green Version]
  63. Hagolle, O.; Huc, M.; Pascual, D.V.; Dedieu, G. A multi-temporal method for cloud detection, applied to FORMOSAT-2, VENµS, LANDSAT and SENTINEL-2 images. Remote Sens. Environ. 2008, 114, 1747–1755. [Google Scholar] [CrossRef] [Green Version]
  64. Hagolle, O.; Huc, M.; Villa Pascual, D.; Dedieu, G. A Multi-Temporal and Multi-Spectral Method to Estimate Aerosol Optical Thickness over Land, for the Atmospheric Correction of FormoSat-2, LandSat, VENμS and Sentinel-2 Images. Remote Sens. 2015, 7, 2668–2691. [Google Scholar] [CrossRef] [Green Version]
  65. De Peppo, M.; Taramelli, A.; Boschetti, M.; Mantino, A.; Volpi, I.; Filipponi, F.; Tornato, A.; Valentini, E.; Ragaglini, G. Non-Parametric statistical approaches for leaf area index estimation from Sentinel-2 Data: A multi-crop assessment. Remote Sens. 2021, 13, 2841. [Google Scholar] [CrossRef]
  66. Main-Knorn, M.; Pflug, B.; Louis, J.; Debaecker, V.; Müller-Wilm, U.; Gascon, F. Sen2Cor for Sentinel-2. In Proceedings of the Image and Signal Processing for Remote Sensing, Warsaw, Poland, 4 October 2017; p. 3. [Google Scholar]
  67. Nocerino, E.; Dubbini, M.; Menna, F.; Remondino, F.; Gattelli, M.; Covi, D. Geometric calibration and radiometric correction of the maia multispectral camera. ISPRS-Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 42, 149–156. [Google Scholar] [CrossRef] [Green Version]
  68. Mancini, F.; Dubbini, M.; Gattelli, M.; Stecchi, F.; Fabbri, S.; Gabbianelli, G. Using Unmanned Aerial Vehicles (UAV) for High-Resolution Reconstruction of Topography: The Structure from Motion Approach on Coastal Environments. Remote Sens. 2013, 5, 6880–6898. [Google Scholar] [CrossRef] [Green Version]
  69. Thales Alenia Space France Team, Sentinel-2 Products Specification Document. 2021. Available online: https://sentinel.esa.int/documents/247904/685211/Sentinel-2-Products-Specification-Document.pdf/fb1fc4dc-12ca-4674-8f78-b06efa871ab9?t=1616068001033 (accessed on 30 September 2021).
  70. Sanz-Ablanedo, E.; Chandler, J.H.; Rodríguez-Pérez, J.R.; Ordóñez, C. Accuracy of unmanned aerial vehicle (UAV) and SfM photogrammetry survey as a function of the number and location of ground control points used. Remote Sens. 2018, 10, 1606. [Google Scholar] [CrossRef] [Green Version]
  71. Lewis, J.P. Fast Normalized Cross-Correlation, Volume 10 of Vision Interface; 1995. Available online: https://www.academia.edu/653960/Fast_normalized_cross_correlation (accessed on 30 November 2021).
  72. Murray, P.J.; Jorgensen, M.; Gill, E. Effect of temperature on growth and morphology of two varieties of white clover (Trifolium repens L.) and their impact on soil microbial activity. Ann. Appl. Biol. 2000, 137, 305–310. [Google Scholar] [CrossRef]
  73. Nakamura, J. Image Sensors and Signal Processing for Digital Still Cameras; CRC Press: Boca Raton, FL, USA, 2006; pp. 87–88. ISBN 978-0-8493-3545-7. [Google Scholar]
  74. Plant, R.E. Site-specific management: The application of information technology to crop production. Comput. Electron. Agric. 2001, 30, 9–29. [Google Scholar] [CrossRef]
  75. Wang, Q.; Chen, J.; Stamps, R.H.; Li, Y. Correlation of visual quality grading and SPAD reading of green-leaved foliage plants. J. Plant Nutr. 2005, 28, 1215–1225. [Google Scholar] [CrossRef]
  76. Limantara, L.; Dettling, M.; Indrawati, R.; Indriatmoko, I.; Brotosudarmo, T. Analysis on the Chlorophyll Content of Commercial Green Leafy Vegetables. Procedia Chem. 2015, 14, 225–231. [Google Scholar] [CrossRef] [Green Version]
  77. Gitelson, A.A.; Gritz, Y.; Merzlyak, M.N. Relationships between leaf chlorophyll content and spectral reflectance and algorithms for non-destructive chlorophyll assessment in higher plant leaves. J. Plant Physiol. 2003, 160, 271–282. [Google Scholar] [CrossRef] [PubMed]
  78. Blackburn, G.A. Hyperspectral remote sensing of plant pigments. J. Exp. Bot. 2006, 58, 855–867. [Google Scholar] [CrossRef] [Green Version]
  79. Hatfield, J.L.; Gitelson, A.A.; Schepers, J.S.; Walthall, C.L. Application of spectral remote sensing for agronomic decisions. Agron. J. 2008, 100, S-117–S-131. [Google Scholar] [CrossRef] [Green Version]
  80. Schlemmer, M.; Gitelson, A.; Schepers, J.; Ferguson, R.; Peng, Y.; Shanahan, J.; Rundquist, D. Remote estimation of nitrogen and chlorophyll contents in maize at leaf and canopy levels. Int. J. Appl. Earth Obs. Geoinf. 2013, 25, 47–54. [Google Scholar] [CrossRef] [Green Version]
  81. Dash, J.; Curran, P.J. The MERIS terrestrial chlorophyll index. Int. J. Remote Sens. 2004, 25, 5403–5413. [Google Scholar] [CrossRef]
  82. Jiang, Z.; Huete, A.R.; Didan, K.; Miura, T. Development of a two-band enhanced vegetation index without a blue band. Remote Sens. Environ. 2008, 112, 3833–3845. [Google Scholar] [CrossRef]
  83. Revill, A.; Florence, A.; MacArthur, A.; Hoad, S.P.; Rees, R.M.; Williams, M. The Value of Sentinel-2 Spectral Bands for the Assessment of Winter Wheat Growth and Development. Remote Sens. 2019, 11, 2050. [Google Scholar] [CrossRef] [Green Version]
  84. Viña, A.; Gitelson, A.A.; Nguy-Robertson, A.L.; Peng, Y. Comparison of different vegetation indices for the remote assessment of green leaf area index of crops. Remote Sens. Environ. 2011, 115, 3468–3478. [Google Scholar] [CrossRef]
  85. Wu, C.; Niu, Z.; Gao, S. The potential of the satellite derived green chlorophyll index for estimating midday light use efficiency in maize, coniferous forest and grassland. Ecol. Indic. 2012, 14, 66–73. [Google Scholar] [CrossRef]
  86. Thanyapraneedkul, J.; Muramatsu, K.; Daigo, M.; Furumi, S.; Soyama, N.; Nasahara, K.N.; Muraoka, H.; Noda, H.M.; Nagai, S.; Maeda, T.; et al. A Vegetation Index to Estimate Terrestrial Gross Primary Production Capacity for the Global Change Observation Mission-Climate (GCOM-C)/Second-Generation Global Imager (SGLI) Satellite Sensor. Remote Sens. 2012, 4, 3689–3720. [Google Scholar] [CrossRef] [Green Version]
  87. Xue, J.; Su, B. Significant Remote Sensing Vegetation Indices: A Review of Developments and Applications. J. Sens. 2017, 2017, 1353691. [Google Scholar] [CrossRef] [Green Version]
  88. Fagerland, M.W.; Sandvik, L. Performance of five two-sample location tests for skewed distributions with unequal variances. Contemp. Clin. Trials 2009, 30, 490–496. [Google Scholar] [CrossRef] [PubMed]
  89. Bollas, N.; Kokinou, E.; Polychronos, V. Comparison of Sentinel-2 and UAV Multispectral Data for Use in Precision Agriculture: An Application from Northern Greece. Drones 2021, 5, 35. [Google Scholar] [CrossRef]
  90. Volterrani, M.; Minelli, A.; Gaetani, M.; Grossi, N.; Magni, S.; Caturegli, L. Reflectance, absorbance and transmittance spectra of bermudagrass and manilagrass turfgrass canopies. PLoS ONE 2017, 12, e0188080. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  91. Huang, W.; Huang, J.; Wang, X.; Wang, F.; Shi, J. Comparability of Red/Near-Infrared Reflectance and NDVI Based on the Spectral Response Function between MODIS and 30 Other Satellite Sensors Using Rice Canopy Spectra. Sensors 2013, 13, 16023–16050. [Google Scholar] [CrossRef]
Figure 1. Study area: corn crop field (continuous red line) located in San Giuliano Terme (Pisa, Italy). The circled dots are the Ground Control Points used to MS2 orthoimage processing. Image background: Google Earth Satellite (WGS 84/UTM zone 32N).
Figure 1. Study area: corn crop field (continuous red line) located in San Giuliano Terme (Pisa, Italy). The circled dots are the Ground Control Points used to MS2 orthoimage processing. Image background: Google Earth Satellite (WGS 84/UTM zone 32N).
Remotesensing 14 01028 g001
Figure 2. Example of the linear correlation coefficient in relation to the lag in x (abscissa) and y (ordinate), for NIR data, June epoch: (a) the numerical value is shown in 3−D; (b) the numerical value is shown in 2−D chromatic scale. The maximum is reached for lag (8,8) or shift (0,0).
Figure 2. Example of the linear correlation coefficient in relation to the lag in x (abscissa) and y (ordinate), for NIR data, June epoch: (a) the numerical value is shown in 3−D; (b) the numerical value is shown in 2−D chromatic scale. The maximum is reached for lag (8,8) or shift (0,0).
Remotesensing 14 01028 g002
Figure 3. Plots of moving window application results: the yellow pixels satisfy both conditions on vegetation intensity and dispersion at the indicated levels.
Figure 3. Plots of moving window application results: the yellow pixels satisfy both conditions on vegetation intensity and dispersion at the indicated levels.
Remotesensing 14 01028 g003
Figure 4. Test areas A, B, C.
Figure 4. Test areas A, B, C.
Remotesensing 14 01028 g004
Figure 5. The graphs report the NDVI value points (MS2_10, S2), the estimated regression line MS 2 _ 10 = p 1   S 2 + p 0 (black line) and the lines representing the range of variability (red lines) expected for a new observation at the confidence level of 0.99.
Figure 5. The graphs report the NDVI value points (MS2_10, S2), the estimated regression line MS 2 _ 10 = p 1   S 2 + p 0 (black line) and the lines representing the range of variability (red lines) expected for a new observation at the confidence level of 0.99.
Remotesensing 14 01028 g005
Table 1. Characteristics of the two sensors used, in terms of band name, central wavelength and spatial resolution. The corresponding central wavelengths between the two instruments are highlighted in bold.
Table 1. Characteristics of the two sensors used, in terms of band name, central wavelength and spatial resolution. The corresponding central wavelengths between the two instruments are highlighted in bold.
Sentinel-2MAIA S2
Band NameCentral Wavelength (nm)Spatial Resolution (m)Band NameCentral
Wavelength (nm)
GSD
(m)
B1—Coastal aerosol44360S1—Violet4430.047
B2—Blue49010S2—Blue4900.047
B3—Green56010S3—Green5600.047
B4—Red66510S4—Red6650.047
B5—Vegetation Red Edge70520S5—Red Edge 17050.047
B6—Vegetation Red Edge74020S6—Red Edge 27400.047
B7—Vegetation Red Edge78320S7—NIR 17830.047
B8—Narrow NIR84210S8—NIR 28420.047
B8A—NIR86520S9—NIR 38650.047
B9—Water vapour94560///
B10—SWIR - Cirrus1.37560///
B11—SWIR1.61020///
B12—SWIR2.19020///
Table 2. Sentinel-2 and MAIA S-2 acquisition time.
Table 2. Sentinel-2 and MAIA S-2 acquisition time.
Acquisition DateTime (UTC+1)Sun Azimuth (°)Sun Elevation (°)
Sentinel-23 June 201910:18:45146.1324.62
16 July 201910:28:42148.125.26
5 August 201910:28:41151.7629.32
MAIA-S-23 June 201912:00:00193.4855.96
11 July 201912:00:00190.4356.1
5 August 201912:00:00189.2951.07
Table 3. For each test area and for each stages: NDVI and SAVI average values and corresponded r.m.s.e. for both sensors, S2—MS2_10 difference of vegetation index mean values, Welch’s test output. In bold, the values show the cases in which the equality hypothesis was accepted. In italics, the values show the cases in which the difference were considered not significant.
Table 3. For each test area and for each stages: NDVI and SAVI average values and corresponded r.m.s.e. for both sensors, S2—MS2_10 difference of vegetation index mean values, Welch’s test output. In bold, the values show the cases in which the equality hypothesis was accepted. In italics, the values show the cases in which the difference were considered not significant.
NDVISAVI
EpochTest Area Sentinel-2 MAIA S-2 Sentinel-2 MAIA S-2
JuneAmean0.2470.1550.1600.103
r.m.s.e.0.0430.0470.0300.032
difference0.0930.057
W test11.610.4
Bmean0.2690.1770.1600.110
r.m.s.e.0.0470.0530.0280.033
difference0.0920.051
W test10.49.3
Cmean0.2050.1170.1240.073
r.m.s.e.0.0380.0410.0210.024
difference0.0900.103
W test12.714.1
JulyAmean0.8710.7760.6850.602
r.m.s.e.0.0130.0480.0200.038
difference0.0950.083
W test15.315.4
Bmean0.8560.7280.5750.488
r.m.s.e.0.0200.0660.0330.051
difference0.1280.087
W test14.811.4
Cmean0.8190.6170.4770.371
r.m.s.e.0.0350.0810.0450.035
difference0.2020.106
W test18.214.8
AugustAmean0.9470.946Not applicable
r.m.s.e.0.0050.010
difference0.001Not applicable
W test0.9
Bmean0.9560.948Not applicable
r.m.s.e.0.0070.009
difference0.007Not applicable
W test4.9
Cmean0.9550.950Not applicable
r.m.s.e.0.0130.014
difference0.006Not applicable
W test2.4
Table 4. For each test area and for each stages: NIR and Red average values and corresponded r.m.s.e. for both sensors, skewness values, S2—MS2_10 difference of band mean values, Welch’s test output. In bold, the values show the cases in which the equality hypothesis was accepted. In italics, the values show the cases in which the differences were considered not significant.
Table 4. For each test area and for each stages: NIR and Red average values and corresponded r.m.s.e. for both sensors, skewness values, S2—MS2_10 difference of band mean values, Welch’s test output. In bold, the values show the cases in which the equality hypothesis was accepted. In italics, the values show the cases in which the differences were considered not significant.
NIRRed
EpochTest Area Sentinel-2 MAIA S-2 Sentinel-2 MAIA S-2
JuneAmean0.2740.2680.1600.192
r.m.s.e.0.0240.0260.0280.036
skewness0.740.49−0.03−0.06
difference0.005−0.032
W test1.15.5
Bmean0.2460.2440.1420.171
r.m.s.e.0.0180.0220.0130.019
skewness0.980.220.020.41
difference0.002−0.029
W test0.5−10.0
Cmean0.2420.2410.1650.198
r.m.s.e.0.0320.0350.0290.038
skewness0.800.630.650.59
difference0.001−0.033
W test0.1−5.5
JulyAmean0.3960.3630.0270.046
r.m.s.e.0.0190.0170.0020.011
skewness−0.190.250.082.35
difference0.033−0.019
W test10.2−13.6
Bmean0.3770.3490.0290.055
r.m.s.e.0.0280.0240.0030.012
skewness−0.05−0.16−0.160.34
difference0.028−0.026
W test6.0−16.5
Cmean0.3400.3180.0330.076
r.m.s.e.0.0240.0160.0050.020
skewness−0.080.201.471.29
difference0.020−0.042
W test6.1−16.6
AugustAmean0.4360.4360.0120.012
r.m.s.e.0.0150.0220.0020.001
skewness0.410.090.611.37
difference0.0000.000
W test0.10.8
Bmean0.4010.4050.0110.009
r.m.s.e.0.0190.0260.0020.001
skewness−0.11−0.02−0.061.18
difference−0.0040.002
W test−0.9−6.2
Cmean0.4190.4240.0110.010
r.m.s.e.0.0140.0250.0030.002
skewness−1.01−0.501.511.78
difference−0.005−0.001
W test−1.53.4
Table 5. Estimate of the parameters ( p 1 ,   p 0 ), their uncertainties ( σ p 1 ,   σ p 0 ), the standard error of the regression ( σ 0 ), the determination and linear correlation coefficients (R2, ρ ) relating the five graphs of Figure 5.
Table 5. Estimate of the parameters ( p 1 ,   p 0 ), their uncertainties ( σ p 1 ,   σ p 0 ), the standard error of the regression ( σ 0 ), the determination and linear correlation coefficients (R2, ρ ) relating the five graphs of Figure 5.
EpochTest Areap1σp1p0σp0σ0R2ρ
JuneB0.830.040.120.010.0180.870.93
JulyB0.270.020.660.010.0090.790.89
JulyC0.400.020.570.010.0150.840.92
AugustA0.260.050.70.050.0040.280.53
AugustC0.600.080.390.080.0090.470.68
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Dubbini, M.; Palumbo, N.; De Giglio, M.; Zucca, F.; Barbarella, M.; Tornato, A. Sentinel-2 Data and Unmanned Aerial System Products to Support Crop and Bare Soil Monitoring: Methodology Based on a Statistical Comparison between Remote Sensing Data with Identical Spectral Bands. Remote Sens. 2022, 14, 1028. https://doi.org/10.3390/rs14041028

AMA Style

Dubbini M, Palumbo N, De Giglio M, Zucca F, Barbarella M, Tornato A. Sentinel-2 Data and Unmanned Aerial System Products to Support Crop and Bare Soil Monitoring: Methodology Based on a Statistical Comparison between Remote Sensing Data with Identical Spectral Bands. Remote Sensing. 2022; 14(4):1028. https://doi.org/10.3390/rs14041028

Chicago/Turabian Style

Dubbini, Marco, Nicola Palumbo, Michaela De Giglio, Francesco Zucca, Maurizio Barbarella, and Antonella Tornato. 2022. "Sentinel-2 Data and Unmanned Aerial System Products to Support Crop and Bare Soil Monitoring: Methodology Based on a Statistical Comparison between Remote Sensing Data with Identical Spectral Bands" Remote Sensing 14, no. 4: 1028. https://doi.org/10.3390/rs14041028

APA Style

Dubbini, M., Palumbo, N., De Giglio, M., Zucca, F., Barbarella, M., & Tornato, A. (2022). Sentinel-2 Data and Unmanned Aerial System Products to Support Crop and Bare Soil Monitoring: Methodology Based on a Statistical Comparison between Remote Sensing Data with Identical Spectral Bands. Remote Sensing, 14(4), 1028. https://doi.org/10.3390/rs14041028

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop