Next Article in Journal
Ocean Temperature Profiling Lidar: Analysis of Technology and Potential for Rapid Ocean Observations
Next Article in Special Issue
Landsat 9 Transfer to Orbit of Pre-Launch Absolute Calibration of Operational Land Imager (OLI)
Previous Article in Journal
Lightweight and Stable Multi-Feature Databases for Efficient Geometric Localization of Remote Sensing Images
Previous Article in Special Issue
The Ground-Based Absolute Radiometric Calibration of the Landsat 9 Operational Land Imager
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Validating Digital Earth Australia NBART for the Landsat 9 Underfly of Landsat 8

1
Geoscience Australia, Cnr Jerrabomberra Ave. and Hindmarsh Drive, Symonston, ACT 2609, Australia
2
Remote Sensing and Satellite Research Group, School of Earth & Planetary Sciences, Curtin University, GPO Box U1987, Perth, WA 6845, Australia
3
School of Earth Sciences, University of Western Australia, Perth, WA 6009, Australia
4
AquaWatch Australia CSIRO Space and Astronomy, Canberra, ACT 2601, Australia
5
CSIRO Environment, GPO Box 1700, Canberra, ACT 2601, Australia
6
Queensland Department of Environment and Science, GPO Box 2454, Brisbane, QLD 4001, Australia
7
School of the Environment, University of Queensland, Brisbane, QLD 4072, Australia
*
Author to whom correspondence should be addressed.
Remote Sens. 2024, 16(7), 1233; https://doi.org/10.3390/rs16071233
Submission received: 9 February 2024 / Revised: 13 March 2024 / Accepted: 18 March 2024 / Published: 31 March 2024

Abstract

:
In recent years, Geoscience Australia has undertaken a successful continental-scale validation program, targeting Landsat and Sentinel analysis-ready data surface reflectance products. The field validation model used for this program was successfully built upon earlier studies, and the measurement uncertainties associated with these protocols have been quantified and published. As a consequence, the Australian earth observation community was well-prepared to respond to the United States Geological Survey (USGS) call for collaborators with the 2021 Landsat 8 (L8) and Landsat 9 (L9) underfly. Despite a number of challenges, seven validation datasets were captured across five sites. As there was only a single 100% overlap transit across Australia, and the country was amidst a strong La Niña climate cycle, it was decided to deploy teams to the two available overpasses with only 15% side lap. The validation sites encompassed rangelands, chenopod shrublands, and a large inland lake. Apart from instrument problems at one site, good weather enabled the capture of high-quality field data allowing for meaningful comparisons between the radiometric performance of L8 and L9, as well as the USGS and Australian Landsat analysis-ready data processing models. Duplicate (cross-calibration) spectral sampling at different sites provides evidence of the field protocol reliability, while the off-nadir view of L9 over the water site has been used to better compare the performance of different water and atmospheric correction processing models.

1. Introduction

How appropriate a particular dataset is and which validation framework is best are essential questions in earth observation (EO) science [1]. The Committee on Earth Observation Satellites (CEOS) Quality Assurance Framework for Earth Observation (QA4EO) states that ‘Data and derived products shall have associated with them an indicator of quality to enable users to assess their suitability for particular applications, i.e., their “fitness for purpose” [2]’. The framework also suggests that ‘comparisons are an essential tool within any quality assurance (QA) framework as they provide a source of unequivocal information of differences and biases associated with similar activities’ [3]. While CEOS specifically references standards and instrumentation comparison, it is, in essence, the primary reason for the Landsat program’s history of ’underflies’, where a new satellite ’underflies’ the old, capturing data over the same sites. With this in mind, there are three complementary comparative elements in this paper:
  • We present the results of the Australian field’s validation campaign in support of the Landsat 8 and 9 underfly and demonstrate the level of agreement between the two sensors.
  • We compare and contrast the results presented by two differing analysis-ready data (ARD) processing models: Geoscience Australia (GA) and the United States Geological Survey (USGS).
  • We use underfly validation to prove the efficacy and comparative reliability of a satellite surface reflectance (SR) validation measurement model refined and proven by Digital Earth Australia (DEA).

1.1. A History of Underfly Validations

The launch of Landsat 9 on 27 September 2021 marks the latest milestone in a series that began with the launch of the first Landsat on 23 July 1972. The Landsat program has provided more than 50 years of continuous EO data that are now freely available to the global community as science-grade ARD [4]. The impact of the Landsat program on EO and our collective understanding of ecosystem functions and planetary changes over the last 50 years cannot be overstated. In recent years, significant progress in radiometric calibration and geolocation, coupled with nadir-adjusted bidirectional reflectance distribution function (NBAR) corrections has enabled collection-based processing [4,5,6]. Advancements in computing power and storage have propelled EO science into an era where it is now possible to conduct meaningful analyses of temporal change and landscape function. Indeed, these developments have ushered in a golden age in EO [5]. As each successive Landsat mission was launched and commissioned, it under-flew the predecessor, during which validation data were collected and used to adjust the calibration coefficients of the new platforms [7]. Since Landsat 4 (L4), subsequent generations have operated using an orbit different from those defined by the Worldwide Reference System (WRS-2, https://landsat.gsfc.nasa.gov/about/the-worldwide-reference-system/, accessed on 26 February 2022); consequently, multiple scenes became available to perform cross-calibration [7,8] before being moved into the final WRS-2 orbit. Since the underfly of L4 by Landsat 5 (L5), there have been many cross-calibration projects, but these have encountered various issues. For example, computing power limitations resulted in only a single scene being used to evaluate L5 and L4, even though they were nearly identical sensors and multiple scenes existed to cross-compare. The later Landsat 7 ETM+ (L7) featured improved and different bandpass filters compared to L5, so a comparison was difficult. Sensor design issues were also at play when the Landsat 8 (L8) Operational Land Imager (OLI) was launched, which had different bandpass filters to the L7 ETM+ [9,10], and later, EO-1 Hyperion hyperspectral data were used to create band-equivalent calibration factors [8]. L8 OLI and Landsat 9 (L9) OLI-2 are almost identical in terms of band characteristics and instrument design. With computing resources no longer a limiting factor, for the first time in the Landsat program, a rigorous cross-calibration between the two Landsat satellites could be performed [8]. That being said, this paper describes the results of a vicarious field validation campaign designed to compare the performances of both L8 and L9, as represented by their ARD SR data products.

1.2. Surface Reflectance ARD

Before the advent of ARD, EO researchers often devoted a large portion of a project to creating a bespoke, geometrically and atmospherically corrected ’postage stamp’-sized image for analysis. These projects often focused on image statistics, object analysis, and scene classification. There was little ability to study change detection from which meaningful insights into dynamic natural systems are gleaned. Thankfully, because of ARD, all that has changed. GA has published extensively on the science behind the creation of ARD, and in collaboration with researchers at the Commonwealth Scientific Research Organisation (CSIRO), created the first open data cube (ODC) [6,11,12,13,14]. The advantage of the ODC, where data are indexed as spatial stacks over time, is that consistent temporal analysis is now possible using the entire DEA ARD ODC archive, which now includes a substantial portion of all Landsat and Sentinel-2 data over Australia. A precondition for effective time series analysis is that each dataset occupies identical measurement domains—both radiometrically and geometrically. This ensures that perceived changes are not differences induced by the atmosphere or viewing geometry. To this end, all satellite data from DEA (i.e., each pixel) are atmospherically corrected to produce a bottom-of-atmosphere SR and then modified to a single nadir view and a solar zenith angle of 45°, called normalised BRDF-adjusted reflectance (NBAR) [11]. In areas with sloping terrain, the data are also corrected for terrain illumination. The combined product is called NBART [11,12,13]. The NBART is the foundational product and a growing list of higher order mapping products flow directly from this, so ongoing validation of the NBART, including the performance of new sensors (e.g., L9) as they contribute data, is prudent.

1.3. Vicarious Field Validation

Advancements in space and airborne spectrometry have occurred in tandem with an understanding of the complexities and improvements in the measurement system and protocols used in field validation [15,16,17,18,19,20,21,22,23]. Within the Australian EO community, data quality assurance and integrity monitoring are part of the five key focus segments defined in the Australia EO Roadmap 2021–2030 [24]. Another focus is recommitting to the long-standing history of international EO partnerships. The underfly validation supports both of these stated goals.
Decades of field spectrometry have proven how difficult capturing meaningful and repeatable field spectral measurements is. Indeed, experience has revealed an earlier hubris and we now appreciate that the very idea of capturing ‘truth’ in the field seems elusive. Field spectrometry—while seeming disarmingly simple—is far from it. Today, practitioners need to counter known observer effects and adhere carefully to procedures relating to instrument maintenance, calibration, and measurement protocols, while limiting field sampling to a few hours on either side of solar noon on the best cloud-free days. The hemispheric conical reflectance factor (HCRF) is the most common (and reliably captured) form of surface-leaving radiance measured in the field [21]. However, by definition of its limited nature, a more holistic understanding of the light field and the target condition is denied. As a measure of ‘truth’, a single HCRF is just a tiny fraction of the total illumination-reflective geometry that comprises a surface BRDF (see Section 2.4).
Indeed, Ref. [17] suggests that ‘there is no truth as such when it comes to data obtained by measurements, be it on ground or from remote sensing platforms’; they are but well-constrained approximations. However, by paying careful attention to instrument calibration and management (both spectrometers and reflectance panels) and following exacting field protocols, capturing meaningful field spectra is achievable. That being said, this paper describes a relatively straightforward form of field-based spectral validation. This validation is not based on detailed end members or pure spectral types but involves the integration of spectral variation along transects across a single hectare (100 m × 100 m). The hectare site is chosen as it is a spectrally homogeneous patch, embedded within a much larger area of of equally similar spectral homogeneity. This patch is heavily over-sampled (both spectrally and spatially). These data are then resampled (averaged) spectrally and spatially to simulate the larger spatial pixels and broad multispectral bands of the L8 and L9 OLI sensors. It is worth noting that the terms ‘calibration’ and ‘validation’ have occasionally been used interchangeably, although they technically refer to different purposes and functions [16]. The CEOS Working Group on Calibration and Validation (WGCV, https://www.isprs.org/proceedings/2003/isprs_ceos_workshop/Documents/Privette/Privette.pdf, accessed on 15 July 2022) states that ‘calibration is the process of quantitatively defining a system’s response to known and controlled signal inputs. Validation, on the other hand, is the process of assessing, by independent means, the quality of the data products derived from those system outputs’ [25].

2. Materials and Methods

Before satellite EO data and field-derived validation measurements can be compared, they need to be brought into a common measurement domain. Differences in the sensor field of view, path radiance look angle, solar illumination, and spectral bandwidths must all be accounted for. Terrestrial and aquatic surfaces also require different process models. The following section briefly details the GA Digital Earth Australia ARD processing. Because sunlight-surface interactions are fundamentally different over land and water surfaces, DEA produces two ARD products: one terrestrial (NBART) and one aquatic (A-ARD).

2.1. Digital Earth Australia Terrestrial Analysis Ready Data

L8 and L9 Level 1C (L1C) Top of Atmosphere (TOA) data provided by USGS need to be further processed before they can be ingested into ARD ODC. The pre-processing accounts for:
  • Atmospheric corrections;
  • BRDF;
  • Topographic effects.
This suite of pre-processing compromises NBART. The substantial difference with the USGS ARD product involves the lack of BRDF and terrain corrections. Because the field of view (FOV) of Landsat sensors is relatively small ( 7 . 5 to 7.5°), the effects of BRDF are considered to be minor over the Landsat FOV. However, even for this small FOV, the Landsat swath overlaps can vary up to 15° in view zenith ( θ v in Figure 1). When seasonal solar position changes are included, BRDF effects become prevalent [11], so this correction is considered important for the DEA terrestrial ARD. In areas of medium to high terrain, the radiance received by the satellite is affected by the surface geometry. It results in the slopes facing toward the Sun receiving more solar irradiance and appearing brighter in satellite images than those facing away from the Sun [12].
In some areas, direct solar irradiance is blocked by hills and mountains. There will be other parts of the hills and mountains facing the satellite sensor such that the geometry of observation will change the observed radiance significantly with respect to the same observation over a horizontal surface, as shown in Ref. [12]. In Figure 1a, the simple case of reflection from a flat area surface is shown, whereas Figure 1b shows how light is reflected from an inclined surface, and Figure 1c depicts where there is self-shadowing, or cast shadows from the terrain. The areas are masked in the DEA product as there is not enough signal received by the sensor to produce a sensible reflectance value. Where there are facets facing away from the Sun, but still irradiated, the radiative transfer equation is modified to account for the change in geometry, as seen in Figure 1a,b. Details on the DEA-ARD NBART methodology can be found in Ref. [12].

2.2. DEA Aquatic A-ARD

While terrestrial remote sensing has its challenges, aquatic remote sensing can be described as something of a dark art. In optically deep water (where no measurable signal from the benthic layer is detected) [26], the number of photons that make up the water-leaving radiance ( L W —the signal of interest to EO) is usually less than 1% of the photon count that enters the water column. The major component of the signal imaged by the satellite sensor is dominated by noise in the form of Sun glint and sky glint, which confounds meaningful feature extraction and interpretation [26,27]. A conceptual model of how the various pathways and sources of light are viewed by a remote sensing platform is shown in Figure 2 [26].
Despite these challenges, aquatic remote sensing continues to achieve significant advancements in delivering increasingly accurate and reliable mapping products, whether it be in benthic mapping or in retrievals that inform the levels of biotic or abiotic materials in solution or suspension. Band selection, data quantization, and signal-to-noise ratio levels are everything to the success of aquatic remote sensing. Ref. [28] states that ‘Landsat-8 outperforms the heritage Landsat optical sensors both radiometrically and spectrally. The OLI is a 12-bit push broom sensor that allows for a significantly higher signal-to-noise ratio (SNR) than that of the 8-bit whisk broom Thematic Mapper T M and enhanced thematic mapper (ETM+)’. The only significant difference between the two OLI instruments is the level of quantisation, as, unlike L8, L9 transmits the full 14-bit data range. Herein lies the potential for L9 to offer improved radiometric detail to the aquatic remote sensing community. For the reasons described above, ARD-A treats water surfaces differently. BRDF and terrain shade correction do not usually apply, except when nearby mountains may cast shadows. Instead, the adjacency effects, as well as Sun and sky glint corrections, are applied simultaneously with the Lambertian atmospheric correction [14,29,30,31]. These can be applied to the whole image. If the target reflectance is very different from the surrounding area (it often happens in water surfaces near the land and inland waters), the remote sensing reflectance can then be calculated [14].

2.3. USGS ARD and USGS Aquatic ARD

The USGS provides level 2 surface reflectance science products to the public. For Landsat 8 and 9, LaSRC (the land surface reflectance code) is used for the atmospheric correction and no BRDF or terrain illumination correction is applied. If BRDF and terrain correction are not applied to the GA ARD, the USGS and GA products are consistent—only differing in some ancillary data. For details about the method and an initial validation of the USGS product, see Ref. [32].
USGS also provides a Landsat Collection 2 Provisional Aquatic Reflectance product to the public. The product uses the sea-viewing wide field-of-view sensor (SeaWiFS) data analysis system (SeaDAS)-based approach [28,33]. The processing is different from the GA water processing as it applies specifically to water, including ocean areas. The GA aquatic ARD processing is consistent with the land processing model; it does not apply BRDF and terrain correction, but it takes account of glint (sky and sun glint) and adjacency.

2.4. DEA ARD Validation Protocol (ARD_VP)

The field validation protocols used during the Australian component of the L8/L9 underfly validations were established, refined, and proven during the Phase One Surface Reflectance Validation Project by Geoscience Australia (GA), which ran from 2018 to 2020. This continental-scale validation campaign set out to determine the accuracy and sources of uncertainty in DEA ARD Landsat and Sentinel SR products. This model of vicarious validation is essentially a site characterisation [19] and was first used in Australia (with some success) in a Lake Frome field campaign in support of the Hyperion calibration and validation [15,34]. The model was later refined and an overall description is presented in Ref. [19]. A comprehensive summary of the Phase One Project and results are presented in Ref. [35], while the sensitivity analysis of the sources of uncertainty using these protocols is described in Ref. [36]. The term ARD_VP is used to generically describe this approach. Underfly validation field spectral data conform to the definition of HCRF, as described by Ref. [21], where they report that field spectral measurements include components of both direct and diffuse atmospheric irradiance, and that to be meaningful, sampling needs to be designed and implemented such that consistently repeatable geometries are used. Moreover, spectral data should not be captured under highly unstable and scattering atmospheres. They also stress the need for recording good ancillary data that describe the prevailing conditions. In accordance with the ARD_VP, all the spectral data in this campaign were captured as surface-leaving radiances and converted to ARD-equivalent SR in post-processing. The full-range field spectrometers used in this validation are single-fibre instruments; therefore, readings of total downwelling irradiance are discrete measurements of a calibrated (and levelled) reflectance panel. In Figure 3a, Ref. [26] depicts the underlying geometric relationship between the illumination source, target surface, and sensor viewing angle. Figure 3b shows the capturing of total downwelling radiance by way of a levelled calibrated Labsphere Spectralon panel. Note the use of the foreoptic rest, as shown in Figure 3b; the rest greatly enhances the accuracy and repeatability of panel measurements when taking readings. Refs. [17,36] show that a 10° angular error in the geometry of panel readings can translate into a ±0.8% change in the recorded radiance.

2.5. Field Spectral Data: Radiance Not Reflectance

Historically, field spectrometry has captured data in reflectance units. Reflectance is a dimensionless ratio of the upwelling surface radiance divided by the total downwelling irradiance. However, working in reflectance units precludes using many simple but highly effective QA/QC measures that can significantly improve the accuracy and fidelity of field spectral data. This preference for collecting reflectance is probably due to a perceived ‘point and shoot’ simplicity where one could easily recognise different surfaces based on the reflective ‘shape’. Ref. [22] notes that ‘one key problem is that the term “reflectance” is unspecific and offers little insight into the particular conditions in which measurements were made’. There was an underlying belief that relative reflectance was a robust measure of the spectral properties of a material or surface. It was assumed that the relative nature of reflectance was sufficient to account for differences in atmospheric conditions, field spectrometers, reflectance panels, and in situ sampling geometries. In reality, while the overall shape of reflectance spectra remains relatively stable, BRDF (differences in illumination and sampling geometry) induces variations in the shape of the reflectance spectra. Time, as well as a multitude of studies reporting poor correlations between field measurements and EO data, have proven that these shifts matter. It became apparent more often than not that the signatures people were looking for as indicative of a particular property had become obscured by the variance, uncertainties, and noise within the data. Small differences in equipment management, calibration, and field sampling protocols greatly impact a study’s results and conclusions [17,22,35,36]. Moreover, they have a significant impact on spectral fidelity as well as its inherent value as an instance of supposed ‘truth’. Ref. [17] refined the earlier comment to state that while ‘there is in fact no such thing as ground truth,’ if data are correctly gathered, then ‘it is at best an unbiased traceable representation of the true value with an associated uncertainty’. Traceability and uncertainty are critical issues in the argument for recording field spectral data in radiance as opposed to reflectance.

2.6. Field Spectrometers

Two types of full-range ‘backpack’ spectrometers were used during this campaign:
  • Malvern Panalytical: ASD-FR (Field Spec 4) [37];
  • Spectral Evolution: SR3500 [38].
The ASD-FR is a widely used field spectrometer and its performance and reliability are fairly well understood [39,40]. ASD-FR were used at the Perth, Wilcannia (Site 1), Cunnamulla, and Narromine sites. At Wilcannia (Site 2) and Narromine, a Spectral Evolution SR3500 was used. The SR3500, being a lightweight, compact, new-generation full-range point spectrometer, is better suited for conducting UAV spectral surveys. The results presented below suggest it performed well against the ASD-FR during this campaign. A third visible-to-near-infrared (VNIR) spectrometer was deployed on a DJI M600 UAV at Wilcannia (Site 2). The Ocean Insight Flame Spectrometer [41] is small, lightweight (190 g), and is integrated into a payload that can be carried by the DJI M600. The sensor package and the sites at which these were deployed are shown in Table 1.
During the underfly validation, an 8° foreoptic was used with the ASD, and a 10° foreoptic was used with the SR3500. With the foreoptic looking vertically down and held approximately 1 m above ground, this resulted in a FOV of approximately 0.14 m for the ASD data and 0.17 m for the SR3500.

2.7. Field Site Characterisation and Data Processing

Field sites are typically one hectare, need to be flat, set within a much larger spectrally ‘uniformly variable’ area, and have low-growing vegetation (grasses and small shrubs) such that they are walkable. A series of transects, spaced every 10 m, are sampled using the field spectrometers (see Figure 3c and Figure 4). Depending on the level of field support and site trafficability, the field reflectance panel can be left in a central location. While this results in longer times between panel readings, by limiting these validations to days of very clear skies and a stable atmosphere, the compromise does not affect the overall result. Where possible, sites are orientated such that the operator walks toward (and away from) the Sun, while sampling, as this reduces the likelihood of their shadow affecting the measurement.
Instead of representing pure end-member point samples, these field spectra effectively integrate over large areas, blending the radiant flux from various cover fractions, including both sunlit and shaded fractions of soil, photosynthetic (PV), and non-photosynthetic (NPV) vegetation that constitute any given landscape. The ASD spectrometer was configured to save each scan as the average of 25 spectra, and walking at a slow pace of around 1 m s 1 tends to capture about 30 spectra per transect. This means that 750 spectral readings inform each 100 m transect. Given that 10 or 11 lines are sampled, a full site encompasses over 7000 individual full-range radiances. The SR3500, having a higher sample rate than the ASD, was set to average 10 spectra per saved file. Based on the automatic integration times and a similar walking speed, the SR3500 captured 20 full spectra per 100 m transect, which represents around 2000 discrete readings. Since the panel is situated on one side of the study area, the order of measurements is as follows: take a panel measurement, then walk and sample a transect line. After completing the line, open a new file (for the next line) and spectrally characterise that line as the operator walks back toward the panel, concluding with a panel measurement. A visualisation of this (derived from Ref. [17]) is shown in Figure 5, where the x-axis shows the time in seconds. The blue crosses represent panel radiance and the orange bars represent surface radiance measurements.
GPS locations in each spectral file can be used to create a visualisation of the transects within the study area (see coloured dots in Figure 6). When these locations are superimposed on the pixel grid for the OLI data, it is possible to see the field sampling density within a given pixel. In this example, transect 1 is the brown line at the bottom of Figure 6, whereas line 11 is the black line at the top of the image. Line 12 is the diagonal line where the operator has traversed the site back to the reflectance panel to capture the final panel reading.
In the absence of being able to take simultaneous measurements of both the radiant flux from the target and the total downwelling atmospheric irradiance, it is important that these validation campaigns are only attempted on clear, cloud-free days, with all sampling conducted within two hours of solar noon. Figure 7 shows three of the Phase One QA and QC measures used to monitor both the stability of the atmosphere and the quality of the panel readings during an overpass validation. Figure 7a shows all the panel readings as part of the spectral transects. Figure 7b plots each panel reading against the cosine of the solar zenith angle with the line of best fit shown in red. This provides an easily interpreted measure of the stability and quality of these readings [42]. All being equal (i.e., sampling only in perfect conditions), departures from the trend line suggest an operator-induced geometry variation when reading the panel (i.e., the spectrometer foreoptic is not fully normal to the panel surface or the panel itself is not level). Being able to identify erroneous panel measurements allows them to be eliminated before creating the SR. This is a central reason for capturing the field spectra in radiances rather than reflectance. The impacts of the operator on making repeatable panel measurements and on moving the panel itself between transect lines in the context of the Phase One methodology are discussed in Ref. [36].
Finally, Figure 7c shows the direct and diffuse components of the total downwelling irradiance before and after sampling the one-hectare site. This visualisation confirms that, apart from the expected increase in brightness due to the approaching solar noon, the overall scattering within the atmosphere is consistent and relatively stable. Ref. [43] described the importance of acquiring detailed coincident atmospheric readings as these can be used to configure MODTRAN radiative transfer simulations that can support the assessment of the field data quality.
Ref. [35] details the data processing required to convert in-field radiance measurements to satellite equivalent values so that a direct comparison between the data can be conducted. As the field data are captured in radiance and the DEA ARD product is NBART, a few extra steps are conducted with the field data beyond what would be required if the data were captured as relative reflectance and the satellite ARD were instantaneous reflectance.
The radiance for each spectrum captured in the field is converted to reflectance using the panel data corrected to the epoch of the spectra and is a simple ratio of the field spectral irradiance divided by the panel spectral irradiance. The in situ hyperspectral data are converted to Landsat 8/9 band equivalent values by convolving with the spectral response function of each band.
For comparisons to the USGS ARD product, no further processing is needed. In order to compare field-derived reflectance with GA’s NBART products, the field reflectance has to be adjusted to match NBART. This is achieved using an approach described in Ref. [44], where a conversion ratio is calculated using the same MODIS kernel weights and Ross-thick Li-sparse BRDF model, as used in the derivation of the satellite NBART product.
The form of the semi-empirical BRDF model is described in Equation (1):
ρ s ( θ s , θ v , φ ) = F i s o + F v o l K v o l + F g e o K g e o
where F i s o is the isotropic contribution, and F v o l and F g e o are the volumetric and geometric contributions, respectively. Equation (1) can be refactored to the following:
ρ s ( θ s , θ v , φ ) = F i s o ( 1 + F v o l F i s o K v o l + F g e o F i s o K g e o ) = F i s o B ( θ s , θ v , φ , α 1 , α 2 )
where α 1 = F v o l F i s o , α 2 = F g e o F i s o . B ( θ s , θ v , φ , α 1 , α 2 ) is the BRDF shape function.
From Equation (2), a correction factor, C, is calculated at the epoch of each spectrum, where the solar zenith angle is calculated for each observation and the view zenith angle is 0 , as all spectra are captured at the nadir:
C = ρ M O D I S ( 45 , 0 , 0 ) ρ M O D I S ( θ s , 0 , 0 )
where θ s is calculated at the epoch of each spectrum. The NBAR correction for each Landsat spectrum is then calculated by multiplying by C such that:
N B A R = C × ρ L a n d s a t ( θ s , 0 , 0 )
Each spectrum has a GPS location, such that the position of each can be located on the same grid as the Landsat data. In practice, multiple spectra will fall onto a Landsat pixel for each test site. There will be nine or more Landsat pixels for each one-hectare survey site. The surface NBAR values for each Landsat pixel are averaged so a direct comparison can be made for each pixel. The surface NBAR and satellite NBART values are averaged such that a single comparison for each site is possible, with the individual pixel values used to derive standard deviation.
The results presented below are for site-averaged values.

2.8. Unmanned Aerial Vehicle (UAV) Flight at Wilcannia

A DJI M600 UAV with the Ocean Insight VNIR Flame Spectrometer package was used to characterise Site 2 at Wilcannia in conjunction with the SR3500 FR spectrometer (see Figure 8).
The UAV survey coincided with a standard ARD_VP site characterisation, as described above, using a Spectral Evolution SR3500. By its very nature, the UAV data collection is different and provides much greater flexibility. The flying height and speed coupled with the foreoptic and data capture rate all impact the nature of the area sampled. On this occasion, the flight lines were supposed to replicate the SR3500 transects.
The M600 flew at a height of 30 m relative to the take-off position, and given that the site (Wilcannia Site 2) is very flat, the UAV is assumed to be 30 m above ground level for the entire flight. The FOV of the Flame Gershun tube foreoptic was 8°, which at 30 m above ground gives a footprint of just over 4 m. The flight speed was 5 m s 1 and the sampling rate of the spectrometer was 1 Hz. This was achieved by optimising the spectrometer and then averaging spectra so that a 1 s averaged spectrum was captured. This provides a smear sample of 5 m with a width of approximately 4 m. With a UAV spectral survey, it is possible to achieve 100% cover of the field site rather than the discrete narrow transects of the ground-based sampling. This is possible by changing the flight height or the FOV. In this case, coverage of the field site is closer to 50%. In theory, this flight aimed to replicate, as far as possible, the lines walked by the surface-based instrument, such that a better comparison could be achieved. But due to issues accessing the site before the underfly event, the flight plan for the UAV had to be estimated on the day, and as a consequence, the coverage of the SR3500 and the UAV did not exactly match, with transects having different headings (see Figure 9).

2.9. Aquatic Validation—Lake Hume

During the underfly campaign, water surface-leaving radiance ( L w ) and apparent optical properties (AOPs) were collected using calibrated TriOS RAMSES sensors on Lake Hume at HUME_10 (Figure 10).
Three methods were used to obtain in situ radiometric measurements of remote sensing reflectance (Rrs). The three methods employed included the single-depth approach (SDA), the sky-blocked approach (SBA) [30] and the above-surface sky radiance method (SRM) [45]. The three deployment methods were initially conducted within 10 min of the simultaneous overpass. All measurement approaches were conducted from a 5.2 m vessel that had been anchored to reduce movement during the measurement process.
Due to the dynamic nature of aquatic systems and the impact of BRDF on aquatic EO, it is important to understand the relationship between the underfly overpasses. Table 2 summarises the timing of each Landsat image of Lake Hume. Landsat 9, descending to the east along WRS2 path 91 with the sensor tilted to the west by 12 . 886 , imaged the lake approximately nine minutes before Landsat 8, which descended along path 92 and captured images of Lake Hume at nadir.
During the on-water validation, all measurements were made from the sunward side of the vessel to minimise interference from vessel shading. The water state at the time of the overpass was optimal, being very stable and glassy (see Section 3.5). The sky overhead was clear of any clouds or haze, although small cumulus clouds were on the horizon during the overpass. Table 3 describes the methods, locations, and times of measurements undertaken during this campaign.
The SDA [30] was obtained by submerging a nadir-viewing radiance sensor 0.15 m below the water surface ( L u (z0.15)). This sensor was suspended from a 1.5 m pole on the sunward side of the vessel to reduce shading and other interference from the vessel. An irradiance sensor was positioned above the irradiance sensor at 180° off-nadir. The measurement configuration for SDA is shown in Figure 11. To correct for the attenuation of light in the upper 0.1 m of the water column, the diffuse radiance attenuation coefficient (k L u ) was calculated from a second radiance measurement made at a depth of 3 m ( L u (z3.0)). k L u was then calculated using the two-depth approach from the L u (z0.15) and L u (z3.0) measurements, as described in Ref. [45]. k L u was then applied to L u (z0.15) to estimate L u (0−). A deck-mounted irradiance sensor was used to correct all radiance measurements, according to [45]. The water-leaving radiance ( L w ) was then calculated by accounting for the transmittance of L u (0−) across the water–air interface, using the method described in Ref. [46], and the refractive index of water from Ref. [47], whilst accounting for the temperature and salinity of the water ( T w a ). R r s ( S D A ) was calculated using Equation (5).
R r s ( S D A ) = L u ( z 0.15 ) × e ( K L u × 0.15 ) × T w a E d ( 0 + )
The sky-blocked approach (SBA) [30] uses a conical shade to block sky glint for a nadir radiance sensor positioned 0.05 to 0.1 m above the water surface. The shading effect from the cone is corrected using in situ absorption and backscattering measurements, obtained by an absorption and attenuation meter (ACS) and a backscattering meter (BB9) positioned near (but out of) the FOV of the radiance sensor.
R r s ( S B A ) was then calculated using Equation (6), where L w (us) is the shade-corrected water-leaving radiance, as shown in Figure 10 (right panel). The self-shading correction was applied as described by Ref. [48].
R r s ( S B A ) = L w ( u s ) E d ( 0 + )
Note that the operator dropped below the E d sensor height prior to data acquisition, where Δ φ is the azimuth angle relative to the Sun and θ v is the nadir-viewing angle.
The sky-radiance corrected method described in Ref. [45] is an above-surface approach that uses off-nadir radiance measurements of the water surface and sky to estimate the water-leaving radiance. The sky radiance ( L s k y ) is subtracted from the water radiance ( L w s g ) to remove the effects of sky reflection and surface glint. This method used a single radiance sensor in sequence with viewing angles of 35° ( L w -sg) and 145° ( L s k y ) off-nadir with an azimuth of 135°. A deck-mounted irradiance sensor was used to obtain E d (0+), and another sensor was viewing 180° off-nadir. R r s ( S G A ) was then calculated using Equation (7), where ρ F is the fraction of incident skylight that is reflected toward the water-viewing radiance sensor and is calculated in Ref. [45].
R r s ( S G A ) = L w s g × ρ F E d ( 0 + )
To enhance the matchup analysis, water samples were collected at each site to obtain water quality parameters in addition to radiometric data. These included the total suspended sediment (TSS) concentration, particulate absorption, chlorophyll-a concentration, coloured dissolved organic matter (CDOM) absorption, total organic carbon (TOC), and dissolved organic carbon (DOC). At each site, bio-optical instruments were deployed through the water column to obtain inherent optical properties such as backscattering, attenuation and absorption, along with temperature, depth, and conductivity.

3. Results

Logistically, this was a difficult campaign. The delayed launch of Landsat 9, COVID-19-related travel restrictions that were in place across Australia, and a large rainfall event all threatened to compromise much of the planned field deployment. Confirmation regarding the location and the timing of the underfly swaths was not defined until after Landsat 9 was launched. The delayed launch (due to the shortage of liquid oxygen) meant that the primary field sites that had been identified (with access negotiated), based on the initial launch schedule, had to be abandoned. Replacement field sites had to be quickly located and access-approved. Capturing meaningful spectral data for these validation sites requires a stable clear atmosphere. Our experience with the Phase One study reinforced the uncertainty the weather brings. So, it was decided to actively target all possible underfly swaths, regardless of the overlap percentage. Although Australia had many underfly swaths, COVID-19-related state border closures meant that possible overpasses, including one that tracked through South Australia on 14 November 2021, could not be targeted by the GA field team. However, teams from three different Australian states were deployed to five sites and three overpasses. The locations of these sites and the nature of the overlap are shown in Figure 12.
Site selection and planning for the final overpass on 17 November 2020 was based on expectations that this underfly featured a limited 15% overlap between the sensors. As it happened, NASA programmed L9 for an off-nadir view and tilted the sensor by 12.886° to the west so that the area viewed by L9 encapsulated most of the L8 swath. Significantly, this off-nadir view had positive implications for the aquatic SR validation that was undertaken on Lake Hume; these results are discussed below.
A Google Earth view of each of these sites is shown in Figure 13.

3.1. Perth

The first Australian underfly in situ data collection occurred over Perth on 13 November 2021. This underfly featured only a 10–15% overlap between the sensors. The area of overlap is highlighted in green (see Figure 14). The site itself is a horse paddock in a peri-urban area to the south of the Perth metropolitan area.
Eleven lines were sampled at Perth with the reflectance panel left in a central position, as shown in Figure 15.
The details of the Perth underfly are given in Table 4.
A MICROTOPS 540 II Sunphotometer was deployed to monitor atmospheric stability, and plots of the aerosol optical thickness across 380, 500, and 675 nm are shown in Figure 16, showing a stable atmosphere over the sampling period. The ASD field data are shown in Figure 17. The full-resolution ASD spectra are shown in the left-hand panel, with the transect averages individually coloured. The transect averages resampled to the Landsat 9 OLI are shown in the right-hand panel.
The results of the matchup between the field data and DEA and USGS ARD Landsat 8 and 9 products are shown in Figure 18. Note that specific bandpass functions for both Landsat 8 and 9 were used when resampling the field data for each OLI matchup.

3.2. Cunnamulla

On 15 November 2021, the sole 100% overlap orbits of Australia occurred, including the one shown in Figure 12. The first field site to be imaged was in Cunnamulla in southern Central Queensland (Figure 19). The site was a stabilised sand dune complex and featured sparse, low-growing grasses and chenopod (saltbush) shrubs.
The team deployed not only an ASD-FR to sample the one-hectare field site, as described in the methods section, but also used a UAV carrying a VNIR hyperspectral camera to image the site. Unfortunately, a hardware fault with the ASD rendered all the collected data unusable, which was a scant reward for the team that had driven the long nine hours from Brisbane. However, some moments later, both satellites passed over Wilcannia about 400 km to the southwest.

3.3. Wilcannia

The Wilcannia sites are located on the Darling River floodplain. The soil (when dry) is fine-grained grey cracking clay. When wet, these areas become dark, impassable quagmires. The surface vegetation is mostly low-growing chenopod (saltbush) shrubs. Access to this site area was only granted 18 h before the overpass, as the road was flood-affected and had been closed by the local council. Due to a very poor cloud forecast (60% cover) for the overpass time (Figure 20), it was decided to deploy teams to two different sites at Wilcannia.
Luckily, the ECMWF cloud model’s forecast for the day proved to be incorrect, and the forecast cloud remained well to the south, so both sites at Wilcannia were cloud-free. After days of forlorn optimism and dealing with last-minute road closures, there was much rejoicing out on the floodplain! The details of the underfly orbits at Wilcannia are presented in Table 5, where less than 40 s separates the two overpasses.

3.3.1. Wilcannia Site 1

The panel in the top right of Figure 21 shows the spatial context of the Wilcannia field sites.
The forecast cloud was set to build over this region at around 10 a.m. (AEDT). So, as a precaution, it was decided to characterise the sites early while it was still cloud-free, with the hope that the site might be partially visible when the satellites went over. When the cloud did not eventuate, a second site characterisation ‘at time of overpass’ was captured.
The results of both sample times for Site 1 are presented below. They show the same patterns of agreement between the field data and the matchups. Firstly the results for the panel readings are shown in Figure 22.
At any given site, no two spectral traverses are exactly alike. Operator-induced variations and equipment performances create differences in both the data density and coverage. Figure 23 shows the relative location of the spectral data. The first site characterisation is more hurried, given the window of time available before the time of the overpass. The final transect cuts a diagonal path back to the reflectance panel location. As the forecast cloud build-up does not occur, the second characterisation (at the time of overpass) is less hurried but suffers from some data capture gaps. The final line is added to the north of the site. Despite this sampling variation, the overall matchups are similar, which demonstrates the inherent stability of this form of spectral sampling. Spectrally homogeneous sites are ‘oversampled’ in such a way that small locational variations average out.
The matchup between these two site characterisations and the two OLI instruments is shown in Figure 24. Overall, both field acquisitions are in good agreement, demonstrating the repeatability of the ARD_VP approach. In Figure 24, we see a constant offset between the field data and the two Landsat satellites where the field data appear less bright. The error bars in these plots represent the 1 σ variation. Given that all error bars overlap, we argue that the offset between the field data and satellites is of no statistical significance. However, possible reasons for the brightness offset between the field data and the satellites are discussed below.
Total downwelling readings of the diffuse and direct irradiance were taken using the ASD-FR, using the cosine diffuser. The first readings were taken at the time of the satellite overpasses and the final reading was taken once the site characterisation was complete. Figure 25 shows these results for Wilcannia and suggests that stable atmospheric conditions persisted over the field site.
The Landsat 8 and 9 matchups against DEA ARD and the USGS SR products for Site 1 (second site characterisation) at Wilcannia are shown in Figure 26. Overall, the agreement between the Landsat sensors is very good. The field data are consistently darker across all the bands but converge at the longest wavelengths.
There are several possible reasons for the brightness offset between the field spectral data and L8 and L9 at this site. There may have been a very thin (unobserved) layer of cloud or water vapour high in the atmosphere. Indeed, we have seen similar ‘block shifts’ between the satellite and field spectra when the site was imaged through a very light fog [35]. Another possibility is the presence of an obvious vegetation structure. The saltbush here stood 0.1–0.3 m high and cast significant shadows (see Figure 27). Walking at ground level, the field spectrometers (with a much narrower FOV) may have seen a much greater percentage of the shaded cover fractions of the soil and vegetation, photosynthetic (PV), and non-photosynthetic (NPV) than the orbiting sensors. It is possible that the sampling was biased by the trafficability across the site, where the operator may have inadvertently sampled more of the shadowed areas whilst walking around the saltbushes [49]. Also, the BRDF kernel used in the DEA ARD correction applies to the scene as a whole and these results may indicate that localised situations may differ from the whole scene. Similar results were observed with the SR3500 and the UAV-derived Flame data taken at the second Wilcannia site.

3.3.2. Wilcannia Site 2

This site is 2.5 km from Wilcannia Site 1 (Figure 28), and this was used to provide contingency against possible cloud contamination during the underfly event. It features similar soils and vegetation. However, a different instrument package was used. The field spectra were captured using a Spectral Evolution full-range SR3500 (400–2500 nm) spectrometer. The site was also imaged using a UAV, specifically a DJI M600, carrying an Ocean Optics VIS/NIR Flame Spectrometer (see Figure 8). Phase two of the DEA validation program is specifically focused on adapting the terrestrial ‘backpack’ field spectrometry model to UAV-based sampling.
UAV-based spectrometry offers the potential to undertake meaningful spectral validation over complex vegetation and landforms and is supported by a growing body of literature [50,51,52,53,54]. This includes validation of BRDF over a range of surfaces. The underfly provides an opportunity to test the UAV in this type of SR validation. While the Flame is only a VIS/NIR sensor, a larger UAV is trialled to carry the full range (400–2500 nm) SR3500.
Given its proximity to Site 1, it can be assumed that the inference made regarding the atmospheric stability at Site 1 also applies to Site 2 (Figure 25). It should be noted that the ground-level photo showing cloud cover in Figure 28 was taken some hours after the data sampling.
The matchup results for DEA and USGS ARD Landsat 8 and 9 SR products and the ground-based field and UAV spectrometry are shown in Figure 29. Only slight differences between the two ARD models are evident. For the DEA product, the SR3500 spectrometer data have lower values than both OLI sensor measurements, as seen at Wilcannia Site 1. The USGS ARD matchup is less systematic. Bands 1 and 2 appear darker than the field data but then the OLI signal climbs above the SR3500 result at Band 3 and then we see an equivalent type of offset to that seen in the DEA ARD.

3.4. Narromine

The final Australia L8/L9 underfly field deployment occurred two days later on 17 November 2020, some 430 km to the east of Wilcannia, at Narromine. This was originally defined as a 15% partial overlap underfly, but it transpired that NASA adjusted L9 to point off-nadir and, in effect, a near 100% overlap was achieved (see Figure 12). This site was located at the western end of the primary runway of the local airport (made possible as the runway was closed for surface repairs). Furthermore, the site had recently been mowed, making it ideal for our purposes, as shown in Figure 30.
As a contingency against forecast cloud cover, we also intended to sample two sites near Narromine. However, upon inspection, the secondary site proved unsuitable due to the long grass that was difficult to traverse. As the weather forecast had improved for the day, it was decided that the ASD-FR, SR3500, and UAV would sample the single airport site. These data could then be used as an ARD_VP cross-calibration between the two field spectrometers and the two operators. The sensitivity analysis of the ARD_VP has shown that up to 5% of the data variance can be attributed to the performance of the instrument and the operator [36]. A summary of these data, as they relate to the Landsat 8 data, is shown in Figure 31, and the results affirm the stability and reliability of the ARD_VP. When using well-calibrated instruments and a simple, but rigorously applied measurement protocol, equitable and repeatable results are achieved.
Figure 7c suggests that both atmospheric diffuse and direct irradiance underwent little change over Narromine during the site characterisation.
Despite the L8/L9 orbital track offsets and the different satellite viewing angles, the DEA ARD processing (a pixel-by-pixel nadir view and constant illumination model) renders near-identical imagery. Figure 32 shows the image tiles extracted from L8 and L9 for the data matchups. The field site is indicated by the small yellow inset tile in the centre of each image.
The matchups for the field spectral sampling (ASD and SR3500) and Landsat 8 and 9 data for this site are good, as shown in Figure 33. Differences in the DEA and USGS ARD algorithms are the most obvious in the Landsat 8 results based on the USGS ARD.
Finally, there were plans to fly the DJI M600 UAV (equipped with the Flame) at this site. Although the runway was closed and the field team had secured local approvals for the survey, the DJI flight control software (GS PRO Version 2.07) recognized the proximity to an airport and would not enable the flight plan.

3.5. Lake Hume

A significant element of the Australian underfly validation campaign involved a team of aquatic remote sensing scientists from the CSIRO Environment who conducted an extensive on-water validation at Lake Hume in southern NSW (see Figure 34). Lake Hume is a productive eutrophic lake [55] located downstream from the confluence of the Murray River and Mitta Mitta River on the NSW and Victorian State borders of eastern Australia. The dam was opened in 1938 and is used for hydropower, flood mitigation, irrigation, recreation, water supply, and wildlife conservation, and has a maximum depth of 40 m.
The weather forecast for this site had been questionable, yet on the day, near-perfect conditions prevailed, as shown in Figure 34. Capturing meaningful data on the water-leaving radiance and SR over a water body is challenging. It necessitates not only clear skies but also very low surface winds, which otherwise create waveforms that act as reflective facets, resulting in a glint that saturates the surface-leaving radiance. Further to this, these data require an optimal Sun-sensor alignment, otherwise, the water surface—as seen by satellite imagery—is dominated by either specular reflection or sky glint effects, which seriously confound the determination of the surface-leaving radiance. Fortunately, at Lake Hume, clear skies and low winds prevailed and meaningful field data of the above-surface and in-water radiance were acquired at several sites across the Lake. However, the underfly took place in mid-November, some five weeks before the southern hemisphere summer solstice. This is less than ideal for the nadir-looking Landsat 8. When the imagery arrived, it was found to be quite affected by sky glint, while Landsat 9, which descended in a more easterly orbit with the sensor rolled off-nadir by 12 . 886 , produced imagery that was not affected by sky glint at all (see Figure 35).
Ref. [56] reports that very slight changes in the view angle vs. illumination angle for imagery over water targets significantly impact quality and uncertainties surrounding meaningful data retrievals. The day prior to the underfly, radiometric data were acquired at two sites in the southern- and mid-regions of the Lake. Previous field campaigns had acquired data at these locations and so the site names of previous campaigns (HUME_06 and HUME_DAMWALL) were used. On the day of the underfly, HUME_01 was first sampled prior to the overpass. For the underfly, a northern site in the Bowna arm of the lake (HUME_10) was selected to best align with Landsat 8 and Landsat 9 overpasses; it was not known at the time that L9 (descending in the adjacent orbit) had been tilted, providing near full coverage of the L8 swath.
Note that the GA A-ARD processing model has been used for water pixels used in these matchups. Specifically, a water/land mask was applied. The land pixels were processed using the usual GA ARD BRDF, terrain, and atmospheric corrections, while sky glint, adjacency, and atmospheric corrections were applied to the water pixels (A-ARD). The matchups between the field data and the GA A-ARD and USGS aquatic ARD products are shown below (see Figure 36, Figure 37, Figure 38 and Figure 39).
Figure 39 shows the comparison between the different field measurements and A-ARD and USGS-ARD. The SDA seems to provide the most robust data. Variations between the two OLI products could be due to the sensitivities in the viewing angle and different path radiances as well as small perturbations from the on-water sampling platform.

3.6. Summary of Results

A total of seven coincident underfly validations were attempted during the Australia campaign, although equipment issues resulted in unusable data for Cunnamulla; see Table 6.
Correlations summarising the relationship between the field data at all the terrestrial sites vs. the individual satellites, as well as DEA and USGS ARD models, are shown in Figure 40. These results suggest good overall agreement between the field data and the two ARD SR products, as well as good agreement between Landsat 8 and 9.
The overall matchup between Landsat 8 and Landsat 9 across all terrestrial sites—comparing the DEA and USGS ARD SR products—is shown in Figure 41. These results suggest that whilst both the GA and USGS ARD products are consistent for both L8 and L9 as an ensemble, there is more scatter for individual points within the USGS data.
Figure 40 and Figure 41 show that even if the uncertainty of the field validation of the GA NBART include three different layers of corrections (atmospheric, BRDF, and terrain illumination), the results from GA NBART are still better in relation to the field data than the USGS’s, which addresses the uncertainty of atmospheric correction only (see Section 2.6). The specific benefit of BRDF correction is still challenging to discern through field validation, as most field measurements were taken simultaneously with the satellite overpass. A field validation specific to BRDF would require using comparisons involving both areas of scene overlap and time series. But that was not the purpose of this paper. Further discussion on this is presented in Ref. [11].

4. Conclusions

This paper aims to address three complementary goals. Firstly the Australian validation demonstrates that, at the time of the underfly, Landsat 8 and 9 record (as designed) near-identical EO data, which speaks positively to the ongoing Landsat EO program. Secondly, the performances of the GA and the USGS Landsat 8 and Landsat 9 ARD SR products are compared. While both products show similar levels of accuracy and agreement, there is slightly better agreement between the field data with the GA DEA NBART product despite the GA NBART addressing three layers of uncertainty, while the USGS addresses only one.
For all intents and purposes, the R 2 of 0.99 suggests that Landsat 8 and 9 can be described as radiometrically identical. The matchups between the field data and the two aquatic ARD products show more divergence. Although the GA aquatic ARD generally provides good results compared to the field measurements, the coastal blue band (band 1) generally has higher Rrs, especially for Landsat 9, which was rolled to a high angle. The high value for this band in the nadir-viewing Landsat 8 could potentially be attributed to calibration issues, as pointed out by Ref. [57]. They handle it by adjusting the calibration. The larger difference for Landsat 9 seems angle-dependent. The similar divergence between algorithms for different water products in underfly matchups is reported by Ref. [58].
Finally, the third objective of this paper highlights the efficacy of the ARD_VP as an approach to SR validation. The duplicate sampling of the two Wilcannia sites and the cross-calibration exercise at Narromine have demonstrated that with well-calibrated sensors and different operators adhering to the protocol, the ARD_VP is robust and repeatable. The challenge for DEA moving forward will be to use and adapt this vicarious validation method to more expansive validation approaches, using UAV platforms over more complicated real-world environments, such as complex vegetation structures and uneven terrains. Through participation in the L8/L9 underfly, the Australian EO community has demonstrated the capacity to mount effective national field validation campaigns. It is important that the skills, instrumentation, and supporting calibration facilities needed for such programs are secured and expanded into the future. Data quality assurance and integrity monitoring are seen as important strategic goals for the Australian EO community, and by default, sustaining and building the capability of this will enable the continuation of successfully supporting international satellite validation.

Author Contributions

All authors participated in various aspects of planning, data acquisition, and data processing. M.B. and G.B. were responsible for organising and planning the Australian underfly collaboration, managing logistics for the NSW sites, and handling data acquisition, processing, and manuscript preparation. A.J.W. took charge of the logistics for the WA site, as well as data acquisition and processing. E.H. and M.T. managed logistics and data acquisition for the NSW sites. F.L. processed the Landsat 9 data for all sites. R.G. and B.M. contributed to logistics and data acquisition for the WA site. J.B., R.D. and I.S. coordinated logistics and data acquisition for the QLD site. J.A., G.K. and N.D. were involved in the logistics, data acquisition, and processing of the Lake Hume site. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Spectroscopy data collected as part of the Australian contribution to the Landsat underfly evaluation are available in the Australian National Spectral, Database (accessed on 18 February 2024) https://www.dea.ga.gov.au/products/national-spectral-database.

Acknowledgments

Mounting field campaigns to new sites at short notice is always difficult. Special thanks must go to all those people who helped locate field sites and those who opened doors, farm gates and roads at very short notice; Geoff Walker (Murdoch University), Simone Carmichael, Ray Dayman and Glen Wheatley (NSW Parks), Peter Fearns (Curtin University), Matt Andrews (Avondale Station), Warren and Janine Duncan (Dunedin Park), Phil Johnston (Narromine Shire), Graham Black (Lake Yanga Station), Tim McMullen (Borambula Wines), Narromine Rifle Range, Patrick Foxley (GA-LAMA), Lisa (Warrawong on the Darling), and finally, Bill Elliot (Wilcannia). The authors would like to thank Tony Gill, Heidi Mawbey, and Geoff Horn from Environment NSW for their help at the Narromine site. This paper is published with the permission of the CEO of Geoscience Australia.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Loew, A.; Bell, W.; Brocca, L.; Bulgin, C.E.; Burdanowitz, J.; Calbet, X.; Donner, R.V.; Ghent, D.; Gruber, A.; Kaminski, T.; et al. Validation Practices for Satellite-Based Earth Observation Data across Communities. Rev. Geophys. 2017, 55, 779–817. [Google Scholar] [CrossRef]
  2. A Quality Assurance Framework for Earth Observation: Implementation Strategy and Work Plan. Available online: https://www.qa4eo.org/docs/QA4EO_Principles_v4.0.pdf (accessed on 18 February 2024).
  3. A Guide to Comparisons—Organisation, Operation and Analysis to Establish Measurement Equivalence to Underpin the Quality Assurance Requirements of GEO. Available online: https://www.qa4eo.org/docs/QA4EO-QAEO-GEN-DQK-004_v4.0.pdf (accessed on 18 February 2024).
  4. Wulder, M.A.; Roy, D.P.; Radeloff, V.C.; Loveland, T.R.; Anderson, M.C.; Johnson, D.M.; Healey, S.; Zhu, Z.; Scambos, T.A.; Pahlevan, N.; et al. Fifty Years of Landsat Science and Impacts. Remote Sens. Environ. 2001, 280, 113195. [Google Scholar] [CrossRef]
  5. Lovel, T.; Dwyer, J. Landsat: Building a Strong Future. Remote Sens. Environ. 2012, 122, 22–29. [Google Scholar] [CrossRef]
  6. Lewis, A.; Oliver, S.; Lymburner, L.; Evans, B.; Wyborn, L.; Mueller, N.; Raevksi, G.; Hooke, J.; Woodcock, R.; Sixsmith, J.; et al. The Australian Geoscience Data Cube—Foundations and lessons learned. Remote Sens. Environ. 2017, 202, 276–292. [Google Scholar] [CrossRef]
  7. Kiata, E.; Markham, B.; Haque, M.O.; Dichmann, D.; Gerace, A.; Leigh, L.; Good, S.; Scmidt, M.; Crawford, C.J. Landsat 9 Cross-Calibration Under-Fly of Landsat 8: Planning and Execution. Remote Sens. 2022, 14, 5414. [Google Scholar] [CrossRef]
  8. Gross, G.; Helder, D.; Begeman, C.; Leigh, L.; Kaewmanee, M.; Shah, R. Initial Cross-Calibration of Landsat 8 and Landsat 9 Using the Simultaneous Underfly Event. Remote Sens. 2022, 14, 2418. [Google Scholar] [CrossRef]
  9. Holden, C.E.; Woodcock, C.E. An analysis of Landsat 7 and Landsat 8 underflight data and the implications for time series investigations. Remote Sens. Environ. 2016, 185, 16–36. [Google Scholar] [CrossRef]
  10. Flood, N. Continuity of Reflectance Data between Landsat-7 ETM+ and Landsat-8 OLI, for Both Top-of-Atmosphere and Surface Reflectance: A Study in the Australian Landscape. Remote Sens. 2014, 6, 7952–7970. [Google Scholar] [CrossRef]
  11. Li, F.; Jupp, D.; Reddy, S.; Lymburner, L.; Muller, N.; Tan, P.; Islam, A. An Evaluation of the Use of Atmospheric and BRDF Corrections to Standardize Landsat Data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2010, 3, 257–270. [Google Scholar] [CrossRef]
  12. Li, F.; Jupp, D.; Thankappan, M.; Lymburner, L.; Lewis, A.; Held, A. A physics-based atmospheric and BRDF correction for Landsat data over mountainous terrain. Remote Sens. Environ. 2012, 124, 756–770. [Google Scholar] [CrossRef]
  13. Li, F.; Jupp, D.; Thankappan, M. Using high resolution DSM data to correct the terrain illumination effect in Landsat data. In Proceedings of the 19th International Congress on Modelling and Simulation, Perth, WA, Australia, 12–16 December 2011. [Google Scholar]
  14. Li, F.; Jupp, D.; Schroeder, T.; Sagar, S.; Sixsmith, J.; Dorji, P. Assessing an Atmospheric Correction Algorithm for Time Series of Satellite-Based Water-Leaving Reflectance Using Match-Up Sites in Australian Coastal Waters. Remote Sens. 2021, 13, 1927. [Google Scholar] [CrossRef]
  15. Campbell, S.; Lovell, J.; Jupp, D.L.B.; Graetz, R.D.; Byrne, G. The Lake Frome field campaign in support of Hyperion instrument calibration and validation. In Proceedings of the IEEE 2001 International Geoscience and Remote Sensing Symposium, Sydney, NSW, Australia, 9–13 July 2001; pp. 2593–2595. [Google Scholar] [CrossRef]
  16. Doelling, D.; Helder, D.; Schott, J.; Stone, T.; Pinto, C. Vicarious Calibration and Validation. In Reference Module in Earth Systems and Environmental Sciences; Elsevier: Amsterdam, The Netherlands, 2017; pp. 475–518. [Google Scholar] [CrossRef]
  17. Hueni, A.; Damm, A.; Kneubuler, M.; Schlapfer, D.; Schaepman, M. Field Airborne Spectroscopy Cross Validation: Some Considerations. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2017, 10, 1117–1135. [Google Scholar] [CrossRef]
  18. Teillet, P.M.; Barker, J.L.; Markham, B.L.; Irish, R.R.; Fedosejevs, G.; Storey, J.C. Radiometric cross-calibration of the Landsat-7 ETM+ and Landsat-5 TM sensors based on tandem data sets. Remote Sens. Environ. 2001, 78, 39–54. [Google Scholar] [CrossRef]
  19. Malthus, T.; Ong, C.; Lau, I.; Fearns, P.; Byrne, G.; Thankappan, M.; Chisholm, L. A Community Approach to Standardised Validation of Surface Reflectance Data; CSIRO: Canberra, ACT, Australia, 2019. [Google Scholar]
  20. Milton, E.; Schaepman, M.; Anderson, K.; Kneubuhler, M.; Fox, N. Progress in Field Spectroscopy. Remote Sens. Environ. 2009, 113, 92–109. [Google Scholar] [CrossRef]
  21. Schaepman-Strub, G.; Schaepman, M.; Painter, T.; Dangel, S.; Martonchik, J. Reflectance quantities in optical remote sensing–definitions and case studies. Remote Sens. Environ. 2006, 103, 27–42. [Google Scholar] [CrossRef]
  22. Anderson, J.; Dugan, K.; MacArthur, A. On the Reproducibility of Field-Measured Reflectance Factors in the Context of Vegetation Studies. Remote Sens. Environ. 2011, 115, 1893–1905. [Google Scholar] [CrossRef]
  23. Helder, D.; Thome, K.; Aaron, D.; Leigh, L.; Czapla-Myers, J.; Leisso, N.; Biggar, S.; Anderson, N. Recent Surface Reflectance Measurement Campaigns with Emphasis on Best Practices, SI Traceability and Uncertainty Estimation. Metrologia 2021, 49, S21–S28. [Google Scholar] [CrossRef]
  24. Earth Observation from Space Roadmap 2021–2030. Available online: https://www.industry.gov.au/publications/earth-observation-space-roadmap-2021-2030 (accessed on 1 April 2023).
  25. Working Group on Calibration & Validation. Available online: https://www.ceos.org/ourwork/workinggroups/wgcv/ (accessed on 12 February 2023).
  26. Harrison, B.A.; Jupp, D.L.B.; Lewis, M.M.; Forster, B.C.; Coppa, I.; Mueller, N.; Hudson, D.; Phinn, S.; Smith, C.; Anstee, J.; et al. Earth Observation: Data, Processing and Applications, Volume 1B: Data—Image Interpretation; CRCSI: Melbourne, VIC, Australia, 2017. [Google Scholar]
  27. Harrison, B.A.; Anstee, J.M.; Dekker, A.G.; King, E.A.; Griffin, D.A.; Mueller, N.; Phinn, S.R.; Kovacs, E.; Byrne, G. Earth Observation: Data, Processing and Applications. Volume 3B: Applications—Surface Waters; CRCSI: Melbourne, VIC, Australia, 2020. [Google Scholar]
  28. Pahlevan, N.; Schott, J.; Franz, B.; Zibordi, G.; Markham, B.; Bailey, S.; Schaaf, C.; Ondrusek, M.; Greb, S.; Strait, C. Landsat 8 Remote Sensing Reflectance (Rrs) Products: Evaluations, Intercomparisons, and Enhancements. Remote Sens. Environ. 2017, 190, 289–301. [Google Scholar] [CrossRef]
  29. Dekker, A.G.; Brando, V.E.; Anstee, J.M.; Pinnel, N.; Kutser, T.; Hoogenboom, E.J.; Peters, S.; Pasterkamp, R.; Vos, R.; Olbert, C.; et al. Imaging Spectrometry of Water. In Imaging Spectrometry; van der Meer, F.D., De Jong, S.M., Eds.; Springer: Dordrecht, The Netherlands, 2002; pp. 307–359. [Google Scholar] [CrossRef]
  30. Zibordi, G.; Talone, M. On the equivalence of near-surface methods to determine the water-leaving radiance. Opt. Express 2020, 28, 3200–3214. [Google Scholar] [CrossRef]
  31. Brando, V.; Lovell, J.; King, E.; Boadle, D.; Scott, R.; Schroeder, T. The Potential of Autonomous Ship-Borne Hyperspectral Radiometers for the Validation of Ocean Color Radiometry Data. Remote Sens. 2016, 8, 150. [Google Scholar] [CrossRef]
  32. Vermote, E.; Justice, C.; Claverie, M.; Franch, B. Preliminary analysis of the performance of the Landsat 8/OLI land surface reflectance product. Remote Sens. Environ. 2016, 185, 46–56. [Google Scholar] [CrossRef] [PubMed]
  33. Franz, B.A.; Bailey, S.W.; Kuring, N.; Werdell, P.J. Ocean color measurements with the Operational Land Imager on Landsat-8: Implementation and evaluation in SeaDAS. J. Appl. Remote Sens. 2015, 9, 096070. [Google Scholar] [CrossRef]
  34. Barry, P.; Jarecke, P.; Pearlman, J.; Jupp, D.; Lovell, J.; Campbell, S. Radiometric Calibration Validation of the Hyperion Instrument Using Ground Truth at a Site in Lake Frome, Australia. In Proceedings of the International Symposium on Optical Science and Technology, San Diego, CA, USA, 9 July–3 August 2001. [Google Scholar] [CrossRef]
  35. Byrne, G.; Walsh, G.; Thankappan, A.; Broomhall, M.; Hay, M.E. DEA Analysis Ready Data Phase 1 Validation Project: Data Summary; Geoscience Australia: Canberra, ACT, Australia, 2021. [Google Scholar] [CrossRef]
  36. Walsh, A.; Byrne, G.; Broomhall, M. A case Study of Measurement Uncertainty in Field Spectroscopy. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 15, 6248–6258. [Google Scholar] [CrossRef]
  37. ASD FieldSpec 4 Standard-Res Spectroradiometer. Available online: https://www.malvernpanalytical.com/en/products/product-range/asd-range/fieldspec-range/fieldspec-4-standard-res-spectroradiometer (accessed on 5 January 2023).
  38. SR3500 Full Range Spectroradiometer. Available online: https://www.spectralevolution.com/products/hardware/compact-lab-spectroradiometers/SR3500/ (accessed on 5 January 2023).
  39. MacArthur, A.; MacLellan, C.; Malthus, T. The Fields of View and Directional Response Functions of Two Field Spectroradiometers. IEEE Trans. Geosci. Remote Sens. 2012, 50, 3892–3907. [Google Scholar] [CrossRef]
  40. Hueni, A.; Bialek, A. Cause, Effect, and Correction of Field Spectroradiometer Interchannel Radiometric Steps. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2017, 10, 1542–1551. [Google Scholar] [CrossRef]
  41. Flame Series General Purpose Spectrometers. Available online: https://www.oceaninsight.com/blog/flame-series-general-purpose-spectrometers/ (accessed on 7 September 2023).
  42. Hueni, A.; Chisholm, L.; Ong, C.; Malthus, T.; Wyatt, M.; Trim, S.; Schaepman, M.; Thankappan, M. The SPECCHIO Spectral Information System. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 13, 5789–5799. [Google Scholar] [CrossRef]
  43. Jupp, D.; Li, F.; Byrne, G.; Ong, C.; Lau, I.; Malthus, T.; Thankappan, M.; Fearns, P. Using Irr_Inv, Modtran and Spectral Irradiance Data to Estimate Atmospheric Parameters. In CSIRO Land and Water Technical Report EP2022-3491; CSIRO: Canberra, ACT, Australia, 2022; 124p. [Google Scholar] [CrossRef]
  44. Roy, D.; Zhang, H.; Ju, J.; Gomez-Dans, J.; Lewis, P.; Schaaf, C.; Sun, Q.; Kovalskyy, V. A general method to normalize Landsat reflectance data to Nadir BRDF-Adjusted Reflectance. Remote Sens. Environ. 2016, 176, 255–271. [Google Scholar] [CrossRef]
  45. Ruddick, K.; Voss, K.; Boss, E.; Castagna, A.; Frouin, R.; Gilerson, A.; Heironymi, M.; Johnson, B.C.; Kuusk, J.; Lee, Z.; et al. Review of Protocols for Fiducial Reference Measurements of Water-Leaving Radiance for Validation of Satellite Remote-Sensing Data over Water. Remote Sens. 2019, 11, 2198. [Google Scholar] [CrossRef]
  46. Voss, K.; Flora, S. Spectral dependance of the seawater-air radiance transmission coefficient. J. Atmos. Ocean. Technol. 2017, 34, 1203–1205. [Google Scholar] [CrossRef]
  47. Quan, X.; Fry, S. Empirical equation for the index of refraction of seawater. Appl. Opt. 1995, 34, 3477–3480. [Google Scholar] [CrossRef]
  48. Shang, Z.; Lee, Z.; Dong, Q.; Wei, J. Self-shading associated with skylight-blocked approach system for the measurement of water-leaving radiance and its correction. Appl. Opt. 2017, 56, 7033–7044. [Google Scholar] [CrossRef]
  49. Jupp, D.; (CSIRO, Canberra, ACT, Australia). Personal communication, 2023.
  50. Aasen, H.; Bolten, A. Multi-Temporal High-Resolution Imaging Spectroscopy with Hyperspectral 2D Imagers—From Theory to Application. Remote Sens. Environ. 2018, 205, 374–389. [Google Scholar] [CrossRef]
  51. Aasen, H.; Honkavaara, E.; Lucieer, A.; Zarco-Tejada, P. Quantitative Remote Sensing at Ultra-High Resolution with UAV Spectroscopy: A Review of Sensor Technology, Measurement Procedures, and Data Correction Workflows. Remote Sens. 2018, 10, 1091. [Google Scholar] [CrossRef]
  52. Turner, D.; Malenovsky, Z.; Lucieer, A.; Turnbull, J.; Robinson, S. Optimizing Spectral and Spatial Resolutions of Unmanned Aerial System Imaging Sensors for Monitoring Antarctic Vegetation. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 3813–3825. [Google Scholar] [CrossRef]
  53. Gautam, D.; Lucieer, A.; Bendig, J.; Malenovsky, Z. Footprint Determination of a Spectroradiometer Mounted on an Unmanned Aircraft System. IEEE Trans. Geosci. Remote Sens. 2020, 58, 3085–3096. [Google Scholar] [CrossRef]
  54. Tmušić, G.; Manfreda, S.; Aasen, H.; James, M.R.; Gonçalves, G.; Ben-Dor, E.; Brook, A.; Polinova, M.; Arranz, J.J.; Mészáros, J.; et al. Current Practices in UAV-Based Environmental Monitoring. Remote Sens. 2020, 12, 1001. [Google Scholar] [CrossRef]
  55. Hakanson, L.; Peters, R. Predictive Limnology—Methods for Predictive Modeling; SPB Academic Publishing: Amsterdam, The Netherlands, 1995; 464p. [Google Scholar]
  56. Botha, E.; Vittorio, B.; Dekker, A. Effects of Per-Pixel Variability on Uncertainties in Bathymetric Retrievals from High-Resolution Satellite Images. Remote Sens. 2016, 8, 459. [Google Scholar] [CrossRef]
  57. Pahlevan, N.; Lee, Z.; Wei, J.; Schaaf, C.B.; Schott, J.R.; Berk, A. On-orbit radiometric characterization of OLI (Landsat-8) for applications in aquatic remote sensing. Remote Sens. Environ. 2014, 154, 272–284. [Google Scholar] [CrossRef]
  58. Kabir, S.; Pahlevan, N.; O’Shea, R.E.; Barnes, B.B. Leveraging Landsat-8/-9 underfly observations to evaluate consistency in reflectance products over aquatic environments. Remote Sens. Environ. 2023, 296, 113755. [Google Scholar] [CrossRef]
Figure 1. Surface reflectance over land, illustrating (a) a horizontal surface, (b) a sloped surface facing the Sun, and (c) a sloped surface facing away from the Sun [12].
Figure 1. Surface reflectance over land, illustrating (a) a horizontal surface, (b) a sloped surface facing the Sun, and (c) a sloped surface facing away from the Sun [12].
Remotesensing 16 01233 g001
Figure 2. Contributing pathways of electromagnetic radiation over water, as seen by remote sensing platforms. Source Arnold Dekker CSIRO [26].
Figure 2. Contributing pathways of electromagnetic radiation over water, as seen by remote sensing platforms. Source Arnold Dekker CSIRO [26].
Remotesensing 16 01233 g002
Figure 3. Field measurement of hemispherical conical reflectance. (a) A basic geometric relationship between illumination and viewing position [26], (b) reflectance panel measurements, (c) capturing transect radiance data.
Figure 3. Field measurement of hemispherical conical reflectance. (a) A basic geometric relationship between illumination and viewing position [26], (b) reflectance panel measurements, (c) capturing transect radiance data.
Remotesensing 16 01233 g003
Figure 4. Phase 1 site sampling model, modified from Ref. [19].
Figure 4. Phase 1 site sampling model, modified from Ref. [19].
Remotesensing 16 01233 g004
Figure 5. Panel and surface radiance sampling sequence at Narromine, 17 November 2021.
Figure 5. Panel and surface radiance sampling sequence at Narromine, 17 November 2021.
Remotesensing 16 01233 g005
Figure 6. Spectral sampling positions at Narromine, 17 November 2021.
Figure 6. Spectral sampling positions at Narromine, 17 November 2021.
Remotesensing 16 01233 g006
Figure 7. QA and QC metrics in pre-processing before generating SR. ((a) All panel readings, (b) panel readings plotted against cosine of of the solar zenith angle and (red) line of best fit, (c) Direct and diffuse downwelling irradiance measured at start and end of the site characterisation).
Figure 7. QA and QC metrics in pre-processing before generating SR. ((a) All panel readings, (b) panel readings plotted against cosine of of the solar zenith angle and (red) line of best fit, (c) Direct and diffuse downwelling irradiance measured at start and end of the site characterisation).
Remotesensing 16 01233 g007
Figure 8. UAV and spectrometer package deployed over the Wilcannia Site 2.
Figure 8. UAV and spectrometer package deployed over the Wilcannia Site 2.
Remotesensing 16 01233 g008
Figure 9. Wilcannia 2 UAV flight line (blue) ground survey transects (orange).
Figure 9. Wilcannia 2 UAV flight line (blue) ground survey transects (orange).
Remotesensing 16 01233 g009
Figure 10. The measurement configurations for the TriOS RAMSES sensors used during the underfly campaign included the SBA approach (left) as shown in the photo on the right on the day of the overpass. Note that the operator dropped below the E d sensor height prior to data acquisition, where Δ φ is the azimuth angle relative to the Sun and θ v is the nadir-viewing angle.
Figure 10. The measurement configurations for the TriOS RAMSES sensors used during the underfly campaign included the SBA approach (left) as shown in the photo on the right on the day of the overpass. Note that the operator dropped below the E d sensor height prior to data acquisition, where Δ φ is the azimuth angle relative to the Sun and θ v is the nadir-viewing angle.
Remotesensing 16 01233 g010
Figure 11. The SDA measurement configurations for the TriOS RAMSES sensors acquired data above the water surface, at the air–water interface and several depths within the water column, where Δ φ is the azimuth angle relative to the Sun and θ v is the nadir-viewing angle.
Figure 11. The SDA measurement configurations for the TriOS RAMSES sensors acquired data above the water surface, at the air–water interface and several depths within the water column, where Δ φ is the azimuth angle relative to the Sun and θ v is the nadir-viewing angle.
Remotesensing 16 01233 g011
Figure 12. Australian underfly field sites.
Figure 12. Australian underfly field sites.
Remotesensing 16 01233 g012
Figure 13. Google Earth images of the underfly field sites.
Figure 13. Google Earth images of the underfly field sites.
Remotesensing 16 01233 g013
Figure 14. Perth field site.
Figure 14. Perth field site.
Remotesensing 16 01233 g014
Figure 15. Perth sampling model.
Figure 15. Perth sampling model.
Remotesensing 16 01233 g015
Figure 16. MICROTOPS readings of aerosol optical thickness over the Perth site during the sampling period.
Figure 16. MICROTOPS readings of aerosol optical thickness over the Perth site during the sampling period.
Remotesensing 16 01233 g016
Figure 17. Perth field ASD reflectances, transect line averages (coloured) at full resolution (left) and then resampled to Landsat 9 bandwidths (right).
Figure 17. Perth field ASD reflectances, transect line averages (coloured) at full resolution (left) and then resampled to Landsat 9 bandwidths (right).
Remotesensing 16 01233 g017
Figure 18. Matchup Perth field data and L8 L9 DEA ARD and USGS ARD.
Figure 18. Matchup Perth field data and L8 L9 DEA ARD and USGS ARD.
Remotesensing 16 01233 g018
Figure 19. Cunnamulla field site.
Figure 19. Cunnamulla field site.
Remotesensing 16 01233 g019
Figure 20. ECMWF cloud forecast for Wilcannia at the time of the 100% L8/L9 overpass. Accessed 14 November 2021. (source: https://www.windy.com).
Figure 20. ECMWF cloud forecast for Wilcannia at the time of the 100% L8/L9 overpass. Accessed 14 November 2021. (source: https://www.windy.com).
Remotesensing 16 01233 g020
Figure 21. Wilcannia field site 1.
Figure 21. Wilcannia field site 1.
Remotesensing 16 01233 g021
Figure 22. Wilcannia Site 1; first and second site characterisation panel readings vs. solar zenith angle. The line of best fit is shown in red.
Figure 22. Wilcannia Site 1; first and second site characterisation panel readings vs. solar zenith angle. The line of best fit is shown in red.
Remotesensing 16 01233 g022
Figure 23. Wilcannia Site 1: First and second site characterisation sampling transects.
Figure 23. Wilcannia Site 1: First and second site characterisation sampling transects.
Remotesensing 16 01233 g023
Figure 24. Wilcannia Site 1—the two site characterisations match against the DEA Landsat 8 and 9 ARD.
Figure 24. Wilcannia Site 1—the two site characterisations match against the DEA Landsat 8 and 9 ARD.
Remotesensing 16 01233 g024
Figure 25. Wilcannia downwelling irradiance readings (diffuse and direct), showing a stable atmosphere for the time of overpass sampling.
Figure 25. Wilcannia downwelling irradiance readings (diffuse and direct), showing a stable atmosphere for the time of overpass sampling.
Remotesensing 16 01233 g025
Figure 26. Wilcannia Site 1 DEA and USGS ARD matchups.
Figure 26. Wilcannia Site 1 DEA and USGS ARD matchups.
Remotesensing 16 01233 g026
Figure 27. Wilcannia Site 1 NW corner point.
Figure 27. Wilcannia Site 1 NW corner point.
Remotesensing 16 01233 g027
Figure 28. Wilcannia 2 field site.
Figure 28. Wilcannia 2 field site.
Remotesensing 16 01233 g028
Figure 29. Wilcannia Site 2 SR3500 and UAV Flame matchups.
Figure 29. Wilcannia Site 2 SR3500 and UAV Flame matchups.
Remotesensing 16 01233 g029
Figure 30. Narromine field site, which was specifically chosen in anticipation of there only being a 15% side lap.
Figure 30. Narromine field site, which was specifically chosen in anticipation of there only being a 15% side lap.
Remotesensing 16 01233 g030
Figure 31. Narromine ASD and SR3500 datasets alongside Landsat 8. Geolocation of spectral data points (left), panel readings on the line of best fit (centre), matchup field data vs. L8 (right) matchup.
Figure 31. Narromine ASD and SR3500 datasets alongside Landsat 8. Geolocation of spectral data points (left), panel readings on the line of best fit (centre), matchup field data vs. L8 (right) matchup.
Remotesensing 16 01233 g031
Figure 32. Narromine L8 (left) and L9 (right) RGB images.
Figure 32. Narromine L8 (left) and L9 (right) RGB images.
Remotesensing 16 01233 g032
Figure 33. Narromine DEA ARD and USGS matchups against both the ASD-FR and SR3500 spectrometers.
Figure 33. Narromine DEA ARD and USGS matchups against both the ASD-FR and SR3500 spectrometers.
Remotesensing 16 01233 g033
Figure 34. Lake Hume. Field site sample times and conditions at the time of overpass.
Figure 34. Lake Hume. Field site sample times and conditions at the time of overpass.
Remotesensing 16 01233 g034
Figure 35. The impact of Sun-sensor alignment on the sky glint at Lake Hume. Landsat 8 (left) is viewed at the nadir while Landsat 9 (right), descending in the adjacent orbit, is tilted while imaging the lake. The red inset square in the large images defines the inset zoom window.
Figure 35. The impact of Sun-sensor alignment on the sky glint at Lake Hume. Landsat 8 (left) is viewed at the nadir while Landsat 9 (right), descending in the adjacent orbit, is tilted while imaging the lake. The red inset square in the large images defines the inset zoom window.
Remotesensing 16 01233 g035
Figure 36. Remote sensing reflectance R r s for Lake Hume Site 1, A-ARD, and USGS-ARD. Sampled 1 h before the overpasses.
Figure 36. Remote sensing reflectance R r s for Lake Hume Site 1, A-ARD, and USGS-ARD. Sampled 1 h before the overpasses.
Remotesensing 16 01233 g036
Figure 37. Remote sensing reflectance, R r s , at Lake Hume Dam wall: A-ARD and USGS-ARD. Site sampled 1–2 h after Landsat overpasses.
Figure 37. Remote sensing reflectance, R r s , at Lake Hume Dam wall: A-ARD and USGS-ARD. Site sampled 1–2 h after Landsat overpasses.
Remotesensing 16 01233 g037
Figure 38. Remote sensing reflectance, R r s , at Lake Hume Site 6: A-ARD and USGS-ARD.
Figure 38. Remote sensing reflectance, R r s , at Lake Hume Site 6: A-ARD and USGS-ARD.
Remotesensing 16 01233 g038
Figure 39. Remote sensing reflectance, R r s , at Lake Hume Site 10: A-ARD and USGS-ARD.
Figure 39. Remote sensing reflectance, R r s , at Lake Hume Site 10: A-ARD and USGS-ARD.
Remotesensing 16 01233 g039
Figure 40. Correlations of the Australia field data and L8 and L9 for all sites for DEA and USGS ARD SR products.
Figure 40. Correlations of the Australia field data and L8 and L9 for all sites for DEA and USGS ARD SR products.
Remotesensing 16 01233 g040
Figure 41. Correlation between Landsat 8 and Landsat 9 for the Australian terrestrial site for DEA and USGS ARD SR products.
Figure 41. Correlation between Landsat 8 and Landsat 9 for the Australian terrestrial site for DEA and USGS ARD SR products.
Remotesensing 16 01233 g041
Table 1. Underfly validation sites and sensor packages.
Table 1. Underfly validation sites and sensor packages.
SiteASD-FRSR3500FlameRamses
PerthY---
Wilcannia 1Y---
Wilcannia 2-YY-
CunnamullaY---
NarromineYY--
Lake Hume---Y
Table 2. Underfly overpass times for Lake Hume.
Table 2. Underfly overpass times for Lake Hume.
SatellitePathRowUTCAEDT
Landsat 8928500:0311:03
Landsat 9918523:5610:56
Table 3. Lake Hume sample measurement methods, locations, and times. NB site 6 and the dam wall were sampled the day before and site Hume 10 was sampled at the overpass time.
Table 3. Lake Hume sample measurement methods, locations, and times. NB site 6 and the dam wall were sampled the day before and site Hume 10 was sampled at the overpass time.
SiteIDMethodLatitudeLongitudeTime (UTC)Time (AEDT)
06-21-11-1SDA 36.2056 147.079816-11-2021 02:4516-11-2021 13:45
DAMWALL-21-11-1SDA 36.1186 147.033816-11-2021 04:3316-11-2021 15:33
01-21-11-1SDA 36.0984 147.05216-11-2021 21:2017-11-2021 08:20
10-21-11-2SBM 36.0132 147.092516-11-2021 23:5217-11-2021 10:52
10-21-11-2 L s k y L w 36.0132 147.092516-11-2021 00:0417-11-2021 11:04
10-21-11-2SDA 36.0132 147.092516-11-2021 23:5517-11-2021 10:55
10-21-11-1SDA 36.0132 147.092517-11-2021 00:1917-11-2021 11:19
Table 4. Perth overpasses.
Table 4. Perth overpasses.
SatellitePathRowTime(UTC)AWST
Landsat 811208202:05.4910:05
Landsat 911308202:11.2110:11
Table 5. Wilcannia overpasses.
Table 5. Wilcannia overpasses.
SatellitePathRowTime (UTC)AEDT
Landsat 809408200:14:3411:14:34
Landsat 909408200:13.5711:13:57
Table 6. Summary of field validation sites.
Table 6. Summary of field validation sites.
Site% OverlapField Validation
Perth151
Wilcannia1002
Cunnamulla1001
Narromine1002
Lake Hume1001 (six sites)
Total 6
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Byrne, G.; Broomhall, M.; Walsh, A.J.; Thankappan, M.; Hay, E.; Li, F.; McAtee, B.; Garcia, R.; Anstee, J.; Kerrisk, G.; et al. Validating Digital Earth Australia NBART for the Landsat 9 Underfly of Landsat 8. Remote Sens. 2024, 16, 1233. https://doi.org/10.3390/rs16071233

AMA Style

Byrne G, Broomhall M, Walsh AJ, Thankappan M, Hay E, Li F, McAtee B, Garcia R, Anstee J, Kerrisk G, et al. Validating Digital Earth Australia NBART for the Landsat 9 Underfly of Landsat 8. Remote Sensing. 2024; 16(7):1233. https://doi.org/10.3390/rs16071233

Chicago/Turabian Style

Byrne, Guy, Mark Broomhall, Andrew J. Walsh, Medhavy Thankappan, Eric Hay, Fuqin Li, Brendon McAtee, Rodrigo Garcia, Janet Anstee, Gemma Kerrisk, and et al. 2024. "Validating Digital Earth Australia NBART for the Landsat 9 Underfly of Landsat 8" Remote Sensing 16, no. 7: 1233. https://doi.org/10.3390/rs16071233

APA Style

Byrne, G., Broomhall, M., Walsh, A. J., Thankappan, M., Hay, E., Li, F., McAtee, B., Garcia, R., Anstee, J., Kerrisk, G., Drayson, N., Barnetson, J., Samford, I., & Denham, R. (2024). Validating Digital Earth Australia NBART for the Landsat 9 Underfly of Landsat 8. Remote Sensing, 16(7), 1233. https://doi.org/10.3390/rs16071233

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop