Next Article in Journal
High-Accuracy Positioning in Urban Environments Using Single-Frequency Multi-GNSS RTK/MEMS-IMU Integration
Next Article in Special Issue
An Automatic Random Forest-OBIA Algorithm for Early Weed Mapping between and within Crop Rows Using UAV Imagery
Previous Article in Journal
Modelling Water Stress in a Shiraz Vineyard Using Hyperspectral Imaging and Machine Learning
Previous Article in Special Issue
Tracking of a Fluorescent Dye in a Freshwater Lake with an Unmanned Surface Vehicle and an Unmanned Aircraft System
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Technical Note

Direct Georeferencing of a Pushbroom, Lightweight Hyperspectral System for Mini-UAV Applications

1
Laboratoire Géosciences Océan—UMR 6538, Université de Bretagne Occidentale, IUEM, Technopôle Brest-Iroise, Rue Dumont d’Urville, F-29280 Plouzané, France
2
CEREMA, Direction Eau Mer et Fleuves, 134 Rue de Beauvais, F-60280 Margny-lès-Compiègne, France
3
Laboratoire de Géologie de Lyon, Terre, Planètes, Environnement—UMR 5276, Université de Lyon, Université Claude Bernard Lyon 1, ENS Lyon, CNRS, F-69622 Villeurbanne, France
4
Geodetic Engineering Laboratory (TOPO), École Polytécthnique Fédéral de Lausanne (EPFL), Batiment GC Station 18, CH-1015 Lausanne, Switzerland
*
Author to whom correspondence should be addressed.
Remote Sens. 2018, 10(2), 204; https://doi.org/10.3390/rs10020204
Submission received: 4 December 2017 / Revised: 24 January 2018 / Accepted: 27 January 2018 / Published: 30 January 2018
(This article belongs to the Special Issue Remote Sensing from Unmanned Aerial Vehicles (UAVs))

Abstract

:
Hyperspectral imagery has proven its potential in many research applications, especially in the field of environmental sciences. Currently, hyperspectral imaging is generally performed by satellite or aircraft platforms, but mini-UAV (Unmanned Aerial Vehicle) platforms (<20 kg) are now emerging. On such platforms, payload restrictions are critical, so sensors must be selected according to stringent specifications. This article presents the integration of a light pushbroom hyperspectral sensor onboard a multirotor UAV, which we have called Hyper-DRELIO (Hyperspectral DRone for Environmental and LIttoral Observations). This article depicts the system design: the UAV platform, the imaging module, the navigation module, and the interfacing between the different elements. Pushbroom sensors offer a better combination of spatial and spectral resolution than full-frame cameras. Nevertheless, data georectification has to be performed line by line, the quality of direct georeferencing being limited by mechanical stability, good timing accuracy, and the resolution and accuracy of the proprioceptive sensors. A georegistration procedure is proposed for geometrical pre-processing of hyperspectral data. The specifications of Hyper-DRELIO surveys are described through two examples of surveys above coastal or inland waters, with different flight altitudes. This system can collect hyperspectral data in VNIR (Visible and Near InfraRed) domain above small study sites (up to about 4 ha) with both high spatial resolution (<10 cm) and high spectral resolution (1.85 nm) and with georectification accuracy on the order of 1 to 2 m.

Graphical Abstract

1. Introduction

Hyperspectral imagery offers a great potential for environmental research applications, particularly on topics linked with the critical zone [1]:
-
bare continental surfaces: soil quality, geophysical and geochemical surface properties [2,3,4];
-
vegetation: species distribution, ecosystems monitoring [5,6];
-
coastal and inland waters: bathymetry, water quality, benthic habitats classification [7,8,9,10].
Currently, hyperspectral imaging is mainly performed by satellite or aircraft platforms. There is, however, a major gap between the fine-resolution, small-scale observations resulting from field surveys and the coarse-resolution, large-scale information from satellite or aerial images [11]. Satellite imagery provides large-scale datasets with long time series, but with constraints on revisit time, cloud coverage, spatial resolution, and Signal-to-Noise Ratio (SNR) values, in that the users cannot tune the acquisition parameters to fit the weather and requirements of the site to be imaged. Otherwise, aerial surveys by plane, operated on-demand, are expensive and require a considerable effort for the management of logistics, which is exacerbated by the dependency on meteorological conditions.
Development of Unmanned Aerial Vehicle (UAV) platforms during the last decade has enriched the range of opportunities to carry out remote sensing surveys, offering a greater flexibility in survey planning and, in flying at lower altitude, enabling the collection of high spatial resolution data. UAVs can be divided into three categories. On one hand, one can consider the helicopters, mainly equipped with a thermic engine and generally designed to support larger and heavier (>5 kg) payloads. This category includes very heavy-duty platforms, up to 100 kg. As an example, in a study by Gallay et al. [12], a helicopter weighing 47 kg is designed to support a payload up to 30 kg, including an airborne laser scanner or a hyperspectral scanning camera. A hyperspectral sensor array mounted onto an unmanned radio-controlled helicopter accepting a payload of 30 kg is presented by Kosugi et al. [13]. A helicopter mini-UAV able to carry a payload of about 7 kg is proposed by Jaakkola et al. [14]. Helicopters can provide a higher scanning homogeneity than multi-rotor aircrafts for data collection in continuous forward flight [12], being less sensitive to wind gusts or side winds. However, they require specially trained operators and generate high frequency vibrations.
Another category comprises fixed-wing platforms. A 5 m wingspan, fixed-wing platform capable of carrying a 3 kg payload is proposed in a paper by Zarco-Tejada et al. [15]. A pushbroom hyperspectral sensor on a fixed-wing UAV designed to carry payloads up to 8 kg is tested in a study by Hruska et al. [16]. Fixed-wing UAVs are very stable during flight, but they require a large area suitable for take-off.
Finally, electric multi-rotor mini-UAVs (<20 kg) have experienced a strong development over the last few years. These platforms are relatively low-cost and easy to pilot. To address the limited payload constraint of these mini-UAVs, lighter sensors are being developed. Importantly, the limited payload of mini-UAVs has obvious consequences on the type of imaging and positioning sensors that can be integrated to the onboard electronics package, since miniaturization usually comes at the expense of performance. In certain cases, data exploitation may be limited by insufficient positioning accuracy, especially with high spatial resolution data. Efficient direct-georeferencing capability is therefore critical. With respect to this lighter sensors approach, a miniature image-frame sensor system composed of hyperspectral sensor, GPS sensor, data logger, LCD display, control switches, and a power supply and weighing 400 g was developed by Uto et al. [17]. Nevertheless, to achieve high ground resolution, this system implies flying at low altitude, and therefore limited ground coverage, since the ground resolution for this system is about 2 m at the UAV nadir from the altitude of 10 m. An octocopter UAV with light hyperspectral full-frame cameras was outfitted in a study by Bareth et al. [18]. Two camera models were tested. Nevertheless, because of binning capability (i.e., combining a cluster of spatial or spectral data into a single pixel) [19], it is necessary to choose between high spectral resolution and low spatial resolution or high spatial resolution and a limited number of spectral bands. Furthermore, mini-UAV platforms are very light and thus generate more high frequency vibrations and fast trajectory changes than larger, platforms such as planes [16]. UAVs thus require very fast and accurate proprioceptive sensors in order to log their attitude correctly.
In this article, we present the Hyper-DRELIO system (Hyperspectral DRone for Environmental and LIttoral Observations) developed in the Laboratoire Géosciences Océan (Brest, France), in association with the Laboratoire de Géologie de Lyon (Lyon, France) and the Geodetic Engineering Laboratory (EPFL, Lausanne, Switzerland). Hyper-DRELIO is meant to be used for surveys of the coastal zone or over inland water bodies. Because of typical wind conditions, this context is generally associated with challenging constraints for UAV flight stability and safety for the platform and payload. Furthermore, these environments also raise specific questions in terms of radiometric measurements due to sunglint and sea spray aerosols, as well as for geo-referencing approaches relying on Ground Control Points (GCPs).
The system also needs to meet the following technical specifications:
-
high spectral resolution and broad spectral coverage to enable addressing a large variety of topics and questions, such as deriving water depth, water properties and benthic habitat by optimization [9,10,20], coral reef monitoring [20], or biofilm characterization [8,21];
-
high spatial resolution (<10 cm) to study structures with complex geometry, such as coral reefs [20], or small features, such as the holes of mangrove crabs [22].
Hyper-DRELIO therefore attempts to find certain trade-offs between vehicle mass and payload, ground coverage, spatial resolution, and spectral resolution. This system is composed of an octocopter UAV platform, equipped with commercially-available sensors: a light pushbroom scanning hyperspectral sensor, a light RGB camera, and a Global Navigation Satellite System—Inertial Navigation System (GNSS-INS).
This article describes the design of Hyper-DRELIO and presents two case studies, over a coastal site and a pond, to illustrate how the pushbroom hyperspectral data are geometrically corrected by direct georeferencing. The purpose of this system is to provide bottom of the atmosphere reflectance measurements, however, the analysis of the radiometric signal recorded with the hyperspectral camera is beyond the scope of this paper that focuses on geometrical corrections. In practice, considering the small size of survey area, the short duration of a survey, and the fact that the presence of operators on site allows the collection of in situ spectroradiometric measurements for standardization, the approach for processing radiometric signal will be quite different than for satellite or airplane surveys.

2. Materials and Methods

2.1. UAV Platform

The Hyper-DRELIO UAV platform (Figure 1) is an HALO_8 octocopter, a prototype designed by DroneSys©. This electric UAV has a diameter of 1.2 m, weighs 13.4 kg, and can handle a payload of 5 kg. Two TATTU® 6S1P Lithium-Polymer (LiPo) batteries, each of which weighs 1.92 kg, supply power to the eight rotors, allowing 12 min of autonomy. However, taking into account the duration of ascent and descent phases and a safety cushion, the duration of the programmed flight plan must not exceed 7–8 min. As impedance fluctuations of the rotors could damage certain components of the sensors, a Kypom® 3S LiPo battery is specifically dedicated to the power supply of the whole onboard package: electronics and sensors, with 28 min of autonomy.
The onboard flight control system is composed of a GNSS and an autopilot. Ground station software is used to control the UAV flight parameters during the survey. The flight control is run by DJI® iOSD® software. Although Hyper-DRELIO is able to perform an autonomous flight, take-off and landing are manually controlled. In order to help with preventing severe damage in case of technical problems, it is equipped with a parachute rescue system meant to ensure a softer landing, and that is released from the ground station.
The imaging sensors and the Inertial Motion Unit (IMU) are mounted together on the same support and placed in a waterproof protective case (Figure 1). This sensor case is mounted on a two-axis gyrostabilized gimbal.

2.2. Imaging Module

The imaging module (Table 1) is composed of a hyperspectral pushbroom camera and an auxiliary color RGB camera. The MicroHyperspec® hyperspectral camera (Headwall®) has a VNIR (Visible and Near InfraRed: 380–1000 nm) imaging sensor parametrized to collect 250 spectral bands, with a spectral resolution of 1.85 nm. The lens has a wide Field Of View (FOV = 48.7°) and a focal length of 8.2 mm. The small and lightweight (770 g) MicroHyperspec® is based on the “pushbroom” technology, which relies on the displacement of the UAV to cover the study area. The array consists of 1000 pixels in cross-track direction.
The onboard data acquisition system is based on the HyperBox developed by the Geodetic Engineering Laboratory [23]. This system enables lines to be stacked in an along-track direction during the survey, with a new image recorded every 1000 lines. An image is thus a 1000 × 1000 × 250 matrix, where each line will require a georegistration.
The HyperBox also enables the gain of the MicroHyperspec® camera to be varied, i.e., the multiplicative factor linearly amplifying the digital signal of ground radiance on a wavelength-per-wavelength basis. Varying from 2 to 8, the gain will be typically set to 6 or 7 in cloudy conditions, whereas it will be set to 5 or 6 in bright sunny conditions.
The acquisition frame rate is adjustable and defined according to the flight parameters. Considering a UAV speed of around 3–4 m/s, close to the lower bound for platform stability (typical flight characteristics are given in Section 3.1), the frame rate is generally set to 50 Hz, so as to optimize ground coverage while keeping the pixel size small and limiting along-track pixel elongation.
The color camera of Hyper-DRELIO is a light iDS© uEye (UI-3250-C-HQ) RGB camera with a global shutter, 1.92 MPix of resolution, and a focal length of 25 mm. Given the focal length and the sensor size, the ground coverage of the RGB camera is around 28 × 22 m for a flying altitude of 100 m, which is smaller than the coverage of hyperspectral data. The RGB camera is parametrized to collect 10 photos per second. This rate is selected in order to limit surface changes between successive images and to maximize image overlapping, which limits distortion in surface reconstruction [24].
During the flight, the RGB photographs enable the operator to have real-time control of the area being imaged. After the flight, these RGB photographs may provide a more user-friendly view of the study area than hyperspectral imagery. Above water, these photos can provide some information about surface roughness. Above emerged area, collecting RGB photographs allows an orthophoto and a Digital Elevation Model (DEM) to be computed, which may be useful for data processing as described in Section 3.

2.3. Proprioceptive Sensors: Navigation Module

Proprioceptive sensors record the internal state of the platform, including the navigation parameters in position and attitude. The quality of the geometrical rectification of the hyperspectral data, which uses the navigation parameters, depends on the accuracy of these sensors.
The Hyper-DRELIO is equipped with an Ekinox-D (SBG System®) Inertial Motion Unit (IMU) with 3-axis gyroscopes, 3-axis accelerometers, and 3-axis magnetometers. The position is recorded using a dual-frequency, dual-antenna RTK (Real Time Kinematics) GNSS receiver. GNSS/IMU data fusion uses a multisensor extended Kalman filter [25] proposed by SBG System® for helicopters.
This IMU has been selected to achieve a trade-off between its light weight (600 g, Table 1) and small size (10 × 8.6 × 7.5 cm) and its accuracy (0.05° for roll and pitch). The heading measurement from the dual GNSS antenna is used over the magnetometers data, as the former is more accurate than the latter. Because of technical constraints related to the span and design of the UAV, the dual GNSS antenna has an 85 cm baseline, which provides a heading estimation with about 0.22° of accuracy, against about 1° with the magnetometers. Moreover, this system provides “true heading”, i.e., the direction where the platform ‘nose’ is pointing with respect to true north on the horizontal plane, both while the UAV is stationary and in motion. Indeed, for such platforms, depending on wind conditions, the heading may differ from the route (the direction along which the platform is moving). Navigation data are recorded at a frequency of 200 Hz.
The boresight is the position and angular offsets between the IMU, the GNSS antennas, and imaging sensors. It is estimated from external measurements and is assumed to remain constant during a flight and from one survey to another, so there is no need to estimate it again before each flight. Due to constraints related to the design of the drone, the GNSS antennas are not directly attached on the same gimbal as the IMU and imaging sensors, but on the main frame of the UAV (Figure 1). However, except for vibrations effects (considered negligible), parts alignment upon reassembling the drone remains unchanged.

2.4. Data Merging

Multi-sensor data are recorded by an Intel® NUC (Next Unit of Computing), a small box-shaped computer, equipped with Intel® CoreTM i5 quad core processor and fitted with 220 GB Solid State Disk and 8 GB of RAM. The HyperBox consists of drivers written to handle both cameras and GPS + IMU [23] (Section 2). Pulses are sent by the imaging module to the navigation module upon the start of an image exposure. The synchronization is based on CPU (Central Processing Unit) timestamps, the acquisition drivers matching navigation lines and image lines (Figure 2). The CPU clock’s drift is lower than 0.09 s/h. For each cross-track hyperspectral line collected, the line number, the CPU time, and the UAV position and attitude are written in the header file of the hyperspectral raw image. Acquisition software, coded in html, JavaScript, and C++ languages, has been developed, allowing the operator on the ground to communicate with the navigation and imaging modules via a web interface and a Wi-Fi link, in particular to check the GNSS/IMU status and to adjust the parameters of the hyperspectral and RGB cameras as needed.

3. Field Operations and Data Processing

3.1. Practical Considerations for the Survey

Three operators are required to manage the following tasks: (i) piloting, (ii) real-time checking of the UAV parameters: altitude, speed, voltage, and position in the flight plan, and (iii) parametrization of the imaging sensors and real-time control of area being imaged.
While the flight autonomy is 12 min, the flight plan is designed by taking into account the combined duration of ascent and descent phases—about 2 min for a flight height of 50 m and 3 min for a flight height of 100 m—the effects of wind conditions on the flight plan duration, the maneuvers to pass the waypoints, and a 2 min safety cushion to ensure a safe flight. Thus, the duration of the programmed flight plan cannot exceed 8 min at 50 m, or 7 min at 100 m, which corresponds to a coverage of about 5 to 10 ha for a flight plan consisting of two parallel lines, out and back.
The French legislation limits UAV flight height to 150 m above ground level. Typically, Hyper-DRELIO performs surveys:
  • at 100 m of altitude with an imaging swath of 90 m and a cross-track ground resolution about 9 cm (Figure 3). With a speed of about 4 m/s, the along-track ground sampling is about 8 cm. For 6 min of effective flight time, about 13 ha are covered.
  • at 50 m of altitude with an imaging swath of 45 m and a cross-track ground resolution about 4.5 cm. With a speed of about 3 m/s, the along-track ground sampling is about 6 cm. For 6 min of effective flight time, about 4.8 ha are covered.
To limit the along-track ground sampling distance, the flight speed is reduced within certain limits so as to ensure UAV stability. The stability of the platform also depends on the weather conditions. Wind conditions over 8 m/s or conditions with strong gusts of wind should be avoided. Head wind is preferred to side wind, which takes the UAV away from the forecast flight plan and so provokes in-flight trajectory readjustment. Therefore, the flight plan may have to be adapted to suit not only the survey objectives, whether for scientific or operational purposes, but also the weather conditions.
The flight path is defined by waypoints. Passing the waypoints implies dedicated maneuvers for the UAV, such as in-flight readjustment of trajectory and velocity and change in direction, which tend to be time-consuming. The number and position of waypoints are therefore set so as to optimize the survey time. Furthermore, some parts of the flight (the ascent, the change of direction at the waypoints, the descent) are unusable in hyperspectral images reconstruction. These maneuvers imply rapid variations in UAV attitude, which are not compatible with pushbroom acquisition since they create gaps and discontinuities between cross-track lines.
Moreover, even for the exploitable portions of the flight (i.e., straight lines at steady altitude), the UAV is likely to pitch or roll by within several degrees, inducing shifts between successive hyperspectral lines. To avoid data gaps due to rolling, the flight plan may be defined so as to have an overlap between cross-track lines. Examples of flight lines are given in Section 4.1.

3.2. SfM Photogrammetric Processing

When the flight is performed above solid ground, the photographs collected by the RGB camera are processed using the photogrammetric software Agisoft® PhotoScan Professional (version 1.2.6, Agisoft®, St Petersburg, Russia). The procedure for the 3D surface reconstruction is based on the Structure from Motion (SfM) workflow for Multiview Stereophotogrammetry [26,27]. The DSM (Digital Surface Model) of the area is useful in the hyperspectral data georectification workflow to know exactly the height of the UAV above the ground and therefore to correct hyperspectral data of relief effects.
Simultaneously acquiring RGB photographs from the hyperspectral imaging platform provides a DSM with higher spatial resolution than the existing DEMs available in national/global databases. Moreover, pre-existing DEM are generally bare-earth representations, while vegetation, buildings and ephemeral features are captured in the hyperspectral data and need to be accounted for in the image corrections. The DSM computed from RGB images has the advantage of depicting the current surface height.

3.3. Geometrical Pre-Processing of Hyperspectral Data

As mentioned above, this article focuses on hyperspectral data georectification. Pre-processing is carried-out image by image, with a hyperspectral image consisting of a 1000 pixels × 1000 pixels array of 250-band spectrograms (see Section 2.2 and Figure 4b). Originally, each pixel of a given hyperspectral line is positioned in a sensor coordinate system, with x , y , z axes being, respectively, roll, pitch, and yaw axes. The georegistration process is performed line by line. This first step combines the following inputs (Figure 4):
  • geometrical acquisition settings: focal length, pixel size, sensor length, mounting offsets;
  • the absolute position (X, Y, Z) and orientation (roll, pitch, heading) of the hyperspectral camera, recorded in the image header;
  • the DSM of the study area to take into account the true height of the UAV above the terrain.
The absolute position of each imaged pixel is then computed into the selected reference coordinate system. Resampling by linear interpolation is finally applied to the hyperspectral images in order to obtain data on a regular grid (Figure 4c), which later may be exported and further processed using other software. Here, this processing chain has been developed with MATLAB® software. Typical hyperspectral imaging products can be generated from the georectified hyperspectral lines: spectra, hyperspectral cubes, etc.

4. Results

4.1. Study Sites and Surveys

In this study, we present data collected at two sites (Figure 5a): (i) a coastal location, Porsmilin Beach (Brittany, France—Figure 5b) and (ii) a small inland water body, Lannénec Pond (Brittany, France—Figure 5c). For these surveys, the reference coordinate system is RGF93—Lambert 93.
The settings of both surveys are summarized in Table 2. At Porsmilin Beach, the weather was very sunny, without wind. The survey was performed along two flight lines at 50 m of altitude with the typical settings previously mentioned in Section 3.1. The flight plan is depicted in Figure 6a. The spacing between flight lines was set to 30 m in order to have 33% of overlap. Due to sunshine conditions, the gain of the hyperspectral camera was set to 5. The acquisition rate was 50 Hz. For a total flight duration of 6 min 15 s, 750 m of flight lines (more than 10,000 hyperspectral lines) are exploitable, covering about 2.8 ha with a spatial resolution of about 4.5 cm and representing over 5 Gb of data.
At Lannénec Pond, the weather was cloudy with sunny spells and slightly windy. Given the constraints for take-off and landing in some parts of the study area, the flight plan (Figure 6b) crossed the pond and then consisted of two lines of flight at 100 m of altitude with the typical settings previously mentioned in Section 3.1. The spacing between flight lines was set to 60 m in order to have 33% of overlap. As the illumination was less intense than in Porsmilin, the gain of the hyperspectral camera was configured to 6. The acquisition rate was 50 Hz. For a total flight duration of 5′40″, about 500 m of flight lines are exploitable (more than 8000 hyperspectral lines), covering about 3.7 ha with a spatial resolution of about 9 cm and representing more than 4 Gb of data.

4.2. Results of Hyperspectral Lines Georegistration

Data are processed as described in Section 3.3. As photogrammetry is ineffective above water surfaces, a DSM is only used above land surfaces, water surfaces being considered as flat surfaces with constant elevation. The results of hyperspectral lines georectification are depicted in Figure 7. To make data viewing easier, hyperspectral lines are represented as a 3-bands (RGB) composite image. For georegistration assessment purposes, as it is not possible to place control points over the water, the spatial accuracy of georegistration is assessed using natural, stable elements (e.g., shoreline of the pond, features of the underwater landscapes…) that are easily identifiable both in the georectified hyperspectral imagery and in orthophotographs from global databases. For this study, we use the IGN© BDortho® (French National Institute of Geographic and Forest Information—BDortho® 2013), which has a spatial resolution of 20 cm for Porsmilin Beach and 50 cm for Lannénec Pond. In Porsmilin beach, right before the Hyper-DRELIO flight, another UAV platform performed a photogrammetric flight over the beach. From this survey, an orthoimage with 5 cm accuracy has been computed and is therefore also available to identify control points on hyperspectral data. Indeed, this quasi-simultaneous orthoimage makes it possible to use non-permanent features (e.g., pebbles, algae, etc.) as ground control points to estimate georegistration accuracy.
With seven control points (Figure 7a) in Porsmilin Beach, the horizontal Root Mean Square Error (RMSE) of hyperspectral lines georectification is about 1.4 m. The horizontal RMSE computed on five control points (Figure 7b) in Lannénec Pond is about 2.3 m. These values are not given as specifications but rather as an order of magnitude of the error for flights of different altitudes. The georectification RMSE is indeed very dependent on survey conditions, particularly wind gusts or side wind, provoking movements beyond the measurement and correction abilities of the IMU. The RMSE is therefore likely to vary from one site to another, from one day to another, and even over the duration of a survey. The dependence on flight conditions also implies that the georectification RMSE is heterogeneous at the scale of a study area. Therefore, a classical procedure of accuracy assessment with a high density of GCPs would not yield a reliable result. Multiplying surveys will allow the RMSE value to be statistically refined for various flight conditions. As an example, for another flight at 100 m of altitude, which has been performed above Saint Aignan pond (Brittany, France), the georectification RMSE computed on nine control points is about 1.9 m.
Figure 8 presents a sample of each georectified dataset as hyperspectral image cubes, i.e., with the third dimension of the cube depicting the radiometric intensity as a function of the wavelength. Some isolated lines appear misaligned when the navigation system fails to register the real movement of the imaging system, creating artifacts on the reconstructed hyperspectral image.

5. Discussion

UAV platforms equipped with hyperspectral sensors such as Hyper-DRELIO provide high spatial and high spectral resolution which would comparatively render full-frame sensor solutions, multispectral sensors, and/or lower spatial resolution systems less optimal. UAV platforms also provide flexibility regarding the time of survey. As an example, the physical properties of the water bodies being studied may evolve rapidly and require several surveys in the same day. Moreover, Hyper-DRELIO enables the collection of hyperspectral data above small-sized areas at reasonable cost with respect to airplane surveys. Nevertheless, considering the number of sensors, the protocol to prepare the survey remains quite complex and a survey requires three operators. It is also pointed out by Uto et al. [17] that flying at low altitude reduces the impact of the presence of clouds, UAVs cruising underneath the clouds, and of atmospheric effects, which applies to data collected by UAVs. This point appears as a real advantage in areas where the weather is often cloudy. Moreover, the flexibility of UAV platforms can yield multi-angular surveys to compensate for the bidirectional reflectance distribution function variations. For example, since the angle of view influences vegetation indices, Burkart et al. [28] consider angular effects by programming complex flight paths over vegetated environments. They note that with multi-angular acquisitions, the survey is more dependent on the position of the sun and may therefore require repeated acquisitions at different hours of the day. In the coastal domain, multi-angular surveys may be helpful to characterize more accurately complex underwater structures such as coral reefs. In the context of the development of satellite hyperspectral sensors, with higher spatial and spectral resolution and shorter revisit times, UAVs also appear as a practical platform to test and identify the best acquisitions parameters for future satellite missions.
The georegistration performance of Hyper-DRELIO compares well with other existing systems described in the literature. For instance, Hruska et al. [16] achieved a planimetric RMSE of 4.63 m for direct georeferencing of hyperspectral lines with a fixed-wing UAV flying at 344 m of altitude. From the georegistration results, it appears that the RMSE is higher in Lannénec Pond than in Porsmilin Beach (respectively 2.3 m and 1.4 m). The difference may be due to the higher flight altitude, leading to a lower spatial resolution. Moreover, during the Lannénec survey, the wind was a bit stronger, which is especially noticeable at the altitude of the UAV. In addition, in the error estimation method, the resolution of the BDortho® reference image is lower in Lannénec than in Porsmilin Beach (50 cm and 20 cm, respectively). Another source of explanation for the difference in georeferencing error is that the appearance of the banks of the pond is very likely to vary over time, so that the identification of control points on the orthoimage lacks precision, while for the emerged part of Porsmilin Beach the orthoimage was obtained from a simultaneous UAV photographic survey.
The sources for georectification error can be multiple, the limited accuracy of lighter, UAV-borne sensors being among the main ones. Hruska et al. [16] mention poor estimation of yaw angle as the largest source of error. In order to minimize this problem, Hyper-DRELIO has been equipped with a dual GNSS antenna to measure true heading. Nevertheless, the baseline between the GNSS antennas is limited to 85 cm, leading to about 0.22° of accuracy in heading estimation. As an example, with a baseline of 2 m, the heading accuracy would be of 0.1°. In a study by Gallay et al. [12], with a xNAV550 GPS/IMU system and a larger dual-GPS antenna baseline (around 1.5 m, estimated from the figures), and using a helicopter UAV platform weighting more than 60 kg, offering higher flight stability than multi-rotor aircrafts, an orientation accuracy of 0.05° in roll and pitch and 0.11° in yaw is reported. Nevertheless, such heavy UAVs are more expensive and harder to transport than light weight UAV. In some countries, it can also be more difficult to obtain the flight authorization for heavy platforms.
To obtain the best possible georeferencing quality with Hyper-DRELIO, the weather conditions must be optimal (in particular, low wind conditions). In addition, the flight plan should be as simple as possible, avoiding turns and curves, as well as an excess number of waypoints, and best adapted to meteorological constraints, mainly so as to avoid crosswind. Despite the aforementioned measures, some degree of georeferencing error persists, particularly scan-line to scan-line errors. As Hruska et al. [16] also point out, such errors most likely result from differential movement or vibration of the platform. For Hyper-DRELIO, the speed of the engines is 7640 rpm at maximum output, that is, 127 Hz. As the navigation data is recorded at a frequency of 200 Hz, the IMU does not capture all the vibrations induced by the rotors. The Kalman filter implemented in the Ekinox-D IMU, designed after an aircraft-type sensor model, may also be not optimally parametrized for the Hyper-DRELIO UAV. Finally, because of these scan-line to scan-line errors, the boresight effect associated with the angular deviations between positioning and imaging sensors is difficult to assess and therefore to correct, which can significantly alter georeferencing accuracy. In the near future, the development of hyperspectral snapshot cameras with improved performance is likely, particularly concerning the trade-off between spatial resolution and spectral resolution. Contrary to line scanners, these sensors do not require accurate navigation system, as they provide the opportunity to derive orthorectified 3D data from Structure-from-Motion (SfM) reconstruction methods [29].
The option of using RGB camera external parameters computed by the SfM photogrammetric processing has been considered to estimate UAV trajectory [30]. The hyperspectral acquisition is performed at 50 Hz and taking 10 photos per second would enable camera position and attitude to be computed at 10 Hz. Interpolation methods such as those described in Barbieux et al. [31] would allow orientation parameters obtained by bundle adjustment to computed at the frequency of pushbroom scan lines acquisition. However, the reliability of this method is largely dependent on the RGB photographs quality. Moreover, the collected RGB images tend to lack correlation features because of their reduced ground coverage. This alters the performance of SfM photogrammetry for surfaces above water or vegetation shaken by the wind.

6. Conclusions

For UAV surveys, payload restrictions are critical, particularly when the data collected require direct georeferencing. Indeed, the quality of the direct georeferencing is limited by the resolution and the accuracy of the proprioceptive sensors.
Hyper-DRELIO allows hyperspectral data to be collected above small study sites (up to about 4 ha) with both high spatial resolution (<10 cm) and high spectral resolution (1.85 nm) in VNIR domain, and with georectification accuracy on the order of 1 to 2 m.
The pushbroom image formation process is a major source of complexity in the geometrical correction step. While this problem should be overcome in the future with the development of light full-frame hyperspectral cameras, currently such sensors require a compromise between spatial coverage, spatial resolution, and spectral resolution.
In general, hyperspectral imagery from UAV platforms allows for low-cost surveys with high spatial and spectral resolution and also offers a high degree of flexibility regarding the acquisition parameters. Potential benefits could be obtained from the assessment of the impact of shifting the camera orientation away from nadir or of multi-scale surveys on the performance of hyperspectral inversion algorithms.

Acknowledgments

The authors acknowledge financial support provided by ANR project EQUIPEX CRITEX (ANR-11-EQPX-0011) and by the TOSCA project HYPERCORAL from the CNES (the French space agency).

Author Contributions

Marion Jaud took part in the instrumental development, performed data processing, and wrote the manuscript. Nicolas Le Dantec provided guidance on the overall project and supervised the writing of the manuscript. Jérome Ammann and Philippe Grandjean took part in the instrumental development and surveys. Dragos Constantin, Yosef Akhtman, and Kevin Barbieux contributed to the core of instrumental, hardware, and software developments (HyperBox). Pascal Allemand, Christophe Delacourt, and Bertrand Merminod developed the research plan for the Hyperspectral UAV system and provided guidance on the overall project.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Richardson, M.; Kumar, P. Critical Zone services as environmental assessment criteria in intensively managed landscapes. Earth’s Future 2017, 5, 617–632. [Google Scholar] [CrossRef]
  2. Asadzadeh, S.; de Souza Filho, C.R. A review on spectral processing methods for geological remote sensing. Int. J. Appl. Earth Obs. Geoinf. 2016, 47, 69–90. [Google Scholar] [CrossRef]
  3. Gomez, C.; Lagacherie, P.; Coulouma, G. Regional predictions of eight common soil properties and their spatial structures from hyperspectral Vis–NIR data. Geoderma 2012, 189–190, 176–185. [Google Scholar] [CrossRef]
  4. Chabrillat, S.; Goetz, A.; Krosley, L.; Olsen, H.X. Use of hyperspectral images in the identification and mapping of expansive clay soils and the role of spatial resolution. Remote Sens. Environ. 2002, 82, 431–445. [Google Scholar] [CrossRef]
  5. Jetz, W.; Cavender-Bares, J.; Pavlick, R.; Schimel, D.; Davis, F.W.; Asner, G.P.; Guralnick, R.; Kattge, J.; Latimer, A.M.; Moorcroft, P.; et al. Monitoring plant functional diversity from space. Nat. Plants 2016, 2, 16024. [Google Scholar] [CrossRef] [PubMed]
  6. Feret, J.-B.; Asner, G.P. Tree Species Discrimination in Tropical Forests Using Airborne Imaging Spectroscopy. IEEE Trans. Geosci. Remote Sens. 2013, 51, 73–84. [Google Scholar] [CrossRef]
  7. Gege, P. A case study at Starnberger See for hyperspectral bathymetry mapping using inverse modelling. In Proceedings of the 2014 6th Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS), Lausanne, Switzerland, 25–27 June 2014. [Google Scholar]
  8. Méléder, V.; Launeau, P.; Barillé, L.; Combe, J.-P.; Carrère, V.; Jesus, B.; Verpoorter, C. Hyperspectral imaging for mapping microphytobenthos in coastal areas. In Geomatic Solutions for Coastal Environments; Maanan, M., Robin, M., Eds.; Nova Science: Hauppauge, NY, USA, 2010; Chapter 4; ISBN 978-1-61668-140-1. [Google Scholar]
  9. Klonowski, W.M.; Fearns, P.R.; Lynch, M.J. Retrieving key benthic cover types and bathymetry from hyperspectral imagery. J. Appl. Remote Sens. 2007, 1, 011505. [Google Scholar] [CrossRef]
  10. Lee, Z.; Carder, K.L.; Mobley, C.D.; Steward, R.G.; Patch, J.F. Hyperspectral remote sensing for shallow waters: 2. Deriving bottom depths and water properties by optimization. Appl. Opt. 1999, 38, 3831–3843. [Google Scholar] [CrossRef] [PubMed]
  11. Schimel, D.S.; Asner, G.P.; Moorcroft, P. Observing changing ecological diversity in the Anthropocene. Front. Ecol. Environ. 2013, 11, 129–137. [Google Scholar] [CrossRef]
  12. Gallay, M.; Eck, C.; Zgraggen, C.; Kaňuk, J.; Dvorný, E. High resolution Airborne Laser Scanning and hyperspectral imaging with a small UAV platform. ISPRS—Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, XLI-B1, 823–827. [Google Scholar] [CrossRef]
  13. Kosugi, Y.; Mukoyama, S.; Takabayashi, Y.; Uto, K.; Oda, K.; Saito, G. Low-altitude hyperspectral observation of paddy using radio-controlled helicopter. In Proceedings of the IEEE International Geoscience and Remote Sensing Symposium, IGARSS, Vancouver, BC, Canada, 24–29 July 2011; pp. 1748–1751. [Google Scholar]
  14. Jaakkola, A.; Hyyppä, J.; Kukko, A.; Yu, X.; Kaartinen, H.; Lehtomäki, M.; Lin, Y. A low-cost multi-sensoral mobile mapping system and its feasibility for tree measurements. ISPRS J. Photogramm. Remote Sens. 2010, 65, 514–522. [Google Scholar] [CrossRef]
  15. Zarco-Tejada, P.J.; Guillén-Climent, M.L.; Hernández-Clemente, R.; Catalina, A.; González, M.R.; Martín, P. Estimating leaf carotenoid content in vineyards using high resolution hyperspectral imagery acquired from an unmanned aerial vehicle (UAV). Agric. For. Meteorol. 2013, 171–172, 281–294. [Google Scholar] [CrossRef]
  16. Hruska, R.; Mitchell, J.; Anderson, M.; Glenn, N.F. Radiometric and Geometric Analysis of Hyperspectral Imagery Acquired from an Unmanned Aerial Vehicle. Remote Sens. 2012, 4, 2736–2752. [Google Scholar] [CrossRef]
  17. Uto, K.; Seki, H.; Saito, G.; Kosugi, Y. Characterization of Rice Paddies by a UAV-Mounted Miniature Hyperspectral Sensor System. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2013, 6, 851–860. [Google Scholar] [CrossRef]
  18. Bareth, G.; Aasen, H.; Bendig, J.; Gnyp, M.L.; Bolten, A.; Jung, A.; Michels, R.; Soukkamäki, J. Spectral comparison of low-weight and UAV-based hyperspectral frame cameras with portable spectroradiometer measurements. Photogramm. Fernerkund. Geoinf. 2015, 1, 69–79. [Google Scholar] [CrossRef]
  19. Yang, C.; Everitt, J.H.; Davis, M.R.; Mao, C. A CCD Camera-based Hyperspectral Imaging System for Stationary and Airborne Applications. Geocarto Int. 2003, 18, 71–80. [Google Scholar] [CrossRef]
  20. Petit, T.; Bajjouk, T.; Mouquet, P.; Rochette, S.; Vozel, B.; Delacourt, C. Hyperspectral remote sensing of coral reefs by semi-analytical model inversion—Comparison of different inversion setups. Remote Sens. Environ. 2017, 190, 348–365. [Google Scholar] [CrossRef]
  21. Chennu, A.; Färber, P.; Volkenborn, N.; Al-Najjar, M.A.A.; Janssen, F.; de Beer, D.; Polerecky, L. Hyperspectral imaging of the microscale distribution and dynamics of microphytobenthos in intertidal sediments: Hyperspectral imaging of MPB biofilms. Limnol. Oceanogr. Methods 2013, 11, 511–528. [Google Scholar] [CrossRef] [Green Version]
  22. Aschenbroich, A.; Michaud, E.; Stieglitz, T.; Fromard, F.; Gardel, A.; Tavares, M.; Thouzeau, G. Brachyuran crab community structure and associated sediment reworking activities in pioneer and young mangroves of French Guiana, South America. Estuar. Coast. Shelf Sci. 2016, 182, 60–71. [Google Scholar] [CrossRef]
  23. Constantin, D. Miniature Hyperspectral Systems. PhD Thesis, École polytechnique fédérale de Lausanne EPFL, Lausanne, Switzerland, 1 September 2017. [Google Scholar]
  24. Tournadre, V.; Pierrot-Deseilligny, M.; Faure, P.H. UAV Linear Photogrammetry. ISPRS—Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, XL-3/W3, 327–333. [Google Scholar] [CrossRef]
  25. Kalman, R.E.; Bucy, R.S. New Results in Linear Filtering and Prediction Theory. Trans. ASME—J. Basic Eng. 1961, 83, 95–107. [Google Scholar] [CrossRef]
  26. Woodget, A.S.; Carbonneau, P.E.; Visser, F.; Maddock, I.P. Quantifying submerged fluvial topography using hyperspatial resolution UAS imagery and structure from motion photogrammetry. Earth Surf. Proc. Landf. 2015, 40, 47–64. [Google Scholar] [CrossRef]
  27. Javernick, L.; Brasington, J.; Caruso, B. Modeling the Topography of Shallow Braided Rivers Using Structure-from-Motion Photogrammetry. Geomorphology 2014, 213, 166–182. [Google Scholar] [CrossRef]
  28. Burkart, A.; Aasen, H.; Alonso, L.; Menz, G.; Bareth, G.; Rascher, U. Angular Dependency of Hyperspectral Measurements over Wheat Characterized by a Novel UAV Based Goniometer. Remote Sens. 2015, 7, 725–746. [Google Scholar] [CrossRef] [Green Version]
  29. Aasen, H.; Burkart, A.; Bolten, A.; Bareth, G. Generating 3D hyperspectral information with lightweight UAV snapshot cameras for vegetation monitoring: From camera calibration to quality assurance. ISPRS J. Photogramm. Remote Sens. 2015, 108, 245–259. [Google Scholar] [CrossRef]
  30. Rehak, M.; Skaloud, J. Applicability of New Approaches of Sensor Orientation to Micro-Aerial Vehicles. ISPRS—Int. Arch. Photogramm. Remote Sens. 2016, III-3, 441–447. [Google Scholar] [CrossRef]
  31. Barbieux, K.; Constantin, D.; Merminod, B. Correction of airborne pushbroom images orientation using bundle adjustment of frame images. ISPRS—Int. Arch. Photogramm. Remote Sens. 2016, XLI-B3, 813–818. [Google Scholar] [CrossRef]
Figure 1. Hyper-DRELIO (Hyperspectral DRone for Environmental and LIttoral Observations) platform, Unmanned Aerial Vehicle (UAV) for hyperspectral imagery. NUC: Next Unit of Computing; IMU: Inertial Motion Unit; GPS: Global Positioning System.
Figure 1. Hyper-DRELIO (Hyperspectral DRone for Environmental and LIttoral Observations) platform, Unmanned Aerial Vehicle (UAV) for hyperspectral imagery. NUC: Next Unit of Computing; IMU: Inertial Motion Unit; GPS: Global Positioning System.
Remotesensing 10 00204 g001
Figure 2. Diagram of the multi-sensor system.
Figure 2. Diagram of the multi-sensor system.
Remotesensing 10 00204 g002
Figure 3. Geometry of the acquisition at 100 m of altitude. FOV: Field Of View.
Figure 3. Geometry of the acquisition at 100 m of altitude. FOV: Field Of View.
Remotesensing 10 00204 g003
Figure 4. (a) Processing chain for hyperspectral lines georegistration. (b) Example of hyperspectral image (in natural colors) in sensor geometry and (c) of the georeferenced hyperspectral image after line-by-line georectification and georeferencing. DEM: Digital Elevation Model.
Figure 4. (a) Processing chain for hyperspectral lines georegistration. (b) Example of hyperspectral image (in natural colors) in sensor geometry and (c) of the georeferenced hyperspectral image after line-by-line georectification and georeferencing. DEM: Digital Elevation Model.
Remotesensing 10 00204 g004
Figure 5. (a) Location of the study sites. (b) Aerial view of Porsmilin Beach (Geoportail IGN© image). (c) Aerial view of Lannénec Pond (Geoportail IGN© image). The red rectangles outline the study areas (displayed in detail in Figure 6 and Figure 7).
Figure 5. (a) Location of the study sites. (b) Aerial view of Porsmilin Beach (Geoportail IGN© image). (c) Aerial view of Lannénec Pond (Geoportail IGN© image). The red rectangles outline the study areas (displayed in detail in Figure 6 and Figure 7).
Remotesensing 10 00204 g005
Figure 6. Flight lines followed by the UAV (a) with a flight altitude of 50 m at Porsmilin Beach (Figure 5a,b) and (b) with a flight altitude of 100 m at Lannénec pond (Figure 5a,c).
Figure 6. Flight lines followed by the UAV (a) with a flight altitude of 50 m at Porsmilin Beach (Figure 5a,b) and (b) with a flight altitude of 100 m at Lannénec pond (Figure 5a,c).
Remotesensing 10 00204 g006
Figure 7. Results of hyperspectral lines georegistration at Porsmilin Beach (a) and at Lannénec Pond (b). The blue spots display the position of the control points. The red rectangles indicate the zones corresponding to the hyperspectral cubes displayed in Figure 8.
Figure 7. Results of hyperspectral lines georegistration at Porsmilin Beach (a) and at Lannénec Pond (b). The blue spots display the position of the control points. The red rectangles indicate the zones corresponding to the hyperspectral cubes displayed in Figure 8.
Remotesensing 10 00204 g007
Figure 8. Hyperspectral cubes for Porsmilin Beach (a) and Lannénec Pond (b), located by the red boxes on Figure 7.
Figure 8. Hyperspectral cubes for Porsmilin Beach (a) and Lannénec Pond (b), located by the red boxes on Figure 7.
Remotesensing 10 00204 g008
Table 1. Weights of the embedded sensors. GNSS: Global Navigation Satellite System.
Table 1. Weights of the embedded sensors. GNSS: Global Navigation Satellite System.
Sensors and Related EquipmentWeight
Headwall Micro-Hyperspec® camera680 g
Hyperspectral camera Schneider® lens90 g
iDS© uEye RGB camera52 g
RGB camera Tamron® lens39 g
SBG System® Ekinox-D IMU600 g
Dual GNSS antenna (without frame)2 × 105 g
Intel® NUC450 g
Onboard package LiPo battery230 g
Waterproof chamber and cables2040 g
Total weight:4.39 kg
Table 2. Settings of the surveys at Porsmilin Beach and Lannénec Pond.
Table 2. Settings of the surveys at Porsmilin Beach and Lannénec Pond.
Porsmilin BeachLannénec Pond
Flight altitude50 m100 m
Cross-track swath45 m90 m
Flight lines overlapping33%33%
Spacing between flight lines30 m60 m
Ground spatial resolution4.5 cm9 cm
Speed3 m/s4 m/s
Along-track ground sampling6 cm8 cm
Hyperspectral camera gain56
Hyperspectral acquisition rate50 Hz50 Hz
Flight duration6′15″5′40″
Distance of exploitable recording750 m500 m
Number of exploitable flight lines>10,000>8000
Covered area2.8 ha3.7 ha
Data volume>5 Gb>4 Gb

Share and Cite

MDPI and ACS Style

Jaud, M.; Le Dantec, N.; Ammann, J.; Grandjean, P.; Constantin, D.; Akhtman, Y.; Barbieux, K.; Allemand, P.; Delacourt, C.; Merminod, B. Direct Georeferencing of a Pushbroom, Lightweight Hyperspectral System for Mini-UAV Applications. Remote Sens. 2018, 10, 204. https://doi.org/10.3390/rs10020204

AMA Style

Jaud M, Le Dantec N, Ammann J, Grandjean P, Constantin D, Akhtman Y, Barbieux K, Allemand P, Delacourt C, Merminod B. Direct Georeferencing of a Pushbroom, Lightweight Hyperspectral System for Mini-UAV Applications. Remote Sensing. 2018; 10(2):204. https://doi.org/10.3390/rs10020204

Chicago/Turabian Style

Jaud, Marion, Nicolas Le Dantec, Jérôme Ammann, Philippe Grandjean, Dragos Constantin, Yosef Akhtman, Kevin Barbieux, Pascal Allemand, Christophe Delacourt, and Bertrand Merminod. 2018. "Direct Georeferencing of a Pushbroom, Lightweight Hyperspectral System for Mini-UAV Applications" Remote Sensing 10, no. 2: 204. https://doi.org/10.3390/rs10020204

APA Style

Jaud, M., Le Dantec, N., Ammann, J., Grandjean, P., Constantin, D., Akhtman, Y., Barbieux, K., Allemand, P., Delacourt, C., & Merminod, B. (2018). Direct Georeferencing of a Pushbroom, Lightweight Hyperspectral System for Mini-UAV Applications. Remote Sensing, 10(2), 204. https://doi.org/10.3390/rs10020204

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop