Next Article in Journal
Robust Mosaicking of Lightweight UAV Images Using Hybrid Image Transformation Modeling
Previous Article in Journal
Mapping Forest Canopy Fuels in the Western United States with LiDAR–Landsat Covariance
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Current Practices in UAS-based Environmental Monitoring

1
Department of Biology and Ecology, Faculty of Science, University of Novi Sad, Trg Dositeja Obradovića 3, 21000 Novi Sad, Serbia
2
Department of Civil, Architectural and Environmental Engineering, University of Naples Federico II, via Claudio 21, 80125 Napoli, Italy
3
Group of Crop Science, Institute of Agricultural Sciences, Department of Environmental Systems Science, ETH Zurich, Universitätstrasse 2, 8092 Zurich, Switzerland
4
Lancaster Environment Centre, Lancaster University, Lancaster LA1 4YQ, UK
5
Lancaster Intelligent Robotic & Autonomous Systems Centre, Lancaster University, Lancaster LA1 4WA, UK
6
Department of Mathematics & INESC-Coimbra, University of Coimbra, 3001-501 Coimbra, Portugal
7
Department of Geography, Porter School of Environment and Earth Science, Faculty of Exact Science, Tel Aviv University, Israel, P.O. Box 69978 Tel Aviv, Israel
8
Department of Geography and Environmental Studies, University of Haifa, 39105 Haifa, Israel
9
ETSI Topografía, Geodesia y Cartografía, Universidad Politécnica de Madrid, 28031 Madrid, Spain
10
Department of Soil Mapping and Environmental Informatics, Centre for Agricultural Research, Institute for Soil Sciences and Agricultural Research, Herman Ottó str. 15, 1022 Budapest, Hungary
11
Dipartimento delle Culture Europee e del Mediterraneo, Architettura, Ambiente, Patrimoni Culturali (DiCEM), Università degli Studi della Basilicata, 75100 Matera, Italy
12
Water Desalination and Reuse Center, King Abdullah University of Science and Technology, 23955 Thuwal, Saudi Arabia
13
Center for Remote Sensing Applications (CRSA), Mohammed VI Polytechnic University (UM6P), 43150 Benguerir, Morocco
14
Marine and Environmental Sciences Centre, Department of Civil Engineering, University of Coimbra, 3030-788 Coimbra, Portugal
15
NORCE Norwegian Research Centre, Siva Innovasjonssenter, Sykehusvn 21, 9019 Tromsø, Norway
16
Poliltehnic University of Timisoara, Traian Lalescu 2°, 300223 Timisoara, Romania
*
Author to whom correspondence should be addressed.
Remote Sens. 2020, 12(6), 1001; https://doi.org/10.3390/rs12061001
Submission received: 3 February 2020 / Revised: 9 March 2020 / Accepted: 16 March 2020 / Published: 20 March 2020

Abstract

:
With the increasing role that unmanned aerial systems (UAS) are playing in data collection for environmental studies, two key challenges relate to harmonizing and providing standardized guidance for data collection, and also establishing protocols that are applicable across a broad range of environments and conditions. In this context, a network of scientists are cooperating within the framework of the Harmonious Project to develop and promote harmonized mapping strategies and disseminate operational guidance to ensure best practice for data collection and interpretation. The culmination of these efforts is summarized in the present manuscript. Through this synthesis study, we identify the many interdependencies of each step in the collection and processing chain, and outline approaches to formalize and ensure a successful workflow and product development. Given the number of environmental conditions, constraints, and variables that could possibly be explored from UAS platforms, it is impractical to provide protocols that can be applied universally under all scenarios. However, it is possible to collate and systematically order the fragmented knowledge on UAS collection and analysis to identify the best practices that can best ensure the streamlined and rigorous development of scientific products.

Graphical Abstract

1. Introduction

Over the last decade, lightweight unmanned aerial systems (UAS) have evolved to become an integral element of spatially distributed environmental monitoring. UAS systems are increasingly a routine component of remote sensing tool-kits, providing a unique earth observation capacity that has been incorporated across a large number of applications, including the mapping and monitoring of both natural ecosystems and managed forest plantations [1,2,3,4], precision agriculture [5,6], river and streamflow measurements [7], topographic surveys [8,9], and natural hazard mapping [10], to highlight just a few examples. The expanding use of UAS in environmental studies has produced a diversity of approaches for collection, processing and analysis, which emphasizes the need to develop standardized protocols for UAS mapping [11]. Within the last few years, a range of UAS survey protocols have emerged across specific environmental studies, such as the mapping of glacial geomorphology [12], river landscape mapping [13], classification of wetland vegetation [14,15], and routine vegetation monitoring using multispectral sensors [16]. In parallel, standardized workflows and reviews have been developed for the many different processes undertaken within UAS surveys, whether these are radiometric calibrations [16,17], optimizing topographic surveys [8], or best classification methods assessments [18,19].
Despite the diversity of environmental UAS surveys, the majority of applications cluster around a few core areas, such as topographic mapping, precision agriculture or forestry; yet we are still far from the definition of standard protocols that can be translated across different contexts and environments. For example, Singh et al. [20] observed that many surveys paid little attention to the planning and processing of UAS imagery; perhaps the key steps need to be taken with forethought and care to ensure useable imagery. Given the emerging nature of this technology, it is not surprising that many studies may lack the use of standardized georeferencing procedures, radiometric and atmospheric correction, sensors calibration and error characterization [21], or even a classification accuracy assessment [22]. Such procedures constitute a constellation of interconnected, and often interdependent, tasks, that requires a structure and framework to describe and develop a robust collection, characterization and processing methodology.
Across the wide variety of UAS environmental applications, the primary data source remains optical imagery (i.e., photographs) and video recordings used for the development of orthomosaics and digital elevation models (DEM) [23]. In parallel with advances in the technology and operation of the UAS flight systems, researchers are also exploring a range of new sensors and platforms for data collection and interpretation. Common to all applications, and often irrespective of the type and goal of the specific research, is the requirement for a comprehensive experimental design to enable the delivery of useable, useful, and informative output. From a more fundamental perspective, ensuring the retention of detailed metadata for each survey step facilitates quality assessment and reproducibility of research. So, each specific research application may have its peculiarities or unique requirements experiences gained from different contexts, which are likely to assist in avoiding repetition of mistakes, collections and processing errors, identification of common challenges, and recognition of sensor constraints. Likewise, given the many commonalities across collection efforts, such aggregation of experience can help to inform the description and establishment of accepted protocols for mapping environmental features.
Although the field of UAS-driven data collection represents a multidisciplinary system of rapidly-developing science areas, there are many unifying elements and associations in their application. One of the key objectives of this present work is to provide UAS users with general practical guidance, that can optimize the collection and delivery of high-quality output for subsequent analysis and interpretation. Across diverse UAS applications, a general framework can describe work within five interconnected steps (Figure 1): (1) study design; (2) pre-flight fieldwork; (3) flight mission; (4) processing of aerial data; and (5) quality assurance. Each of these steps can be subdivided into a sequence of activities, that we have also attempted to detail and explore. In collating both a standardized set of procedures and also undertaking a harmonization across different methods and approaches, we also examine the results and conclusions from a literature survey of recent research and reviews on different types of UAS surveys. Further details on each of these five steps are provided in the following sections.

2. Study Design

Each UAS study is unique in its own way, and detailed mission planning presents the first essential step to ensure a safe and successful UAS data acquisition. A wide range of different parameters must be considered before any flight collection survey. It is important to note that the mission planning should consider measurement requirements in conjunction with practical limitations, since the accuracy and resolution of the data collected by the UAS are affected not only by the flight mission parameters itself, but by target design, platform, sensor configurations, topography, and meteorological conditions [24,25]. Moreover, UAS survey requirements are obviously linked to the specific research question being addressed. Therefore, the data type and quality (resolution, precision, accuracy, repeat frequency, etc.) required for each variable being assessed should be identified in advance, to achieve sufficient detail for appropriate intra- and inter-survey comparisons.
To illustrate the complexity of the problem, Figure 2 attempts to summarize the diversity of factors involved in UAS mapping application, for the development of a digital terrain model. The graph illustrates how the different elements of data collection are strongly interconnected and concepts of optics, photogrammetry, programming and radiometry should be considered in the study design. In particular, the different factors, influencing UAS-based products, are divided into six groups: output product (red), quality parameters (purple), software and hardware (blue), experiment design (orange), environmental conditions (light blue), and internal parameters (green) and the mutual dependencies (direct or indirect) are identified.
For example, the final quality of a terrain model depends on a range of factors, including: UAS hardware (platform), environmental conditions (sun-angle, cloudiness, wind speed), analysis tools (processing software), and of course the field experiment design (everything else). The equipment configuration directly affects the obtained data quality and completeness [26,27]. Exact equipment characteristics determine the subsequent image processing techniques, while the sensor’s spectral resolution defines the required radiometric calibration and correction methods. The next independent factor is represented by the environmental conditions. The type of study area, size, relief complexity, spatial heterogeneity, and meteorological conditions restrict the maximum available output model quality [2,28]. For example, elevation differences of the study area compared to the flight altitude lead to a variety of ground footprint sizes and can result in accuracy reduction in the lowlands [29]. The third factor affecting the terrain model quality is experiment design and it is the most flexible factor that can be managed by the researcher, according to the purposes of the study.
Therefore, the quality of UAS-based outputs is influenced by multiple interacting factors. For instance, to provide specific spatial resolution, it is necessary to take into account the sensor resolution and focal length, the effect of the terrain complexity and the flight altitude on the ground footprint. Increasing the spatial resolution by reducing flight height leads to smaller coverage areas, increasing the number of flight missions required for a specific study site and, consequently, potentially increasing the variability in environmental conditions (e.g., cloudiness, sun angle) that complicate radiometric correction and reduce spectral accuracy. Additionally, flight altitude prescribes the appropriate target size for their detection in images. As a consequence, the presented cross-interactions between study conditions and the data processing approaches can help to identify the controlling factors for a specific research objective.
The overall goal of this section is to provide a comprehensive workflow description for mission planning and control, with a specific focus on the following elements: (1) UAS regulations and legislation; (2) platform and sensor choice; (3) camera settings and UAS control software and (4) target georeferencing.

2.1. UAS Regulations and Legislation

Familiarization with local UAS regulations and restrictions is the first consideration on any UAS application [30]. Three key aspects encapsulated within most UAS regulations include [31]: (1) targeting the regulated use of airspace by UAS; (2) imposing operational limitations; (3) tackling the administration procedures of flight permissions, pilot licenses, and data collection authorization. With regards to the first point, many countries have restrictions for the use of UAS in the vicinity of airports, military installations, or protected areas, such as national parks and nature reserves. Additionally, there are limitations regarding the distance to built-up areas and the general presence of people. UAS flights are often restricted to a specific maximum flight height, depending on the type of operation, size, and weight of the aircraft, and pilot and company licensing. Operational regulations frequently identify different limitations based on the type of flights adopted. The main categories are visual line of sight (VLOS), extended visual line of sight (EVLOS), and beyond visual line of sight (BVLOS). In all cases, it is the pilots’ responsibility to ensure that flights are carried out according to local regulations [31]. Future standardized use of UAS in environmental studies may eventually lead to the harmonization of UAS regulations on a regional and even global scale. A promising step in this direction is represented by recently adopted EU regulations on unmanned aircraft systems and third-country operators of unmanned aircraft systems [32].

2.2. Platform and Sensor Choice

The best choice of platform and sensor will ultimately depend on the requirements of the survey, in terms of spatial and spectral resolution, areal coverage, and image quality. In addition to these technical requirements, factors such as available time and budget may also need to be considered. More generally, the main issues that need to be addressed to establish the appropriate platform and sensor design are: the desired spatial coverage; the required ground resolution; and the needed spectral resolution.
The choice of UAS platforms is between fixed and rotary-wing, of which the latter has become the more frequently used. While we note that there are a number of hybrid options that combine elements of fixed-wing and rotary systems (particularly for vertical take-off and landing), they currently represent a relatively small proportion of the market. The main advantages of rotary-wing platforms are vertical takeoff and landing, hovering ability, recovery capability, and a generally much easier automated flying experience [33]. Even though the recent rise of UAS popularity has been driven in part by the flexibility of rotary-wing multi-copters, traditional fixed-wing platforms are still preferred for larger-scale surveys (e.g., 1km²), for their energy efficiency characteristics. Thereby, for successful study design, researchers must consider the limits of UAS platforms for flight duration, based on battery capacity and maximum flight distance [30], both of which are increasing with new advances in power supply and operational efficiencies. Platform selection is further influenced by a combination of factors including user experience, research field, sensor type and required payload, and also the available dedicated software for flight control. In Table 1, we summarized some of the main advantages and disadvantages associated with the use of different platforms.
For most UAS surveys, commercial red-green-blue (RGB) cameras are the most common sensor choice, mainly due to their low-cost [11,20]. RGB cameras are sufficient in many applications, such as land mapping, morphological studies, vegetation classification, and river surveys. More recently, researchers have also turned to modified off-the-shelf RGB and near-infrared (NIR) cameras, in pursuit of more accurate vegetation mapping [34,35,36,37,38,39]. This provides an advantage for vegetation mapping, leading to an increase in the classification accuracy of up to 15% [40].
The miniaturization of sensors such as thermal, multispectral, and hyperspectral cameras (mostly in the VIS-NIR) have further enabled their installation on small UAS, allowing a dramatic increase in the range of potential applications (Table 2). Apart from the increased cost, these sensors are not always consumer-friendly, and can require significant additional resources to ensure calibration accuracy and sufficient user-training in order to extract useable, and scientifically relevant information.
The standalone use of thermal sensors is adequate for the field monitoring of brightness temperature differences, especially for precision agriculture, as the canopy temperature is closely related to plant stress [41,42]. Furthermore, thermal sensors have proved to be useful in mapping urban heat islands [43], heat losses in large objects [44], soil water monitoring [11], and for animal detection [45]. When using thermal sensors, radiometric calibration needs to be considered, as well as the lower spatial resolution compared to RGB or multispectral sensors. Capabilities for multi-sensor combinations to better inform processes has been a particularly useful benefit of UAS. For example, to explore the impact that these sensors may have on the accuracy of vegetation mapping, Komárek et al. [40] showed that identification accuracy of aquatic vegetation increased from 60–68% (using only visible) to 63–71% (combining visible + thermal). Replacing the visible with multispectral data further improved the accuracy interval to 74–81%. In other work, by combining light detection and ranging (LiDAR) with multispectral images, the classification accuracy of individual species was increased considerably, with an overall classification accuracy of 96% [46,47].

2.3. Camera Settings and UAS Control Software

Testing and defining appropriate camera settings is crucial, and one significant variable that needs to be determined to define the camera settings is the ground sampling distance (GSD), which results from the combination of flight height, focal length, and sensor resolution. In addition to these parameters, factors that may affect image quality are motion blur (dependent on GSD, flight speed and shutter speed), and exposure (as a function of shutter speed, aperture, and ISO level [48]). The required GSD should be defined based on the size of the features of interest, while other camera settings require trade-offs for survey optimization. A practical guidance for camera system selection, configuration, and image acquisition was presented by [49], while a comprehensive overview of camera settings importance was given by [48], with an in-depth analysis of parameters such as shutter speed, ISO, and lens and camera body choice. A planning workflow is represented by firstly ascertaining the minimum feature size of interest, and using this to determine the required GSD. Based on camera, lens and aircraft characteristics, flying speed and height are then optimized. By dividing the product of pixel pitch and flying height with the size of the camera focal length, one can calculate the GSD and scale estimated photogrammetric errors to the geographic coordinate system. Motion blur can be calculated by multiplying the flight speed of the UAS with shutter speed, and then dividing by the intended GSD [48]. Considering the impact flight speed has on motion blur, it is one of the most underreported survey characteristics and its values should be provided with published UAS surveys, along with other camera parameters. Flight speed and height, along with shutter speed, must be included in mission planning. Another important parameter is the image file format, as many researchers consider the RAW file format to be superior in comparison to compressed formats (e.g., JPEG, PNG), as the latter could introduce unnecessary noise [20,50]. The RAW format contains the unprocessed data acquired by the sensor, in contrast to JPEG, where image compression can lead to some loss in fidelity.
A significant component in mission planning is the determination of the forward and side image overlap, particularly for structure from motion (SfM) photogrammetric reconstructions, which require features observed in multiple images for developing digital models, orthomosaics, and 3D models [23]. The quality of the final SfM product can be influenced by the percentage of image overlap, where increasing overlap can lead to more precise final models. Yet, high overlap requires more images to be acquired, increasing data volumes and computation time. Published literature demonstrates significant variability in overlap values used, as they are affected by the characteristics of the study area, autopilot software, and SfM package used, as well as the user-driven specifications of the final product.
The flight plan should take into consideration the level of surface complexity, as well as topographic variation, of the required area in terms of the number of details or spatial uniformity. To obtain the best quality final product with the shortest computation time possible, optimal flight configuration can be defined as the minimum number of images with a sufficient amount of overlap. The optimal overlap will also depend on the sensor type, with thermal sensors (due to their lower spatial resolution) often requiring higher overlap [51] than multispectral. Besides, the overlap also defines the angular composition of the data. Thus, a balanced across and along track overlap is beneficial [42,52].
Many studies have used, or recommend large front and side overlap to compensate for aircraft instability [53,54,55,56,57]. In most of the studies, a forward overlap of 80% and side overlap of 60–75% has led to the production of high-quality 3D model and orthoimages [3,34,58]. Overlaps may need to be at the higher end of the range, when surveying areas containing uniform patterns, which often result in fewer tie points being identified. For instance, thermal surveys (which generally contain much less feature information than optical images), often require overlaps at least of 80% [51]. Therefore, this upper end of the image overlap range is adopted in cases where a high level of accuracy is required [56,57,59], while the lowest overlaps, for rapid observation with low accuracy requirements, should not be less than 55% [60]. Some studies have also found that interactions between flight collection variables, including flight altitude, image overlap, flying direction, speed and solar elevation affect UAS data quality and the subsequent processing results, emphasizing the need for flight planning optimization [61].
Such settings can be managed using flight planning and autopilot software, the details of which are explored in Table 3. Most of the flight planning applications for mobile devices are able to plan waypoint planar flight missions, while some have valuable additional functions. For example, Pix4Dcapture (Pix4D SA, Lausanne, Switzerland) provides the option of double gridded flights, which benefits thermal mapping due to the high image overlap needed, as well as for constructing high quality 3D models. Litchi (VC Technology Ltd., London, England) can take control of both the gimbal and the drone’s yaw axis and has an embedded computer vision algorithm for target tracking. DroneDeploy (DroneDeploy, San Francisco, USA) can generate cloud-based orthomosaics.
Ground station software usually has more functions than flight planning applications. Widely used examples include MavProxy [62], Mission Planner [63], APM Planner 2 [64] QGroundControl [65] UgCS [66] etc. which support the MAVlink protocol based autopilots (e.g., the ArduPilot or PX4). They have similar basic functions, like mission planning and flight monitoring, while various other features exist among them. For instance, the MavProxy supports loadable modules like antenna trackers and moving maps. Mission Planner can create a full hardware-in-the-loop UAV simulator via the interface with a PC flight simulator. The APM Planner 2 allows the connection of a 3DR Radio to view live data and to initiate commands in flight. The QGroundControl GCS is an open-source simulation environment for full flight control and multiple vehicle setup and simultaneous control [67]. UgCs can import DEM and KML files to enable map customization, and have a built-in photogrammetry planning tool and option to resume the route from the last point after the battery change. Others, such as mdCOCKPIT, can receive and record real-time telemetered data to detailed flight logs. The UAV Toolbox provides a flexible way to download and convert telemetry data. For instance, the eMotion 3 [68] supports both fixed-wing and multirotor operations and provides full 3D flight management. This last option is particularly valuable for maintaining a constant flight distance from the surface in a site, with significant changes in elevation.
In a recent review of mission planning software, Roth et al. [50] identified two main causes of error in data collection: (1) inadequate values set for camera parameters and (2) sub-optimal settings of the mapping mission, mostly caused by the tendency of UAS pilots to set automatic camera settings for “minimizing experiment complexity”. In addition, most of the studies are lacking in providing a comprehensive description of the survey in terms of parameter setting, limiting the replicability of the experiments.
Relatively few mission planning tools (e.g., DJI GS Pro, UgCS and DroneDeploy) consider camera settings such as shutter speed and may control motion blur value to some extent, but these tend to be limited to specific cameras. In an attempt to bridge the gap between UAS based mission planning and the control of camera settings, the free software PhenoFly Planning Tool was developed as a community tool to increase the efficiency and success of UAS mapping [50]. This planning tool considers GCP number, their placement pattern and frequency, as well as the viewing geometry. However, the PhenoFly Planning Tool does not provide autopilot functionality and presents the only free “offline-tool” flight planning software without vendor-specific and operating system restrictions, supporting all input parameters of various frame-camera and thin-lens combinations.
For initial flights with a new system, it is highly recommended to use the software provided by the manufacturer (e.g., mdCOCKPIT for Microdrones’ UAVs, DJI GS Pro for DJI products), because sometimes detailed aspects of flight preparation (e.g., calibration of onboard sensors and the remote controller) can be performed only by that particular software. In other cases, the operating system of the ground station hardware will influence the choice of the software used, e.g., Mission Planner can be installed only on Windows machines but QGroundControl GCS supports all three major operating systems (Android, iOS, Windows) with the same capabilities and tools as Mission Planner for ArduPilot/PX4 based UAVs. Only after several flights, when the pilot has built up flight experience, should other software options be explored, with this decision guided by the most frequent survey case that will be undertaken.

2.4. Georeferencing

The camera orientation and location for each captured image is defined by six exterior orientation parameters (EOP). Traditionally, in manned aerial photogrammetry, the determination of EOPs in a photogrammetry block is solved indirectly using the aerial triangulation method [69,70]. In this method, the EOPs are estimated from several ground control points (GCPs) and their corresponding image coordinates are manually identified. If additional positioning and orientation sensors are used during the flight, direct georeferencing becomes possible [71]. Therefore, the accuracy of the georeferencing method is dependent on two aspects: i) the accuracy of the GCP measurements (which are in general performed with GNSS receivers or total stations) and ii) the accuracy of the onboard sensors.
Currently, onboard differential global navigation satellite system (DGNSS) techniques are being adopted for UAS-photogrammetry, to provide high-quality camera position measurements and enable direct georeferencing to centimeter-level accuracy [72,73,74]. Such techniques require two GNSS receivers: a base station receiver or a service of continuously operating reference stations (CORS), to provide differential correction data, along with a rover GNSS receiver mounted on the UAS. If the onboard receiver communicates with the base station using a radio link, the DGNSS corrections are applied in real-time and the method is referred to as real-time kinematic direct georeferencing (RTK-DG). If the differential corrections are computed by processing the two GNSS signals (base station and rover) and applied post-flight, the method is known as post-processed kinematic direct georeferencing (PPK-DG).
In any case, a few categories of errors appear in the process of direct georeferencing [75,76]:
  • errors generated from external orientation sensors (e.g., GNSS and internal measurement unit—IMU errors);
  • errors generated from internal orientation procedure (e.g., stability of the focal length, lens distortion, over-parameterization through self-calibration);
  • errors generated from the measurement of image point coordinates.
In a comparison of four georeferencing methods for environmental monitoring, Padró et al. [77] concluded that the traditional method based on the use of GCPs still presents an accurate and low-cost solution. However, it is a time-consuming process involving a considerable manual effort in field positioning, especially when the survey area includes inaccessible sites.
Currently, one of the best solutions is a combination of direct and indirect georeferencing, especially in cases of poorly textured areas, very steep forested areas, motion blur and image noise, or anywhere the SfM-based approaches may not succeed. The knowledge of EO parameters delivered by the direct georeferencing sensors can facilitate the tie-point matching process and reduce processing time.
In this context, technological miniaturization led to the possibility of applying DGNSS methods for direct georeferencing on board. The RTK/PPK method is less influenced by onboard flight patterns, vegetation phenology, and data post-processing. Notably, the RTK/PPK method succeeds in surveying inaccessible or hazardous terrains, such as dense forests [72], volcano tectonics [78], or settlement landfills [79] and ice sheets [80,81]. A recent study by Zhang et al. [82] showed that the PPK method can provide the same accuracy as the use of GCPs, with a small variation in vertical accuracy, which could be solved by including only one GCP. Furthermore, this study provides a UAV-PPK-SfM workflow, which is provided to assess the positional accuracy, repeatability, and reproducibility of digital models. Additionally, using GPS precise point positioning (PPP) without GCPs proved to be sufficient for large-scale UAS mapping of the most remote areas [83].
Regardless of the method used for georeferencing and its accuracy, it is necessary to evaluate the accuracy of the geospatial products (see Section 5.1). This can be done by using independent checkpoints, reference surfaces or length measurements [84]. In particular, digital surface models (DSMs) are generated by SfM-photogrammetry and may contain errors and outliers. Existing accuracy metrics like root mean square error (RMSE), standard deviation and mean error are based on the assumption that outliers don’t exist and that errors are normally distributed [85]. Therefore, in order to check the accuracy of the DSMs, it is necessary to use a robust accuracy assessment methodology and metrics that are mainly based on independent check points and preferably not influenced by outliers, or by a skew distribution of the errors. A robust RMSE estimator of vertical accuracy able to account for non-normal error distributions in a DSM obtained by SfM and dense correlation techniques is introduced by Gonçalves et al. [86].

3. Pre-Flight Fieldwork

There are three main phases that have been identified in pre-flight fieldwork: 3.1 reconnaissance of surveyed area; 3.2 GCPs distribution and radiometric calibration, and 3.3 field data collection, each of which is explored in the following paragraphs. Given the importance of these various pre-flight decisions for accurate UAS data collection, we have provided a generalized check-list that can be used to guide users to avoid common planning and collection mistakes and also to ensure safe operation of surveys (see Appendix ATable A1).

3.1. Reconnaissance of the Surveyed Area

For safety reasons, visual assessment of the surveyed area is very important to seek out potential obstacles that could potentially damage the UAS or block either the line of sight or radio signal. Additionally, choosing safe takeoff and landing sites and identifying potential alternative spots for landing in case of unexpected situations is important [13]. During area reconnaissance, safe spots with maximum visibility for GCPs deployment should be considered, including the potential for using existing stationary man-made or natural objects as GCPs.

3.2. Ground Control Point Distribution and Radiometric Calibration

As we have seen previously (Section 2.4), GCPs are an essential prerequisite for performing the classical aerial triangulation of any image block. In general, the coordinates of these points represent the only known vertex of a stereo-model. Therefore, their measurement, distribution, and placement have to be carefully planned, as they directly influence the accuracy of the bundle block adjustment (BBA). Taking into account the specificities of the different study areas, the distribution, number, and density of GCPs largely depend on the required accuracy, precision and spatial extent of the survey (as determined from the study design). The required GCP deployment will also depend on the image content (e.g., vegetation and surface type), terrain characteristics, survey design and camera characteristics, which influence the potential for systematic error to develop within the image block. The optimal number of GCPs and their spatial distribution is one of the most important issues in indirect georeferencing [8,87]. From a mathematical point of view, a minimum of three GCPs is required to generate georeferenced geospatial products in a particular geographic or projected coordinate system. The literature suggests that DSM accuracy tends to increase with the number of GCPs, reflecting increasing mitigation of systematic error. For surveys with little susceptibility to systematic error, asymptotic behavior can be reached rapidly. For the examples of this trend presented in Figure 3, optimal GCP densities are around five GCP/ha for planar and 10 GCPs/ha for vertical geo-referencing [73].
The spatial distribution of GCPs is also important, because evenly distributed points in space generate geospatial products of higher accuracy than grouped GCPs. As this GCP deployment is time-consuming, it may be advantageous to use double gridded flight plans, and include some oblique images, to increase 3D model accuracy and reduce the number of required GCPs [73]. Despite the difficulties in deploying GCPs and the expected contributions of high-precision IMU/GNSS technologies to reduce their number, GCPs are still needed for acquiring high geometric accuracies (see Section 2.4) [74]. In this context, PhenoFly Planning Tool [50] can be helpful in managing the arrangement and frequency of GCPs. An additional consideration is the physical dimension of GCPs, and it is advised to use targets with dimensions 7–10 times the GSD obtained with a particular survey [16]. GCP size must correspond to sensor type, resolution, and intended GSD, so increasing the flight height generally requires larger GCPs.
Precise georeferencing of ground truth data presents another compromise of survey goals, type of environmental survey, and time management. GCPs and ground truth data coordinate acquisition is accomplished by infield measurements, using a GNSS total station and associated receiver. While accurate, acquiring GNSS data could be very time-consuming, as each georeferencing feature requires an observation time that can vary from a couple of minutes [83] to the suggested 15–20 min [5,92] or more. As mentioned (Section 2.4), GNSS stations can also read the data from a network of a permanent reference station (PRS) and with the combination of DGNSS measurements, GNSS stations can determine the receiver position more precisely than with RTK measurements [90]. Successful georeferencing can depend on the type of sensor used, as many multispectral and hyperspectral sensors produce non-aligned bands, have an insufficient interior orientation or produce low spatial resolution images [17].
Similar to considerations of the GCP design, the configuration of the spectral target, along with their number and location, depends on the particular area and the study purpose. A control site can be equipped with various types of targets: resolution bar target, Siemens-star target, reflectance target, spectral characteristic target, moveable targets [93], and coded targets [93,94]. Most often, single white or white with black panels are used for radiometric correction [95,96] but, in some cases, there is a need for an increased number of radiometric targets. For example, light and dark panels could be put at the end of each flight line to compute a calibration curve for every transect and mitigate changes in illumination during the collection mission [97]. The radiometric calibration can be performed by a combination of irradiance measurements by the spectrometer and object radiance measurements by an UAS hyperspectral camera [97,98,99]. This approach can be used separately or in combination with calibration panels [99,100]. When hand-held spectrometer measurements are available for field-based assessments, the calibration target can be made from a range of commonly available materials, with Masonite and oak boards painted with matt paint proving to be acceptable standards if properly prepared [21]. Reflectance targets should have close to Lambertian reflectance characteristics, and space between black and white-colored parts/targets should be sufficient for avoiding adjacency error. Alternatively, [101] recommended using eight radiometric calibration targets of different brightness, measuring their reflectance values with field spectrometer at 13 different angles. A detailed insight in radiometric processing is given in Section 5.2. Although nadir images acquired by parallel flight line surveys are the predominant viewing geometry, for many applications it can be advisable to use additional oblique imagery and cross lines, as a method to improve both the radiometric and overall quality of the generated geospatial products [102].

3.3. Field Data Collection

Field data sampling and collection depends largely on the type and nature of environmental research, so providing specific case-by-case guidance is not practical. Nevertheless, this step is as important as any other aspect of these guidelines, so instead of generalizing collection strategies, we provide a number of representative examples, that can be used as guidance for other environmental studies. For instance, in the case of geomorphic studies, the most important is precisely georeferenced GCPs and checkpoints. For fine-scale research on surface ice melt on an Arctic glacier, [103] used an ablation stake network as ground truth data for measuring melting variability, while an analysis of slow geomorphological processes [104] used terrestrial laser scan as a reference. For soil water content measurement, the most common method is soil sampling, within defined sampling cells [105] or precisely determined locations [106] of crop fields at a specific depth. Afterward, soil samples are oven-dried in laboratory processing and used as reference data. The same practice was used in the research of karst mountainous area [107], where soil samples were collected from six different vegetation types. Field measurements are also very useful, as by receiving real-time data, researchers can acquire additional reference data and validate UAS and laboratory results [106].
Obtaining ground truth data is also achieved by identifying major terrain and vegetation units within the surveyed zone [88], or by selecting samples directly on images on the basis of GNSS surveyed points and handwritten documentation [18,36]. For vegetation patches that are easily distinguished from surroundings by their size, specific growth pattern or phenology traits, there is no need for detailed georeferencing. On the other hand, for multi-temporal monitoring, exact georeferencing is crucial for image collocation, thus enabling researchers to fully understand spatio-temporal processes. For structural vegetation survey, besides georeferencing, ground truth data are acquired by collecting structural data such as plant height and population count [32]; tree height and crown cover diameter for each tree [108]; tree girth [109] or foliage, seed/flower/fruit, branching and bark characteristics [58]. As active LiDAR sensors are most prevalent for these types of studies, plant height and crown diameter are the most important ground truth data.

4. Flight Mission

Weather conditions are an obvious factor that cannot be planned with certainty. Factors such as illumination and cloudiness impact the quality of images, while wind and fog are also unfavorable for survey quality. Illumination changes may affect the overall quality of the spectral data, while wind can impact flight duration and presents a major safety concern. Besides wind speed and direction, [110] showed that air humidity has a large impact on the radiometric image quality, since higher humidity contributes to a greater dispersion of light. As such, air humidity should be included and provided with other research parameters. In this context, there are numerous useful real-time weather condition and forecast sites, and even a dedicated smartphone app (UAV Forecast™, covering Europe) that can forecast the optimal time for flight missions: but all are based on operational meteorological systems, which may be less accurate in countries lacking ground-based meteorological infrastructure. In addition to the percentage of cloudiness and chance for rain, the UAV Forecast™ platform offers values of planetary K-index (Kp index), for measuring geomagnetic disruptions caused by solar activity, affecting the satellite signal positioning of UAS [111]. By interpreting all available data, it provides a final assessment about flying conditions. Of course, no web-site or forecast can beat site-based assessment, and this will always be the gold standard.
Most of the UAS-based surveys for landscape mapping are carried out by autopilot software on the ground control station, where the intended GSD, image overlap, flight speed, and altitude are defined in advance (see Section 2.3). Nevertheless, flight patterns should be developed in consideration of the weather conditions and especially the wind direction, as these can have a direct influence on retrieval quality (e.g., movement of vegetation affecting 3D reconstructions; impact of wind-speed on thermal measurements and sensors). If the area of interest has a particular structure, e.g., when the vegetation is planted in rows, it is beneficial to plan the flight path along these lines, to later use spatial correction algorithms to minimize the impact of fluctuating environmental conditions on the results [42]. The appropriate flight pattern significantly impacts the overall quality of the survey. For instance, cross flight patterns significantly improve self-calibration processes, as more tie-points are acquired for the SfM algorithm. In mapping heterogeneous terrain, this effect is not so conspicuous, but in the case of flat and homogeneous areas, cross flight pattern improves the vertical and overall accuracy [112]. Furthermore, acquiring images with different camera tilt (e.g., nadir and oblique) also improves the self-calibration and the final quality of digital models. This method is very useful in mapping on steep slopes, river banks [13] or in forest monitoring [112,113]. In a recent assessment of DSM accuracy, [73] showed that usage of a tilted camera can vastly improve the DSM elevation accuracy of inclined surfaces, and the combination of different flight trajectories is also tremendously beneficial. On the other hand, acquiring spectral data using a different camera tilt is more complex and requires additional radiometric calibration. Given the fact that illumination and viewing geometry depends on zenith and azimuth angles of the sun, some regions may have increased brightness (“hot spots”) as a result of the same viewing and illumination angles [114].
Since the accuracy of estimating tie points, and consequently the reconstructed surface geometry, depends on the ground resolution of the input imagery, flight altitude is an important parameter that defines the precision of the output terrain data. For mapping the surface with elevation differences (including natural and built features), the altitude should be defined by taking into account the lowest point on the study area, i.e., the maximum GSD, or modified in order to obtain a uniform distance from the surface.
On the one side, altitude increase reduces the flight duration and allows one to cover larger areas, which can be important to keep relatively constant environmental conditions (cloudlessness, sun angle, and radiance) during the flight mission. On the other side, increased field of view, resulting from high flight altitude, leads to poorer spatial resolution, which may affect feature delineation [115]. In general, higher altitude flights produce a sparser point spacing, leading to a less detailed DSM. The result for low altitude flights is a more irregular shaped DSM that might be filtered as well, and these effects have to be taken into account [54].
The UAS flight path is planned according to the defined flight parameters mentioned above, with the task largely automated via specific software (see Table 3). Given the mass utilization of UAS, the flight path algorithms have been the subject of many studies and mostly divided into two overall categories (i) optimal algorithm and (ii) heuristic algorithm. Beside general differences between these algorithms, we can also distinguish (i) optimal path planning and (ii) coverage paths algorithms. The use of these algorithms is study-specific, and for environmental monitoring, coverage path planning is most commonly used [116]. The main purpose of the optimal flight plan focuses on cost-efficiency in the sense of flying and computation time, respectively. By calculating the optimal path, passing through waypoints is ensured with minimal energy consumption [117]. Currently, grid or double-grid flight missions dependent on the required overlap are the common approaches for terrain and vegetation monitoring [118,119,120]. Nevertheless, new cost-efficient flying paths are emerging, such as spiral coverage path. [121] In the case of computation time, a new approach was proposed [122], by combining different methodologies for mapping of the 3D environment. In pursuit of the optimal flight plan, Fu et al. [116] proposed three cost functions, of which path security cost is used to determine the feasibility of the survey and the length cost and smoothness cost of the path are referred to as the energy consumption of UAS flight mission.

5. Processing of Aerial Data

5.1. Geometric Processing

UAS imagery usually contains a range of distortions (barrel, pincushion, and mustache distortion) that alter the geometry of the represented object. Distortion is influenced by the camera lens and the point of view at the time of the shooting. Importantly, images can be orthorectified by applying the required corrections, due to the optical distortions associated with the adopted camera, and apparent changes in the position of ground objects caused by the perspective of the sensor view angle and position. In aerial photogrammetry, the rectification plane is always the XY plane. However, in near object photogrammetry, the rectification plane can be any defined by the user. In general, the third coordinate will be the distance to the object measured perpendicular to the rectification plane. In order to obtain the third coordinate, a 3D digital model is necessary. This digital model must faithfully represent the geometry of the objects that appear in the photograph. Due to the high point density obtained with SfM algorithms, it is usually possible to obtain a detailed model. SfM-photogrammetric processing can produce a range of mapping products: point clouds, digital terrain and surface models (DTM, DSM), 3D models, canopy height models and orthoimages [20,54].
For DSMs and orthoimage production, SfM software follows a general workflow: (1) calculating camera locations to generate a low-density point cloud; (2) generating a high-density point cloud from the results; and (3) using the high-density point cloud to build a georeferenced mesh with texture (color) overlay, or another geospatial product [123]. Most of SfM software can be divided into either commercial package (e.g., Pix4D and Agisoft Metashape), that have standardized workflows and “black box” type operation (with correspondingly little insight for researchers on its internal workings); or, open-source software with much more complex workflows, but which allows internal inspection (e.g., VisualSFM, MicMac).
Extracting tie-points from low-quality images can introduce undesirably large errors to the SfM products. As such, images need to be either corrected or excluded. Metashape offers automatic image quality assessment, where images with a value less than 0.5 (even 0.7) are advised to be removed from photogrammetric processing [123]. This value is calculated based on the sharpness level of the most focused part of the image. For GCPs integration, the first essential step is consistency in coordinate systems of GCPs and sensor GNSS data. After importing the GCP data, a manual association of points with their locations is required. If using Metashape, before this step, it is crucial to uncheck all geotagged photos, to avoid the production of a distorted DSM due to the attempt to reconcile high-quality GCP data with the less accurate sensor GNSS data. After manual identification of one GCP in two photographs, the software automatically starts filtering out the rest of the images containing the given GCP. By optimizing camera alignment, georeferencing accuracy is significantly improved. Following optimization, the GCPs and camera estimated coordinates are updated with georeferencing errors. RMSE reprojection error is related to estimate positional planar and vertical accuracy. In the context of required accuracy, GCPs with high total reprojection error (larger RMSE value) should be examined or removed before generating the dense point cloud. The quality of final SfM products has been the focus of research over the past years [124], including efforts that optimize UAS topographic surveys and GCP deployment using Monte Carlo approaches [8], or evaluating ground surface models in disturbed conifer forests [125,126]. Results from the latter research indicate that RMSE is three times more impacted by canopy cover than by terrain slope.
The most critical aspect for geometric processing is determining the magnitude of absolute vertical and horizontal error that might restrict the application of topographic mapping, especially for change detection. The most efficient way to assure high-quality results with reduced absolute errors is by planning appropriate study design. The distribution of GCPs also plays an important role, as stratified arrangement inside the investigated area can optimize the vertical accuracy [90] and using a cross-flight pattern with a nadir and oblique camera can decrease the vertical error [73]. Regardless of the choice of software, we recommend the reporting of processing parameters such as accuracy level, preselection mode, the key point limit, and tie point limit, as well as parameters for the dense point cloud reconstruction.
Another important element for successful product retrieval is the hardware requirements for using SfM software. In most cases, central processing unit (CPU) power and random-access memory (RAM) is a limiting factor, but most of the software allows the use of graphics processing unit (GPU) cores, significantly improving the processing performance. For example, aligning 500 photos (12 megapixels) requires 2.5 GB of storage space, building a high-quality model of another 8 GB and building a high-quality arbitrary model up to 180 GB of storage space, with computation time approaching 24 h if a platform with a basic configuration is used.

5.2. Radiometric Processing

During the sensing process, the radiometric signal of an object is influenced by the measurement geometry (sun-object-sensor), the illumination conditions (direct-diffuse ratio), absorption and scattering of the atmosphere on the path from the object to the sensor and the sensing systems itself (vignetting and response function of the optical path and the chip, respectively) [127,128]. To obtain a radiometrically reliable signal, these effects need to be quantified, and where possible, normalized. Additionally, the radiometric calibration of the sensor and the data processing procedures affect the final data product [17]. Thus, the spectral response of a system needs to be known to generate spectro-radiometric consistent data products. Notably, [17] identified several essential steps that need to be carried out to get a radiometrically and spectrally consistent and comparable data product: (i) sensor characterization and calibration (both spectral and radiometric), (ii) reflectance factor generation (c.f. [129]), (iii) radiometric scene normalization, (iv) radiometric validation and (v) metadata generation. When radiance is required, the transformation to reflectance in step (ii) can be omitted. Additionally, step (ii) and (iii) can be swapped or carried out together.
Sensor calibration and characterization are usually carried out periodically in the lab. The sensor is exposed to a homogeneously illuminated target to normalize for the non-homogeneous illumination of the chip due to the optical path of the system (e.g., vignetting) and differences in the radiometric response function of the individual elements of the chip. The result is a correction function that transforms the digital numbers recorded by the chip to linear radiometric coefficients. Additionally, if the radiometry of the light source is known, the response of the sensor can be mapped to physical units of radiance. For the spectral characterization of each band (response function or at least central wavelength and full width half maximum (FWHM), the sensor is exposed to light with a known spectral emission [21]. Additionally, the spectral smile and keystone effects need to be characterized during lab calibration and should be taken into account in the post-processing of the data. Often, sensor calibration is carried out by sensor vendors, but ongoing assessment is important to ensure the stability of collection systems.
To generate reflectance images from the calibrated images, two approaches are usually used in low-altitude UAS remote sensing: (i) radiometric reference targets (RRT) or (ii) irradiance measurements, based on a second sensor (or optical path) that measures the spectrally resolved downwelling illumination [99]. In the RRT approach, several (near-) Lambertian targets (mostly panels or tarps) of known spectroradiometric properties are placed within the survey area and overflown during the flight. The reflectance of these targets (after convolving them to the sensors’ spectral band characteristics) can then be used with the empirical line method (ELM) [130] to generate reflectance and normalize signal for different illumination conditions between flights and for atmospheric effects (although the atmosphere between the ground and UAS may only have limited impact on low-altitude for standard reflectance measurements [131]). It has to be noted that using a one-point empirical line calibration, where the sensor is pointed above one RRT before or after takeoff (similar to field-spectroscopy procedures), should be avoided, due to biases related to the shading of the hemisphere [c.f. 52]. When irradiance measurements based on a second sensor are used, these measurements can be employed to transform radiance measurements of the target objects to reflectance by dividing radiance by irradiance after both sensing systems have been cross-calibrated for spectral and radiometric response. The assumption here is that the atmosphere between the ground and the sensing system does not influence the signal, and the illumination is the same for both. This could be checked by comparing the signals over the calibration panels at the ground and the flight levels at (or near) real-time. Ideally, the irradiance sensor is also mounted on the UAS at a reciprocal angle to the measurement geometry of the sensor (if a sensor is pointing nadir, the irradiance sensor should point zenith), and both sensors should be actively stabilized to minimize the impact of tilt and roll of the UAS. The latter approach has the advantage over stationary approaches (RRT or stationary irradiance measurements) that changes in illumination during the flight may be compensated (this does not work in cases where, for example, a shadow is cast on the area within the field of view (FOV) of the sensor from an oblique angle, such that it does not shadow the irradiance sensor). Ideally, also the diffuse proportion of the irradiance field should be characterized by having an additional sensor measuring the diffuse irradiance (e.g., shading the direct illumination). The latter is particularly useful when radiative transfer models should be used later on.
Measurement geometry has a significant influence on the apparent reflectance, due to the anisotropic reflectance of most objects [52,132,133]. The anisotropy of an object can be quantified by assessing its bidirectional reflectance distribution function (BRDF) [129]. For imaging sensors, each pixel (due to its instantaneous field of view) has a different viewing geometry. Although the FOVs of 2D imaging sensors are rather small, it was shown that even small differences in the measurement geometry significantly affect the apparent reflectance [52,133,134] and thermal signal [42,135] when standard photogrammetric software is used. The reason for this is that when compiling an orthomosaic, the processing mode has a significant influence on which information from which pixel, and consequently with which measurement geometry, is used in the orthomosaic. When the information from all images is averaged, the result represents the average from multiple viewing geometries. In contrast, when no blending is used, then only viewing geometries close to nadir (depending on the image overlap) are used. Notably, [52] developed a conceptual framework to capture this principle and defined the specific field of view (SFOV), which describes the composition of pixels and their angular properties within a scene used to characterize a specific area of interest on the ground. It is very important to deliver information on the viewing geometry as metadata, since without it, a spectral scene cannot always be interpreted accurately. In order to minimize viewing geometry effects as well as changes in illumination conditions during the flight, [136] developed a radiometric block adjustment for UAS remote sensing, that can normalize for these. Finally, it is important to validate the results. However, as pointed out by [17,52,137], this might not be straight forward. Due to the anisotropy of most objects, validation targets need to be selected carefully, to prevent the angular properties of data from different sources (e.g., with a non-imaging field spectrometer with a FOV of 25° and imaging spectrometer data with a very narrow SFOV) confound the validation. One method is to use Lambertian targets that have not been previously used for the ELM.
While the aforementioned paragraphs provided the basics of radiometric processing, for “perfect” radiometric data, some additional aspects can be taken into account. Many spectral/hyperspectral approaches assume a perfectly diffusing reference for calculating relative reflectance (Lambertian reference). Likewise, the basic supervised vicarious calibration (SVC) method [138] does not consider the anisotropic nature of the hemispherical downwelling irradiance composed of the combined direct solar and anisotropic diffuse sky downwelling irradiance. It is well established that under theoretical conditions, with the solar zenith angle from near nadir (90°) to slightly off-nadir (60°), the biconical reflectance factor (BCRF) provides radiance that can be corrected by a reflectance factor measured upon a reflectance panel, e.g., SpectralonTM [139]. A part of the BCRF is a hemispherical-conical reflectance factors (HCRF) characterization of flat panel samples using an integrating sphere, which is a well-established technique from Nicodemus et al. [140] All theoretical/laboratory approaches rely on the goniometric system and suggest applying a common ratio between HCRF (0°) and BCRF (45°), which was reported as a linear relationship by [141]. In outdoor illumination conditions when the sky has low aerosol optical depth and no clouds, the diffuse component of the incoming light is often ignored [142]. A recent study [143] developed a new physically-based method to estimate the HCRF, by a combination of photogrammetry and radiometry from images of lightweight multispectral cameras, with a downwelling irradiance sensor aboard the UAS. The main advantages of this innovative concept are that no calibration targets are needed, it works both in clear sky and overcast conditions, and captures the directionality of the reflectance factor. The main drawback is that a well-calibrated sensor must be combined with a sky sensor.
Due to the influences on the spectral signature during data acquisition and processing, it is very important to capture metadata on the quality, processing and pixel properties of the final data product, and supply this with any data product. In [17], a list with suggested metadata is provided. Additionally, [144] demonstrates how quality assurance information can be attached to a spectral scene. For a detailed description and literature on the procedures, please refer to the review provided by [17]. Examples of using these correction methods and their impact on the accuracy of the spectral information are presented in Table 4.

6. Quality Assurance

Whereas the acquisition of UAS imagery and photogrammetric products has proliferated in recent years, there is currently no standard specification for such UAS acquisitions that ensures data standardization across collection strategies, as exists in other remote sensing disciplines. As the practical usage of UAS technology involves diverse sensors and platforms, and with many different applications, the uncertainties of the products can potentially be high and are often user-dependent. Importantly, there is no easy-to-use tool available to perform quality assurance on UAS data, and the data provided to users are frequently of varying and often unknown quality. Accordingly, there is a need to provide coherent guideline on how to acquire and process the data. Learning from lessons in other remote sensing disciplines (e.g., airborne and satellite retrieval) the parameters that need quality assurance are divided into two domains: radiometric and geolocation. The radiometric property is important for providing reliable physical values of the sensed target, while the geolocation property is important as users tend to request (and develop) applications that require high spatial resolution (and hence high spatial accuracy). UAS imposes an additional aspect for quality assurance, that is connected to the specific protocol adopted for flight planning, sensor settings, and data processing settings. Some comments on this are provided in the recent manuscript by [17].

6.1. Quality Assurance Metrics for Radiometric Data

The calibration/validation (cal/val) aspects of onboard UAS sensors is an important step in ensuring the robustness of the collected data. The present section examines the cal/val process from a wider perspective, rather than focusing on the specifics of any published method. Following a well-known practice, sensor output validation is performed by a comparison of measured (known) data, including field in situ measurements and other validated UAS systems. The primary intention of cal/val approaches is to retrieve the best possible spectral ground validation results. This approach generally involves up-scaling data from the ground to the scale of the UAS sensing, and requires detailed quality assessment, often via standard statistical metrics. These might include Euclidean distance, spectral angle distance, comparing absolute values [149], multiple linear regression and the average sum of deviation square score [150]. QA is mainly performed by correlation coefficients and RMSE. Alternative quality indicators are the at-sensor radiance-to-reflectance ratio (rad/ref), and the radiance-to-reflectance difference factor (RRDF), suggested by [138]. The rad/ref indicator is a quantitative regulator for inspection of the data quality and radiometric performance of the sensor in question. Calculating the RRDF by two pairs of sensors and ground truth spectra, followed by the inspection of the entire spectral region confirmation, is achieved only if both pairs provide fully overlapping curves. With all of these approaches, one has to be careful to obviate influences resulting from differing viewing geometries of different instruments (c.f. Section 5.2 and [17,52,115]). The cal/val concept plays an essential role in bringing UAS remote sensing closer to quantitative applications. The results of advanced cal/val technology and the QA/QI perspective of the future should be to ensure that UAS remote sensing remains consistent with best practice.
The most common method for establishing a relationship between the imagery digital numbers (DNs) and ground-measured reflectance is ELM. Using radiometric reference targets [151,152], ELM is prone to accuracy deterioration due to the variation of atmospheric conditions and reference targets properties, therefore stable weather conditions during missions and targets close to Lambertian characteristics should be favored. Even though ELM is one of the simplest and most accurate methods so far, overcoming the abovementioned constraints may significantly increase the cost and complexity of an UAS survey. Recently, Iqbal et al. [153] proposed a simplified radiometric calibration for the mini-MCA (multispectral camera array) sensor. White plastic boards, as the pseudo targets and radiometric calibration targets, were used to convert DN to spectral reflectance using the calibration equation. Xu et al. [154] emphasized that the ELM independently calibrate each channel, ignoring the correlation between spectral bands. The authors proposed a spectral angle constraint method (SACM) for the radiometric processing of all bands as a whole. For the comparison of the ELM and SACM, the precision results were compared to spectroradiometer data and evaluated using mean absolute error (MAE), mean relative percent error (MRPE), RMSE and standard deviation (SD). The proposed method contributes to radiometric calibration within bands located in the visible part of the spectrum, i.e., stretches with multiple vegetation coverage. The correlation between the bands is more noticeable as more spectral channels are introduced, thereby quality assurance for increasingly used hyperspectral sensors is needed. A detailed example of a procedure for hyperspectral snapshot cameras was presented in [144], introducing a novel quality assurance to trace pixel properties from the raw data of individual image cubes to the composed scene. Likewise, Barreto et al. [21] provided a comprehensive assessment of a line-scanning hyperspectral system, employing a range of approaches for quality assessment.
UAS mapping under variable solar conditions also has an impact on image quality. Wang et al. [155] proposed a pixel-wise radiometric and geometric calibration, extending the sensor calibration to such conditions and correcting vignetting effects. Stow et al. [114] showed that illumination geometry impacts the retrieval of reflectance values, but using VIs and photogrammetry can mitigate those effects. Even though flying height did not prove to have a conclusive effect on radiometric quality, the authors raised a question about the atmospheric effect on radiometric calibration. Indeed, [156] highlights that atmospheric correction has been overlooked in UAS environmental studies, due to low-altitude (<120 m) flights. The same authors pointed out the impact of Rayleigh scattering (light scattering by particles smaller than radiation wavelength, mostly from gas molecules), and developed an atmospheric correction algorithm to eliminate path radiance. In a recent study, [110] included air humidity to determine the potential impact on the quality of low-height images. Based on missions at different sites in Poland, characterized by specific climate, sun angle at 5°, 14° and 38°, and humidity ranging 40–80%, authors confirmed the impact of humidity and irradiation geometry on the radiometric quality of UAS images. Moreover, a new quality assessment indicator for the visible range was proposed, which accounts for the influence of air humidity and solar angle, including the high correlation between the humidity at a given altitude and difference in reflection. This approach proved to be universal for flying heights from 75 to 300 m, different humidity levels and sun angle.

6.2. Thermal Domain

UASs utilize uncooled thermal sensors, due to their size and weight benefits. However, uncooled microbolometers tend to be less accurate and prone to thermal sensitivities, which can cause inconsistencies and reduced accuracy in thermal retrieval. Though UAS-based thermal sensors have improved over the last few years, calibration and validation remain challenging, but are critical elements in obtaining useable imagery. Generally, low-cost sensors are not radiometrically calibrated by the manufacturer, making it difficult to obtain absolute temperature measurements. Moreover, camera optics, design and enclosure influences can introduce energy dissipation effects from the microbolometers, resulting in distortion (vignetting) in collected imagery [157]. As such, prior to any data collection effort, sensor calibration and vignetting correction are an essential step. Several studies have developed protocols using available blackbody devices in the laboratory to calibrate or correct uncooled sensors [158,159,160]. Nonetheless, the calibration is often different sensor-to-sensor, and needs to be performed regularly [161].
Complementary to the laboratory cal/val, acquiring coincident thermal ground measurements (using cooled cameras and/or well-calibrated small thermal sensors) from various targets during the time of UAV acquisition is required [157,162]. These measurements will allow direct comparisons with the UAS-based data. Additionally, we highly recommend deploying aluminum targets (e.g., pizza trays), as they are able to infer sky temperatures, due to the very low emissivity of shiny aluminium [162]. Moreover, when used as GCPs within the observed scene, these targets help to ensure accurate absolute geolocation of the thermal imagery [163]. For robust detection, composite materials made out of e.g., styrofoam and black metal, have been shown to work well with thermal data [42]. Alternatively, temperature monitored water baths (along with standard thermocouple assembly) can be installed within the observed scene to provide reference kinetic temperatures during the mission. However, the challenge is to effectively monitor the temperature of the whole water baths, particularly as environmental fluctuations occur.
Obtaining accurate thermal orthomosaic remains challenging because of (1) the rapid changes in temperature during the morning and evening transitions [42], and (2) inconsistencies between images relative to in-flight effects (wind-speed and direction) on uncooled sensors. To date, there is still a lack of specific UAS-based thermal processing protocols for quantifying those uncertainties, so more work is required around characterizing these effects. More generally, examining pixel-based temperature variability between orthoimages would be essential to provide QA of thermal data. Further, mosaicking methods proposed by commercial software are not able to account for these pixel-based thermal variabilities. Perich et al. [42] investigated the influences of different processing modes and found significant differences, resulting from the interaction from the viewing geometry and canopy structure in combination with soil influences. Thus, specific thermal QA protocols (e.g., pixel-based standard deviation of the mosaicking) still need to be developed and integrated into software, as the standard practices for optical-based systems may not be appropriate.

6.3. Final Quality Assurance of UAS Products

Even with numerous UAS environmental studies having been undertaken using diverse system configuration over the past years, the level of quality assurance along each step of the workflow remains largely underreported. In the context of the harmonization of UAS environmental monitoring, quality assurance must play an important role in terms of standardized approaches, monitoring practices, and reproducibility of the research itself. As already mentioned, platform choice, flying speed and altitude are highly correlated with GSD. Moreover, [164] emphasized that using portable resolution test charts before imaging is a very useful first step for UAS image quality assurance. These charts have specifically designed characteristics and are used for evaluation and camera calibration. Two types of test charts were examined by the authors: i) tri-bar pattern: calculating resolution according to fixed bar size and frequency; and ii) Siemens star: calculating resolution by measuring boundary of center in comparison to target outer diameter and size. The second method is more subjective due to visual demarcating the boundary of expanding black and white lines, of which the size and number are specified. In case of both platform types carrying two camera models at different flying altitudes, GSD of orthoimages highly correlated to theoretical resolution (calculated GSD prior to mission—see Section 2.3).
There is currently an unmet need for the standardization of SfM methods, along with their detailed survey reporting and final product assessment. An example of such guidelines is the use of SfM photogrammetry in geomorphic research [84], where the authors pinpointed three key steps in: (1) using appropriate survey design; (2) identifying systematic errors and precision within results; and (3) propagating uncertainty estimates into the final data product. In order to ensure reproducible SfM-based surveys, the authors provided thirteen successive steps, including mandatory points such as literature review, equipment choice, and calibration, detailed report on image acquisition and control measurements, photogrammetric processing, quality of final results and error management, and final product assessment. This workflow provides the most detailed quality assessment of photogrammetric processing, as it includes accuracy assessment, residual error on image observation, correlation between camera parameters, and finally comparison with independent checkpoint coordinates. Finally, the authors draw attention to determining the potential impact of residual uncertainty on SfM final products, whether using simulation or analytical solutions for the propagation of error.
Orthoimage mosaic is another product used for direct measurement, and classification is one of the most common approaches [23]. The choice of classification methods largely depends on the spatial and spectral quality of data, to what extent the targeted unit is distinguished from its surroundings and the particular goal of an environmental survey. The pixel-based approach analyzes only the spectral properties of each pixel, while object-based image analysis (OBIA) integrates information on the texture and shape of pixels and their neighbors, arranging them into segments and assigning them to different classes [165]. With the rise of high-resolution sensors, the spatial resolution has become finer than the object of interest. Therefore a pixel-based approach on high-resolution images can cause a “salt and pepper” effect, contributing to classification inaccuracies [165,166]. As such, the OBIA technique is gradually replacing the traditional pixel-based approach for the classification of high-resolution UAS imagery [167]. A systematic comparison of seven different OBIA classification techniques using RGB images with 0.2 m spatial resolution [168] concluded that the random forest classifier gave the most accurate results. Similarly, [19] came to the same conclusion in a review of 173 scientific publications of supervised OBIA image classifications, where the random forest approach outperformed others. An inseparable part of classification choice is the accuracy assessment method. However, accuracy measurements are mainly designed for the pixel-based approach. Therefore, area-based accuracy assessment and size of training data are proving to be more stable than traditional point-based methods. The accuracy level of the final product largely depends on the map purpose, and the rise of object-based supervised classification has created room for research on uncertainties and accuracy assessment in the OBIA field.
A recent and comprehensive document, that attempts to rectify UAS utilization in Australia and New Zealand, may be used as a basic guide for both UAS data acquisition and quality controls [169]. Based on this “user needs report” more points that require QA are provided and a software package (QA4) to do so has been recently developed to include different QA points.

7. Discussion and Final Remarks

The main purpose of this review document is to provide insights and overview into the necessary steps that are required to ensure a successful UAS environmental survey. We singled out five key elements of this process, which have been described to some extent by recent literature contributions. Throughout the proposed workflow, we suggested specific reviews or research studies on each step of the UAS data planning, acquisition, processing and quality assurance workflow (Figure 4). Even though each sub-section represents a scientific discipline of its own, their mutual dependence requires all the steps to be compatible to ensure successful outputs of an UAS study. We emphasize the importance of UAS regulations, but the significant heterogeneity of national regulations indicates that implementation of UAS in everyday professional use is still progressing. The impact of regulations is most evident on the inability to survey urban areas, platform weight and allowed flight altitude. UAS weight affects platform and sensor choice, while flight altitude limitations can impact intended spatial resolution, flight duration, image overlap, and area covered. GSD is also impacted by camera properties and type of sensor used. As the spatial resolution differs amongst different sensors, proper calibration is needed, especially for simultaneous use. Intended GSD influences the choice of classification approach and method, as well as DSM and orthomosaic quality.
The success of the SfM algorithm also depends on image overlap, and for specific types of vegetation research, we can almost conclude that optimal overlap thresholds exist. The overlap settings may be strongly influenced by the specific characteristics (heterogeneity) of the area. When tie points can easily be identified in the scene, it is possible to use 70% front- and 40% side-overlap as minimum values for precision agriculture [5], with flying heights between 60–90 m. Forward- and side-overlap of at least 80% and 60% is recommended for forestry surveys [2]. A similar overlap percentage was used for natural vegetation and invasive plant surveys, even though different sensors require specific overlap (e.g., suggested image overlaps for thermal sensors are above 80% [51]). There is a noticeable distinction in optimal side lap percentage, as for mapping structurally and spatially homogeneous areas, such as crop fields, less side lap was used. Therefore it is possible that for surveys of heterogeneous natural ecosystems (e.g., riparian vegetation), higher side lap is desirable than for homogeneous ecosystems (e.g., rangelands, grasslands [37,170,171]). On the other hand, homogenous scenes imply less distinct tie-points for the SfM algorithm, so for quality final products, a side overlap higher than 40% is required. Nevertheless, there is no general methodology for choosing the optimal overlap configuration for UAS flight planning, even though monitored area can be characterized by specific quantitative parameters, i.e., the complexity of relief can be estimated by elevation differences; or scene homogeneity/heterogeneity can be assessed by specific tools such as FRAGSTATS [172]. In practice, the purpose of monitoring itself plays an important role in choosing the optimal image overlap. At the same time, even small changes in other flight parameters can affect the accuracy of the data obtained under the same overlap conditions. For example, in a study on the monitoring of olive orchards [173], image overlap and flying height was tested to obtain optimal configuration for generating DSMs in the context of processing time. Side overlap was constant at 60%, while the best results were obtained, with 95% of front overlap at 100 m flying height. Quality of the 3D reconstructed olive trees was strongly affected by overlap percentage, as well as the tree volume accuracy (TVA), which was linearly correlated with the front overlap. The same applies to the dependency of optimal overlap configuration from the UAS sensor type. Similarly, [60] examined the connection between image overlap, the processing time, and low altitude flights, but the images were extracted from compressed MPEG-4 video as JPEG or PNG image files. This approach enables extremely high forward overlap (up to 98.8%) to be achieved without an increase in flying time, and proved to be beneficial in low altitude flights (15–30 m above the canopy). Still, using forward overlap higher than 95% increases the processing time, while the side overlap ranging from 50% to 70% proved to be optimal in reconstruction accuracy. Including a cross-flight pattern and tilted camera could benefit in obtaining optimal image overlap. This optimal value represents the compromise of sufficient image overlap for the quality final product on one side, and computation time and more important flying time on the other side.
Flight duration limits the range of UAS, thus based on the scale of the survey, more energy-efficient fixed-wing platforms are used for mapping large areas. The payload weight impacts flying time as well, as heavier sensors significantly reduce the acquisition period. Besides weight, each sensor type has a set of specific limitations that must be considered within the general workflow. The main limitation of widely adopted RGB, modified RGB and color-infrared cameras is the overlap between spectral bands, and the fact that these bands are not always originally used in calculating vegetation indices, as modified vegetation indices are often generated. Many of these types of 2D imagers record all bands at the same time, although there are also sequential 2D imagers that record desired bands sequentially in time (i.e., Rikola or Cubert hyperspectral imagers, with 50–100 bands). The high spatial resolution follows this wide spectral selection to some extent, but the band spatial offset needs to be corrected in the image pre-processing step. Systems such as Tetrtacam, MicaSense Parrot-Sequoia and RedEdge use several integrated cameras to record images of different spectral resolution, pushbroom sensors record a line of spectral data [174] and point spectrometers acquire spectral signatures of objects. A detailed radiometric workflow was provided in the latest review of spectral sensor technology by Aasen et al. [17] Due to high spectral sensor diversity, a clear border between classical multispectral and hyperspectral sensors is blurred, and it is important to emphasize that no device can meet all needs.
Georeferencing is an irreplaceable part of UAS data processing, and besides acquiring spatial positioning for spectral data, it is tightly incorporated within each workflow step. Developing synchronized IMU/GNSS technology significantly improves the accuracy of sensed data, whereas a limited number of GCPs ensure the minimizing of standardized error values. As GCPs still provide the most accurate results, a step in mitigating their time-consuming placement and geolocation are using specially designed GCPs with integrated GNSS, such as Propeller® AeroPoints [175]. The location of the AeroPoints can be recorded for a couple of hours, automatically uploaded after the flight and subsequently post-processed [95,176]. The type of GCPs deployed also depends on the sensor choice, as for low spatial resolution and low-contrast thermal sensors, aluminum GCPs are needed, reflecting up to 90% of thermal radiation [177]. The number and density of GCPs need to be included in the initial study design, supported by required internal precision and overall repeatability of the survey, with the intended GSD of final products [8]. Ultimately, precisely georeferenced ground data are perhaps the most crucial parts of multi-temporal environmental monitoring.
Assuring the quality of each step included in this guidance guarantees the quality of final products and ensures their suitable application. It is clear by now that QA is not an independent step within the workflow, but rather an inseparable part of each mentioned process, especially as subsequent steps depend on the quality of previously collected data and different system configurations. Survey cost is an integral part of study design, and the price of each component included in the workflow can vary from at least several thousand dollars to considerably more. This price range implies that not all UASs can be considered as a low-cost remote sensing tool, and the utilization of more complex systems requires greater funding and expert knowledge. With this in mind, the quality assurance of data obtained through expensive equipment is crucial for fulfilling its full potential. Composing optimal cost-efficient study design represents a final tradeoff between the purposes of the final product on one side, and if improvement brought by complex systems is supported by appropriate survey cost on the other side. Nevertheless, these expensive components are still generally used by remote sensing “core” experts, with a tendency of lower prices for professional systems.
We strongly recommend providing all metadata, parameters, and system configuration used within each step of the UAS environmental survey. Detailed overviews of camera settings, sensor calibration, flight configuration, geometric and radiometric processing, as well as different software and approaches used, enables the independent analysis and reproducibility of a study [17]. Beside suggested literature for each step and associated literature for quality assurance, we also present a form with necessary metadata associated with each UAS-survey, that should be provided in Appendix B. Ultimately, cooperation of experts within each component of the workflow could be one of the solutions for obtaining the highest quality data. Moreover, networking of geologists, ecologists, land managers, etc. with engineers, programmers and other remote sensing specialists leads to best practice in designing the optimal study. These processes will result in standardized procedures and workflows, by verification of past surveys or adopting new methods.

8. Conclusions

The long unmet demand for near real-time cost-effective data has been largely resolved by advances in UAS technology. With the availability of these new sensor and platform combinations, an entirely new frontier in remote sensing has been revealed. While still relatively immature, new sensors (passive and active) with high spectral resolution and spatial coverage, along with fusion capabilities of data from several sensors, are shedding light on the potential of this technology. New platforms that can carry heavy payloads and easy to operate UAS, with user-friendly operation and processing systems, will no doubt make this technology a major pillar of modern remote sensing, equal or even higher to those acquired from more traditional airborne and satellite domains.
The present manuscript provides a comprehensive review of the most recent results in the field of UAS environmental mapping using passive sensors. The literature offers a significant amount of useful suggestions, that have been harmonized to provide guidance in the use of drones for scientific applications. The information reviewed focused mainly on practical usage of UAS, and can be easily adapted to several fields such as vegetation mapping, soil water monitoring, snow assessment, urban mapping, and more.
Currently, it is not standard practice that UAS publications provide a detailed description of the specific data collection and processing steps that were followed in specific studies. However, a detailed description of the flight characteristics and preprocessing activities carried out on the published data is extremely useful, not only for scientific reproducibility, but also to guarantee a certain quality, while also advancing and educating the broader field. For this reason, we developed an UAS survey form (see Appendix BTable A2), where all potential variables and details about a survey are listed. This form may be considered as a potential reference in future studies of UAS data collection, in order to guarantee the reproducibly of the results. Finally, we have identified the research pillars that should be used in an ideal workflow, where each step of the UAS survey is properly managed. Systematizing this information, we have built a workflow that can be used as a reference workflow for UAS-studies.

Author Contributions

S.M. conceived the structure of the research and coordinated the writing of the manuscript. G.T. carried out the literature research and wrote the first draft of the manuscript. J.M. contributed to Section 2.3. G.G., M.R.J., S.H., and J.J.A. contributed to Section 3.2 and Section 3.3. H.A. wrote major parts of the radiometric calibration, camera settings and UAS software section and contributed to various other sections of the manuscript. A.B., M.P., K.J., Y.M. and E.B.-D. contributed to Section 6. M.F.M. provided a comprehensive editing and review on the various paper sections. All authors contributed to the review of the manuscript and the sections of discussion and conclusion. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the COST Action CA16219 “HARMONIOUS—Harmonization of UAS techniques for agricultural and natural ecosystems monitoring”. H.A. is supported by the Swiss National Science Foundation (SNSF), within the framework of the National Research Programme “Sustainable Economy: resource-friendly, future-oriented, innovative” (NRP 73), in the InnoFarm project, Grant-N° 407340_172433. K.J., Y.M. and M.F.M. were supported by the King Abdullah University of Science and Technology.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Checklists Before A Flight

Table A1. Preliminary checklist before flying.
Table A1. Preliminary checklist before flying.
N.HARMONIOUS UAS Check-ListCheck
1Check the weather conditions (particularly critical maybe rain and strong wind)
2Identify the timing of the flight with respect to the best solar illumination. The central hours of the day allow avoiding shadows in the scene.
3Make sure that you have GPS coverage to fly in “safe” mode.
4Take off from areas that are sufficiently large, free of obstacles and leveled
5Check the presence of any deformation to the propeller or frames
6Execute a small manual flight; this ensures that the vehicle is stable and radio control is performing well
7If the presence of people and / or animals is planned in the survey area, plan the flight when such presence is minimum.
8In the case of critical operations, obtain all permits in advance
9Check the status of the batteries of your drone, controller, sensors, and tablet
10Check that the propellers are intact and well-fixed
11Deactivate for safety the Bluetooth and the Wi-Fi of your device (we recommend the mode “Airplane”)
12Check that you have enough free memory in the SD card used to store the data acquired
13Do the compass calibration (magnetic compass)
14Wait for the drone to connect to as many satellites as possible (minimum required 5)
15Set the “return to home” point in case of anomaly before starting
16Take off and Fly

Appendix B. UAS-Survey Description

Table A2. Metadata associated with each UAS-survey.
Table A2. Metadata associated with each UAS-survey.
Study designPlatform characteristicsPlatform type
Weight & payload capacity
Maximum speed
Flight height & coverage
On-board GNSS receiver
Sensor characteristicsSensor type & name
Sensor weight
Camera settingsPixel size
Sensor size
Focal length
ISO
Aperture
Shutter Speed
Flight planGSD (cm)
Flight height
Flight speed
Forward & side image overlap
UAS Control softwareSoftware name
GeoreferencingType of georeferencing
Number of GCPs
Arrangement of GCPs
Flight missionWeatherWind power & direction
Illumination condition
Humidity
MissionAverage flying speed
Flying time
Flight pattern
Camera angle
Image format
Processing of aerial dataGeometric processingSfM tool name
Final product type
Bundle adjustment
Radiometric processingSignal to noise ratio
Radiometric resolution
Viewing geometry
Band configuration
Reflectance calculation method
Vignetting
Motion blur
Accuracy assessment Error measure
Statistical value
Error management
Classification accuracy

References

  1. Adão, T.; Hruška, J.; Pádua, L.; Bessa, J.; Peres, E.; Morais, R.; Sousa, J.J. Hyperspectral Imaging: A Review on UAV-Based Sensors, Data Processing and Applications for Agriculture and Forestry. Remote Sens. 2017, 9, 1110. [Google Scholar] [CrossRef] [Green Version]
  2. Dandois, J.; Olano, M.; Ellis, E. Optimal Altitude, Overlap, and Weather Conditions for Computer Vision UAV Estimates of Forest Structure. Remote Sens. 2015, 7, 13895–13920. [Google Scholar] [CrossRef] [Green Version]
  3. Lisein, J.; Pierrot-Deseilligny, M.; Bonnet, S.; Lejeune, P. A Photogrammetric Workflow for the Creation of a Forest Canopy Height Model from Small Unmanned Aerial System Imagery. Forests 2013, 4, 922–944. [Google Scholar] [CrossRef] [Green Version]
  4. Alvarez-Taboada, F.; Paredes, C.; Julián-Pelaz, J. Mapping of the Invasive Species Hakea sericea Using Unmanned Aerial Vehicle (UAV) and WorldView-2 Imagery and an Object-Oriented Approach. Remote Sens. 2017, 9, 913. [Google Scholar] [CrossRef] [Green Version]
  5. Mesas-Carrascosa, F.J.; Clavero Rumbao, I.; Torres-Sánchez, J.; García-Ferrer, A.; Peña, J.M.; López Granados, F. Accurate ortho-mosaicked six-band multispectral UAV images as affected by mission planning for precision agriculture proposes. Int. J. Remote Sens. 2017, 38, 2161–2176. [Google Scholar] [CrossRef]
  6. Yuan, W.; Li, J.; Bhatta, M.; Shi, Y.; Baenziger, P.; Ge, Y. Wheat Height Estimation Using LiDAR in Comparison to Ultrasonic Sensor and UAS. Sensors 2018, 18, 3731. [Google Scholar] [CrossRef] [Green Version]
  7. Tauro, F.; Pagano, C.; Phamduy, P.; Grimaldi, S.; Porfiri, M. Large-Scale Particle Image Velocimetry from an Unmanned Aerial Vehicle. IEEE ASME Trans. Mechatron. 2015, 20, 3269–3275. [Google Scholar] [CrossRef]
  8. James, M.R.; Robson, S.; d’Oleire-Oltmanns, S.; Niethammer, U. Optimising UAV topographic surveys processed with structure-from-motion: Ground control quality, quantity and bundle adjustment. Geomorphology 2017, 280, 51–66. [Google Scholar] [CrossRef] [Green Version]
  9. Whitehead, K.; Hugenholtz, C.H. Applying ASPRS Accuracy Standards to Surveys from Small Unmanned Aircraft Systems (UAS). Photogramm. Eng. Remote Sens. 2015, 81, 787–793. [Google Scholar] [CrossRef]
  10. Gomez, C.; Purdie, H. UAV-based Photogrammetry and Geocomputing for Hazards and Disaster Risk Monitoring—A Review. Geoenviron. Disasters 2016, 3, 23. [Google Scholar] [CrossRef] [Green Version]
  11. Manfreda, S.; McCabe, M.; Miller, P.; Lucas, R.; Pajuelo Madrigal, V.; Mallinis, G.; Ben Dor, E.; Helman, D.; Estes, L.; Ciraolo, G.; et al. On the Use of Unmanned Aerial Systems for Environmental Monitoring. Remote Sens. 2018, 10, 641. [Google Scholar] [CrossRef] [Green Version]
  12. Ewertowski, M.W.; Tomczyk, A.M.; Evans, D.J.A.; Roberts, D.H.; Ewertowski, W. Operational Framework for Rapid, Very-high Resolution Mapping of Glacial Geomorphology Using Low-cost Unmanned Aerial Vehicles and Structure-from-Motion Approach. Remote Sens. 2019, 11, 65. [Google Scholar] [CrossRef] [Green Version]
  13. Rusnák, M.; Sládek, J.; Kidová, A.; Lehotský, M. Template for high-resolution river landscape mapping using UAV technology. Measurement 2018, 115, 139–151. [Google Scholar] [CrossRef]
  14. Hamada, Y.; Stow, D.A.; Coulter, L.L.; Jafolla, J.C.; Hendricks, L.W. Detecting Tamarisk species (Tamarix spp.) in riparian habitats of Southern California using high spatial resolution hyperspectral imagery. Remote Sens. Environ. 2007, 109, 237–248. [Google Scholar] [CrossRef]
  15. Pande-Chhetri, R.; Abd-Elrahman, A.; Liu, T.; Morton, J.; Wilhelm, V.L. Object-based classification of wetland vegetation using very high-resolution unmanned air system imagery. Eur. J. Remote Sens. 2017, 50, 564–576. [Google Scholar] [CrossRef] [Green Version]
  16. Assmann, J.J.; Kerby, J.T.; Cunliffe, A.M.; Myers-Smith, I.H. Vegetation monitoring using multispectral sensors—Best practices and lessons learned from high latitudes. J. Unmanned Veh. Syst. 2019, 7, 54–75. [Google Scholar] [CrossRef] [Green Version]
  17. Aasen, H.; Honkavaara, E.; Lucieer, A.; Zarco-Tejada, P. Quantitative Remote Sensing at Ultra-High Resolution with UAV Spectroscopy: A Review of Sensor Technology, Measurement Procedures, and Data Correction Workflows. Remote Sens. 2018, 10, 1091. [Google Scholar] [CrossRef] [Green Version]
  18. Gini, R.; Passoni, D.; Pinto, L.; Sona, G. Use of Unmanned Aerial Systems for multispectral survey and tree classification: A test in a park area of northern Italy. Eur. J. Remote Sens. 2014, 47, 251–269. [Google Scholar] [CrossRef]
  19. Ma, L.; Li, M.; Ma, X.; Cheng, L.; Du, P.; Liu, Y. A review of supervised object-based land-cover image classification. ISPRS J. Photogramm. Remote Sens. 2017, 130, 277–293. [Google Scholar] [CrossRef]
  20. Singh, K.K.; Frazier, A.E. A meta-analysis and review of unmanned aircraft system (UAS) imagery for terrestrial applications. Int. J. Remote Sens. 2018, 39, 5078–5098. [Google Scholar] [CrossRef]
  21. Barreto, M.A.P.; Johansen, K.; Angel, Y.; McCabe, M.F. Radiometric Assessment of a UAV-Based Push-Broom Hyperspectral Camera. Sensors 2019, 19, 4699. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  22. Müllerová, J.; Brůna, J.; Bartaloš, T.; Dvořák, P.; Vítková, M.; Pyšek, P. Timing Is Important: Unmanned Aircraft vs. Satellite Imagery in Plant Invasion Monitoring. Front. Plant Sci. 2017, 8, 887. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  23. Whitehead, K.; Hugenholtz, C.H. Remote sensing of the environment with small unmanned aircraft systems (UASs), part 1: A review of progress and challenges. J. Unmanned Veh. Syst. 2014, 2, 69–85. [Google Scholar] [CrossRef]
  24. Morgenthal, G.; Hallermann, N. Quality assessment of unmanned aerial vehicle (UAV) based visual inspection of structures. Adv. Struct. Eng. 2014, 17, 289–302. [Google Scholar] [CrossRef]
  25. Tziavou, O.; Pytharouli, S.; Souter, J. Unmanned Aerial Vehicle (UAV) based mapping in engineering geological surveys: Considerations for optimum results. Eng. Geol. 2018, 232, 12–21. [Google Scholar] [CrossRef] [Green Version]
  26. Ahmadzadeh, A.; Jadbabaie, A.; Kumar, V.; Pappas, G.J. Multi-UAV cooperative surveillance with spatio-temporal specifications. In Proceedings of the 45th IEEE Conference on Decision and Control, San Diego, CA, USA, 13–15 December 2006; pp. 5293–5298. [Google Scholar]
  27. Kontogiannis, S.G.; Ekaterinaris, J.A. Design, performance evaluation and optimization of a UAV. Aerosp. Sci. Technol. 2013, 29, 339–350. [Google Scholar] [CrossRef]
  28. Kedzierski, M.; Wierzbicki, D. Radiometric quality assessment of images acquired by UAV’s in various lighting and weather conditions. Measurement 2015, 76, 156–169. [Google Scholar] [CrossRef]
  29. Chen, H.; Wang, X.M.; Jiao, Y.S.; Li, Y. Research on search probability and camera footprint of region coverage for UAVs. In Proceedings of the IEEE International Conference on Control and Automation, Christchurch, New Zealand, 9–11 December 2009; pp. 1920–1924. [Google Scholar]
  30. Cruzan, M.B.; Weinstein, B.G.; Grasty, M.R.; Kohrn, B.F.; Hendrickson, E.C.; Arredondo, T.M.; Thompson, P.G. Small unmanned aerial vehicles (micro-UAVs, drones) in plant ecology. Appl. Plant Sci. 2016, 4, 1600041. [Google Scholar] [CrossRef]
  31. Stöcker, C.; Bennett, R.; Nex, F.; Gerke, M.; Zevenbergen, J. Review of the Current State of UAV Regulations. Remote Sens. 2017, 9, 459. [Google Scholar] [CrossRef] [Green Version]
  32. EU Commission Delegated Regulation (EU) 2019/945 on unmanned aircraft systems and on third-country operators of unmanned aircraft systems. Off. Journey 2019, L152, 1–40.
  33. Shi, Y.; Thomasson, J.A.; Murray, S.C.; Pugh, N.A.; Rooney, W.L.; Shafian, S.; Rajan, N.; Rouze, G.; Morgan, C.L.S.; Neely, H.L.; et al. Unmanned Aerial Vehicles for High-Throughput Phenotyping and Agronomic Research. PLoS ONE 2016, 11, e0159781. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  34. Michez, A.; Piégay, H.; Lisein, J.; Claessens, H.; Lejeune, P. Classification of riparian forest species and health condition using multi-temporal and hyperspatial imagery from unmanned aerial system. Environ. Monit. Assess. 2016, 188, 146. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  35. Müllerová, J.; Brůna, J.; Dvořák, P.; Bartaloš, T.; Vítková, M. Does the Data Resolution/Origin Matter? Satellite, Airborne and UAV Imagery and UAV Imagery to Tackle Plant Invasions. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 41, 903–908. [Google Scholar] [CrossRef]
  36. Gini, R.; Sona, G.; Ronchetti, G.; Passoni, D.; Pinto, L. Improving Tree Species Classification Using UAS Multispectral Images and Texture Measures. ISPRS Int. J. Geo-Inf. 2018, 7, 315. [Google Scholar] [CrossRef] [Green Version]
  37. Lu, B.; He, Y. Species classification using Unmanned Aerial Vehicle (UAV)-acquired high spatial resolution imagery in a heterogeneous grassland. ISPRS J. Photogramm. Remote Sens. 2017, 128, 73–85. [Google Scholar] [CrossRef]
  38. Wan, L.; Li, Y.; Cen, H.; Zhu, J.; Yin, W.; Wu, W.; Zhu, H.; Sun, D.; Zhou, W.; He, Y. Combining UAV-Based Vegetation Indices and Image Classification to Estimate Flower Number in Oilseed Rape. Remote Sens. 2018, 10, 1484. [Google Scholar] [CrossRef] [Green Version]
  39. Weil, G.; Lensky, I.; Resheff, Y.; Levin, N. Optimizing the Timing of Unmanned Aerial Vehicle Image Acquisition for Applied Mapping of Woody Vegetation Species Using Feature Selection. Remote Sens. 2017, 9, 1130. [Google Scholar] [CrossRef] [Green Version]
  40. Komárek, J.; Klouček, T.; Prošek, J. The potential of Unmanned Aerial Systems: A tool towards precision classification of hard-to-distinguish vegetation types? Int. J. Appl. Earth Obs. Geoinf. 2018, 71, 9–19. [Google Scholar] [CrossRef]
  41. Yang, G.; Liu, J.; Zhao, C.; Li, Z.; Huang, Y.; Yu, H.; Xu, B.; Yang, X.; Zhu, D.; Zhang, X.; et al. Unmanned Aerial Vehicle Remote Sensing for Field-Based Crop Phenotyping: Current Status and Perspectives. Front. Plant Sci. 2017, 8, 1111. [Google Scholar] [CrossRef]
  42. Perich, G.; Hund, A.; Anderegg, J.; Roth, L.; Boer, M.P.; Walter, A.; Liebisch, F.; Aasen, H. Assessment of Multi-Image Unmanned Aerial Vehicle Based High-Throughput Field Phenotyping of Canopy Temperature. Front. Plant Sci. 2020, 11, 150. [Google Scholar] [CrossRef]
  43. Soto-Estrada, E.; Correa-Echeveri, S.; Posada-Posada, M. Thermal analysis of urban land cover using an unmaned aerial vehicle (UAV) in Medellin, Colombia. J. Urban Environ. Eng. 2017, 11, 142–149. [Google Scholar] [CrossRef]
  44. Anweiler, S.; Piwowarski, D.; Ulbrich, R. Unmanned Aerial Vehicles for Environmental Monitoring with Special Reference to Heat Loss. E3S Web Conf. 2017, 19, 02005. [Google Scholar] [CrossRef] [Green Version]
  45. Spaan, D.; Burke, C.; McAree, O.; Aureli, F.; Rangel-Rivera, C.E.; Hutschenreiter, A.; Longmore, S.N.; McWhirter, P.R.; Wich, S.A. Thermal Infrared Imaging from Drones Offers a Major Advance for Spider Monkey Surveys. Drones 2019, 3, 34. [Google Scholar] [CrossRef] [Green Version]
  46. Dinuls, R.; Erins, G.; Lorencs, A.; Mednieks, I.; Sinica-Sinavskis, J. Tree Species Identification in Mixed Baltic Forest Using LiDAR and Multispectral Data. J. Sel. Top. Appl. Earth Obs. Remote Sens. 2012, 5, 594–603. [Google Scholar] [CrossRef]
  47. Sankey, T.; Donager, J.; McVay, J.; Sankey, J.B. UAV lidar and hyperspectral fusion for forest monitoring in the southwestern USA. Remote Sens. Environ. 2017, 195, 30–43. [Google Scholar] [CrossRef]
  48. O’Connor, J.; Smith, M.J.; James, M.R. Cameras and settings for aerial surveys in the geosciences. Prog. Phys. Geogr. 2017, 41, 325–344. [Google Scholar] [CrossRef] [Green Version]
  49. Mosbrucker, A.R.; Major, J.J.; Spicer, K.R.; Pitlick, J. Camera system considerations for geomorphic applications of SfM photogrammetry. Earth Surf. Proc. Landf. 2017, 42, 969–986. [Google Scholar] [CrossRef] [Green Version]
  50. Roth, L.; Hund, A.; Aasen, H. PhenoFly Planning Tool: Flight planning for high-resolution optical remote sensing with unmanned aerial systems. Plant Methods 2018, 14, 116. [Google Scholar] [CrossRef] [Green Version]
  51. Boesch, R. Thermal remote sensing with UAV-based workflows. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 42, 41. [Google Scholar] [CrossRef] [Green Version]
  52. Aasen, H.; Bolten, A. Multi-temporal high-resolution imaging spectroscopy with hyperspectral 2D imagers—From theory to application. Remote Sens. Environ. 2018, 205, 374–389. [Google Scholar] [CrossRef]
  53. Küng, O.; Strecha, C.; Beyeler, A.; Zufferey, J.C.; Floreano, D.; Fua, P.; Gervaix, F. The accuracy of automatic photogrammetric techniques on ultra-light UAV imagery. In Proceedings of the UAV-g 2011—Unmanned Aerial Vehicle in Geomatics, Zürich, Switzerland, 14–16 September 2011. [Google Scholar]
  54. Colomina, I.; Molina, P. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef] [Green Version]
  55. Mayr, E. Storia del Pensiero Biologico. Diversità, Evoluzione, Eredità; Bollati Boringhieri: Torino, Italy, 2011; ISBN 9788833922706. [Google Scholar]
  56. Gonçalves, J.A.; Henriques, R. UAV photogrammetry for topographic monitoring of coastal areas. ISPRS J. Photogramm. Remote Sens. 2015, 104, 101–111. [Google Scholar] [CrossRef]
  57. Uysal, M.; Toprak, A.S.; Polat, N. DEM generation with UAV Photogrammetry and accuracy analysis in Sahitler hill. Measurement 2015, 73, 539–543. [Google Scholar] [CrossRef]
  58. Ahmed, O.S.; Shemrock, A.; Chabot, D.; Dillon, C.; Williams, G.; Wasson, R.; Franklin, S.E. Hierarchical land cover and vegetation classification using multispectral data acquired from an unmanned aerial vehicle. Int. J. Remote Sens. 2017, 38, 2037–2052. [Google Scholar] [CrossRef]
  59. Haala, N.; Cramer, M.; Weimer, F.; Trittler, M. Performance test on UAV-based photogrammetric data collection. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, 38, 7–12. [Google Scholar] [CrossRef] [Green Version]
  60. Seifert, E.; Seifert, S.; Vogt, H.; Drew, D.; van Aardt, J.; Kunneke, A.; Seifert, T. Influence of Drone Altitude, Image Overlap, and Optical Sensor Resolution on Multi-View Reconstruction of Forest Images. Remote Sens. 2019, 11, 1252. [Google Scholar] [CrossRef] [Green Version]
  61. Tu, Y.H.; Phinn, S.; Johansen, K.; Robson, A.; Wu, D. Optimising drone flight planning for measuring horticultural tree crop structure. ISPRS J. Photogramm. Remote Sens. 2020, 160, 83–96. [Google Scholar] [CrossRef] [Green Version]
  62. MAVProxy. Available online: https://ardupilot.github.io/MAVProxy/html/index.html (accessed on 24 February 2020).
  63. Mission Planner. Available online: http://ardupilot.org/planner/ (accessed on 24 February 2020).
  64. APM Planner 2. Available online: http://ardupilot.org/planner2/ (accessed on 24 February 2020).
  65. QGroundControl. Available online: http://www.qgroundcontrol.org (accessed on 24 February 2020).
  66. UgCS. Available online: https://www.ugcs.com (accessed on 24 February 2020).
  67. Ramirez-Atencia, C.; Camacho, D. Extending QGroundControl for Automated Mission Planning of UAVs. Sensors 2018, 18, 2339. [Google Scholar] [CrossRef] [Green Version]
  68. eMotion 3. Available online: https://www.sensefly.com/software/emotion) (accessed on 24 February 2020).
  69. Jacobsen, K. Exterior Orientation Parameters. Photogramm. Eng. Remote Sens. 2001, 67, 12–47. [Google Scholar]
  70. Kraus, K. Photogrammetry: Geometry from Images and Laser Scans; Walter de Gruyter: Berlin, Germany, 2007. [Google Scholar]
  71. Cramer, M.; Przybilla, H.J.; Zurhorst, A. UAV cameras: Overview and geometric calibration benchmark. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 42, 85. [Google Scholar] [CrossRef] [Green Version]
  72. Tomaštík, J.; Mokroš, M.; Surový, P.; Grznárová, A.; Merganič, J. UAV RTK/PPK Method—An Optimal Solution for Mapping Inaccessible Forested Areas? Remote Sens. 2019, 11, 721. [Google Scholar] [CrossRef] [Green Version]
  73. Manfreda, S.; Dvorak, P.; Mullerova, J.; Herban, S.; Vuono, P.; Arranz Justel, J.; Perks, M. Assessing the Accuracy of Digital Surface Models Derived from Optical Imagery Acquired with Unmanned Aerial Systems. Drones 2019, 3, 15. [Google Scholar] [CrossRef] [Green Version]
  74. Stöcker, C.; Nex, F.; Koeva, M.; Gerke, M. Quality assessment of combined IMU/GNSS data for direct georeferencing in the context of UAV-based mapping. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 42, 355. [Google Scholar] [CrossRef] [Green Version]
  75. Schenk, T. Towards automatic aerial triangulation. ISPRS J. Photogramm. Remote Sens. 1997, 52, 110–121. [Google Scholar] [CrossRef]
  76. Cramer, M.; Haala, N.; Stallmann, D. Direct Georeferencing Using GPS/Inertial Exterior Orientations for Photogrammetric Applications. ISPRS J. Photogramm. Remote Sens. 2000, 33, 198–205. [Google Scholar]
  77. Padró, J.-C.; Muñoz, F.-J.; Planas, J.; Pons, X. Comparison of four UAV georeferencing methods for environmental monitoring purposes focusing on the combined use with airborne and satellite remote sensing platforms. Int. J. Appl. Earth Obs. Geoinf. 2019, 75, 130–140. [Google Scholar] [CrossRef]
  78. Bonali, F.L.; Tibaldi, A.; Marchese, F.; Fallati, L.; Russo, E.; Corselli, C.; Savini, A. UAV-based surveying in volcano-tectonics: An example from the Iceland rift. J. Struct. Geol. 2019, 121, 46–64. [Google Scholar] [CrossRef]
  79. Baiocchi, V.; Napoleoni, Q.; Tesei, M.; Servodio, G.; Alicandro, M.; Costantino, D. UAV for monitoring the settlement of a landfill. Eur. J. Remote Sens. 2019, 52, 41–52. [Google Scholar] [CrossRef]
  80. Chudley, T.; Christoffersen, P.; Doyle, S.H.; Abellan, A.; Snooke, N. High accuracy UAV photogrammetry of ice sheet dynamics with no ground control. Cryosphere 2019, 13, 955–968. [Google Scholar] [CrossRef] [Green Version]
  81. Chudley, T.R.; Christoffersen, P.; Doyle, S.H.; Bougamont, M.; Schoonman, C.M.; Hubbard, B.; James, M.R. Supraglacial lake drainage at a fast-flowing Greenlandic outlet glacier. Proc. Natl. Acad. Sci. USA 2019, 116, 25468–25477. [Google Scholar] [CrossRef]
  82. Zhang, H.; Aldana-Jague, E.; Clapuyt, F.; Wilken, F.; Vanacker, V.; Van Oost, K. Evaluating the potential of post-processing kinematic (PPK) georeferencing for UAV-based structure- from-motion (SfM) photogrammetry and surface change detection. Earth Surf. Dyn. 2019, 7, 807–827. [Google Scholar] [CrossRef] [Green Version]
  83. Grayson, B.; Penna, N.T.; Mills, J.P.; Grant, D.S. GPS precise point positioning for UAV photogrammetry. Photogramm. Rec. 2018, 33, 427–447. [Google Scholar] [CrossRef] [Green Version]
  84. James, M.R.; Chandler, J.H.; Eltner, A.; Fraser, C.; Miller, P.E.; Mills, J.P.; Noble, T.; Robson, S.; Lane, S.N. Guidelines on the use of structure-from-motion photogrammetry in geomorphic research. Earth Surf. Proc. Landf. 2019, 44, 2081–2084. [Google Scholar] [CrossRef]
  85. Höhle, J.; Höhle, M. Accuracy assessment of digital elevation models by means of robust statistical methods. ISPRS J. Photogramm. Remote Sens. 2009, 64, 398–406. [Google Scholar] [CrossRef] [Green Version]
  86. Gonçalves, G.R.; Pérez, J.A.; Duarte, J. Accuracy and effectiveness of low cost UASs and open source photogrammetric software for foredunes mapping. Int. J. Remote Sens. 2018, 39, 5059–5077. [Google Scholar] [CrossRef]
  87. Rock, G.; Ries, J.B.; Udelhoven, T. Sensitivity analysis of UAV-photogrammetry for creating Digital Elevation Models (DEM). Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, 38, 69–73. [Google Scholar] [CrossRef] [Green Version]
  88. Dunford, R.; Michel, K.; Gagnage, M.; Piégay, H.; Trémelo, M.L. Potential and constraints of Unmanned Aerial Vehicle technology for the characterization of Mediterranean riparian forest. Int. J. Remote Sens. 2009, 30, 4915–4935. [Google Scholar] [CrossRef]
  89. Tonkin, T.; Midgley, N. Ground-Control Networks for Image Based Surface Reconstruction: An Investigation of Optimum Survey Designs Using UAV Derived Imagery and Structure-from-Motion Photogrammetry. Remote Sens. 2016, 8, 786. [Google Scholar] [CrossRef] [Green Version]
  90. Martínez-Carricondo, P.; Agüera-Vega, F.; Carvajal-Ramírez, F.; Mesas-Carrascosa, F.-J.; García-Ferrer, A.; Pérez-Porras, F.-J. Assessment of UAV-photogrammetric mapping accuracy based on variation of ground control points. Int. J. Appl. Earth Obs. Geoinf. 2018, 72, 1–10. [Google Scholar] [CrossRef]
  91. Rangel, J.M.G.; Gonçalves, G.R.; Pérez, J.A. The impact of number and spatial distribution of GCPs on the positional accuracy of geospatial products derived from low-cost UASs. Int. J. Remote Sens. 2018, 39, 7154–7171. [Google Scholar] [CrossRef]
  92. Oniga, V.-E.; Breaban, A.-I.; Statescu, F. Determining the Optimum Number of Ground Control Points for Obtaining High Precision Results Based on UAS Images. Proceedings 2018, 2, 352. [Google Scholar] [CrossRef] [Green Version]
  93. Duan, Y.; Yan, L.; Xiang, Y.; Gou, Z.; Chen, W.; Jing, X. Design and experiment of UAV remote sensing optical targets. In Proceedings of the 2011 International Conference on Electronics, Communications and Control (ICECC 2011), Ningbo, China, 9–11 September 2011; pp. 202–205. [Google Scholar]
  94. Grenzdörffer, G.J.; Niemeyer, F. UAV based BRDF-measurements of agricultural surfaces with pfiffikus. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, 38, 229–234. [Google Scholar] [CrossRef] [Green Version]
  95. Laliberte, A.S.; Goforth, M.A.; Steele, C.M.; Rango, A. Multispectral Remote Sensing from Unmanned Aircraft: Image Processing Workflows and Applications for Rangeland Environments. Remote Sens. 2011, 3, 2529–2551. [Google Scholar] [CrossRef] [Green Version]
  96. Nebiker, S.; Annen, A.; Scherrer, M.; Oesch, D. A Light-weight Multispectral Sensor for Micro UAV-Opportunities for very High Resolution Airborne Remote Sensing. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2008, 37, 1193–1200. [Google Scholar]
  97. Bondi, E.; Salvaggio, C.; Montanaro, M.; Gerace, A.D. Calibration of UAS imagery inside and outside of shadows for improved vegetation index computation. Auton. Air Ground Sens. Syst. Agric. Optim. Phenotyping 2016, 9866, 98660J. [Google Scholar]
  98. Näsi, R.; Honkavaara, E.; Lyytikäinen-Saarenmaa, P.; Blomqvist, M.; Litkey, P.; Hakala, T.; Viljanen, N.; Kantola, T.; Tanhuanpää, T.; Holopainen, M. Using UAV-Based Photogrammetry and Hyperspectral Imaging for Mapping Bark Beetle Damage at Tree-Level. Remote Sens. 2015, 7, 15467–15493. [Google Scholar] [CrossRef] [Green Version]
  99. Hakala, T.; Markelin, L.; Honkavaara, E.; Scott, B.; Theocharous, T.; Nevalainen, O.; Näsi, R.; Suomalainen, J.; Viljanen, N.; Greenwell, C.; et al. Direct Reflectance Measurements from Drones: Sensor Absolute Radiometric Calibration and System Tests for Forest Reflectance Characterization. Sensors 2018, 18, 1417. [Google Scholar] [CrossRef] [Green Version]
  100. Honkavaara, E.; Hakala, T.; Nevalainen, O.; Viljanen, N.; Rosnell, T.; Khoramshahi, E.; Näsi, R.; Oliveira, R.; Tommaselli, A. Geometric and reflectance signature characterization of complex canopies using hyperspectral stereoscopic images from UAV amd terrestrial platrforms. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 41, 77–82. [Google Scholar] [CrossRef]
  101. Johansen, K.; Raharjo, T.; McCabe, M. Using Multi-Spectral UAV Imagery to Extract Tree Crop Structural Properties and Assess Pruning Effects. Remote Sens. 2018, 10, 854. [Google Scholar] [CrossRef] [Green Version]
  102. James, M.R.; Robson, S. Mitigating systematic error in topographic models derived from UAV and ground-based image networks. Earth Surf. Proc. Landf. 2014, 39, 1413–1420. [Google Scholar] [CrossRef] [Green Version]
  103. Bash, E.; Moorman, B.; Gunther, A. Detecting Short-Term Surface Melt on an Arctic Glacier Using UAV Surveys. Remote Sens. 2018, 10, 1547. [Google Scholar] [CrossRef] [Green Version]
  104. Klingbeil, L.; Heinz, E.; Wieland, M.; Eichel, J.; Laebe, T.; Kuhlmann, H. On the UAV based Analysis of Slow Geomorphological Processes: A Case Study at a Solifluction Lobe in the Turtmann Valley. In Proceedings of the 4th Joint International Symposium on Deformation Monitoring (JISDM 2019), Athens, Greece, 15–17 May 2019. [Google Scholar]
  105. Ge, X.; Wang, J.; Ding, J.; Cao, X.; Zhang, Z.; Liu, J.; Li, X. Combining UAV-based hyperspectral imagery and machine learning algorithms for soil moisture content monitoring. PeerJ 2019, 7, e6926. [Google Scholar] [CrossRef] [PubMed]
  106. Hassan-Esfahani, L.; Torres-Rua, A.; Jensen, A.; McKee, M. Assessment of Surface Soil Moisture Using High-Resolution Multi-Spectral Imagery and Artificial Neural Networks. Remote Sens. 2015, 7, 2627–2646. [Google Scholar] [CrossRef] [Green Version]
  107. Luo, W.; Xu, X.; Liu, W.; Liu, M.; Li, Z.; Peng, T.; Xu, C.; Zhang, Y.; Zhang, R. UAV based soil moisture remote sensing in a karst mountainous catchment. Catena 2019, 174, 478–489. [Google Scholar] [CrossRef]
  108. Goodbody, T.R.H.; Coops, N.C.; Hermosilla, T.; Tompalski, P.; Crawford, P. Assessing the status of forest regeneration using digital aerial photogrammetry and unmanned aerial systems. Int. J. Remote Sens. 2018, 39, 5246–5264. [Google Scholar] [CrossRef]
  109. Otero, V.; Van De Kerchove, R.; Satyanarayana, B.; Martínez-Espinosa, C.; Fisol, M.A.B.; Ibrahim, M.R.B.; Sulong, I.; Mohd-Lokman, H.; Lucas, R.; Dahdouh-Guebas, F. Managing mangrove forests from the sky: Forest inventory using field data and Unmanned Aerial Vehicle (UAV) imagery in the Matang Mangrove Forest Reserve, peninsular Malaysia. For. Ecol. Manag. 2018, 411, 35–45. [Google Scholar] [CrossRef]
  110. Kedzierski, M.; Wierzbicki, D.; Sekrecka, A.; Fryskowska, A.; Walczykowski, P.; Siewert, J. Influence of Lower Atmosphere on the Radiometric Quality of Unmanned Aerial Vehicle Imagery. Remote Sens. 2019, 11, 1214. [Google Scholar] [CrossRef] [Green Version]
  111. Akala, A.O.; Doherty, P.H.; Carrano, C.S.; Valladares, C.E.; Groves, K.M. Impacts of ionospheric scintillations on GPS receivers intended for equatorial aviation applications. Radio Sci. 2012, 47, 1–11. [Google Scholar] [CrossRef]
  112. Gerke, M.; Przybilla, H.-J. Accuracy Analysis of Photogrammetric UAV Image Blocks: Influence of Onboard RTK-GNSS and Cross Flight Patterns. Photogramm. Fernerkund. Geoinf. 2016, 2016, 17–30. [Google Scholar] [CrossRef] [Green Version]
  113. Howell, T.L.; Singh, K.K.; Smart, L. Structure from Motion Techniques for Estimating the Volume of Wood Chips. In High Spatial Resolution Remote Sensing: Data, Techniques, and Applications; Yuhong, H., Quhao, W., Eds.; CRC Press: Boca Raton, FL, USA, 2018; pp. 149–164. [Google Scholar]
  114. Stow, D.; Nichol, C.J.; Wade, T.; Assmann, J.J.; Simpson, G.; Helfter, C. Illumination Geometry and Flying Height Influence Surface Reflectance and NDVI Derived from Multispectral UAS Imagery. Drones 2019, 3, 55. [Google Scholar] [CrossRef] [Green Version]
  115. Mesas-Carrascosa, F.-J.; Torres-Sánchez, J.; Clavero-Rumbao, I.; García-Ferrer, A.; Peña, J.-M.; Borra-Serrano, I.; López-Granados, F. Assessing Optimal Flight Parameters for Generating Accurate Multispectral Orthomosaicks by UAV to Support Site-Specific Crop Management. Remote Sens. 2015, 7, 12793–12814. [Google Scholar] [CrossRef] [Green Version]
  116. Fu, Z.; Yu, J.; Xie, G.; Chen, Y.; Mao, Y. A Heuristic Evolutionary Algorithm of UAV Path Planning. Wirel. Commun. Mob. Comput. 2018, 2018, 2851964. [Google Scholar] [CrossRef] [Green Version]
  117. Franco, C.D.; Di Franco, C.; Buttazzo, G. Coverage Path Planning for UAVs Photogrammetry with Energy and Resolution Constraints. J. Intell. Robot. Syst. 2016, 83, 445–462. [Google Scholar] [CrossRef]
  118. Du, M.; Noguchi, N. Monitoring of Wheat Growth Status and Mapping of Wheat Yield’s within-Field Spatial Variations Using Color Images Acquired from UAV-camera System. Remote Sens. 2017, 9, 289. [Google Scholar] [CrossRef] [Green Version]
  119. Tahir, M.N.; Naqvi, S.Z.A.; Lan, Y.; Zhang, Y.; Wang, Y.; Afzal, M.; Cheema, M.J.M.; Amir, S. Real time estimation of chlorophyll content based on vegetation indices derived from multispectral UAV in the kinnow orchard. Int. J. Precis. Agric. Aviat. 2018, 1, 24–31. [Google Scholar]
  120. Pádua, L.; Marques, P.; Hruška, J.; Adão, T.; Peres, E.; Morais, R.; Sousa, J. Multi-Temporal Vineyard Monitoring through UAV-Based RGB Imagery. Remote Sens. 2018, 10, 1907. [Google Scholar] [CrossRef] [Green Version]
  121. Cabreira, T.M.; Di Franco, C.; Ferreira, P.R.; Buttazzo, G.C. Energy-Aware Spiral Coverage Path Planning for UAV Photogrammetric Applications. IEEE Robot. Autom. Lett. 2018, 3, 3662–3668. [Google Scholar] [CrossRef]
  122. Samaniego, F.; Sanchis, J.; García-Nieto, S.; Simarro, R. Recursive Rewarding Modified Adaptive Cell Decomposition (RR-MACD): A Dynamic Path Planning Algorithm for UAVs. Electronics 2019, 8, 306. [Google Scholar] [CrossRef] [Green Version]
  123. Agisoft LLC. AgiSoft Metashape User Manual; Professional Edition v.1.5; Agisoft LLC: St. Petersburg, Russia, 2019. [Google Scholar]
  124. James, M.R.; Robson, S.; Smith, M.W. 3-D uncertainty-based topographic change detection with structure-from-motion photogrammetry: Precision maps for ground control and directly georeferenced surveys. Earth Surf. Proc. Landf. 2017, 42, 1769–1788. [Google Scholar] [CrossRef]
  125. Salamí, E.; Barrado, C.; Pastor, E. UAV Flight Experiments Applied to the Remote Sensing of Vegetated Areas. Remote Sens. 2014, 6, 11051–11081. [Google Scholar] [CrossRef] [Green Version]
  126. Graham, A.; Coops, N.; Wilcox, M.; Plowright, A. Evaluation of Ground Surface Models Derived from Unmanned Aerial Systems with Digital Aerial Photogrammetry in a Disturbed Conifer Forest. Remote Sens. 2019, 11, 84. [Google Scholar] [CrossRef] [Green Version]
  127. Tu, Y.H.; Phinn, S.; Johansen, K.; Robson, A. Assessing radiometric correction approaches for multi-spectral UAS imagery for horticultural applications. Remote Sens. 2018, 10, 1684. [Google Scholar] [CrossRef] [Green Version]
  128. Oliveira, R.A.; Tommaselli, A.M.; Honkavaara, E. Generating a hyperspectral digital surface model using a hyperspectral 2D frame camera. ISPRS J. Photogramm. Remote Sens. 2019, 147, 345–360. [Google Scholar] [CrossRef]
  129. Schaepman-Strub, G.; Schaepman, M.E.; Painter, T.H.; Dangel, S.; Martonchik, J.V. Reflectance quantities in optical remote sensing—Definitions and case studies. Remote Sens. Environ. 2006, 103, 27–42. [Google Scholar] [CrossRef]
  130. Smith, G.M.; Milton, E.J. The use of the empirical line method to calibrate remotely sensed data to reflectance. Int. J. Remote Sens. 1999, 20, 2653–2662. [Google Scholar] [CrossRef]
  131. Aasen, H.; Van Wittenberghe, S.; Sabater Medina, N.; Damm, A.; Goulas, Y.; Wieneke, S.; Hueni, A.; Malenovský, Z.; Alonso, L.; Pacheco-Labrador, J.; et al. Sun-Induced Chlorophyll Fluorescence II: Review of Passive Measurement Setups, Protocols, and Their Application at the Leaf to Canopy Level. Remote Sens. 2019, 11, 927. [Google Scholar] [CrossRef] [Green Version]
  132. Burkart, A.; Aasen, H.; Alonso, L.; Menz, G.; Bareth, G.; Rascher, U. Angular Dependency of Hyperspectral Measurements over Wheat Characterized by a Novel UAV Based Goniometer. Remote Sens. 2015, 7, 725–746. [Google Scholar] [CrossRef] [Green Version]
  133. Aasen, H. Influence of the viewing geometry within hyperspectral images retrieved from UAV snapshot cameras. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 3, 257–261. [Google Scholar] [CrossRef]
  134. Roosjen, P.; Suomalainen, J.; Bartholomeus, H.; Kooistra, L.; Clevers, J. Mapping Reflectance Anisotropy of a Potato Canopy Using Aerial Images Acquired with an Unmanned Aerial Vehicle. Remote Sens. 2017, 9, 417. [Google Scholar] [CrossRef] [Green Version]
  135. Jaud, M.; Passot, S.; Le Bivic, R.; Delacourt, C.; Grandjean, P.; Le Dantec, N. Assessing the Accuracy of High Resolution Digital Surface Models Computed by PhotoScan® and MicMac® in Sub-Optimal Survey Conditions. Remote Sens. 2016, 8, 465. [Google Scholar] [CrossRef] [Green Version]
  136. Honkavaara, E.; Khoramshahi, E. Radiometric Correction of Close-Range Spectral Image Blocks Captured Using an Unmanned Aerial Vehicle with a Radiometric Block Adjustment. Remote Sens. 2018, 10, 256. [Google Scholar] [CrossRef] [Green Version]
  137. Hueni, A.; Damm, A.; Kneubuehler, M.; Schlapfer, D.; Schaepman, M.E. Field and Airborne Spectroscopy Cross Validation—Some Considerations. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2017, 10, 1117–1135. [Google Scholar] [CrossRef]
  138. Brook, A.; Dor, E.B. Supervised vicarious calibration (SVC) of hyperspectral remote-sensing data. Remote Sens. Environ. 2011, 115, 1543–1555. [Google Scholar] [CrossRef]
  139. Durell, C.; Scharpf, D.; McKee, G.; L’Heureux, M.; Georgiev, G.; Obein, G.; Cooksey, C. Creation and validation of Spectralon PTFE BRDF targets and standards. Sens. Syst. Next Gener. Satell. XIX 2015, 9639, 96391D. [Google Scholar]
  140. Nicodemus, F.E. Reflectance nomenclature and directional reflectance and emissivity. Appl. Opt. 1970, 9, 1474–1475. [Google Scholar] [CrossRef] [PubMed]
  141. Cooksey, C.C.; Allen, D.W.; Tsai, B.K.; Yoon, H.W. Establishment and application of the 0/45 reflectance factor scale over the shortwave infrared. Appl. Opt. 2015, 54, 3064–3071. [Google Scholar] [CrossRef] [Green Version]
  142. Bourgeois, C.S.; Saskia Bourgeois, C.; Ohmura, A.; Schroff, K.; Frei, H.-J.; Calanca, P. IAC ETH Goniospectrometer: A Tool for Hyperspectral HDRF Measurements. J. Atmos. Ocean. Technol. 2006, 23, 573–584. [Google Scholar] [CrossRef]
  143. Schneider-Zapp, K.; Cubero-Castan, M.; Shi, D.; Strecha, C. A new method to determine multi-angular reflectance factor from lightweight multispectral cameras with sky sensor in a target-less workflow applicable to UAV. Remote Sens. Environ. 2019, 229, 60–68. [Google Scholar] [CrossRef] [Green Version]
  144. Aasen, H.; Burkart, A.; Bolten, A.; Bareth, G. Generating 3D hyperspectral information with lightweight UAV snapshot cameras for vegetation monitoring: From camera calibration to quality assurance. ISPRS J. Photogramm. Remote Sens. 2015, 108, 245–259. [Google Scholar] [CrossRef]
  145. Wehrhan, M.; Rauneker, P.; Sommer, M. UAV-Based Estimation of Carbon Exports from Heterogeneous Soil Landscapes—A Case Study from the CarboZALF Experimental Area. Sensors 2016, 16, 255. [Google Scholar] [CrossRef] [Green Version]
  146. Honkavaara, E.; Saari, H.; Kaivosoja, J.; Pölönen, I.; Hakala, T.; Litkey, P.; Mäkynen, J.; Pesonen, L. Processing and Assessment of Spectrometric, Stereoscopic Imagery Collected Using a Lightweight UAV Spectral Camera for Precision Agriculture. Remote Sens. 2013, 5, 5006–5039. [Google Scholar] [CrossRef] [Green Version]
  147. Hruska, R.; Mitchell, J.; Anderson, M.; Glenn, N.F. Radiometric and Geometric Analysis of Hyperspectral Imagery Acquired from an Unmanned Aerial Vehicle. Remote Sens. 2012, 4, 2736–2752. [Google Scholar] [CrossRef] [Green Version]
  148. Yang, G.; Li, C.; Wang, Y.; Yuan, H.; Feng, H.; Xu, B.; Yang, X. The DOM Generation and Precise Radiometric Calibration of a UAV-Mounted Miniature Snapshot Hyperspectral Imager. Remote Sens. 2017, 9, 642. [Google Scholar] [CrossRef] [Green Version]
  149. Soffer, R.J.; Ifimov, G.; Arroyo-Mora, J.P.; Kalacska, M. Validation of Airborne Hyperspectral Imagery from Laboratory Panel Characterization to Image Quality Assessment: Implications for an Arctic Peatland Surrogate Simulation Site. Can. J. Remote Sens. 2019, 45, 476–508. [Google Scholar] [CrossRef]
  150. Ben-Dor, E. Quality assessment of several methods to recover surface reflectance using synthetic imaging spectroscopy data. Remote Sens. Environ. 2004, 90, 389–404. [Google Scholar] [CrossRef]
  151. Markelin, L.; Suomalainen, J.; Hakala, T.; Oliveira, R.A.; Viljanen, N.; Näsi, R.; Scott, B.; Theocharous, T.; Greenwell, C.; Fox, N.; et al. Methodology for direct reflectance measurement from a drone: System description, radiometric calibration and latest results. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, 42. [Google Scholar] [CrossRef] [Green Version]
  152. Del Pozo, S.; Rodríguez-Gonzálvez, P.; Hernández-López, D.; Felipe-García, B. Vicarious radiometric calibration of a multispectral camera on board an unmanned aerial system. Remote Sens. 2014, 6, 1918–1937. [Google Scholar] [CrossRef] [Green Version]
  153. Iqbal, F.; Lucieer, A.; Barry, K. Simplified radiometric calibration for UAS-mounted multispectral sensor. Eur. J. Remote Sens. 2018, 51, 301–313. [Google Scholar] [CrossRef]
  154. Xu, K.; Gong, Y.; Fang, S.; Wang, K.; Lin, Z.; Wang, F. Radiometric Calibration of UAV Remote Sensing Image with Spectral Angle Constraint. Remote Sens. 2019, 11, 1291. [Google Scholar] [CrossRef] [Green Version]
  155. Wang, S.; Baum, A.; Zarco-Tejada, P.J.; Dam-Hansen, C.; Thorseth, A.; Bauer-Gottwein, P.; Bandini, F.; Garcia, M. Unmanned Aerial System multispectral mapping for low and variable solar irradiance conditions: Potential of tensor decomposition. ISPRS J. Photogramm. Remote Sens. 2019, 155, 58–71. [Google Scholar] [CrossRef]
  156. Yu, X.; Liu, Q.; Liu, X.; Liu, X.; Wang, Y. A physical-based atmospheric correction algorithm of unmanned aerial vehicles images and its utility analysis. Int. J. Remote Sens. 2017, 38, 3101–3112. [Google Scholar] [CrossRef]
  157. Kelly, J.; Kljun, N.; Olsson, P.O.; Mihai, L.; Liljeblad, B.; Weslien, P.; Klemedtsson, L.; Eklundh, L. Challenges and best practices for deriving temperature data from an uncalibrated UAV thermal infrared camera. Remote Sens. 2019, 11, 567. [Google Scholar] [CrossRef] [Green Version]
  158. Budzier, H.; Gerlach, G. Calibration of uncooled thermal infrared cameras. J. Sens. Sens. Syst. 2015, 4, 187–197. [Google Scholar] [CrossRef] [Green Version]
  159. Nugent, P.W.; Shaw, J.A.; Pust, N.J. Correcting for focal-plane-array temperature dependence in microbolometer infrared cameras lacking thermal stabilization. Opt. Eng. 2013, 52, 061304. [Google Scholar] [CrossRef] [Green Version]
  160. Ribeiro-Gomes, K.; Hernández-López, D.; Ortega, J.F.; Ballesteros, R.; Poblete, T.; Moreno, M.A. Uncooled thermal camera calibration and optimization of the photogrammetry process for UAV applications in agriculture. Sensors 2017, 17, 2173. [Google Scholar] [CrossRef]
  161. Sagan, V.; Maimaitijiang, M.; Sidike, P.; Eblimit, K.; Peterson, K.T.; Hartling, S.; Esposito, F.; Khanal, K.; Newcomb, M.; Pauli, D.; et al. UAV-based high resolution thermal imaging for vegetation monitoring, and plant phenotyping using ICI 8640 P, FLIR Vue Pro R 640, and thermomap cameras. Remote Sens. 2019, 11, 330. [Google Scholar] [CrossRef] [Green Version]
  162. Malbéteau, Y.; Parkes, S.; Aragon, B.; Rosas, J.; McCabe, M. Capturing the Diurnal Cycle of Land Surface Temperature Using an Unmanned Aerial Vehicle. Remote Sens. 2018, 10, 1407. [Google Scholar] [CrossRef] [Green Version]
  163. Turner, D.; Lucieer, A.; Malenovský, Z.; King, D.; Robinson, S. Spatial Co-Registration of Ultra-High Resolution Visible, Multispectral and Thermal Images Acquired with a Micro-UAV over Antarctic Moss Beds. Remote Sens. 2014, 6, 4003–4024. [Google Scholar] [CrossRef] [Green Version]
  164. Lee, J.; Sung, S. Evaluating spatial resolution for quality assurance of UAV images. Spat. Inf. Res. 2016, 24, 141–154. [Google Scholar] [CrossRef]
  165. Blaschke, T. Object based image analysis for remote sensing. ISPRS J. Photogramm. Remote Sens. 2010, 65, 2–16. [Google Scholar] [CrossRef] [Green Version]
  166. Weih, R.C.; Riggan, N.D. Object-based classification vs. pixel-based classification: Comparative importance of multi-resolution imagery. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2010, 38, 1–8. [Google Scholar]
  167. Blaschke, T.; Hay, G.J.; Kelly, M.; Lang, S.; Hofmann, P.; Addink, E.; Feitosa, R.Q.; van der Meer, F.; van der Werff, H.; van Coillie, F.; et al. Geographic Object-Based Image Analysis—Towards a new paradigm. ISPRS J. Photogramm. Remote Sens. 2014, 87, 180–191. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  168. Li, M.; Ma, L.; Blaschke, T.; Cheng, L.; Tiede, D. A systematic comparison of different object-based classification techniques using high spatial resolution imagery in agricultural environments. Int. J. Appl. Earth Obs. Geoinf. 2016, 49, 87–98. [Google Scholar] [CrossRef]
  169. Amirebrahimi, S.; Quadros, N.; Coppa, I.; Keysers, J. UAV Data Acquisition in Australia and New Zeland; FrontierSL: Melbourne, Australia, 2018; ISBN 978-0-6482278-7-8. [Google Scholar]
  170. Whitehead, K.; Hugenholtz, C.H.; Myshak, S.; Brown, O.; LeClair, A.; Tamminga, A.; Barchyn, T.E.; Moorman, B.; Eaton, B. Remote sensing of the environment with small unmanned aircraft systems (UASs), part 2: Scientific and commercial applications. J. Unmanned Veh. Syst. 2014, 2, 86–102. [Google Scholar] [CrossRef] [Green Version]
  171. Kattenborn, T.; Eichel, J.; Fassnacht, F.E. Convolutional Neural Networks enable efficient, accurate and fine-grained segmentation of plant species and communities from high-resolution UAV imagery. Sci. Rep. 2019, 9, 17656. [Google Scholar] [CrossRef] [PubMed]
  172. Mcgarigal, K.; Marks, B.J. Spatial pattern analysis program for quantifying landscape structure. In General Technical Report. PNW-GTR-351; US Department of Agriculture, Forest Service, Pacific Northwest Research Station: Portland, OR, USA, 1995; pp. 1–122. [Google Scholar]
  173. Torres-Sánchez, J.; López-Granados, F.; Borra-Serrano, I.; Peña, J.M. Assessing UAV-collected image overlap influence on computation time and digital surface model accuracy in olive orchards. Precis. Agric. 2018, 19, 115–133. [Google Scholar] [CrossRef]
  174. Arroyo-Mora, J.; Kalacska, M.; Inamdar, D.; Soffer, R.; Lucanus, O.; Gorman, J.; Naprstek, T.; Schaaf, E.; Ifimov, G.; Elmer, K.; et al. Implementation of a UAV–Hyperspectral Pushbroom Imager for Ecological Monitoring. Drones 2019, 3, 12. [Google Scholar] [CrossRef] [Green Version]
  175. Propeller. AeroPoints. Available online: https://www.propelleraero.com/aeropoints/ (accessed on 3 January 2019).
  176. Tu, Y.-H.; Johansen, K.; Phinn, S.; Robson, A. Measuring Canopy Structure and Condition Using Multi-Spectral UAS Imagery in a Horticultural Environment. Remote Sens. 2019, 11, 269. [Google Scholar] [CrossRef] [Green Version]
  177. Hartmann, W.; Tilch, S.; Eisenbeiss, H.; Schindler, K. Determination of the uav position by automatic processing of thermal images. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, 39, 111–116. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Activities involved in UAS-based mapping. The five topics identified in this graph also reflect the structure of the present manuscript that addresses each of them in the following sections.
Figure 1. Activities involved in UAS-based mapping. The five topics identified in this graph also reflect the structure of the present manuscript that addresses each of them in the following sections.
Remotesensing 12 01001 g001
Figure 2. The influence of various study conditions on the output quality of an UAS observation: red—output product; purple—quality parameters; blue—software and hardware; orange—experiment design; light blue—environmental conditions; green—internal (affected) parameters. The arrow color indicates the type of relationship: gray—direct dependence, orange—inverse dependence.
Figure 2. The influence of various study conditions on the output quality of an UAS observation: red—output product; purple—quality parameters; blue—software and hardware; orange—experiment design; light blue—environmental conditions; green—internal (affected) parameters. The arrow color indicates the type of relationship: gray—direct dependence, orange—inverse dependence.
Remotesensing 12 01001 g002
Figure 3. Planar root mean square error (RMSExy) and vertical root mean square error (RMSEz) from a literature review, based on the work by [8,73,88,89,90,91,92].
Figure 3. Planar root mean square error (RMSExy) and vertical root mean square error (RMSEz) from a literature review, based on the work by [8,73,88,89,90,91,92].
Remotesensing 12 01001 g003
Figure 4. Proposed UAS environmental mapping workflow (n.b. QA refers to quality assurance).
Figure 4. Proposed UAS environmental mapping workflow (n.b. QA refers to quality assurance).
Remotesensing 12 01001 g004
Table 1. (Dis)advantages of different platforms.
Table 1. (Dis)advantages of different platforms.
PlatformAdvantages (+) and Disadvantages (-)Flight Time/Coverage
Rotary-wing+ flexibility and ease of use
+ stability
+ possibility for low flight heights and low speed
+ possibility to hover
- lower area coverage
- wind may affect the vehicle stability
Flight time typically 20–40 min
 
Coverage 5–30 × 103 m2 depending on flight altitude
Fixed-wing+ capacity to cover larger areas
+ higher speed and reduced time of flight execution
- take-off and landing require an experienced pilot
- faster vehicle may have difficulties in mapping small objects or establish enough overlaps
Flight time up to hours
 
Coverage e.g., >20 km2 depending on flight altitude
Hybrid VTOL (Vertical Take Off and Landing)+ ability to hover, vertical take-off and landing
+ ability to cover larger areas
- complex systems mechanically (i.e., tilting rotors or wings, mixed lifting and pushing motors)
Flight time up to hours, but usually less than fixed wings
 
Coverage × 106 m2
Table 2. Type of sensors mounted on UAS and their possible applications.
Table 2. Type of sensors mounted on UAS and their possible applications.
Sensor TypeSpecificsMain Applications
RGBOptical aerial photogrammetry, SfM-based 3D modeling, change detection, fluid flow tracking
Multispectral (<10–20 bands)Multiple wavelengthsvegetation mapping, water quality, classification studies
Hyperspectral overlapping contiguous bandsAnalyzing the shape of spectrumvegetation mapping, plant physiology, plant phenotyping studies, water quality, minerals mapping, pest-detection
ThermalBrightness surface temperaturethermography, plant stress, thermal inertia, soil water content, urban heat island mapping, water temperature, animal detection.
LiDAR (Light Detection and Ranging)Surface structure3D reconstruction, digital terrain mapping, canopy height models, plant structure, erosion studies
Table 3. List of the UAS mission planning software and their main characteristics.
Table 3. List of the UAS mission planning software and their main characteristics.
NameSoftware (SW) OptionsOperating SystemHome PageType of License
Flight planning appPix4DcapturePlanar flights; Double gridded flights; Circular Flights.Android/iOS/Windowshttp://pix4d.com/product/pix4dcaptureFree to use
DJI GS Pro3D mappingiOShttp://dji.com/ground-station-proFree to use
Precision flight freeResume interrupted flights.Androidhttp://precisionhawk.com/precisionflightFree to use
DroneDeployPlanar flights; Cloud-based orthomosaics.Android/iOShttps://www.dronedeploy.com/Free to use
LitchiArt computer vision algorithms; the gimbal and the drone’s yaw axis.Android/iOShttps://flylitchi.com/Proprietary SW
Phenofly Planning toolPhotographic properties, GCP placement, Viewing angle estimationJavaScript browserhttp://www.phenofly.net/PhenoFlyPlanningToolFree to use & modify
Ground station softwareMAVProxyLoadable modules.Portable Operating System (POSIX)https://ardupilot.github.io/MAVProxy/html/index.htmlFree to use
Mission PlannerHardware-in-the-loop UAV simulator.Windowshttp://ardupilot.org/plannerFree to use
APM Planner 2/Mission PlannerLive data; Initiate commands in flight.Linux/OS X/Windowshttp://ardupilot.org/plannerFree to use
QGroundControl GCSMultiple vehicles.Android/iOS/Linux/OS X/Windowshttp://www.qgroundcontrol.org/Free to use & modify
UgCSPhotogrammetry; Custom elevation data import; battery change option.OS X/Linux/Windowshttps://www.ugcs.com/Proprietary SW
mdCOCKPITReal-time telemetric data; Flight analytics Module.Androidhttp://microdrones.com/en/mdaircraft/software/mdcockpitProprietary SW
UAV ToolboxTelemetry data conversion.Androidhttp://uavtoolbox.com/Proprietary SW
eMotion 3Supports both fixed-wing and multirotor operations; Full 3D environment for flight management.Windowshttp://sensefly.com/software/emotion-3.htmlProprietary SW
Table 4. Impact of different correction methods on UAS-based radiometric observations.
Table 4. Impact of different correction methods on UAS-based radiometric observations.
Correction MethodSensor ResolutionAccuracy AssessmentReference
Noise reduction;
Vignetting correction;
Lens distortion correction.
12 bands
400–900 nm
UAS via ASD
R2 = 0.99
[145]
Noise reduction;
Spectral smile correction;
Block adjustment.
48 bands
400–900 nm
Average coefficient of variation for the radiometric tie points was 0.05–0.08[146]
Correction coefficient;
Noise reduction;
Vignetting correction.
125 bands
450–950 nm
Average precision within the entire scene is 0.2% reflectance[144]
Correction coefficient;
Noise reduction;
Spectral smile correction.
48 bands
400–900 nm
Ratio of UAS radiance to reference measurements varies from 0.84 to 1.17[98]
Radiometric block adjustment.240 bands
400–900 nm
UAS to MODTRAN predicted radiance agreement 96.3%[147]
Vignetting correction;
RRV effect correction.
125 bands
450–950 nm
Ratio of UAS radiance to reference measurements varies from 0.95 to 1.04[148]
Assess dark current and white reference consistency spatially and temporally; assess spectral wavelength calibration; conversion from reflectance to radiance.270 bands
400–1000 nm
Dark current and white reference evaluations showed insignificant increase over time; hyperspectral bands exhibited a slight shift of 1-3 nm; radiometric calibrations with R2 > 0.99[21]

Share and Cite

MDPI and ACS Style

Tmušić, G.; Manfreda, S.; Aasen, H.; James, M.R.; Gonçalves, G.; Ben-Dor, E.; Brook, A.; Polinova, M.; Arranz, J.J.; Mészáros, J.; et al. Current Practices in UAS-based Environmental Monitoring. Remote Sens. 2020, 12, 1001. https://doi.org/10.3390/rs12061001

AMA Style

Tmušić G, Manfreda S, Aasen H, James MR, Gonçalves G, Ben-Dor E, Brook A, Polinova M, Arranz JJ, Mészáros J, et al. Current Practices in UAS-based Environmental Monitoring. Remote Sensing. 2020; 12(6):1001. https://doi.org/10.3390/rs12061001

Chicago/Turabian Style

Tmušić, Goran, Salvatore Manfreda, Helge Aasen, Mike R. James, Gil Gonçalves, Eyal Ben-Dor, Anna Brook, Maria Polinova, Jose Juan Arranz, János Mészáros, and et al. 2020. "Current Practices in UAS-based Environmental Monitoring" Remote Sensing 12, no. 6: 1001. https://doi.org/10.3390/rs12061001

APA Style

Tmušić, G., Manfreda, S., Aasen, H., James, M. R., Gonçalves, G., Ben-Dor, E., Brook, A., Polinova, M., Arranz, J. J., Mészáros, J., Zhuang, R., Johansen, K., Malbeteau, Y., de Lima, I. P., Davids, C., Herban, S., & McCabe, M. F. (2020). Current Practices in UAS-based Environmental Monitoring. Remote Sensing, 12(6), 1001. https://doi.org/10.3390/rs12061001

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop