Next Article in Journal
UAV Recognition Based on Micro-Doppler Dynamic Attribute-Guided Augmentation Algorithm
Previous Article in Journal
Evaluating the Vertical Accuracy of DEM Generated from ZiYuan-3 Stereo Images in Understanding the Tectonic Morphology of the Qianhe Basin, China
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Technical Study on UAV Characteristics for Precision Agriculture Applications and Associated Practical Challenges

1
Department of Agriculture and Biosystems Engineering, North Dakota State University, Fargo, ND 58102, USA
2
College of Agriculture & Montana Agricultural Experiment Station, Montana State University, Bozeman, MT 59717, USA
*
Author to whom correspondence should be addressed.
Remote Sens. 2021, 13(6), 1204; https://doi.org/10.3390/rs13061204
Submission received: 25 February 2021 / Revised: 15 March 2021 / Accepted: 18 March 2021 / Published: 22 March 2021
(This article belongs to the Section Remote Sensing in Agriculture and Vegetation)

Abstract

:
The incorporation of advanced technologies into Unmanned Aerial Vehicles (UAVs) platforms have enabled many practical applications in Precision Agriculture (PA) over the past decade. These PA tools offer capabilities that increase agricultural productivity and inputs’ efficiency and minimize operational costs simultaneously. However, these platforms also have some constraints that limit the application of UAVs in agricultural operations. The constraints include limitations in providing imagery of adequate spatial and temporal resolutions, dependency on weather conditions, and geometric and radiometric correction requirements. In this paper, a practical guide on technical characterizations of common types of UAVs used in PA is presented. This paper helps select the most suitable UAVs and on-board sensors for different agricultural operations by considering all the possible constraints. Over a hundred research studies were reviewed on UAVs applications in PA and practical challenges in monitoring and mapping field crops. We concluded by providing suggestions and future directions to overcome challenges in optimizing operational proficiency.

Graphical Abstract

1. Introduction

Precision agriculture (PA) is the use of large data sources in conjunction with crop and environmental advanced analytical tools to help farmers make better informed decisions and accurately (right rates, right times, and right places) manage their fields, with the goal of achieving both economic and environmental targets [1,2]. Recently, there has been growing interest in PA globally as a promising step towards meeting an unprecedented demand to produce more food and energy of higher qualities in a more sustainable manner by optimizing externalities [3]. PA practices have evolved significantly over recent years, with the global market now estimated to reach $43.4 billion by 2025 [4]. PA based on a previous map of any of the variables of interest can be performed by using remote sensing (RS) technology. RS allows growers to collect, visualize, and evaluate crop and soil health conditions at various stages of production in a convenient and cost-effective manner [5]. It can serve as an early indicator to detect potential problems, and provide opportunities to address these problems in a timely fashion.
In RS, platform type dictates attributes such as deployment time and range, the sensor’s distance from the object of interest, periodicity of image acquisition, image acquisition timing, location and extent of coverage. Unmanned Aerial Vehicles’ (UAVs) capabilities offer the potential to revolutionize traditional RS platforms for real-time crop monitoring, weed detection, tree classification, water stress assessment, disease detection, yield estimation, and various pest and nutrient management strategies. Although, UAVs have not yet made it into the mainstream of PA practices, they are playing an increasingly important role in precision farming, helping agriculture professionals lead the way with sustainable farming practices, while also protecting and increasing profitability [4]. UAVs provide growers, agronomists, and companies that serve the industry with high-quality images and data. In supporting PA, UAVs can conduct a broad range of agricultural operations including soil health scanning, irrigation schedules planning, seed planting, fertilizers applying, and weather analyzing. UAVs provide distinct advantages over other RS platforms, including:
  • Rapid data collection with high quality, range, and resolution
  • Capability to combine 3D canopy height and orthophoto information
  • Ability to create canopy height models from Structure-From-Motion (SFM) point clouds
  • Capability to capture multi-angular data (particularly from snapshot cameras)
  • Capacity to operate multiple sensors at the same time.
A number of review studies on UAVs and their applications in various domains that underline the popularity, importance, and explosion of this topic among researchers in the last decade are shown in Table 1. These studies projected that crop monitoring for pests, weeds, and diseases will become the most prevalent application of UAVs RS [6].
UAVs are commonly used for RS in agriculture for scouting crop fields and monitoring livestock [17]. For example, multispectral camera integrated UAVs, have been used for crop yield assessments, crop height monitoring, crop weed mapping, and biomass monitoring [18,19,20,21]. Aerial data can be successfully acquired with high spatial and temporal resolution with the UAVs and UAV integrated subsystems [22]. Some of these UAVs’ subsystems are capable of measuring variety of air pollutants due to the recent developments in sensor technology [23]. RS is not the only type of application of the UAVs. Aerial spraying of herbicides or pesticides, aerial sensing of sound and identifying changes in land structure for city planning are other types of recently emerged UAV applications [24,25]. UAVs were considered as an effective emergency relief vehicle that their use include but not limited to blood delivery, ambulance for cardiac arrest, and disaster relief operations [26,27,28]. UAVs provide distinct advantages such as ability to rapidly and remotely travel to pre-defined locations that are difficult to access and to execute tasks efficiently at relatively lower costs and time. For instance, UAV-assisted aerial images that are taken from waterbodies can help visualize disturbances in water and provide enhanced spatial water quality monitoring data [29,30]. Topographic changes in watersheds can be monitored with UAV attached high-resolution cameras and other sensors [31]. The specific coordinates of contaminations can be acquired from these surveys and can be included in water quality monitoring plan for further sampling. Besides RS, UAVs and special sub-systems can be used for making in situ water quality measurements for pH, dissolved oxygen, electrical conductivity, and temperature from surface waters [32]. In compliance with the in situ measurements, water sample collection with custom designed water collection devices can be used for enhancing water quality monitoring in larger waterbodies [33,34].
Several investigations have been conducted on UAVs’ advantages and disadvantages over other RS platforms [7,34,35]. A concise comparison of technical specifications of the most frequently used RS platforms in Precision Agriculture (PA) is presented in Table 2. Equipped with proper hardware components, sensors, and the ability to collect high spatial and temporal resolution data for relatively short periods, UAVs could either substitute for or be complementary to many of these platforms [7,10,13,14,16]. Compared to traditional RS platforms:
  • UAVs are more flexible and can overcome shortcomings such as pilot fatigue and fuel limitation.
  • UAVs are locally accessible and excel at the ability to provide near-real-time imagery of high spatial (e.g., pixels < 3 cm), spectral, and temporal resolutions.
  • UAVs are considered as fast survey platforms over proximal ground-based systems for non-destructive methods of data collection [36].
  • UAVs are cost-effective, easier to operate, flexible to deploy where and when needed, and able to fly below the clouds [37].
Even though UAVs have advantages over traditional RS technology and can provide high quality orthorectified images, several technical problems and limitations affect its agricultural applications [7,9,38]. There are challenges regarding engine power, automated takeoff and landing, short flight duration, difficulties in maintaining flight altitude, aircraft stability, reliability (sensitivity of communications, mechanical and electrical failure), susceptibility to weather condition, regulations and restrictions regarding flying UAVs of 55 lb. and heavier (500ft bubble and Blanket COA 5-3-2 airspace bubble) [39], maneuverability in winds and turbulence, platform failures, engine breakdown due to inadequate building materials, and payload capacity [34,40,41,42].
The potential for UAVs in the improvement of PA is huge. Already the agriculture drone market is predicted to be worth US$32.4 billion – an indication that the growers, researchers, and industry are beginning to recognize UAVs benefits over more traditional methods, such as ground mapping [4]. Thus, it is crucial to understand the capabilities and limitations of available UAVs for different agricultural operations. The main motivation to undertake this study is the lack of a comprehensive survey focusing on UAVs’ characteristics and their applications in PA. This review can be treated as a preliminary insight into the choice of UAVs by growers and interested researchers for PA operations.
The objective of this paper is to provide a technical study on the most widely used UAVs and their suitability for PA applications. In this review study, over 100 peer-reviewed and conference studies, research, and relevant websites have been reviewed and reported. For an automatic program script, we searched for several keywords, including Unmanned Aerial Vehicle, UAVs Flight, Aerial, Fixed-Wing, Rotary-Wing, Drone, and Quadcopter. The manual screening was then performed by reading the abstract and the related work’s sections to reject some non-related or outdated papers. Since this review is on crop field monitoring and agricultural spraying, we did not cover animal agriculture or non-agricultural applications. Additionally, we only surveyed references written in the English language. Finally, the benefits and significant challenges of using these platforms in different crop field environments were investigated, and practical solutions to overcome those limitations were also discussed in this paper. We did not restrict our studied references to any time period, but according to Web of Science more than 50% (~3000) of the papers available on UAVs use in agriculture are published after 2016, with almost 90% of the papers published after 2013 [2], therefore, most of the references cited in this study fall within this period of time and after that.
This paper’s remaining contents are organized as follows: Section 2 addresses the most frequently used platform designs of UAVs used in PA along with the technical features of each platform. In Section 3, the most common UAVs’ on-board sensing systems were overviewed. Section 4 is devoted to the practical challenges of using UAVs in PA. In Section 5, the advantages and disadvantages of using UAVs in some specific agricultural applications were reported. Section 5 also discusses the reasons why UAVs still are not recognized as widely used platforms among growers. The insights for future applications and possible future improvements were discussed in the end.

2. Unmanned Aerial Vehicle Platforms

Small UAVs are powered aerial vehicles that weigh less than 25 kg (55 lb.) and can operate autonomously or be operated remotely by human pilots. Considering the sizes, configurations, and characteristics, of UAVs, several classifications of UAVs have been suggested in different studies [43,44,45,46,47]. UAVs that weigh 55 pounds or more on take-off, will be required to fly under a different set of regulations and restrictions [39]. The weight of UAVs will determine what set of regulations they fall within, therefore, UAVs could be classified based on the take-off weight. First category of drones using their weight as a characteristic is Super heavy UAVs that weigh over 2 tons (4409 lb.). They hold sufficiently large amounts of fuel. They fly at high altitudes serving military reconnaissance (e.g., Global Hawk). Next are heavy weight UAVs that weigh between 200 and 2000 kg (440 and 4409 lb.) (e.g., Fire Scout). Medium-weight UAVs weigh 50–200 kg (110–440 lb.). All of these UAVs are capable of holding more fuel and travel long distances with greater endurance. Light-weight UAVs weigh 5–50 kg (11–110 lb.). They are useful in agricultural fields (e.g., RMAX, Yintong, Autocopter). Micro UAVs are those weighing less than 5 kg. They are quick to lift off and fly at relatively lower altitudes (e.g., Raven). These are more common in agricultural fields and are portable as they weigh less than 5 kg (11 lb.) (e.g., Precision Hawk’s Lancaster, eBee, Swinglet, CropCam). Mini UAVs and Nano UAVs are two kinds of micro UAVs that are very small in nature and have all the characteristics of miniature UAVs. The idea behind the Mini and Nano was to reduce the size and weight—and associated costs—of UAVs technology so that it can be carried by one person.
In this study, we focused on the most commonly light-weight UAVs used in agriculture, including mini model fixed-wing airplanes, rotary-winged helicopters, and hybrid Vertical Take-Off and Landing (VTOL) UAVs. Other types of UAVs, e.g., flapping wings, Parafoil-wings and blimps that have not been used in agricultural operations [45] were not considered for this review. Figure 1 shows some of the types and models of small UAVs frequently used in PA applications.
Fixed-wing UAVs are hand-launched typically from a launcher ramp and require runways for takeoff and landing areas (Figure 2). These UAVs can travel several kilometers from the launch point and fly at high cruise altitudes and speeds, cover larger areas, and get a centimeter-level accuracy of ground sample distance. These characteristics make them excellent platforms to obtain fast and detailed information of agricultural field environments. Fixed-wing UAVs are a wise option for heavier/denser payloads. Because of the higher endurance and payload capacity of fixed-wing UAVs, they are ideal platforms for operations such as pesticide spot spraying that require longer flight endurance [48]. They can be operated fully autonomously and require hardly any piloting skills.
Rotary-wing UAVs can fly in lower altitudes with lower speeds, which results in a sub-centimeter resolution of collected data. However, they cover smaller areas and require a longer flight time over the same area compared to fixed-wing UAVs. The VTOL systems make rotary-wing UAVs independent of a runway and allow a wide range of operational choices in different cropping situations such as steep and uneven terrains. The VTOL and hovering capability make the rotary-wing UAVs a good option for applications that require controlled low-speed flight in confined areas or areas with obstacles such as low altitude flying in orchards. Due to their simple mechanical structures, rotary-wing UAVs are ideal platforms for in-flight testing of the stability, control augmentation systems, and autonomous navigation control strategies. Increasing the number of rotors results in lower crash risk, higher flight stability, and lower vibrations. Rotary-wing UAVs with six or eight rotors (hexacopters and octocopters, respectively) are gaining more attention recently over quadcopters with four rotors due to several advantages such as:
  • Redundant actions provide safe flying and landing capability, especially in case of a malfunctioning motor without a complex control algorithm.
  • Higher payload capacity allows mounting of heavier hardware, such as a robotic arm for an aerial manipulation application (e.g., planting weeding, tissue sampling, or a 3D LiDAR sensor for a mapping application).
  • Holonomic flight capability (move horizontally without tilting motion) allows for more robust and precision flight against wind disturbance.
Rotary-wing UAVs are highly maneuverable yet lack flight efficiency due to short endurance. On the other hand, fixed-wings provide efficient flight due to longer endurance but lack maneuverability. Unlike fixed-wings, rotary-wings require constant downward airflow for hovering and they consume more power, thus reducing endurance.
Modification and customization of the structure of fixed-wings and rotary-wings are common practices to adapt UAVs in unstructured environments of agricultural fields. One such customization was proposed in a classical aircraft methodology for sizing an electric multi-rotor configuration [49]. This method was useful in considering the first estimations of flight parameters, including takeoff gross weight and empty weight. Another customization was proposed to modify the design and construction to inspect fruit orchards and vineyards [50]. Under these environments, the UAVs should have limited width and high agility to respond quickly to sudden external disturbances such as gusts of wind to minimize the drift. The hexacopter combines two large lift propellers providing low disk loading and low power consumption, and four small control propellers for high agility. An engine operated fixed-wing UAV was modified by removing some redundant structural material to reduce the mass and using carbon-fiber-composite replacement parts to increase strength [51]. This modified UAV allowed superior performance in takeoff and landings with longer endurance and area coverage. However, the stability of unmodified UAV was better and required less expertise to fly.
Efforts were made to combine the fixed-wing and rotary types of UAVs into hybrid models to take advantage of their pros and compensate for the disadvantages [52,53]. Hybrid UAVs combine the VTOL ability of multi-rotor UAVs and cruise flight of fixed-wing UAVs [54]. While the ability to VTOL makes these UAVs capable of operating in almost any location, the fixed-wing design provides greater endurance, the ability to cover longer distances, and the option to fly faster. These UAVs are characterized by large control surfaces capable of large deflections and a high thrust-to-weight ratio. Hybrid UAVs are ideal for tasks requiring both endurance and maneuverability. Because of the transition between two modes of rotary-wing and fixed-wing during the flight, aerodynamics of the systems changes dramatically. This could make the controlling of such designs difficult.
Complete comparisons of fixed-wing, rotary-wing, and hybrid UAVs are presented in Table 3. While the popularity of the fixed-wing UAVs in agricultural RS increases, the rotary-wing UAVs are favored due to their stable flight, larger payload capacity, ability to fly at low altitudes, and rapid deployment. Hybrid UAVs are costly when compared to rotary-wings and fixed-wing. Since hybrid UAVs are still under development stage, they are neither perfect in hovering nor in forwarding flight yet. However, with the arrival of modern autopilots, gyroscopes, accelerometers, and novel designs of hybrid UAVs, including tail sitters and tilt rotors, it is expected that these UAVs will become much more popular options in the near future for PA applications.
In general, the desired characteristics and configurations of UAVs depend on the required application type. To select the most appropriate UAV platform for different applications, compromises must be made among energy consumption, desired image resolution, ease of flying, wind stability, handling flight failures, area covered, and takeoff/landing requirements. Flight characteristics could also be influenced by unforeseeable factors, like wind turbulence and temperature. Careful consideration of proper design combination when selecting a UAV helps better address the above factors, which is crucial for PA applications.

3. RS Systems on Unmanned Aerial Vehicles

Before selecting UAVs’ on-board sensors, it is necessary to know the image resolution and spectral bands required to achieve the desired operational and application outcomes. Image resolution depends on UAVs architecture, camera detector’s resolution, Field of View (FOV) provided by the lens, camera exposure time, light intensity, images overlap, flight altitude, platform motion, ground speed, image pixel size, and flight stability. The effects of some of these factors on image quality are determined in Table 4. The cameras’ resolution is determined by the number of photo sites on the detector and spatial sampling frequency. Camera exposure time and UAVs ground speed determine the amount of pixel smear. Pixel smear is essentially a reduction in resolution in the direction of travel. Image pixel size is determined by factors like Above Ground Level (AGL), the altitude of UAVs, flight mode, Number of Ground Control Points (GCPs), UAVs architecture, and the type of camera on-board and light intensity (which is controlled by camera shutter and aperture) [55]. To maintain the UAVs image resolution at a certain altitude, a balance between ground speed and camera exposure time is required. This balance keeps the traveled distance during the exposure time in comparison to the spatial resolution.
The influence of a quadcopter UAV flight parameters, including AGL altitude, flight mode, and number of GCPs, on spatial resolution and spectral discrimination of multispectral orthomosaicks, was investigated [56]. Cruise flying mode improved the negative impact of low AGL altitude on flight duration while using 5 GCPs (4 GCPs at the corner and 1 GCP in the center of the mapping area) improved the relative spatial resolution in stop mode. However, none of the tested AGL altitudes of 60, 80, or 100 m allowed differentiation between crop and weed plants from a spectral perspective. UAVs stability, particularly in fixed-wings, is also a crucial factor for collecting high-quality images. There are two common ways to ensure this stability:
  • Use of UAV controls that keep it level even in case of variable winds aloft and gust
  • Use of gimbaled cameras on UAVs, which maintain the on-nadir angle even when the UAV is not level.
Generally, payloads can be integrated with a UAV depending on their size, weight, and application. Table 5 summarizes the most commonly used sensors and payloads for various PA operations. RGB (Red-Green_Blue) and multispectral sensors can be considered low-cost systems compared to the hyperspectral, thermal, and LiDAR sensors. Major disadvantages of the hyperspectral sensors are high cost, large dataset size, and complexity of image processing. The thermal camera can be used in conjunction with any other system when thermal imagery is needed. However, these cameras are affected by environmental conditions like humidity. Most off-the-shelf multispectral cameras can provide image resolutions above 4200 pixels by 2800 pixels, and they are better than the resolution of the thermal cameras (640 pixels by 480 pixels) [57]. Consumer-grade cameras are widely used for RS. In the future, more research will be needed to evaluate these cameras and compare them with more sophisticated multispectral and hyperspectral imaging systems for PA.

4. Challenges with UAVs in Precision Agriculture Applications

The RS data must be accurate and reliable with geometric, atmospheric, and radiometric corrections in order to enable the most optimal UAV RS data acquisition. For instance, crop monitoring and stress detection in the field incorporating vegetation indices, time series analyses, and change detection methods require absolute canopy reflectance calculated after applying radiometric and atmospheric corrections. To generate high-quality surface reflectance data, it is essential to have high-quality on-board sensors and comprehensive testing, calibration, and data processing methods. Calibration methods can be divided into absolute and relative methods. Absolute calibration is performed by converting remotely sensed Digital Numbers (DN) output from a sensor into surface reflectance or reflected radiance. For this purpose, effects caused by atmospheric attenuation, topographic condition, sensor noise, and other factors should be removed. In relative calibration, the detectors’ outputs are normalized to a given average output from all detectors in the band. In other words, removing or normalizing the variation within a scene and normalizing the intensities between images.
In this section, the atmospheric, radiometric, and geometric corrections of UAVs that are based on data as well as the challenges of these corrections were discussed.

4.1. Atmospheric Correction

The sun emits electromagnetic energy (EM) in the direction of Earth. Some of this energy is absorbed and scattered by gases and aerosols in the atmosphere before hitting the Earth’s surface [58]. Surface reflectance observations using aerial imagery are affected by a variety of interfering processes associated with the propagation of electromagnetic radiation in the atmosphere-surface system [59]. In clear sky conditions, these processes are gaseous absorption, molecular scattering, aerosol scattering and absorption, and water surface reflection. In cloudy conditions, scattering by cloud droplets makes it very difficult to sense the surface (the cloud signal largely dominates), except when clouds are optically thin or occupy a small fraction of the pixel, i.e., their effect on pixel reflectance is less than 0.2 [59]. Atmospheric effects influence the quality of the information extracted from aerial image measurements, such as vegetation indices [60]. Such errors, caused by atmospheric effects, can increase the uncertainty up to 10%, depending on the spectral channel [61]. Also, most of the signal reaching an imagery sensor from a dark object, such as water stress area, is contributed to by the atmosphere at visible wavelengths and on the assumption that the near-infrared and middle infrared image data are free of atmospheric scattering effects [62]. Therefore, the pixels from dark targets are indicators of the amount of upwelling path radiance in that band. The atmospheric path radiance adds to the surface radiance of the dark target, giving the target radiance at the sensor [61]. The influence of the atmosphere and surface must be removed to give access to correct surface reflectance. An atmospheric correction model is required where Vegetation Indices (VIs) are adopted in vegetation monitoring and in dark scenes where features can be masked out with atmospheric scatters like water stress and drought, rain erosion, or overgrazing. Atmospheric correction removes effects of atmosphere, variable solar illumination, sensor viewing geometry, and terrain on reflectance values of images and determines their actual values. Supplying, calibrating, and adjusting for the atmospheric condition at the imaging time are essential atmospheric correction requirements.
In low altitudes between 150–200 m, atmospheric contribution to surface reflectance has been argued to be negligible [63]. Therefore, atmospheric corrections have been neglected in low altitude RS with UAVs. However, the path radiance of the low-altitude atmosphere misdirects the reflectance of target objects. The reflected light of the ground objects collected by UAVs sensors passes through a very thin atmosphere, which is not thicker than 120 m [64]. Atmospheric attenuation is composed of atmospheric absorption and atmospheric scattering. Absorptions of visible light caused by nitrogen, oxygen, and carbon dioxide are negligible. Thus, the atmospheric attenuation of visible light is mainly caused by Rayleigh scattering at the low altitude [64]. Indeed, although Rayleigh scattering dominates at altitudes above ~35 km [65], it is crucial to consider its effects in low altitudes. It has been confirmed that some applications such as thermal cameras using microbolometer sensors require atmospheric corrections to be performed even at flight altitudes as low as 40 m [63].
Several methods have been applied (Table 6) to improve the atmospheric correction models for UAVs imagery. The off-the-shelf atmospheric correction algorithms adopted in satellite-based RS are typically ill-suited for UAVs-based images due to the distinctly different altitudes and radiation transfer modes. Atmospheric correction with these standard atmospheric profiles causes a loss of accuracy. Moreover, the data needed for applying an atmospheric calibration model in these methods are mostly costly and not always available.

4.2. Radiometric Calibration

Radiometric calibration involves determining the functional relationship between incoming radiation and sensor output, such as DN. Accurate radiometric calibration becomes necessary for change detection and interpretation if the images were from different dates or times, locations, or sensors. Radiometric calibration ensures that changes in the data are related to field changes, not variations in the image acquisition process or conditions (e.g., change in light intensity). Most image collections involving hyperspectral cameras (e.g., crop phenotyping, disease detection, and yield monitoring) require accurate radiometric calibrations.
Some potential solutions could reduce the radiometric variation. Light intensity varies over time because of changes in solar elevation, atmospheric transmittance, and clouds. Thus, conducting image collection flights in minimum solar elevation could reduce radiometric variation in collected data. The digital camera exposure settings should be selected carefully based on the overall light intensity, whether manually or automatically. However, none of these solutions can completely remove the effects of the radiometric variation from collected images.
In manned aircraft images, radiometric correction is usually conducted by laying out large tarps with a known set of reflectance values spanning the range of object reflectance in the scene. Then each pixel in the image is corrected according to a linear model. Since UAVs images are mosaicked into large images, and each image covers relatively small areas, it is impractical to include one calibration reference in each image. Although operators set camera parameters for certain average light intensity, the camera’s exposure settings vary within each aerial photograph. Thus, a radiometric calibration model developed for one image will not be ideal for another image.
There are several UAVs-based radiometric calibration methods proposed for PA purposes (Table 7). The radiometric correction algorithms in commercially available data processing software packages (e.g., AgiSoft PhotoScan and Pix4DMapper) include options for empirical correction, color balancing, irradiance normalization, or sensor-information based calibration. Despite the large number of radiometric correction methods proposed in research studies (Table 7) or available with commercial software packages, most of these methods are laborious and do not provide quantitative radiometry and acceptable visual quality.

4.3. Geometric Correction

UAVs collected imagery for aerial mapping of crop fields or orchards contains some degree of geometric distortion. These distortions are due to sensor position variations, off-axis projector, screen placement, non-flat screen surface, platform motion, rotation of the Earth on its axis as images are taken, earth curvature, and terrain effects. These factors could be categorized as internal and external factors (Table 8).
Alternatively, they could be classified as systematic or random distortions. Systematic distortions happen due to the Earth’s rotation and camera angles, and these distortions are predictable. Random distortions occur due to changing terrain and variations in sensor altitude. Since systematic distortions are predictable, correction can be applied systematically, while random distortions are not predictable and more challenging to address.
Geometric correction is required to compensate for geometric distortions and ultimately produce a corrected image with a high geometric integrity level. In a distorted image, the size of target area represented by a pixel may vary across the image, and the (x, y) coordinates of a given pixel may not match the correct geographic location. Geometric calibration is performed to recover intrinsic camera parameters, including focal distance, principal point coordinates, and lens radial distortion, to recreate a realistic representation of the scene.
Geometric correction digitally manipulates images, so their projection precisely matches a specific surface or shape. Geometric correction is achieved by establishing the relationship between the image coordinate system and the geographic coordinate system using sensors’ calibration data, measured data of positions and attitudes, digital elevation model (DEM), ground control points, atmospheric condition, etc. Proposed methods in previous studies (Table 9) are based on either georeferencing or orthorectification. Georeferencing is the process of aligning geographic data to a known coordinate system and may involve shifting, rotating, scaling, skewing, and in some cases warping, or orthorectified data. Usually, GCPs build a polynomial transformation that shifts the distorted dataset from its existing location to a spatially correct location. Higher transformation orders could correct more complex distortions.
Orthorectification requires more information than georeferencing e.g.,—knowledge of sensor or camera. Orthorectification corrects for sensor tilt and the Earth’s terrain distortions using rational polynomial coefficients (RPCs) and an accurate DEM.

5. Precision Agriculture Applications with UAVs

Crop monitoring can occur during the different growth stages, starting from germination even before plants respond to soil characteristics. The purpose of crop monitoring is to record phenotypic characteristics that may reflect potential problems, and to identify these problems at an earlier stage of development. Later, the crop growth and performance can be evaluated to present proper management decisions. Acquiring high-resolution aerial images can help growers evaluate soil conditions, crop growth status and provide useful information about plant health and crop stress. Such information can help with sustainable crop management and risk management, and play an essential role in global markets, policy, and decision making. In this section, we have reviewed the benefits and challenges of using UAVs in conducting agricultural operations.

5.1. Pesticide and Fertilizer Spraying

The application of pesticides and fertilizers in agricultural areas is of prime importance as they maintain and ensure the quality and quantity of crop yield. However, several challenging issues affect these processes, including:
  • Uneven spread of chemicals because of broadcasting the pesticides and fertilizers.
  • Skipping or overlapping some crop areas while spraying
  • Applying pesticides only on the outer edge of the crop
  • Climatic condition impacts on spraying or broadcasting (e.g., wind intensity and direction) including spray drift, or spraying a neighboring field
  • Damage to plants when using ground sprayers or spreaders
  • Operators exposure to chemicals.
UAVs’ abilities to hover, modulate distance from the ground as the topography and geography vary, create high-resolution field maps, and spray the correct amount of liquid in real time for even coverage provides the opportunity for patch spraying and accurate site-specific management in the PA industry. UAVs could reduce chemical pesticide and fertilizer application by 15–20% by using a low or ultra-low volume of spray [86] and reduce the amount of chemicals penetrating into groundwater. It is estimated that a UAV based pesticide application could be up to five times faster than traditional machinery [87]. Low thrust is exerted in UAVs because of their ability to operate significantly close to crops without causing damage. It can remarkably reduce the concern of chemical drift. Spraying pesticides and fertilizers using UAVs also could eliminate the operator exposure to chemicals in manual spraying.
Although UAVs have advantages for spraying pesticides and fertilizers, the effectiveness of operations is still affected by several factors. Droplet deposition level on target crops is a critical index of spray performance and is significantly influenced by aerial nozzles. Unfortunately, droplet deposition is still an essential concern in spraying applications when using UAVs. The effectiveness of spraying using UAVs is also affected by the flight parameters. Variable flight parameters during UAVs operation, affect the average deposition amount of droplets. The rotors’ downward airflow determines the penetrability of the droplets. It could interact with crop canopy and form a conical vortex shape in crops. The size of this vortex directly affects the outcome of the spraying operation. The efficacy of pesticide spraying also depends on temperature during spraying. Because of the flight parameters, farmland size, and the number of stops for spot spraying, it is difficult for UAVs to traverse all the fields in a single flight. Batteries recharging and refilling sprayer tanks might be required multiple times during chemical application. These are mainly time-consuming tasks, and there is a high possibility of temperature change to prepare UAVs for the next flights.
Flight stability and smoothness could also affect the efficiency of UAVs spraying. Spraying the right targets at the right spots is achievable in stable and smooth flights. For this purpose, complicated designs of controllers are required to improve the stability of UAVs flight.
Some Federal Aviation Administration (FAA) regulations are also considered constraints to use UAVs for spraying. Despite the advent of UAVs’ novel uses, the FAA regulations of aerial pesticide application have not yet been updated to accommodate UAVs’ specific benefits and limitations. Basically, FAA Remote Pilot Certificate (Part 107) requires the UAVs sprayer to be registered, and the pilot to have a remote pilot certificate [39]. However, Part 107 is only for UAVs that weigh on take-off less than 55 pounds and hazardous material could not be carried on the UAVs under this certificate. UAVs sprayer could simply weight more than 25 kg. For UAVs with a payload over 25 kg (55 lbs.), a Section 333 Exemption is required. Obtaining this permission could be costly and time-consuming. It could cost up to $1500 and take up to four months for the FAA response [88]. The acceptance rate of these requests has been estimated as only 14% [89]. Carrying hazardous materials and certain pesticide active ingredients (e.g., allethrin, carbamate, and organophosphorus) with UAVs are not allowed [90]. In addition to the exemption, an operator needs to obtain an agricultural aircraft operator certificate (Part 137). Agricultural aircraft operation means the operation of an aircraft for the purpose of dispensing any economic poison, dispensing any other substance intended for plant nourishment, soil treatment, propagation of plant life, or pest control, or engaging in dispensing activities directly affecting agriculture, horticulture, or forest preservation, but not including the dispensing of live insects [39]. These regulations could restrict UAV technologies, especially in the most common aerially applied pesticides like Naled, widely used to control mosquito populations. These limitations could be addressed by FAA waivers and updating of the regulations.

5.2. Vegetation Growth Monitoring for Yield Estimation

Crops do not generally grow evenly across fields, consequently, crop yield can vary significantly from one spot in a field to another. These growth differences may result from biotic (pests) or abiotic (soil, water, etc.) factors. Monitoring of canopy and assessment of crop condition in real-time can aid in devising suitable management strategies to improve yield [91].
Conventional crop monitoring methods involve tedious crop growth data collection and evaluation of crop yield for previous years/seasons. The latest trend in large farms is to adopt combine harvesters with electronic yield monitors and Global Positioning System (GPS) tagging facility [92]. In a practical crop field situation, farmers have to carefully select grain flow sensors, volumetric and mass flow detection, ground speed sensor, harvest controls, yield data collection points and yield mapping. Combine harvesters originally only gave final yield value per plot. Current combine harvesters equipped with yield monitors record crop yield at much finer resolution that are suitable for site-specific or precision farming techniques. Field-based accurate crop surveying and crop estimates are usually costly and time-consuming.
Satellite-based crop yield predictions raised possibilities for monitoring and understanding agricultural productivity in many world regions. However, until recently, the use of satellite-based data for yield prediction has been sparse and limited only to large scale monitoring and mapping due to the limited availability of high spatial (>5 m) and temporal (daily) resolution satellite imagery [93]. For applications at more detailed scales and small fields, limitations are created by the mixed nature of low-resolution pixels such as Landsat [94]. This severely complicates the analysis, interpretation and validation of data and challenges the reliability of the derived information [95]. Obtaining spatiotemporal information and crop phenological status during a critical period of the growing season is also very challenging due to high cloud coverage.
Aerial images by UAVs could augment the data obtained using combine harvesters over many years, NDVI, and leaf chlorophyll content, to name a few. For example, the NDVI data and disease/pest spots imaged by UAVs at different stages of crop growth could be compared, along with yield data for several seasons. It would help to prepare maps depicting standard deviation and coefficient of variation. It is possible to mark the highest and lowest yield produced from the same location in the field, for each of the several seasons and crops. Compared to satellite imagery, there is no limitation in flying UAVs periodically over the same field and it is possible to generate yield maps of a field for multiple years. By examining multiple years of yield maps, it is possible to mark management blocks to provide greater authenticity and yield stability [96]. UAVs’ ability to operate at low altitudes below the clouds also minimizes cloud coverage on images’ quality.
The UAV-based images can be transformed into Digital Surface Models (DSM) or Crop Surface Models at the full canopy to assess crops health, growth rates, and yield forecasts topography [87]. These models are required in order to estimate biomass accumulation, and plant height. Yield estimation using the combination of these models and spectral data could be developed very fast. These maps can provide great details of variations in vegetative growth and yield potential [97]. However, the UAVs’ ability to map the crops is not perfect yet, and there are several rooms to improve the ability of UAVs to accurately show yield variations and pin-point causes.
UAV imagery’s main challenge to monitor plant health and yield potential is assessing the high-resolution multi-temporal data. It is essential to know the appropriate vegetation index/indices that is/are best for fractional vegetation cover mapping to the crops’ time/growth stage. In other words, it is crucial to know which vegetation indices estimate crops yield best and find the relationship between observed crop characteristics and crop yield. The required accuracy of crop surface models to calculate plant height should be determined before the flight. Additionally, the best time or growth stage to record data from each crop type for accurate yield prediction should be estimated accurately.

5.3. Vegetation Health Monitoring and Pest Management

Stress symptoms in plants begin to appear in the infrared range several days before their appearance in the visible range [98]. A combination of both visible and infrared data at different plant growth stages could increase model prediction accuracy. Many farmers use ground-based methods, satellite imagery, and manned aircraft to monitor crop growth and density. Crop consultants or farmers may perform ground-based monitoring during growth seasons [99]. Ground-based monitoring is tedious, destructive, or not offer best data at a rapid pace [100]. In these cases, farmers cannot monitor all farm areas, and some parts of the fields are neglected. It may lead to avoidable expenses to address crop problems at a later stage. Additionally, although satellite imagery has actually improved over time in terms of resolution, accuracy and sharpness; it falls short of great details, close-ups and rapid relay of pictures and accessing satellites fitted with multispectral sensors could be costly [92]. Scheduling manned aerial flights could take time, while diseases and invasive species spread fast through fields [101].
UAVs’ ability to fly close to crops results in higher resolution images that are less affected by cloud cover and low light conditions than satellite imagery. UAV imaging can produce a map of the entire field down to the millimeter, and there is no need to schedule flights beforehand. UAVs could be equipped with special imaging equipment (NDVI camera, regular cameras, etc.) that use detailed information to indicate plant health in terms of temperature, chlorophyll levels, foreign contaminants, and leaf thickness, which allows farmers to monitor crops as they grow to deal with them fast enough to better manage the plants.
UAV based images used for pest detection should offer:
  • The detection of a symptom at the onset
  • The differentiation of different symptoms
  • The ability to separate from environmental stresses
  • The estimation of symptoms severity.
In pest management using UAVs, it is crucial to find the most significant spectral channels for symptom detection in different crops. A good result of disease detection using UAVs was obtained when textural and spectral data were combined [102]. The combination of hyperspectral and thermal sensors also indicated promising results in disease detection in the early stages [103,104,105]. However, the fusion of the data from two sensor types could increase the difficulty of data processing.
The efficiency of some site-specific pest management operations like weed and disease detections based on UAV images highly depends on image spectral, spatial, and temporal (the frequency of flights) resolutions. The flight altitude impact on image pixel size in discriminating between weeds and crop is very important [106,107]. At lower altitudes, images have a higher spatial resolution, but area coverage may be smaller. However, the highest accuracy is not always provided by the highest spatial resolution at a low altitude [108]. It is demonstrated that flight altitude in the range of 10 to 50 m with corresponding image resolution in the range of 3–15 mm pixel−1 did not influence pest detection in a systematic way significantly [109]. Regardless of flight parameters, the major challenges in crop and pest discrimination include the need for high spatial and temporal resolution of UAVs imagery and the spectral similarity of pest symptoms and abiotic stresses on crops. This problem deteriorates in early detection when crops with pest symptoms and healthy crops have similar appearance and show analogous spectral patterns. Image collection time and spectral characteristics of field background could also highly affect UAV-based aerial detection of pests [110,111,112].
Using newly developed UAVs, biological pest management can be accomplished by releasing beneficial insects, predators, and parasites over crops [113]. This method could reduce the environmental impact of pesticide usage, and address a growing labor crunch. UAV-based aerial biocontrol provides more efficient distribution of biocontrol agents than traditional application techniques, could kill pests even when they’re hiding where chemicals can’t reach, and meaningfully reduce the use of chemical pesticides [113].
Although pest management using UAVs has several advantages over conventional methods, they face several challenges. These include sensor integration, multi-resolution solutions, flight altitude vs. efficiency optimization, image preprocessing, and information extraction [114]. UAVs are more appropriate for large areas such as golf courses and crop fields, while in small areas such as parks and gardens, ground-based methods are preferred because they are less expensive and more practical [115]. Although UAV-based pest control requires discrimination of the various pest symptoms from each other and environmental stresses, currently there are no accurate methods to accomplish this. This is an area that new data analytics strategies incorporating machine learning and artificial intelligence have the opportunity to address.

5.4. Irrigation Management and Water stress

Water availability is a top production parameter affecting crop productivity. Soil moisture also has an impact on the extent of fertilizer usage and its efficiency. Reduced available soil water to meet the evapotranspiration demand leads to water stress in crops. Canopy temperature is shown to be a good indicator of plant stress and can provide continuous information on water status, water use and how a plant is functioning metabolically [116].
Farmers and companies have adopted several different crop monitoring and irrigation methods (e.g., sheet irrigation, furrow irrigation, drip irrigation) to observe real-time crop water status and schedule irrigation. All these methods require frequent observation with proximal sensors, especially when water is distributed on to fields. These proximal sensing techniques are costly, tedious, and time-consuming if field sampling is intense.
Field elevation data are useful in determining drainage patterns and wet/dry spots, which allow for more efficient watering techniques. When using thermal and hyperspectral sensors, UAVs can capture very accurate data pertaining to spatial variability of soils and crops in fields. The inherent variation in soil properties or non-uniform application of irrigation can contribute to spatial variation in crop growth and yield. For example, knowledge of spatial and temporal variations in soil moisture becomes critical for irrigation management. The spatial resolution of data collected for mapping crop water stress must be sufficient to separate plant pixels from soil background and to avoid mixed soil/vegetation pixels. Thermal images can be easily influenced by various factors, including the characteristics of the thermal camera, weather conditions, and different emitted and reflected thermal radiation sources. Thus, several factors should be considered carefully for correct temperature retrieval including, sensor calibration or humidity correction. Therefore, crop water stress estimation using UAVs particularly in large areas, should be conducted carefully and effectively.
UAVs also can monitor the water flow in irrigation channels and distribution of water using center-pivot sprinklers. In fact, UAVs could be of immense utility in monitoring and coordinating movements of sprinklers located in large farms [92]. Equipped with proper nozzles, UAVs could potentially sprinkle water and nutrients in small target fields/areas. For large farms, a group of UAVs (swarms of UAVs) must conduct the sprinkle [92] to overcome the effect of drought, when drought is suspected to affect crop, biomass and grain formation severely. However, swarms of UAVs could be cost-effective and require a complex design of path planning and control systems for each UAV. If swarms of UAVs technologies can be made affordable, they could potentially provide very valuable information about when and where to apply precise quantities of water to crop in large farms [117].

6. Adoption of UAVs in PA

UAVs have been influential in RS, and their contribution to PA has captured adequate attention from researchers. However, critical issues raised in different studies regarding adoption of these new technologies in agriculture [118,119,120,121]. By 2005 only 9% of a survey producer used RS of any kind, while 73% did not plan to use these technologies at all [122]. However, according to a study recently unveiled by Munich Reinsurance America Inc., (Princeton, NJ, USA) [123], in 2018, nearly three-fourths of all U.S. farmers are currently using or considering adopting the UAVs technology to assess, monitor and manage their farm. Munich surveyed 269 farmers for input on how farmers are using or viewing UAVs. Most of the farmers surveyed have concerns related to UAV usage (76 percent). Privacy issues, at 23 percent, was the most common concern held among farmers, followed by cyber security concerns with data captured via UAV (20 percent) and then the possibility of injury or damage from the UAV (17 percent) [123]. Of those who currently use the technology, 49% contract with an outside company to operate their drones and 51% handle drone usage on their own. 83% of these farmers, use UAVs on their farms either daily or once a week or more. UAVs are used for or considered to be used for crop monitoring (73%), soil and field analysis (46%), and health assessment of [crops and livestock] (43%) [123].
RS with UAVs is not widely adopted by farmers. Despite the incorporation of new technologies in UAVs, some factors and challenges prevent farmers’ widespread usage. One reason for UAVs’ unpopularity among farmers is that UAVs collect a huge amount of complex data from various sensors. Big data analytics tools and cloud computing are required to extract information from the collected data. Data analysis and interpretation usually depend on using software packages with different levels of complexity and cost. Data analysis software that are used in this scope must be purchased if they are not open sources. These software packages must be installed and configured by users and require maintenance, advanced coding, or advanced data training. Thus, utilizing UAVs for non-expert users requires investment in time and money, and learning new skills with complex tools. Moreover, although UAVs technologies do not require a trained human pilot, a private pilot license is still required for commercial flight operations. Agricultural producers also require clear information on economic benefit of precision agricultural technologies before adopting them. Other technical factors influencing the adoption of these technologies are:
  • Socioeconomic characteristics (e.g., farm size, farming experience, education, age, and access to the information)
  • Physical attributes of the farm (e.g., variability of soil types, productivity)
  • Location of the farm (ex, broadband connectivity)
  • Initial financial investment
  • Interest in learning new skills and the learning curve
  • Compatibility of new technologies with the current practice
  • Potential benefits (reducing production costs, increasing yields, protecting the environment, providing massive amounts of information to help manage farms)
  • Complexity of the technology and ease of use
  • Complexity of data interpretation and decision making
  • Data transfer speed
  • Data privacy
  • Safety.
We suggested solutions to some of the above-mentioned challenges, as presented in Table 10. These solutions require complicated advanced designs, and extensive laboratory and field verification experiments to address the respective challenges. In short, research, development and validation are essential steps in addressing these challenges.
Because of the many challenges discussed in this paper, UAVs are not universally adopted by farmers, especially in small farms. Additional efforts are required to make these technologies more user friendly and available for all types of end-users covering different interests for a precise crop management. Further research is required to design and implement special types of cameras and sensors on-board UAVs for remote crop and soil monitoring, and other agricultural applications in real-time scenarios. The goal should be a fully automated pipeline, including flight preparation (optimal flight height and pattern, sensor setup, etc.), flight execution (calibration of sensors, ground control measurements, and flight execution itself), and data processing with interpretation.

7. Conclusions

In this study, UAVs’ application for different agricultural operations and the distribution of UAV types among researchers of this domain have been reviewed. This manuscript provides a perspective of technical characteristics of UAVs in field-based agricultural RS. The wide range of influential factors on applying UAVs in PA is discussed. Types of available platforms and sensors along with flight planning, image registration, calibration and correction, and data products derived based on variables in fields for any particular application have been described. This comprehensive study provides a preliminary insight into selecting the most appropriate UAVs and procedures for PA operations by growers and interested researchers. Indeed, to leverage the full potential of UAV-based approaches, sensing technologies, measurement protocols, postprocessing techniques, retrieval algorithms, and evaluation techniques must be combined. A review of the literature carried out herein identified many challenging issues in using UAVs for agricultural monitoring and the variety of methodologies adopted. The review suggests that there is still a need for further integration efforts of these technologies and methods. The challenges and shortcomings of UAVs used in PA procedures were discussed, and several strategies were suggested to optimize the application of UAVs. In particular, it was envisaged that there is a need to develop standard experiment platforms to assess different procedures’ reliability and identify the most appropriate methodology for precision agriculture in different environmental conditions.

Author Contributions

Conceptualization, N.D. and X.S.; validation, C.K., X.S. and J.N.; investigation, N.D. and C.K.; resources, N.D.; writing—original draft preparation, N.D.; writing—review and editing, N.D., C.K.; supervision, X.S., J.N. and S.B.; funding acquisition, X.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Acknowledgments

The authors would like to thank the US Department of Agriculture, agreement number 58-6064-8-023. Any opinions, findings, conclusions or recommendations expressed in this publication are those of the author(s) and do not necessarily reflect the view of the US Department of Agriculture.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Pierce, F.J.; Nowak, P. Aspects of Precision Agriculture. In Advances in Agronomy; Elsevier: Amsterdam, The Netherlands, 1999; pp. 1–85. [Google Scholar]
  2. Raj, R.; Kar, S.; Nandan, R.; Jagarlapudi, A. Precision Agriculture and Unmanned Aerial Vehicles (UAVs). In Unmanned Aerial Vehicle: Applications in Agriculture and Environment; Springer International Publishing: Cham, Switzerland, 2020; pp. 7–23. [Google Scholar]
  3. Carolan, M. Publicising Food: Big Data, Precision Agriculture, and Co-Experimental Techniques of Addition: PublicisingFood. Sociol. Ruralis 2017, 57, 135–154. [Google Scholar] [CrossRef]
  4. Hopkins, M. The Role of Drone Technology in Sustainable Agriculture. Available online: https://www.precisionag.com/in-field-technologies/drones-uavs/the-role-of-drone-technology-in-sustainable-agriculture/ (accessed on 11 March 2021).
  5. Brisco, B.; Brown, R.J.; Hirose, T.; McNairn, H.; Staenz, K. Precision Agriculture and the Role of Remote Sensing: A Review. Can. J. Remote Sens. 1998, 24, 315–327. [Google Scholar] [CrossRef]
  6. Hunt, E.R., Jr.; Daughtry, C.S.T. What Good Are Unmanned Aircraft Systems for Agricultural Remote Sensing and Precision Agriculture? Int. J. Remote Sens. 2018, 39, 5345–5376. [Google Scholar] [CrossRef] [Green Version]
  7. Zhang, C.; Kovacs, J.M. The Application of Small Unmanned Aerial Systems for Precision Agriculture: A Review. Precis. Agric. 2012, 13, 693–712. [Google Scholar] [CrossRef]
  8. Watts, A.C.; Ambrosia, V.G.; Hinkley, E.A. Unmanned Aircraft Systems in Remote Sensing and Scientific Research: Classification and Considerations of Use. Remote Sens. 2012, 4, 1671–1692. [Google Scholar] [CrossRef] [Green Version]
  9. Huang, Y.B.; Thomson, S.J.; Hoffmann, W.C.; Lan, Y.B.; Fritz, B.K. Development and prospect of unmanned aerial vehicle technologies for agricultural production management. IJABE 2013, 6, 1–10. [Google Scholar]
  10. Colomina, I.; Molina, P. Unmanned Aerial Systems for Photogrammetry and Remote Sensing: A Review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef] [Green Version]
  11. Salamí, E.; Barrado, C.; Pastor, E. UAV Flight Experiments Applied to the Remote Sensing of Vegetated Areas. Remote Sens. 2014, 6, 11051–11081. [Google Scholar] [CrossRef] [Green Version]
  12. Nex, F.; Remondino, F. UAV for 3D Mapping Applications: A Review. Appl. Geomat. 2014, 6, 1–15. [Google Scholar] [CrossRef]
  13. Pajares, G. Overview and Current Status of Remote Sensing Applications Based on Unmanned Aerial Vehicles (UAVs). Photogramm. Eng. Remote Sens. 2015, 81, 281–330. [Google Scholar] [CrossRef] [Green Version]
  14. Pádua, L.; Vanko, J.; Hruška, J.; Adão, T.; Sousa, J.J.; Peres, E.; Morais, R. UAS, Sensors, and Data Processing in Agroforestry: A Review towards Practical Applications. Int. J. Remote Sens. 2017, 38, 2349–2391. [Google Scholar] [CrossRef]
  15. Hunt, E.R., Jr.; Daughtry, C.S.T.; Walthall, C.L.; McMurtrey, J.E., III; Dulaney, W.P. Agricultural Remote Sensing Using Radio-Controlled Model Aircraft. In ASA Special Publications; American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America: Madison, WI, USA, 2015; pp. 197–205. [Google Scholar]
  16. Mogili, U.M.R.; Deepak, B.B.V.L. Review on Application of Drone Systems in Precision Agriculture. Procedia Comput. Sci. 2018, 133, 502–509. [Google Scholar] [CrossRef]
  17. Freeman, P.K.; Freeland, R.S. Agricultural UAVs in the U.S.: Potential, Policy, and Hype. Remote Sens. Appl. Soc. Environ. 2015, 2, 35–43. [Google Scholar] [CrossRef]
  18. Schut, A.G.T.; Traore, P.C.S.; Blaes, X.; de By, R.A. Assessing Yield and Fertilizer Response in Heterogeneous Smallholder Fields with UAVs and Satellites. Field Crops Res. 2018, 221, 98–107. [Google Scholar] [CrossRef]
  19. Chang, A.; Jung, J.; Maeda, M.M.; Landivar, J. Crop Height Monitoring with Digital Imagery from Unmanned Aerial System (UAS). Comput. Electron. Agric. 2017, 141, 232–237. [Google Scholar] [CrossRef]
  20. Pérez-Ortiz, M.; Peña, J.M.; Gutiérrez, P.A.; Torres-Sánchez, J.; Hervás-Martínez, C.; López-Granados, F. Selecting Patterns and Features for Between- and within- Crop-Row Weed Mapping Using UAV-Imagery. Expert Syst. Appl. 2016, 47, 85–94. [Google Scholar] [CrossRef] [Green Version]
  21. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-Based Plant Height from Crop Surface Models, Visible, and near Infrared Vegetation Indices for Biomass Monitoring in Barley. ITC J. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  22. Villa, T.F.; Gonzalez, F.; Miljievic, B.; Ristovski, Z.D.; Morawska, L. An Overview of Small Unmanned Aerial Vehicles for Air Quality Measurements: Present Applications and Future Prospectives. Sensors 2016, 16, 1072. [Google Scholar] [CrossRef] [Green Version]
  23. Snaddon, J.; Petrokofsky, G.; Jepson, P.; Willis, K.J. Biodiversity Technologies: Tools as Change Agents. Biol. Lett. 2013, 9, 20121029. [Google Scholar] [CrossRef] [Green Version]
  24. Koc, C. Design and Development of a Low-Cost UAV for Pesticide Applications. Gaziosmanpașa Üniv. Ziraat Fak. Derg. 2017, 34, 94–103. [Google Scholar]
  25. Gallacher, D. Drone Applications for Environmental Management in Urban Spaces: A Review. Int. J. Sustain. Land Use Urban Plan. 2017, 3. [Google Scholar] [CrossRef]
  26. Rutkin, A. Blood Delivered by Drone. New Sci. 2016, 232, 24. [Google Scholar] [CrossRef]
  27. Van de Voorde, P.; Gautama, S.; Momont, A.; Ionescu, C.M.; De Paepe, P.; Fraeyman, N. The Drone Ambulance [A-UAS]: Golden Bullet or Just a Blank? Resuscitation 2017, 116, 46–48. [Google Scholar] [CrossRef]
  28. Rabta, B.; Wankmüller, C.; Reiner, G. A Drone Fleet Model for Last-Mile Distribution in Disaster Relief Operations. Int. J. Disaster Risk Reduct. 2018, 28, 107–112. [Google Scholar] [CrossRef]
  29. Rusnák, M.; Sládek, J.; Kidová, A.; Lehotský, M. Template for High-Resolution River Landscape Mapping Using UAV Technology. Measurement 2018, 115, 139–151. [Google Scholar] [CrossRef]
  30. Zeng, C.; Richardson, M.; King, D.J. The Impacts of Environmental Variables on Water Reflectance Measured Using a Lightweight Unmanned Aerial Vehicle (UAV)-Based Spectrometer System. ISPRS J. Photogramm. Remote Sens. 2017, 130, 217–230. [Google Scholar] [CrossRef]
  31. Cook, K.L. An Evaluation of the Effectiveness of Low-Cost UAVs and Structure from Motion for Geomorphic Change Detection. Geomorphology 2017, 278, 195–208. [Google Scholar] [CrossRef]
  32. Koparan, C.; Koc, A.; Privette, C.; Sawyer, C. In Situ Water Quality Measurements Using an Unmanned Aerial Vehicle (UAV) System. Water 2018, 10, 264. [Google Scholar] [CrossRef] [Green Version]
  33. Koparan, C.; Koc, A.; Privette, C.; Sawyer, C.; Sharp, J. Evaluation of a UAV-Assisted Autonomous Water Sampling. Water 2018, 10, 655. [Google Scholar] [CrossRef] [Green Version]
  34. Laliberte, A.S.; Rango, A.; Herrick, J. Unmanned Aerial Vehicles for Rangeland Mapping and Monitoring: A Comparison of Two Systems. In Proceedings of the ASPRS Annual Conference, Tampa, FL, USA, 7–11 May 2007. [Google Scholar]
  35. Lucieer, A.; Malenovský, Z.; Veness, T.; Wallace, L. HyperUAS-Imaging Spectroscopy from a Multirotor Unmanned Aircraft System: HyperUAS-Imaging Spectroscopy from a Multirotor Unmanned. J. Field Robot. 2014, 31, 571–590. [Google Scholar] [CrossRef] [Green Version]
  36. Burkart, A.; Aasen, H.; Alonso, L.; Menz, G.; Bareth, G.; Rascher, U. Angular Dependency of Hyperspectral Measurements over Wheat Characterized by a Novel UAV Based Goniometer. Remote Sens. 2015, 7, 725–746. [Google Scholar] [CrossRef] [Green Version]
  37. Berni, J.; Zarco-Tejada, P.J.; Suarez, L.; Fereres, E. Thermal and Narrowband Multispectral Remote Sensing for Vegetation Monitoring from an Unmanned Aerial Vehicle. IEEE Trans. Geosci. Remote Sens. 2009, 47, 722–738. [Google Scholar] [CrossRef] [Green Version]
  38. Hardin, P.J.; Jensen, R.R. Small-Scale Unmanned Aerial Vehicles in Environmental Remote Sensing: Challenges and Opportunities. GIsci. Remote Sens. 2011, 48, 99–111. [Google Scholar] [CrossRef]
  39. Jonathan, R. Drone Sprayers: Uses, Laws, & Money Saving Tips. 2021. Available online: https://jrupprechtlaw.com/drone-sprayer/ (accessed on 15 March 2021).
  40. Hardin, P.J.; Hardin, T.J. Small-Scale Remotely Piloted Vehicles in Environmental Research: Remotely Piloted Vehicles in Environmental Research. Geogr. Compass. 2010, 4, 1297–1311. [Google Scholar] [CrossRef]
  41. Hunt, E.R., Jr.; Hively, W.D.; Fujikawa, S.; Linden, D.; Daughtry, C.S.; McCarty, G. Acquisition of NIR-Green-Blue Digital Photographs from Unmanned Aircraft for Crop Monitoring. Remote Sens. 2010, 2, 290–305. [Google Scholar] [CrossRef] [Green Version]
  42. Jones, G.P., IV; Pearlstine, L.G.; Percival, H.F. An Assessment of Small Unmanned Aerial Vehicles for Wildlife Research. Wildl. Soc. Bull. 2006, 34, 750–758. [Google Scholar] [CrossRef]
  43. Gupta, S.G.; Ghonge, M.; Jawandhiya, P.M. Review of Unmanned Aircraft System (UAS). SSRN Electron. J. 2013. [Google Scholar] [CrossRef]
  44. Shakhatreh, H.; Sawalmeh, A.H.; Al-Fuqaha, A.; Dou, Z.; Almaita, E.; Khalil, I.; Othman, N.S.; Khreishah, A.; Guizani, M. Unmanned Aerial Vehicles (UAVs): A Survey on Civil Applications and Key Research Challenges. IEEE Access 2019, 7, 48572–48634. [Google Scholar] [CrossRef]
  45. Tsouros, D.C.; Bibi, S.; Sarigiannidis, P.G. A Review on UAV-Based Applications for Precision Agriculture. Information 2019, 10, 349. [Google Scholar] [CrossRef] [Green Version]
  46. Kim, K.; Davidson, J. Unmanned Aircraft Systems Used for Disaster Management. Transp. Res. Rec. 2015, 2532, 83–90. [Google Scholar] [CrossRef]
  47. Van Blyenburgh, P. RPAS Yearbook: Remotely Piloted Aircraft Systems: The Global Perspective 2013/2014; Technical Report; UVS International: Paris, France, 2013. [Google Scholar]
  48. Huang, Y.; Hoffmann, W.C.; Lan, Y.; Wu, W.; Fritz, B.K. Development of a spray system for an unmanned aerial vehicle platform. Appl. Eng. Agric. 2009, 25, 803–809. [Google Scholar] [CrossRef]
  49. Gatti, M.; Giulietti, F. Preliminary Design Analysis Methodology for Electric Multirotor. IFAC Proc. Vol. 2013, 46, 58–63. [Google Scholar] [CrossRef]
  50. Verbeke, J.; Hulens, D.; Ramon, H.; Goedeme, T.; De Schutter, J. The Design and Construction of a High Endurance Hexacopter Suited for Narrow Corridors. In 2014 International Conference on Unmanned Aircraft Systems (ICUAS); IEEE: New York, NY, USA, 2014. [Google Scholar]
  51. Hunt, E.R., Jr.; Cavigelli, M.; Daughtry, C.S.T.; Mcmurtrey, J.E., III; Walthall, C.L. Evaluation of Digital Photography from Model Aircraft for Remote Sensing of Crop Biomass and Nitrogen Status. Precis. Agric. 2005, 6, 359–378. [Google Scholar] [CrossRef]
  52. Bapst, R.; Ritz, R.; Meier, L.; Pollefeys, M. Design and Implementation of an Unmanned Tail-Sitter. In 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS); IEEE: New York, NY, USA, 2015. [Google Scholar]
  53. Saeed, A.S.; Younes, A.B.; Cai, C.; Cai, G. A Survey of Hybrid Unmanned Aerial Vehicles. Prog. Aerosp. Sci. 2018, 98, 91–105. [Google Scholar] [CrossRef]
  54. D’sa, R.; Jenson, D.; Henderson, T.; Kilian, J.; Schulz, B.; Calvert, M.; Heller, T.; Papanikolopoulos, N. SUAV: Q—An Improved Design for a Transformable Solar-Powered UAV. In 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS); IEEE: New York, NY, USA, 2016. [Google Scholar]
  55. Wang, J.; Ge, Y.; Heuvelink, G.B.M.; Zhou, C.; Brus, D. Effect of the Sampling Design of Ground Control Points on the Geometric Correction of Remotely Sensed Imagery. ITC J. 2012, 18, 91–100. [Google Scholar] [CrossRef]
  56. Mesas-Carrascosa, F.-J.; Torres-Sánchez, J.; Clavero-Rumbao, I.; García-Ferrer, A.; Peña, J.-M.; Borra-Serrano, I.; López-Granados, F. Assessing Optimal Flight Parameters for Generating Accurate Multispectral Orthomosaicks by UAV to Support Site-Specific Crop Management. Remote Sens. 2015, 7, 12793–12814. [Google Scholar] [CrossRef] [Green Version]
  57. Khanal, S.; Fulton, J.; Shearer, S. An Overview of Current and Potential Applications of Thermal Remote Sensing in Precision Agriculture. Comput. Electron. Agric. 2017, 139, 22–32. [Google Scholar] [CrossRef]
  58. What is Atmospheric Correction in Remote Sensing? Available online: https://gisgeography.com/atmospheric-correction/ (accessed on 3 January 2021).
  59. Frouin, R.J.; Franz, B.A.; Ibrahim, A.; Knobelspiesse, K.; Ahmad, Z.; Cairns, B.; Chowdhary, J.; Dierssen, H.M.; Tan, J.; Dubovik, O.; et al. Atmospheric Correction of Satellite Ocean-Color Imagery during the PACE Era. Front. Earth Sci. 2019, 7. [Google Scholar] [CrossRef] [Green Version]
  60. Agapiou, A.; Hadjimitsis, D.G.; Papoutsa, C.; Alexakis, D.D.; Papadavid, G. The Importance of Accounting for Atmospheric Effects in the Application of NDVI and Interpretation of Satellite Imagery Supporting Archaeological Research: The Case Studies of Palaepaphos and Nea Paphos Sites in Cyprus. Remote Sens. 2011, 3, 2605–2629. [Google Scholar] [CrossRef] [Green Version]
  61. Che, N.; Price, J.C. Survey of Radiometric Calibration Results and Methods for Visible and near Infrared Channels of NOAA-7, -9, and -11 AVHRRs. Remote Sens. Environ. 1992, 41, 19–27. [Google Scholar] [CrossRef]
  62. Lu, D.; Mausel, P.; Brondizio, E.; Moran, E. Assessment of Atmospheric Correction Methods for Landsat TM Data Applicable to Amazon Basin LBA Research. Int. J. Remote Sens. 2002, 23, 2651–2671. [Google Scholar] [CrossRef]
  63. Martínez, J.; Egea, G.; Agüera, J.; Pérez-Ruiz, M. A Cost-Effective Canopy Temperature Measurement System for Precision Agriculture: A Case Study on Sugar Beet. Precis. Agric. 2017, 18, 95–110. [Google Scholar] [CrossRef]
  64. Yu, X.; Liu, Q.; Liu, X.; Liu, X.; Wang, Y. A Physical-Based Atmospheric Correction Algorithm of Unmanned Aerial Vehicles Images and Its Utility Analysis. Int. J. Remote Sens. 2017, 38, 3101–3112. [Google Scholar] [CrossRef]
  65. Sox, L.; Wickwar, V.B.; Herron, J.P.; Barton, D.L.; Emerick, M.T. GroundBased Observations with a Rayleigh-Mie-Raman Lidar from 15–120 km. 2013. Fall National Space Grant Meeting. Charleston, SC. Oct. 2013. Graduate Student Posters. Paper 22. Available online: https://digitalcommons.usu.edu/graduate_posters/22 (accessed on 3 January 2021).
  66. Moro, G.D.; Halounova, L. Haze Removal for High-resolution Satellite Data: A Case Study. Int. J. Remote Sens. 2007, 28, 2187–2205. [Google Scholar] [CrossRef]
  67. Berk, A.; Bernstein, L.S.; Anderson, G.P.; Acharya, P.K.; Robertson, D.C.; Chetwynd, J.H.; Adler-Golden, S.M. MODTRAN Cloud and Multiple Scattering Upgrades with Application to AVIRIS. Remote Sens. Environ. 1998, 65, 367–375. [Google Scholar] [CrossRef]
  68. Zhang, L.; Yang, L.; Lin, H.; Liao, M. Automatic Relative Radiometric Normalization Using Iteratively Weighted Least Square Regression. Int. J. Remote Sens. 2008, 29, 459–470. [Google Scholar] [CrossRef]
  69. Chrysoulakis, N.; Diamandakis, M.; Prastacos, P. GIS based estimation and mapping of local level daily irradiation on inclined surfaces. In Proceedings of the 7th AGILE Conference on Geographic Information Science, Heraklion, Greece, 29 April–1 May 2004. [Google Scholar]
  70. Song, C.; Woodcock, C.E.; Seto, K.C.; Lenney, M.P.; Macomber, S.A. Classification and change detection using Landsat TM data: When and how to correct atmospheric effects? Remote Sens. Environ. 2001, 75, 230–244. [Google Scholar] [CrossRef]
  71. Chavez, P.S., Jr. An Improved Dark-Object Subtraction Technique for Atmospheric Scattering Correction of Multispectral Data. Remote Sens. Environ. 1988, 24, 459–479. [Google Scholar] [CrossRef]
  72. Alvarez, F.; Catanzarite, T.; Rodríguez-Pérez, J.R.; Nafría, D. Radiometric calibration and evaluation of UltraCamX and XP using portable reflectance targets and spectrometer data. Application to extract thematic data from imagery gathered by the national plan of aerial orthophotography (PNOA). In International Calibration and Orientation Workshop EuroCOW; Castelldefels, Spain, 2010; Available online: https://www.isprs.org/proceedings/xxxviii/eurocow2010/euroCOW2010_files/papers/02.pdf (accessed on 21 March 2021).
  73. Hunt, E.; Walthall, C.L.; Daughtry, C.; Cavigelli, M.; Fujikawa, S.; Yoel, D.; Ng, T.; Tranchitella, M. High-resolution multispectral digital photography using unmanned airborne vehicles. In Biennial Workshop on Aerial Photography, Videography, and High Resolution Digital Imagery for Resource Assessment Proceedings; United States Department of Agriculture, Agricultural Research Service: Weslaco, TX, USA, 2006. [Google Scholar]
  74. González-Piqueras, J.; Hernández, D.; Felipe, B.; Odi, M.; Belmar, S.; Villa, G.; Domenech, E. Radiometric aerial triangulation approach. A case study for the Z/I DMC. In Proceedings of the EuroCOW; Castelldefels, Spain, 2010; Available online: https://www.isprs.org/proceedings/xxxviii/eurocow2010/euroCOW2010_files/papers/11.pdf (accessed on 21 March 2021).
  75. Hernández López, D.; Felipe García, B.; González Piqueras, J.; Alcázar, G.V. An Approach to the Radiometric Aerotriangulation of Photogrammetric Images. ISPRS J. Photogramm. Remote Sens. 2011, 66, 883–893. [Google Scholar] [CrossRef]
  76. Collings, S.; Caccetta, P.; Campbell, N.; Wu, X. Empirical Models for Radiometric Calibration of Digital Aerial Frame Mosaics. IEEE Trans. Geosci. Remote Sens. 2011, 49, 2573–2588. [Google Scholar] [CrossRef]
  77. Collings, S.; Caccetta, P. Radiometric Calibration of Very Large Digital Aerial Frame Mosaics. Int. J. Image Data Fusion 2013, 4, 214–229. [Google Scholar] [CrossRef]
  78. Yang, G.; Li, C.; Wang, Y.; Yuan, H.; Feng, H.; Xu, B.; Yang, X. The DOM Generation and Precise Radiometric Calibration of a UAV-Mounted Miniature Snapshot Hyperspectral Imager. Remote Sens. 2017, 9, 642. [Google Scholar] [CrossRef] [Green Version]
  79. Johansen, K.; Raharjo, T.; McCabe, M. Using Multi-Spectral UAV Imagery to Extract Tree Crop Structural Properties and Assess Pruning Effects. Remote Sens. 2018, 10, 854. [Google Scholar] [CrossRef] [Green Version]
  80. Deng, L.; Yan, Y.; Gong, H.; Duan, F.; Zhong, R. The Effect of Spatial Resolution on Radiometric and Geometric Performances of a UAV-Mounted Hyperspectral 2D Imager. ISPRS J. Photogramm. Remote Sens. 2018, 144, 298–314. [Google Scholar] [CrossRef]
  81. Xiang, H.; Tian, L. Development of a Low-Cost Agricultural Remote Sensing System Based on an Autonomous Unmanned Aerial Vehicle (UAV). Biosyst. Eng. 2011, 108, 174–190. [Google Scholar] [CrossRef]
  82. Kallimani, C.; Heidarian, R.; van Evert, F.K.; Rijk, B.; Kooistra, L. UAV-Based Multispectral & Thermal Dataset for Exploring the Diurnal Variability, Radiometric & Geometric Accuracy for Precision Agriculture. Open Data J. Agric. Res. 2020, 6, 1–7. [Google Scholar]
  83. Rocchini, D.; Di Rita, A. Relief Effects on Aerial Photos Geometric Correction. Appl. Geogr. 2005, 25, 159–168. [Google Scholar] [CrossRef]
  84. Randy, R. Price and Pava Alli. Development of a System to Automatically Geo-Rectify Images and Allow Quick Transformation into a Prescription Map. In 2005 Tampa, FL July 17-20, 2005; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2005. [Google Scholar]
  85. Xiang, H.; Tian, L. Method for Automatic Georeferencing Aerial Remote Sensing (RS) Images from an Unmanned Aerial Vehicle (UAV) Platform. Biosyst. Eng. 2011, 108, 104–113. [Google Scholar] [CrossRef]
  86. Mazur, M. Six Ways Drones Are Revolutionizing Agriculture. Available online: https://www.technologyreview.com/2016/07/20/158748/six-ways-drones-are-revolutionizing-agriculture/ (accessed on 20 July 2016).
  87. Xue, X.; Lan, Y.; Sun, Z.; Chang, C.; Hoffmann, W.C. Develop an Unmanned Aerial Vehicle Based Automatic Aerial Spraying System. Comput. Electron. Agric. 2016, 128, 58–66. [Google Scholar] [CrossRef]
  88. Antonelli, J. 2016: Most Section 333s Just $1500. Available online: https://dronelawsblog.com/2016-most-section-333s-just-1500/ (accessed on 15 January 2021).
  89. Regulations.Gov Beta. Available online: https://beta.regulations.gov/search?agencyIds=FAA&documentTypes=Notice&filter=333 (accessed on 15 January 2021).
  90. Petty, R.V. Drone Use in Aerial Pesticide Application Faces Outdated Regulatory Hurdles. Harv. J. Law Technol. Dig. 2018. Available online: https://jolt.law.harvard.edu/digest/drone-use-pesticide-application (accessed on 16 January 2018).
  91. Meng, J.H.; Wu, B.F. Study on the Crop Condition Monitoring Methods with Remote Sensing; The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences: Beijing, China, 2008; pp. 945–950. [Google Scholar]
  92. Krishna, K.R. Agricultural Drones: A Peaceful Pursuit; Apple Academic Press: Oakville, MO, USA, 2018. [Google Scholar]
  93. Khanal, S.; Kc, K.; Fulton, J.P.; Shearer, S.; Ozkan, E. Remote Sensing in Agriculture—Accomplishments, Limitations, and Opportunities. Remote Sens. 2020, 12, 3783. [Google Scholar] [CrossRef]
  94. Rembold, F.; Atzberger, C.; Savin, I.; Rojas, O. Using Low Resolution Satellite Imagery for Yield Prediction and Yield Anomaly Detection. Remote Sens. 2013, 5, 1704–1733. [Google Scholar] [CrossRef] [Green Version]
  95. Rembold, F.; Korpi, K.; Rojas, O. Guidelines for Using Remote Sensing Derived Information in Support of the IPC Analysis; Ispra, Italy, 2011; Available online: http://www.ipcinfo.org/fileadmin/user_upload/ipcinfo/docs/1_RemoteSensedData_IPC_JRC_guidelines.pdf (accessed on 21 March 2021).
  96. Diker, K.; Heermann, D.F.; Brodahl, M.K. Frequency Analysis of Yield for Delineating Yield Response Zones. Precis. Agric. 2004, 5, 435–444. [Google Scholar] [CrossRef]
  97. DroneMapper. Available online: https://dronemapper.com/ (accessed on 23 February 2021).
  98. Chaerle, L.; Van Der Straeten, D. Imaging Techniques and the Early Detection of Plant Stress. Trends Plant Sci. 2000, 5, 495–501. [Google Scholar] [CrossRef]
  99. Grassi, M. 5 Actual Uses for Drones in Precision Agriculture Today. Available online: https://dronelife.com/2014/12/30/5-actual-uses-drones-precision-agriculture-today/ (accessed on 24 February 2021).
  100. Zaman-Allah, M.; Vergara, O.; Araua, J.L.; Trakegne, A.; Magarokosha, C.; Zarco-Tejada, P.J.; Hornero, A.; Hernandez Alba, A.; Das, B.; Crauford, P.; et al. Unmanned Aerial Platform-Based Spectral Imaging for Field Phenotyping of Maize. Plant Methods. 2015, 11, 35–39. Available online: https://plantmethods.biomedcentral.com/articles/10.1186/s13007-015-0078-2/ (accessed on 24 February 2021). [CrossRef] [Green Version]
  101. Chakravorty, S. Oil Field Drones: Monitoring Oil Pipelines with Drones; Angel Publishing: Baltimore, MD, USA, 2015; p. 275. [Google Scholar]
  102. Al-Saddik, H.; Simon, J.C.; Cointault, F. Development of Spectral Disease Indices for ‘Flavescence Dorée’ Grapevine Disease Identification. Sensors 2017, 17, 2772. [Google Scholar] [CrossRef] [Green Version]
  103. Calderón, R.; Navas-Cortés, J.A.; Lucena, C.; Zarco-Tejada, P.J. High-Resolution Airborne Hyperspectral and Thermal Imagery for Early Detection of Verticillium Wilt of Olive Using Fluorescence, Temperature and Narrow-Band Spectral Indices. Remote Sens. Environ. 2013, 139, 231–245. [Google Scholar] [CrossRef]
  104. López-López, M.; Calderón, R.; González-Dugo, V.; Zarco-Tejada, P.; Fereres, E. Early Detection and Quantification of Almond Red Leaf Blotch Using High-Resolution Hyperspectral and Thermal Imagery. Remote Sens. 2016, 8, 276. [Google Scholar] [CrossRef] [Green Version]
  105. Calderón, R.; Montes-Borrego, M.; Landa, B.B.; Navas-Cortés, J.A.; Zarco-Tejada, P.J. Detection of Downy Mildew of Opium Poppy Using High-Resolution Multi-Spectral and Thermal Imagery Acquired with an Unmanned Aerial Vehicle. Precis. Agric. 2014, 15, 639–661. [Google Scholar] [CrossRef]
  106. López-Granados, F.; Torres-Sánchez, J.; Serrano-Pérez, A.; de Castro, A.I.; Mesas-Carrascosa, F.-J.; Peña, J.-M. Early Season Weed Mapping in Sunflower Using UAV Technology: Variability of Herbicide Treatment Maps against Weed Thresholds. Precis. Agric. 2016, 17, 183–199. [Google Scholar] [CrossRef]
  107. Torres-Sánchez, J.; López-Granados, F.; de Castro, A.I.; Peña-Barragán, J.M. Configuration and Specifications of an Unmanned Aerial Vehicle (UAV) for Early Site Specific Weed Management. PLoS ONE 2013, 8, e58210. [Google Scholar] [CrossRef] [Green Version]
  108. Tamouridou, A.A.; Alexandridis, T.K.; Pantazi, X.E.; Lagopodi, A.L.; Kashefi, J.; Moshou, D. Evaluation of UAV Imagery for Mapping Silybum Marianum Weed Patches. Int. J. Remote Sens. 2017, 38, 2246–2259. [Google Scholar] [CrossRef]
  109. Rasmussen, J.; Nielsen, J.; Streibig, J.C.; Jensen, J.E.; Pedersen, K.S.; Olsen, S.I. Pre-Harvest Weed Mapping of Cirsium Arvense in Wheat and Barley with off-the-Shelf UAVs. Precis. Agric. 2019, 20, 983–999. [Google Scholar] [CrossRef]
  110. López-Granados, F. Weed Detection for Site-Specific Weed Management: Mapping and Real-Time Approaches: Weed Detection for Site-Specific Weed Management. Weed Res. 2011, 51, 1–11. [Google Scholar] [CrossRef] [Green Version]
  111. Peña, J.M.; Torres-Sánchez, J.; Serrano-Pérez, A.; de Castro, A.I.; López-Granados, F. Quantifying Efficacy and Limits of Unmanned Aerial Vehicle (UAV) Technology for Weed Seedling Detection as Affected by Sensor Resolution. Sensors 2015, 15, 5609–5626. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  112. Torres-Sánchez, J.; Peña, J.M.; de Castro, A.I.; López-Granados, F. Multi-Temporal Mapping of the Vegetation Fraction in Early-Season Wheat Fields Using Images from UAV. Comput. Electron. Agric. 2014, 103, 104–113. [Google Scholar] [CrossRef]
  113. Drone-Based Biocontrol Services to Protect your Crops • UAV-IQ. Available online: https://www.uaviq.com/en/biocontrol/ (accessed on 19 January 2021).
  114. He, Y.; Weng, Q. (Eds.) High Spatial Resolution Remote Sensing: Data, Analysis, and Applications; Productivity Press: New York, NY, USA, 2018. [Google Scholar]
  115. Caturegli, L.; Corniglia, M.; Gaetani, M.; Grossi, N.; Magni, S.; Migliazzi, M.; Angelini, L.; Mazzoncini, M.; Silvestri, N.; Fontanelli, M.; et al. Unmanned Aerial Vehicle to Estimate Nitrogen Status of Turfgrasses. PLoS ONE 2016, 11, e0158268. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  116. Drone Technology Tracks Water Needs. Available online: https://www.ksre.k-state.edu/reports/water/2018/08/crops-drones.html (accessed on 24 February 2021).
  117. Torres-Rua, A.; McKee, M. Technologies Will Tackle Irrigation Inefficiencies in Agriculture’s Drier Future. The Conversation. Available online: https://theconversation.com/technologies-will-tackle-irrigation-inefficiencies-in-agricultures-drier-future-40601 (accessed on 27 May 2015).
  118. Unay Gailhard, İ.; Bavorová, M.; Pirscher, F. Adoption of Agri-Environmental Measures by Organic Farmers: The Role of Interpersonal Communication. J. Agric. Educ. Ext. 2015, 21, 127–148. [Google Scholar] [CrossRef]
  119. Pierpaoli, E.; Carli, G.; Pignatti, E.; Canavari, M. Drivers of Precision Agriculture Technologies Adoption: A Literature Review. Procedia Technol. 2013, 8, 61–69. [Google Scholar] [CrossRef] [Green Version]
  120. Zheng, S.; Wang, Z.; Wachenheim, C.J. Technology Adoption among Farmers in Jilin Province, China: The Case of Aerial Pesticide Application. China Agric. Econ. Rev. 2019, 11, 206–216. [Google Scholar] [CrossRef]
  121. Kernecker, M.; Knierim, A.; Wurbs, A.; Kraus, T.; Borges, F. Experience versus Expectation: Farmers’ Perceptions of Smart Farming Technologies for Cropping Systems across Europe. Precis. Agric. 2020, 21, 34–50. [Google Scholar] [CrossRef]
  122. Adrian, A.M.; Norwood, S.H.; Mask, P.L. Producers’ Perceptions and Attitudes toward Precision Agriculture Technologies. Comput. Electron. Agric. 2005, 48, 256–271. [Google Scholar] [CrossRef]
  123. As Farmers Grow Drone Use, Privacy Issues Top List of Concerns. Available online: https://www.munichre.com/us-non-life/en/company/media-relations/press-releases/2018/2018-07-17-farmers-grow-drone-use.html (accessed on 10 March 2021).
Figure 1. UAVs’ types that are used in precision agriculture; (a) six rotary-winged, (b) large fixed-winged, (c) VTOL (Vertical Take-Off and Landing), (d) four rotary-winged, (e) mini fixed-winged, and (f) large VTOL.
Figure 1. UAVs’ types that are used in precision agriculture; (a) six rotary-winged, (b) large fixed-winged, (c) VTOL (Vertical Take-Off and Landing), (d) four rotary-winged, (e) mini fixed-winged, and (f) large VTOL.
Remotesensing 13 01204 g001
Figure 2. A typical launcher for fixed-wing UAVs used in agriculture.
Figure 2. A typical launcher for fixed-wing UAVs used in agriculture.
Remotesensing 13 01204 g002
Table 1. Review studies on UAVs and their main concentrations.
Table 1. Review studies on UAVs and their main concentrations.
ReferenceDiscussed Points
Zhang and Kovacs., 2012 [7]
  • Limitations of UAVs in Precision Agriculture (PA) applications
  • Commonly used cameras and image processing techniques
  • The likeliness of Farmers’ interests in UAV adaptation
  • Aviation regulations
Watts et al., 2012 [8]
  • Classification and characterization of UAV Platforms
  • Payloads and logistics requirements
  • UAVs regulation
Huang et al., 2013 [9]
  • UAVs application over agricultural lands
  • Limitations of current agricultural UAVs
Colomina and Molina., 2014 [10]
  • Crucial high-level components of unmanned aircraft
  • Regulatory and data processing
  • UAVs photogrammetry and RS applications
Salami et al., 2014 [11]
  • Vegetation monitoring
  • Data processing
  • Payloads and flight characterization
Nex and Remondino., 2014 [12]
  • UAV applications in archeological site 3D mapping
  • Historical framework and regulations
  • UAV data acquisition and processing systems
Pajares., 2015 [13]
  • Unmanned aerial platforms, sensors, and technologies
  • Collaboration and cooperation of a fleet of UAVs
  • Several applications of UAVs
Padua et al., 2017 [14]
  • UAV’s main characteristics and sensors
  • Data processing
  • Application recommendation towards UAVs platform selection
Hunt and Daughtry., 2018 [15]
  • Overview of precision agriculture and RS
  • Available sensors
  • Cost
Mogili and Deepak., 2018 [16]
  • UAVs models and types
  • Different methodologies of controller
  • Hardware components
  • Crop monitoring
  • Sprinkling system
Table 2. Technical comparison of common RG platforms used in precision agriculture.
Table 2. Technical comparison of common RG platforms used in precision agriculture.
SpecificationGround-BasedSatelliteManned AircraftUAVs
CostLowHighestHighLowest
Operating environmentIndoor/outdoorOutdoorOutdoorsIndoor/outdoor
Time-consumingLongShortestShortShort
Labor-intensityHighestLowHighMedium
Operational riskLowModerateHighLow
Trained pilot requirementNoNoYesNo
Automatic crop sprayingNoNoNoYes
Spatial resolutionHighestLowModerateHighest
Spatial accuracyModerate LowHighHigh
Temporal advantageNoNoNoYes
AdaptabilityLowLowLowHigh
ManeuverabilityLimitedLimitedModerateHigh
DeployabilityModerateDifficultComplexEasy
Susceptibility to weatherYesYesYesNo
Repeatability rateMinutesDayHoursMinutes
Feasibility for small areasYesNoNoYes
Autonomy and sociabilityLowLowLowHigh
Real-time data availabilityNoNoYesYes
Limited to specific hoursNoYesNoNo
Running at low altitudeNoNoNoYes
Ground coverageSmallestLargeMediumSmall
Observation rangeLocalWorldwideRegionalLocal
Operational complexitySimpleComplexComplexSimplest
Table 3. Frequently used UAVs types and their technical characteristics.
Table 3. Frequently used UAVs types and their technical characteristics.
CharacteristicsFixed-WingRotary-WingHybrid
Aerodynamic lift systemPredefined airfoil of static and fixed wings.Several rotors and propellers point upwards to generate propulsive thrust.Take off vertically like a rotary system and fly like a fixed-wing.
Control system of UAVElevators (turn around roll), ailerons (to pitch), and rudder (to yaw) attached to the wings.Torque and thrust of the rotors which control its movement (yaw, pitch, roll, and throttle).Three different controllers, for horizontal mode, vertical mode, and transition mode.
Control complexityComplexSimpleMost complex
Flight systemSimpleComplexComplex
Energy efficientMoreLessMore
Architecture and maintenance processesSimpleComplexComplex
Require space for takeoff and landingYesNoNo
Hovering capabilityNoYesYes
High-speed flight capabilityYesNoYes
Turning radius restriction while changing directionsYesNoYes
Minimum and maximum flight angles restriction during landing and takeoffYesNoNo
Propeller motion effect on imagesNoYesNo
Require forward airspeedYesNoYes
Table 4. Factors that affect image quality in UAVs’ operational conditions.
Table 4. Factors that affect image quality in UAVs’ operational conditions.
FactorsCaused Problems
Platforms’ tilt, roll, and yaw motion Image blur/Geometric distortion
Platforms’ vibrationImage blur/Sensor damage
Changing flight speed Inadequate image overlap
Changing flight altitudeUneven resolution
Inadequate image overlap/sidelapOrthorectification issues/Geometric distortion
Changing illumination during flightSpectral mismatch
Table 5. The most desired sensors for various PA operations and their application.
Table 5. The most desired sensors for various PA operations and their application.
UAV PayloadsDescriptionCommon Applications
RGB cameraLimited to visual bands. Visible red, green and blue information.
  • Plants outer defects, greenness and growth monitoring
  • Calculating a range of vegetation indices
  • Creating High-resolution digital elevation models (DEMs)
  • Vegetation height maps
Multispectral cameraCameras with five bandpass interference filter red, green, blue, red-edge and near-infrared.
  • Monitoring and mapping crop diseases and weeds
  • Estimating the vegetation state
  • Nutrient deficiency
Hyperspectral cameraMore bandpass compares to multispectral, which sometimes reaches the number of 2000.
  • Distinguish different plant species with similar spectral signatures
  • Identify plant biochemical composition
  • Quantitative soil vegetation
  • Calculating chemical attributes
Thermal cameraInfrared radiation to form a heat zone image, operating at (approx.) wavelengths of 14,000 nm.
  • Evaluating water stress and assess irrigation uniformity
  • Vegetation indices calculation
Consumer-grade camerasTypically use a Bayer color filter mosaic to obtain true-color RGB images with a single sensor and provide only three visible bands.Crop identification and pest detection.
LIDAR sensorsRapid laser pulses to map out the surface of the Earth.
  • Creating high-resolution digital surfaces, terrain, and elevation
  • Measuring canopy heights, coverage, tree density, location, and height of individual trees
IMU, GPS, magnetometer-Localization of the UAV
Chemical sensors-Identifying chemical compositions and specific organic substances
Biological sensors-Identifying various kinds of microorganisms
Meteorological sensors-Measure various values, such as wind speed, temperature, and humidity
Spraying system or similar payloads-Delivering specific objects or substances to specific destinations
Table 6. Studies and methods applied for atmospheric correction in RS.
Table 6. Studies and methods applied for atmospheric correction in RS.
ReferenceAtmospheric Correction Method
Moro and Halounova, 2007 [66]Image-based atmospheric corrections as a subdivision of two-dimensional absolute corrections.
Berk et al., 1998 [67]Atmospheric optical conditions to calculate radiative transfer as a subdivision of two-dimensional absolute corrections.
Zhang et al., 2008 [68]Two-dimensional relative corrections.
Chrysoulakis et al., 2004 [69]Three-dimensional corrections.
Song et al., 2001 [70]Darkest object principle and relative atmospheric correction.
Yu et al., 2016 [64]Physical-based atmospheric correction algorithm.
Chavez, 1988 [71]Dark object subtraction (DOS) method.
Table 7. Proposed radiometric correction method in research studies.
Table 7. Proposed radiometric correction method in research studies.
ReferenceRadiometric Calibration Method
Alverez et al., 2010 [72]Calibration to ground reflectance using targets constructed from shade vinyl
Hunt et al., 2005 [73]Use of five colored tarpaulins to test the camera’s spectral calibration
González-Piqueras et al., 2010 [74]Explore BRDF (bidirectional reflectance distribution function) techniques
López et al., 2011 [75]Calibration based on radiative transfer modeling of atmospheric effects, combined with kernel-based models for BRDF effects
Collings et al., 2011 [76]Adjusting for potentially variable atmospheric and sensor acquisition parameters combined with kernel-based models for BRDF effects
Collings and Caccetta., 2013 [77]Calibrating a large-scale acquisition to ground reflectance, based on measured calibration targets for hyperspectral calibration
Yang et al., 2017 [78]Radiometric calibration from cube (with one spectral and two spatial dimensions) hyperspectral snapshots
Johansen et al., 2018 [79]Empirical method identifies the relationship between known reflectance targets within the image to their measured DN. The best-fit equation is used to convert all DN within the image to at-surface reflectance values
Tu et al., 2018 [80]Generate an orthomosaiced reflectance image directly without the need for an empirical correction for radiometric correction
Table 8. Classification internal and external factors cause geometric distortion in image data.
Table 8. Classification internal and external factors cause geometric distortion in image data.
Internal FactorsExternal Factors
Lens distortionVariations of altitude
Misalignment of detectorsPosition of the platform
Variations of sampling rateEarth curvature, perspective, and geometry
Table 9. Some of the Studies of geometric correction methods.
Table 9. Some of the Studies of geometric correction methods.
ReferenceGeometric Correction Method
Xiang and Tian., 2011 [81]Manual georeferencing using ground collected GCPs, photo match, and automatic georeferencing using navigation data along with a camera lens distortion model
Kallimani et al., 2020 [82]Use GCPs and mark to the images manually
Rocchini and Rita., 2005 [83]Polynomial functions to rectify the aerial image
Price and Alli., 2005 [84]Georectify images using GP
Xiang and Tian., 2011 [85]Automatic georeferencing with an on-board navigation system and camera lens distortion model
Table 10. Unresolved challenges of UAV flights and some suggestions to address them.
Table 10. Unresolved challenges of UAV flights and some suggestions to address them.
Problems EncounteredSuggested Solutions
Imagery during daylight hours suffers from shadows and reflections, camera only sees fruits that are not hidden by leavesControl the illuminationBackscatter x-ray imaging
Large amounts drift of pesticides and fertilizers outside of the targeted area in spraying processes-
Camera’s detectors saturation and susceptibility to vignetteDesign cameras specifically for UAVs applicationsMicro four-thirds cameras with fixed interchangeable lenses instead of retractable lens
Vulnerability to weather conditionDevelop robust and stable UAV platforms with the capability to change their functional characteristics (e.g., velocity and altitude) appropriately
Limited battery time of UAVs, especially for large fieldsDevelop additional solar-powered mechanismsAdopt a collaborative group of UAVs Develop wireless charging systemsDevelop energy efficient UAVs
Band-to-band offsets in multispectral and hyperspectral cameras due to multiple lensesImprove the radiometric, atmospheric and geometric corrections methods
Poor resolution of thermal cameras and their vulnerability to the moisture in the atmosphere, shooting distance, and other sources of emitted and reflected thermal radiationCareful calibration of aerial sensors
Downward wind force created by rotary-wing propellers motion in extremely close flight affects the crops positionsMiniaturize the sensors and UAV
Variability in the strategies, methodologies, and sensors adopted for each specific applicationUnify principles in UAV-based studies
High demand on data storage andprocessing capacity in high spatial resolution-
Inconsistent ground sampling distanceImplement 3D flight paths thatfollow the surface. Increase flight height
Proper camera location information needed to correct aerial imagesCombine the structure from motion (SfM) with ground control points (GCPs) to generate precise digital terrain models (DTM)
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Delavarpour, N.; Koparan, C.; Nowatzki, J.; Bajwa, S.; Sun, X. A Technical Study on UAV Characteristics for Precision Agriculture Applications and Associated Practical Challenges. Remote Sens. 2021, 13, 1204. https://doi.org/10.3390/rs13061204

AMA Style

Delavarpour N, Koparan C, Nowatzki J, Bajwa S, Sun X. A Technical Study on UAV Characteristics for Precision Agriculture Applications and Associated Practical Challenges. Remote Sensing. 2021; 13(6):1204. https://doi.org/10.3390/rs13061204

Chicago/Turabian Style

Delavarpour, Nadia, Cengiz Koparan, John Nowatzki, Sreekala Bajwa, and Xin Sun. 2021. "A Technical Study on UAV Characteristics for Precision Agriculture Applications and Associated Practical Challenges" Remote Sensing 13, no. 6: 1204. https://doi.org/10.3390/rs13061204

APA Style

Delavarpour, N., Koparan, C., Nowatzki, J., Bajwa, S., & Sun, X. (2021). A Technical Study on UAV Characteristics for Precision Agriculture Applications and Associated Practical Challenges. Remote Sensing, 13(6), 1204. https://doi.org/10.3390/rs13061204

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop