Next Article in Journal
Assessing Drought Vegetation Dynamics in Semiarid Grass- and Shrubland Using MESMA
Next Article in Special Issue
Use of Oblique RGB Imagery and Apparent Surface Area of Plants for Early Estimation of Above-Ground Corn Biomass
Previous Article in Journal
Investigation on Global Distribution of the Atmospheric Trapping Layer by Using Radio Occultation Dataset
Previous Article in Special Issue
Improving Biomass and Grain Yield Prediction of Wheat Genotypes on Sodic Soil Using Integrated High-Resolution Multispectral, Hyperspectral, 3D Point Cloud, and Machine Learning Techniques
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Automatic Identification and Monitoring of Plant Diseases Using Unmanned Aerial Vehicles: A Review

by
Krishna Neupane
and
Fulya Baysal-Gurel
*
Otis L. Floyd Nursery Research Center, Department of Agricultural and Environmental Sciences, Tennessee State University, McMinnville, TN 37110, USA
*
Author to whom correspondence should be addressed.
Remote Sens. 2021, 13(19), 3841; https://doi.org/10.3390/rs13193841
Submission received: 21 July 2021 / Revised: 17 September 2021 / Accepted: 21 September 2021 / Published: 25 September 2021
(This article belongs to the Special Issue UAV Imagery for Precision Agriculture)

Abstract

:
Disease diagnosis is one of the major tasks for increasing food production in agriculture. Although precision agriculture (PA) takes less time and provides a more precise application of agricultural activities, the detection of disease using an Unmanned Aerial System (UAS) is a challenging task. Several Unmanned Aerial Vehicles (UAVs) and sensors have been used for this purpose. The UAVs’ platforms and their peripherals have their own limitations in accurately diagnosing plant diseases. Several types of image processing software are available for vignetting and orthorectification. The training and validation of datasets are important characteristics of data analysis. Currently, different algorithms and architectures of machine learning models are used to classify and detect plant diseases. These models help in image segmentation and feature extractions to interpret results. Researchers also use the values of vegetative indices, such as Normalized Difference Vegetative Index (NDVI), Crop Water Stress Index (CWSI), etc., acquired from different multispectral and hyperspectral sensors to fit into the statistical models to deliver results. There are still various drifts in the automatic detection of plant diseases as imaging sensors are limited by their own spectral bandwidth, resolution, background noise of the image, etc. The future of crop health monitoring using UAVs should include a gimble consisting of multiple sensors, large datasets for training and validation, the development of site-specific irradiance systems, and so on. This review briefly highlights the advantages of automatic detection of plant diseases to the growers.

1. Introduction

Simply, UAVs are any aerial vehicles that are remotely driven, meaning no pilot on board. They are considered one of the important innovations of present-day precision agriculture [1,2,3,4,5,6,7,8]. Precision agriculture (PA) is a method to transform agriculture by reducing time and labor and increasing production and management efficiency [9]. With the development in technology and computational skills, there have been changes in agricultural patterns such as using digital planters, harvesters, sprayers, etc. Agriculture has transformed over time from being carried out by manual labor to mechanical labor due to the adoption of technological changes. Previously, plant diseases in agriculture fields were monitored visually by people who have experience scouting and monitoring plant diseases. This type of observation is psychological and subject to bias, optical illusion, and error [1]. This generated the need for external image-based tools that can replace unreliable human observations. This also allows for extended coverage in a limited amount of time.
The potential of UAVs for conducting detailed surveys in precision agriculture has been demonstrated for a range of applications such as crop monitoring [10,11], field mapping [12], biomass estimation [13,14], weed management [15,16], plant population counting [17,18,19], and spraying [20]. A large amount of data and information is collected by UAVs to improve agricultural practices [21]. Different kinds of data recording instruments, cameras, and sensor installation equipment have been developed for agricultural purposes [4]. Some additional reasons for increasing usage of UAVs and drones in agriculture [22] include UAV prices decreasing gradually [2,23], agriculture operations being carried out in areas with low population and activities [1], and UAVs having a large occupancy and great scouting capability. Although the images can be obtained from various sources, such as satellites and aircrafts [4], and can cover a large area compared to those of an unmanned aerial system, the resolution in the images is not conducive to drawing significant conclusions. This leads to other advantages of using drones for high-resolution aerial images [1,24]. Increased efficiency, stability, accuracy, and productivity are other advantages of UAVs [25] that allow growers and experts to make better, more timely management decisions [26,27]. The application of PA technology not only increases economic profit but social benefits measured in sustainability. A study carried out by Van Evert et al. [28] reported that the application of PA in potato cultivation increased economic profit by 21% and social profit by 26% compared to agricultural practices without precision agriculture.
The use of agriculture UAVs is hindered by many challenges such as battery efficiency [25,29], low flight time [7], communication distance, and payload [25,30]. The limited flight duration due to increased payload and decreased efficiency of batteries obstructs when performing critical agricultural activities in larger fields, such as pesticide and nutrient application. Other challenges associated with its use are engine power, stability, maintaining altitude, and maneuverability in wind and turbulence [7,31].
UAVs have been used for a variety of reasons in agriculture. Traditionally, visual observations would determine crop nutrient status, their pests, diseases, and environmental stress [32]. Currently, most UAVs are used for detecting stress in plants, for quantifying biomass, vegetation classification, canopy cover estimation, yield prediction, plant height assessment, and lodging assessment. Agriculture UAVs have been used for mapping agriculture fields, spraying chemicals, planting, crop monitoring, irrigation, diagnosis of insects and pests, artificial pollination [25,33], and livestock population dynamics [34]. A combination of UAVs with hyper and multispectral cameras has been predominant in a wide range of agricultural operations [33] for disease identification purposes. Recently, Chang, Zhou, Kira, Marri, Skovira, Gu, and Sun [32] used UAVs to measure solar-induced chlorophyll fluorescence and photo-chemical reflectance through single bifurcated fiber and a motorized arm to measure radiance. Artificial Intelligence (AI) and deep learning are incorporated with UAVs to increase the precision of crop disease identification and monitoring. Penn State University, Food and Agriculture Organization, International Institute of Tropical Agriculture, International Maize and Wheat Improvement Center and others have developed PlantVillage Nuru to identify viral diseases in cassava plants [35]. Nowadays, AI and DL are collaborated to ease the process of plant disease detection. PlantVillage Nuru has been integrated with the West African viral epidemiology platform (WAVE 2) to track the spread of cassava brown streak disease.
Out of these applications, the most extensive use of UAVs in agriculture has been for the detection of stress in plants and quantification. This might be because of the impact that early detection renders in overall agriculture activity [1]. Early detection of the plant’s biotic and abiotic stresses helps to understand the changing physiology of the plant. If the stresses are detected early, suitable treatments can be used to protect plants and reduce loss. This review paper intends to cover the basic areas of the UAVs in the automatic detection of plant diseases, their components, image and data processing, and different models of data analysis. The major objective of this paper is to provide a holistic explanation of how UAVs can be used to automatically monitor the health status of the crop in the field. It provides insight into basic concepts of UAV peripherals, sensors, and cameras along with their limitations and applicability. In that regard, it further explains the performance, advantages and limitations of different deep learning models. Nonetheless, in situ detection of plant diseases using UAVs is expanding with many challenges. This paper tries to explore different UAV platforms and their limitations and advantages, cameras, and sensors with their spectral specifications to capture images and acquire data for monitoring and detecting plant diseases. It also reflects different methods of processing acquired data, the challenges in the process and the prospects of autonomous identification of plant diseases using Unmanned Aerial Vehicles.

2. Types of UAVs, Their Platforms and Peripherals Used in Disease Monitoring and Identification

Nowadays, as discussed earlier, different agricultural activities have been carried out by UAVs. Among them, UAVs are increasingly used for disease monitoring and identification. They work together with different components such as cameras, sensors, motors, rotors, controllers, etc. One of the basic uses of UAV is to capture images. The information contained in images is extracted and transformed into useful information by image processing and deep learning tools [1,26]. Electromagnetic spectra also provide useful information, which is used to make decisions regarding plant physiological stress [33,36]. The comparison between the spectroscopy in the specific region helps to assess the condition of the plants in real-time under field conditions [33]. Plant disease is identified by observing the physiological disturbances caused by foliar reflectance in a near-infrared portion of the spectrum in a UAV captured image [36]. In addition, the disturbances in the photosynthetic activities of the crops caused by many diseases are also observed as reflectance in the red wavelength range.
There are various types of UAVs for different agricultural operation purposes. In relation to UAVs, these UAV structures are called platforms [4]. Primarily, there are two types of agricultural UAV platforms: fixed wing and rotatory wing [25]. A fixed-wing UAV is comparatively larger in size and used for large-area coverage [4,25,37,38]. Rotary-wing UAVs are further divided into two types: helicopter and multirotor. Helicopters have a large propeller at the top of the aircraft and are used for aerial photography and spraying. Similarly, multirotors have different varieties depending upon the number of motors the aircraft possesses. Different types of multirotors are the quadcopter (four rotors) [39,40,41], hexacopter (six rotors) [42], and octocopter (eight rotors) [43].
For the purpose of plant disease monitoring and identification, both fixed-wing and rotary UAVs have been used. To monitor leaf stripe disease of grapevine, one of the diseases of the esca complex, Di Gennaro and his colleagues used a modified multirotor Mikrocopter OktoXL (HiSystems GmbH, Moomerland, Germany) [36]. Similarly, target spot and powdery mildew in soybean have been identified using a popular quadcopter, Phantom-3 (Shenzhen DJI Sciences and Technologies Ltd., Shenzhen, China) [26]. Xavier and colleagues (2019) used a multirotor UAV to identify Rumaria leaf blight disease in cotton [44]. A hexacopter, DJI Matrice 600 pro (Shenzhen DJI Sciences and Technologies Ltd., Shenzhen, China), was used to detect target spot and bacterial spot in tomato leaves [45]. In 2013, Garcia-Ruiz and colleagues used fixed-wing multirotor UAVs to compare the aerial imaging platforms for identifying citrus greening disease in Florida [33]. Gomez Selvaraj et al. [46] used a multicopter, UAV Phantom 4 Pro (Shenzhen DJI Sciences and Technologies Ltd., Shenzhen, China), to identify major diseases in bananas. An Altura X8 octocopter (Aerialtonics DV B.V., Katwijk, The Netherlands) was used in monitoring fire blight in pear orchards in Belgium [47]. In a similar fashion, an octocopter (DJI S1000) (Shenzhen DJI Sciences and Technologies Ltd., Shenzhen, China) was used in spatio-temporal monitoring of yellow rust in wheat in China [48]. A helicopter was modified to carry a camera system, an autopilot, and sensors to obtain thermal imagery and multispectral imagery over the agricultural fields [49]. A package of sensors was built into the DJI S800 hexacopter (Shenzhen DJI Sciences and Technologies Ltd., Shenzhen, China) to identify citrus greening disease [50]. Özgüven [51] used a rotary UAV–DJI Phantom 3 Advanced Brand (Shenzhen DJI Sciences and Technologies Ltd., Shenzhen, China) to determine Cercospora leaf spot in sugar beet. Valasek et al. [52] used rotary UAV to distinguish leaf spot (Cercosporidium personatum) among healthy and diseased peanuts. Sugiura et al. [53] used a multirotor UAV to determine late blight resistant cultivars of potato.
As we can see, both fixed and rotary UAVs have been used for crop monitoring purposes. The choice of the platform simply depends upon the area to be covered by the UAVs, payload, and nature of the study. For example, fixed-wing UAVs can cover a larger area than rotary-wing UAVs in the same amount of time. Additionally, fixed-winged UAVs have faster flight speed and can take comparatively bigger payloads than rotary UAVs. Rotary-winged UAVs are small and confined to easily access the congested area. Rotary UAVs are mostly used for aerial photography and videography, whereas fixed-winged UAVs are best suited for aerial mapping. Because of their shape and size, fixed-wing UAVs cannot hover in place and need definite spots to launch and land. Moreover, multirotors are cost-effective in terms of total area coverage and easy to handle as compared to fixed-winged rotors. Despite their larger size and difficulty in maintaining control, larger and fixed-winged UAVs can be effectively used in large fields because of their cost-efficiency in the total field coverage per flight duration [54].

3. Cameras and Sensors

Aerial imaging is one of the most important factors when it comes to the application of UAVs. Usually, the image quality determines the selection of UAVs. However, the type of sensor and the purpose of the study also dictate the choice of platform. Thus, the UAV platforms are loaded with different cameras and sensors. For example, a DJI Mavic 2 UAV is upgraded with a Sentera single NDVI camera (Sentera, Saint Paul, MN, USA) to monitor the drought stress in crapemyrtle plants (Figure 1). However, UAV platforms are limited to their payload. An increase in payload decreases the speed, stability, and flight time of the UAV [4,23]. The choice of sensors depends upon the purpose of the study. Drought stress is better observed using thermal sensors in the early stages [55,56], while multispectral and hyperspectral sensors are used for long-term results. Pathogen infections in crops are better diagnosed using hyperspectral and thermal sensors in the early stages, but RGB sensors, multispectral, and hyperspectral can be used to detect the severity of infection in later stages [57]. In this subsection, we will discuss different cameras and sensors used in plant disease monitoring and observation.

3.1. RGB Camera

RGB (Red–green–blue) cameras are commonly used types of cameras that produce images that measure the intensity of three colors and define values of each color in pixels: red, green, and blue. RGB cameras are used to generate 3-dimensional (3D) models of agricultural crops [58,59,60] and provide an estimation of crop biomass [61,62,63]. RGB cameras are also used with NIR and multispectral cameras to improve accuracy while calculating the biomass [62]. If the near-infrared filter is replaced by a red filter, it is called a modified RGB camera [64,65]. Commercial RGB cameras are cheap and have poor spectral resolution [65]. However, Grüner, Astor, and Wachendorf [61] calculated the biomass of grassland with an RGB camera using only SfM processing. RGB cameras have also been successfully used to identify diseases in plants. It is worth mentioning that RGB cameras only have the electromagnetic spectrum range of 380 nm to 750 nm, and not all of these wavelengths are suitable for appropriate crop disease detection [66]. The optical properties and the spectral range of the camera are considerable factors in plant disease detection. Mattupalli and his colleagues confirmed Phymatotrichopsis root rot disease in an alfalfa field using RGB imaging with a maximum likelihood classification algorithm [67]. RGB images were used to detect banana bunchy top virus and banana Xanthomonas wilt diseases in the African landscape of the Congo [46]. Similarly, an RGB camera with 12 megapixels was used to classify leaf spot (Cercospora beticola Sacc.) severity in sugarbeets in Turkey [51]. Grapevines were classified as diseased or healthy in France using RGB sensors [68]. Valasek, Thomasson, Balota, and Oakes [52] classified the leaf caused by Cercosporidium personatum spot of peanut using RGB cameras. Potato late-blight-resistant cultivars are screened using RGB cameras [53]. In a study conducted by Ashourloo et al. [69], wheat leaf rust (caused by Puccinia triticina) was detected by using RGB digital images with varying reflectance at 605, 695 and 455 nm wavelengths. RGB images also provides information on LAB (L = lightness, A and B are color opponent dimensions), YCBCR (Y = luma component, CB = blue difference and CR = red difference chroma components), HSV (hue, saturation, value), etc., which are very helpful in recognizing plant diseases.
RGB cameras are readily available and are typically less expensive. They can also be used to collect high-resolution still images. However, they can only measure three bands (red, green and blue) of the electromagnetic spectrum. This causes RGB images to be less accurate than multispectral or hyperspectral images in terms of the spectral resolution of the camera system. Images from RGB cameras do not provide sufficient information to differentiate levels of sheath blight in rice [70]. However, one of the major advantages of RGB cameras is their ability to capture high spatial resolution images in comparison to multispectral systems and, in turn, provide finer spatial details for plant disease detection and monitoring. RGB cameras should be carefully operated in order to have uniform coloring and lighting in the images. Uniform images will reflect fewer errors in differentiating healthy and diseased plants.

3.2. Multispectral Cameras

Multispectral cameras are considered to be one of the most appropriate sensors for agricultural analytics, as they have the capacity to capture images in high spatial resolution and determine reflectance in near infrared bands [71]. Multispectral and NIR cameras create vegetative indices that rely on near-infrared or other bands of light [72,73,74]. Multispectral cameras use different spectral bands: mostly red, blue, green, red-edge, and near-infrared. They can be differentiated into two groups based on bandwidth: narrowband and broadband [75]. Most of the aerial images for monitoring crop health issues use multispectral cameras [4] as they are used to calculate indices such as NDVI and others including NIR [63,71,73,76,77]. The absence of multispectral sensors in agricultural UAVs will hinder the early detection of plant diseases. The evaluation of multispectral image bands captured in an aerial image at different heights was carried out in 2019 by Xavier and colleagues to detect Ramularia leaf blight in cotton. However, the differentiation in the severity index was not significant [44]. In a study conducted by Abdulridha, Ampatzidis, Kakarla, and Roberts [45], 35 vegetative indices were used to detect target spot and bacterial spot in tomatoes under laboratory and field conditions. A fixed-wing UAV capable of capturing hyperspectral images equipped with multiband sensors was used in 2012 in Florida to compare aerial imaging platforms in identifying citrus greening disease [33]. In the same study, both the multispectral and hyperspectral cameras were installed in the UAV. The multispectral camera used was six narrow-band cameras (miniMCA6, Tetracam, Inc., Chatsworth, CA, USA), having six digital sensors with customizable bands of 10 nm and arranged in a 3 × 2 array. Similarly, the hyperspectral camera used in the study was the AISA EAGLE Very Near Infra-red (VNIR) hyperspectral imaging sensor (Specim Ltd., Oulu, Finland). This imaging sensor has a spectral range of 398–998 nm and a resolution of around 5 nm. The same camera had 128 spectral bands for the VNIR region [33]. A combination of RGB images and multispectral images captured aerially using UAVs were used for pixel-based classification of banana diseases in the Democratic Republic of Congo. The camera system, Micasense RedEdge (MicaSense, Inc., Seattle, WA, USA), was capable of acquiring a 16-bit raw image in five narrow bands. Su, Liu, Hu, Xu, Guo, and Chen [48] used a multispectral camera—RedEdge—having a resolution of 1280 × 960 pixels and five narrow bands. A six-band multispectral camera (MCA-6, Tetracam, Inc., Chatsworth, CA, USA) was used, having an image resolution of 1280 × 1024 pixels, 10-bit radiometric, and optical focal length of 8.5 mm [49]. An RGB-Depth (RGB-D) camera, employed with two grayscale cameras (mvBlueFOX-MLC202bG), covering light sources with polarizing films and the multispectral sensors mounted to UAV platform, was used to monitor orange orchards for detecting citrus greening disease in Florida [50]. Al-Saddik et al. [78] used multispectral sensors to differentiate Flavescence dorée diseased and healthy grapevines in a vineyard without using common vegetative indices. Several other studies also used multispectral cameras [36,48,70,79,80,81,82,83]. A combination of visible and infrared images was used to form a multispectral approach for the identification of vine diseases [84]. The platform was combined using an RGB camera and an infrared light sensor with a wavelength of 850 nm; both cameras had 16-megapixel high-resolution properties. In the study, the accuracy was varied between 70% to 90% depending upon the surface area, with a larger surface having higher accuarcy. Similarly, a multispctral sensor, OptRxTM -Ag Leader, was used at visual range of 670 nm, RedEdge 730 nm, and NIR 775 nm to identify the esca complex and Flavescence dorée in the vineyard [85]. A multispectral camera consisiting spectra of blue (475 nm), green (560 nm), red egde (720 nm), and near red (840 nm) was used to identify pine wood nematode caused by Bursaphelenchus xylophilus [86]. The accuracy observed in the study was 79%. Ye et al. [87] used a five-band multispectral camera having blue (465–485 nm), green (550–570 nm), red (653–673 nm), red edge (712–722 nm), and near infrared (800–880 nm) spectral ranges to classify banana wilt disease and obtained an accuracy of 80%. Multispectral sensors have high practicability for the new innovations of the automatic identification of plants. They can capture images in both visible and near infrared regions, but they may be limited while detecting subtle changes in the biophysical and biochemical parameters.
Various studies have reported that multispectral cameras are best suited to identify plant diseases and pests in the field. The principle behind high accuracy is multiple bands of the electromagnetic spectrum. They not only provide additional information to the acquired images but can also provide vegetative indices. Vegetative indices are one of the most important factors in the identification of crop diseases. There are a few disadvantages of multispectral cameras, which include expensiveness and increased effort for calibration for specific tasks, such as disease identification, image processing, etc.

3.3. Hyperspectral Cameras

The major difference between multispectral and hyperspectral cameras is that hyperspectral cameras collect light of different narrow size bands for every pixel in the image captured [72,88]. Though multispectral cameras are able to capture light reflected by biomolecules, the differences lie in the bandwidth and placement of the light, which helps us to isolate responses from the different molecules. These cameras have particular properties in detecting lights emitted from biomolecules such as chlorophyll [89,90], mesophyll [88], xanthophyll [91], and carotenoids [89,90]. The major drawback of using a hyperspectral camera is the high cost of cameras [72,92] and the huge amount of unnecessary data when not properly calibrated [88,93,94,95].
Abdulridha and colleagues [45] used a hyperspectral imaging system (line-scan imager system), Pika L 2.4 (Resonon Inc., Bozeman, MT, USA) with a 23 mm lens, that had a spectral range of 380–1020 nm and 281 spectral channels to detect diseases in tomato leaves. Fire blight in pears was monitored using a hyperspectral camera (frame-based system) (COSI-cam, VITO NV, Boeretang, Belgium) with a spectral range of 600–900 nm [47,96]. Images were captured in rapid succession of 340 frames/s in an 8-bit mode [47,97]. The overall accuracy was found to be 52% for the detection of healthy and infected trees. However, red wavelength (611 nm) and NIR (784 nm) were found to have an accuracy of 85% in distinguishing healthy and diseased trees [47]. Calderón et al. [98] and Sandino et al. [99] both used line-scan imagers hyperspectral cameras in their experiments to monitor crop health. The thermal, multispectral, and hyperspectral cameras were able to successfully detect the crop crown temperature, structural indices, fluorescence, and health index of the olive tree [100]. Similarly, in the study conducted by Sandino, Pegg, Gonzalez, and Smith [99], the identification of diseased trees was achieved at 97% accuracy and for healthy trees were at 95% accuracy, whereas the global multiclass detection rate was 97%. The main reason why hyperspectral cameras are used is to reduce the shortcomings of multispectral cameras. Hyperspectral cameras are used to capture details in fewer spectral differences and to identify and discriminate target objects.
The major breakthrough that hyperspectral cameras have provided is that, unlike other cameras [101], they are able to detect plant stress with possible causative agents (pathogen/disease). Hyperspectral sensors basically measure several hundred bands of the electromagnetic spectrum to derive accurate results. They are able to measure visible spectrum (400–700 nm), near infrared (700–1000 nm), and also short-wave infrared (1000–2500 nm). As a result, they not only collect images but a huge amount of spectral data as well, which causes difficulties in extracting relevant information.

3.4. Thermal Cameras

Thermal cameras capture infra-red lights in the range of 0.75 to 1000 µm and provide the temperatures of the objects in the form of a thermal image [102]. The advantages of thermal cameras are the low cost as compared to other spectral cameras and that RGB cameras can be converted to thermal cameras with certain modifications [103]. Originally, thermal cameras were used for inspecting drought stress in crops [92,103,104,105]. Thermal images include the temperature of the surrounding objects and have low resolution compared to images captured by other major cameras [102]. Thermal sensors are also used in detecting crop diseases and monitoring crops [80,98,106].
Thermal cameras are capable of identifying responsible agents for plant stress. As the pathogen infects, the structure and metabolism of the plant change, which can be detected by thermal sensors [107,108]. Baranowski et al. [109] reported that fungal stress caused by Alternaria in oilseed rape was identified using thermal imaging. Similarly, early detection of red leaf blotch on almonds is also conducted using hyperspectral and thermal imagery [110]. Many other studies carried out using thermal cameras are detection of Huanglongbing disease of citrus [111], tobacco mosaic virus [112], tomato powdery mildew [110], Fusarium wilt of cucumber [113], etc. A thermal FLIR camera was used to detect disease-induced spots on banana leaves with an accuracy of 92.8% [114]. Similar technology was used by Raza, Prince, Clarkson, and Rajpoot [108] to detect tomato powdery mildew caused by Oidium neolucopersici.
This shows that the use of thermal cameras is increasing in crop health monitoring and disease identification. However, thermal imaging technology has not been fully explored. This may be because of environmental factors considered during image acquisition. Additionally, they contain huge amounts of information along with the image, which leads to challenges in deriving relevant information. The advantages of thermal imaging are low cost and early identification with causative agents. Yang et al. [115] developed a method for early detection of diseases in tea using thermal imagery.

3.5. Depth Sensors

Depth sensors are common peripherals used in agricultural UAVs that provide extra feature-depth in the RGB pixels. The depth in a depth sensor is defined as the distance between the sensor and the point of an object at the time of image capture [116]. The most prevalent depth sensor technology is Light Detection and Ranging (LiDAR). The major difference between RGB-D sensors and LiDAR is that RGB-D sensors depend upon the light reflection intensities, whereas LiDAR uses laser pulses to calculate distance [23]. However, RGB-D sensors are one of the least commonly used sensors in the case of plant health monitoring but are commonly used in spraying [4], 3D modelling [117,118], and phenotyping [116].
Nowadays, depth sensors are used to increase the accuracy of the sensors. Sarkar et al. [50] used RGB-D cameras for detecting citrus greening disease. Similarly, Xia et al. [119] used depth sensors to create 3D segmentation of the individual leaf. There are many sensors available in the market to provide accurate 3D information, such as LiDAR and time of flight (ToF), but these are highly expensive and cannot be used for general agriculture practices. The use of depth cameras provides both the pixel intensity and depth in the captured image to develop a classifier during disease identification. Moreover, depth sensors provide data on the intensity of the light reflected from stressed objects or plants [120]. Paulus et al. [121] used 3-D laser technology to detect Fusarium spp. in kernels of wheat. The challenge of using depth sensors is that the sensors may not detect objects after a certain distance, which may result in a lower intensity count. This can be eliminated by using a configurable camera and calibrating it with depth sensors.

4. Image Pre-Processing

Image pre-processing involves a series of steps before extracting data from the images. The major objective of image pre-processing is to reduce errors and to prepare to extract data from the image. After the high spatial geo-referenced aerial images are captured using UAVs, a large amount of data has to be extracted. Thus, it is very important that the images are error-free. The images may have been degraded while capturing due to noise, shadow, etc. It is very important to have critical knowledge of plant diseases for pre-processing the images and choosing an appropriate method to increase the accuracy of identification [122]. Image pre-processing includes a series of steps to make it appropriate to extract data from the images. These include enhancement of images, their segmentation, color space conversion, and filters [123]. However, Sonka et al. [124] categorize the steps of image pre-processing as pixel brightness transformation, geometric transformation, local pre-processing, and image restoration. Pixel brightness transformation modifies the brightness of the image depending on the pixel of the image, for which they have two classes: brightness correction and grey scale transformation. Similarly, geometric transformation correlates the coordinates of the input image pixel with the points in the output image and determines the brightness of the transformed point in the digital raster. Local pre-processing utilizes the neighborhood of the pixel to generate the new brightness value in the output image. Finally, image restoration is the method in pre-processing that is used to discard degradation factors, such as lens defects, wrong focus, etc., in the image.
After image pre-processing, data from the images are extracted to use for data processing. The accuracy of the image classification depends on the type of image pre-processing and extraction techniques used. Studies have reported that image processing helps to improve the information in the image and can be more easily interpreted than non-processed images [70]. The images are generally pre-processed using different available software to orthorectify spectral bands in reflectance. There are varieties of software available to process raw images depending upon the sensors and cameras used. Ghosal et al. [125] used an unsupervised explanation framework to isolate symptoms during image pre-processing. In the study conducted by Ferentinos, a strange result was found while detecting diseases using open dataset images. The overall accuracy of real images was higher than the pre-preprocessed images [126]. Headwall SpectralView® (Headwall Photonics Inc., Bolton, MA, USA) and CSIRO|Data61 Scyven 1.3.0. (Scyllarus Inc., Canberra, Australia) software was used for isolating regions of interest and reflectance data by Sandina and colleagues [99,127,128]. The Simple Linear Iterative Clustering (SLIC) super pixels method was used by Tetila, Machado, Menezes, Da Silva Oliveira, Alvarez, Amorim, De Souza Belete, Da Silva, and Pistori [27] to segment leaves from the UAV captured images. Environment for Visualizing Images (ENVI) software (version 4.7, ITT VSI, White Plains, NY, USA) was used to reflect bands in the images from UAVs by Garcia-Ruiz, Sankaran, Maja, Lee, Rasmussen, and Ehsani [33]. PixelWrench 2 software (Tetracam Inc., Chatsworth, CA, USA) was used for correcting radiometric distortion and vignetting images in identifying Ramularia leaf blight [44] and grape leaf stripe diseases [36]. Precision Hawk (Precision Hawk USA Inc., Raleigh, NC, USA) provides an image pre-processing facility based on its cloud server [6]. Image processing for detection of cotton root rot was carried out using Pix4D software (Pix4D SA, Lausanne, Switzerland).
The selection of the image processing features depends upon the purpose of the study. For example, if the diseased leaf contains an unwanted object, it can simply be removed by cropping to obtain the target image. This process is called image clipping, a part of image pre-processing. Similarly, the image enhancement can be performed using the histogram equalization technique to equally distribute the color intensities in the diseased plant images. Additionally, if the image contains shadow and has low contrast, this can be enhanced by removing blur [129]. Image pre-processing helps to identify target regions in the images and also helps to reduce noise in the images. It increases the reliability of the optical inspection. Images captured from the multispectral and hyperspectral sensors require atmospheric and radiometric calibration and correction of data in order to ensure consistency in the data from the image.

5. Data Processing

Data processing of the UAV images is carried out using various tools. Most of the researchers use vegetation indices for data processing because they are easy and readable. Currently, different Artificial Neural Networks (ANNs) are used for result demonstrations. In most of the studies, researchers also use numerical values from the sensors and use different statistical tools such as K-means clustering, Receiver Operator Characteristics (ROC) analysis, and regressions. However, it also depends upon the purpose of the study for the selection of data processing tools. Some of the data processing tools are discussed below.

5.1. Image Data Processing

Data processing in UAVs is conducted in several ways. After the vignetting and orthorectification of the images, some of the results are derived using vegetation indices (VIs) only. However, imagery data analysis is more than that. It also includes image segmentation and result interpretation in the form of images. For those types of image data processing, various tools are used. Some of these tools are Artificial Neural Networks (ANNs), Decision Trees, K-means, k nearest neighbors, Support Vector Machines (SVMs), and Regression Analysis. In this review, a short description of k-means clustering and regression analysis is given.

5.1.1. K-Means Clustering

Altas and his colleagues converted RGB images into L*a*b color space to identify Cercospora leaf spot in sugar beet, where a*b are the components in which the information about diseases in the leaves is stored. The colors in a*b are classified using K-means clustering [51]. K-means clustering is a common clustering algorithm used in various application domains, such as image segmentation [130], which divides a dataset into k groups [131]. In k-means clustering, an initial k cluster center is defined, and then the algorithm repeatedly selects other k values within that dataset. In the datasets for which the value of k is already known (such as Unique client identifiers, e.g., customer IDs), the same k value is used. For those where the k value is not known, the k value should be identified separately [132]. For the study to determine leaf spot in sugar beet, k value was issued as 3 since three images are obtained when pixels in the RGB image were separated according to color [51].

5.1.2. Regression Analysis

Regression analysis is one of the most common methods of analyzing UAV imagery. After orthorectification of images, the values for each band, such as NIR, Red, Green, Blue, etc., are extracted. For those extracted values, the regression model is run to investigate the spectral characteristics of the parameters. Generally, different types of regression analysis models are run depending upon the type of dataset acquired: linear and non-linear and simple and multiple regression analysis. At the end of the large dataset analysis, the cross-validation of regression (RA) model is important because when a model is chosen, it is predicted that the observation will be the same in the future as well, but it may not always be similar [133]. Thus, data are split, and two portions are formed. A set is used to form a regression analysis model, while the other set is reserved as a future observation to fit into the model. However, it is not required that both of the split datasets should be equal [133].

5.1.3. Vegetation Indices

Vegetative indices are one of the major analysis tools for analyzing aerial images. They are the numeric representation of the relationship between different wavelengths of lights that are reflected from the plant surface [90]. There are various vegetative indices to describe the status of plants. They include NDVI [4], Optimized Soil-Adjusted Vegetation Index (OSAVI) [134,135], and CSWI [104,136]. A detailed study of different remote sensing vegetation indices is conducted by Xue and Su [137]. These indices are correlated with different imaging sensors to derive a conclusion. Excess Red (ExR), Excess Green (ExG), and Excess Blue (ExB) can be used with RGB imaging. The equation for ExR, ExG and ExB are as below:
E × R = 1.4R − G
E × G = 2G – R − B
E × B = 1.4B − G
where R represents red, G represents green, and B represents blue [4,15,18,138].
The physiological and morphological properties of the plant, such as water content, biochemical composition, nutrient status, biomass content, and diseased tissues, are reflected in the values of vegetative indices [70,76,139]. The most common vegetative index to determine diseased tissue is NDVI [104]. The range of NIR is 780–800 nm, and R is 670–700 nm, as both come from reflected light [62,73,104]. The intensities of these lights are measured by multispectral or hyperspectral cameras and formatted to an NDVI map later. At the end of formatting, each pixel of the NDVI map represents the value of the crop. It is also important to note that vegetation indices can be combined to form other different vegetation indices (VIs). Therefore, we can assume that different VIs are evolving during the writing of this paper. Different studies using VIs to process the data include the esca complex in a grape vineyard detected by using NDVI [36]. Similarly, Albetis et al. [140] used NDVI and 10 other VIs to differentiate symptomatic and asymptomatic Flavescence dorée in grapevines.
Analyzing and comparing vegetation indices between diseased and healthy crop samples is widely used in monitoring crop health. The use of VIs is considered simple, easy, and comparatively reliable in terms of disease identification and monitoring.

6. Deep Learning Models

Deep Learning (DL) is an approach under Machine Learning (ML) where a computer model resembles the biological pathways of a human [141]. Deep learning includes the use of artificial neural networks that contain various numbers of processing layers different from traditional neural networks [126]. It is actually the inclusion of several steps from data collection to the classification of the images and interpretation of results.

6.1. Artificial Neural Networks (ANNs)

The use of neural networks for disease recognition in agriculture crops is rapidly increasing [126,142,143]. A commonly used deep learning tool for image processing and classification is the Artificial Neural Networks (ANNs) [8]. ANNs are mathematical models that work in the same fashion as the human brain does with neurons and synapses to connect each other [126]. The neural networks are trained into a model using previously known data and are programmed to work on a similar set of data. There are different kinds of artificial neural networks for image classification: Recurrent Neural Networks (RNNs), Convolutional Neural Networks (CNNs), and Generative Adversarial Networks (GANs). The most common neural networks used for plant disease detection and classification are convolutional neural networks (CNNs).

6.2. Convolutional Neural Networks (CNNs)

CNNs are basic deep learning tools to identify plant diseases using aerial imagery [144]. They consist of powerful modelling techniques performing complex pattern recognition that have large amounts of data, such as in image recognition [126]. Studies that do not have large amounts of data to run neural networks augment the data [27]. CNNs are the successors of previous ANNs. ANNs were designed to apply in fields, having repeating patterns such as image recognition of diseased plants. For the identification of diseases in plants using CNNs, several algorithms have been successfully applied, making crop health monitoring easier than before. The major CNN algorithms or architectures used are AlexNet [145,146], AlexNetOWTBn [147], GoogLeNet [146], Overfeat [148], and VGG [149]. The different studies that used CNNs to identify plant diseases are listed in Table 1 below.
The performance of the deep architectures also depends upon the number of modifying images, minibatch sizes, weight differences, and bias learning rate [158]. A study by Too et al. [159] reported that DenseNets showed higher accuracy with no overfitting and degraded performance when compared with VGG 16, Inception V4, and ResNet. Similarly, AlexNet had higher accuracy when compared with SqueezeNet in classifying tomato diseases [160] whereas, AlexNet and VGG16 had similar accuracy in classifying tomato disease [158]. Mohanty, Hughes, and Salathé [151] used AlexNet and GoogLeNet to classify plant diseases and achieved an accuracy of 99.35%; however, they performed poorly when tested in different sets of images.
Deep learning architectures are considered accurate as compared to the previously used models such as SVM and random forest methods [158]. The correct prediction percentage of CNN was reported 1–4% higher than SVM [161,162,163,164] and 6% higher than random forests [165]. However, Song et al. [166] reported that CNN models are 18% lower in correct prediction as compared to the Root Mean Square Error (RMSE). CNNs are widely used in the identification and classification of plant diseases using images. Crop classification using deep learning models helps for pest control, cropping activities, yield prediction, etc. [167]. Deep learning models have eased the work for the growers in the way that they can click the image of the picture in the field and identify the disease by uploading it to the software. Feature engineering, a complex process, is eliminated in CNN models as the important features are located during the training of the dataset. However, different architectures have their own pros and cons. As the layers extend, neural networks suffer from performance degradation, resulting in less accuracy. It is also time-consuming during training of the images as a large number of images are required. Deep networks experience an internal covariant shift, which causes disruption in input data and the training layer. However, different techniques, such as skip connections [168], layer-wise training [169], transfer learning, initialization strategies, and batch normalization, are used these days to overcome those challenges [159].

7. Challenges of Automatic Plant Disease Identification Using UAVs

Automatic plant disease detection primarily means the identification of biotic injury caused by pathogens in plants involving no direct human resources in the field. One way of automatically identifying plant disease is deploying UAVs with machine learning algorithms installed in them. Information gathered in the algorithms not only helps in identifying diseases but also estimating the severity of the disease [17]. Plants are infected with hundreds of pathogens in the field, and most of them exhibit similar symptoms. Appropriate identification of plant disease is one of the basic but challenging tasks in agricultural activities. Manually identifying plant disease is subject to bias and optical illusions, which result in errors [170]. It involves intensive labor with economic costs. Nevertheless, automatic disease identification also requires expert opinion for disease confirmation in specific cases. Scouting each plant using laboratory and molecular techniques is not practically possible for disease identification in a large area. Large and complex data obtained from optical sensors are able to detect disease quickly and classify between diseases, stress, and intensity of the diseases [120]. This incentivizes scientists and researchers to develop tools that are programmable and can read every plant through images to detect diseases. There has been progress in automatic monitoring of crop health using UAVs and imagery. However, the system of automatic detection and identification of plant diseases has still been experiencing difficulty with programming accuracy. Some of the challenges during automatic identification of plant diseases using UAVs are discussed by Barbedo [1,170], which include background noise, unfavourable field conditions, sensor limitations, symptom variations, limitation of resources (peripherals and cameras), and ‘Training’–validation discrepancy.
In addition, most of the crop health monitoring activities using UAVs are dominated by RGB and NIR sensors or their combination. The accurate detection of plant disease requires more advanced sensors with higher spectral ranges, such as hyperspectral sensors. These sensors have the potentiality to distinguish specific features of the object with several hundreds of narrow spectral bands [171]. Moreover, the platform used and the image captured also require a considerable value in accurate crop health monitoring. There are different platforms and sources to acquire the images. Satellite images such as Landsat are available for free however have less resolution to correctly detect plant disease. Even the satellite images that are available with costs fall under the Visual-NIR region. Development of sensors to provide high-quality spatial, temporal and spectral information are undergoing [171,172]. Different new technologies have also been introduced to monitor crop phenology, such as sun-induced fluorescence and short-wave infrared for greenhouse gas monitoring. However, these sensors are highly affected by climatic conditions, such as clouds, sunlight, etc., and are only available on regional or global scales [173].
The limitations are extended to image processing, segmentation and classification as well. Different machine learning tools and architectures are available but contain limitations. Some of the major limitations during image processing, classification, and segmentation are low-resolution images, less accuracy, unavailability of large datasets to train the models, etc. Nguyen and Shah [174] found a huge discrepancy in the accuracy of their dataset and the PlantDisease dataset. The author recommends the semi-supervised approach to classify disease, which will create more diverse images. Similarly, Arsenovic et al. [175] suggested an improvement in the decision-making process. In a nutshell, the process of automatic identification of plant diseases using UAVs and deep learning can be improved by choosing a high-quality image capturing camera, appropriate sensors (RGB, multispectral or hyperspectral) depending upon the purpose of study, enough datasets to accurately train the model, and selecting appropriate architecture for the deep learning model.
Apart from these various challenges regarding image analysis and result interpretation during automatic detection of plant diseases using UAVs, challenges regarding their usage and application in the field exist. The regulatory body for controlling UAVs’ application, Federal Aviation Administration (FAA), has various regulations such as limits on the height, operating areas, and zones. In 2018, FAA applied legal clauses for the application of pesticides using UAVs that the operator should receive permission following three exemptions and a waiver process [176]. The privacy around the operating and surrounding areas is strictly addressed [177]. Regulations on UAVs’ application and their handling in the international context is driven by International Civil Aviation Authority (ICAO). More details about the different authorities, protocols, and information about UAVs and their regulations in the global context are available in Stöcker et al. [178]. The UAV technology is expanding and becoming cheaper over time but still is not very applicable to many smallholding growers. The addition of hyperspectral and multispectral sensors, which are of the utmost importance for monitoring plant health in an already purchased UAV, adds more than $10,000 to the cost [4,72]. Similarly, the affordability of trained manpower for a small-scale farmer is still unrealistic.

8. Future Considerations

The Unmanned Aerial Vehicle has been a boon to automatic monitoring of crop status. Nonetheless, identifying the type of stress, either biotic and abiotic, is still vague. Researchers and scientists working together with UAV system manufacturers are merging to broaden possibilities. Many challenges are limiting progress. Different platforms and the types of sensors are already discussed in the sections above; however, all of them have their own pros and cons. Lightweight UAVs with high-resolution cameras can capture a better image that helps for the proper detection of diseases and reduces chances of error. Sensors also play a vital role in disease detection. Abdulridha, Ampatzidis, Kakarla, and Roberts [45] used a benchtop hyperspectral imaging system with a 23 mm lens having a spectral range of 380–1030 nm, 281 spectral channels 15.3° field view, and a spectral resolution of 2.1 nm to detect powdery mildew in squash. The wider the spectral ranges, the better the differentiation of disease symptoms and eventually helps to reduce the error. The selection of sensors and their spectral range is aided by the nature of the disease. For example, Xu et al. [179] used NIR spectroscopy and found that the best range for disease monitoring was 1450 nm and 1900 nm in tomatoes. Similarly, Al-Ahmadi et al. [180] utilized the same technique with the range of 900–2400 nm to monitor charcoal rot (Macrophomina phaseolina in soybean (Glycine max)). Scientists are working to develop a hybrid UAV that works as both fixed-wing and multirotor systems simultaneously. ALTI transition is a system [181,182] that serves both fixed-wing and multirotor systems when and where needed. A combined platform constituent of multiple sensors, such as RGB, NIR, RE infrared, and many others could be in the future, which will decrease the payload and could measure a variety of physiological parameters from the same sensor [183]. The increased flight duration of the UAV is the future of agricultural UAVs. With limited flight duration, it is challenging to distinguish the overall status of the crop. Interpreting results in the form of images rather than in the parametric value such as NDVI or other indices could be beneficial for growers to learn and execute management practices. Most of the decisions regarding crop health are based on the values of vegetative indices. Recent development in crop monitoring integrating UAVs and deep learning techniques offers concomitant crop counting, yield predictions, crop disease and nutrient deficiency detection [184]. Nebiker, Lack, Abächerli, and Läderach [83] utilized low-weight multispectral UAV sensors to predict grain yield and diseases in rape and barley. However, most of these integrations of concomitant yield and disease monitoring are only prototypes and are not available for commercial purposes [185]. The universal system developed by the American Society for Testing and Materials (ASTM) is called the G173 standard. The G173 standard system is used by different software to derive vegetation indices, such asNDVI, Green NDVI (GNDVI), Soil Adjusted Vegetation Index (SAVI), etc. However, the G173 standard accounts for the sun facing 37°, an average latitude of the continental states of the United States [186]. This develops a site-specific irradiance system that can provide a precise description of vegetation indices that will help in getting closer to a more accurate result using the Simple Model of Atmospheric Radiative Transfer of Sunshine (SMARTS). SMARTS stands on the base of the ASTM G173/G177 standard to provide more local solar irradiance spectra [187]. Scientists from Israel and Italy have launched hyperspectral imaging sensors to the orbit of the earth named the Space-born Hyperspectral Applicative Land and Ocean Mission (SHALOM). SHALOM is expected to work in the field of environmental quality and assist in precision agriculture in Israel and Italy [9,188]. Similarly, Fluorescence Explorer (FLEX) is a satellite to be launched by the European Space Agency (ESA) in 2022 [189,190]. FLEX is comprised of three instrumental arrays of fluorescence, hyperspectral reflectance, and canopy temperature. This satellite is expected to observe the vegetative fluorescence of crops at the global level [189,191,192]. Further emphasis on chemometric or spectral decomposition should be given for the derivative method of analysis. Many researchers are creating their own datasets for training and validation. It has become important to develop agricultural datasets that will aid machine learning algorithms and help with accurate disease diagnosis [163]. In this review, we saw that the future of automatic detection of plant diseases is a combination of agriculture with machine learning. The development of robotic arms facilities in UAS, which can retrieve samples and return them for confirmation in the case of confusion, will help better diagnose plant diseases. Thus, machine learning, agriculture, and UAVs can act together to extend the realm of food security by limiting food loss due to pests and diseases.

9. Conclusions

Rapid population growth and climate change are the leading causes of food insecurity. The advancements in UAVs and their systems to diagnose crop stress, pests, and diseases have greatly benefitted growers. Increasing farm productivity and lowering the cost of production using advanced technology is helping growers to increase yields and sustainability on their farms. The development in the automatic detection of plant diseases using UAVs has emerged as a novel technology of precision agriculture. UAVs are accurate and provide large amounts of data regarding crop status, which aids in making management decisions. However, there is still immense opportunity in plant disease diagnosis. As discussed in the future considerations section, the development of various algorithms of machine learning and collaboration with the other stems will help to reach this milestone.

Author Contributions

Writing—original draft preparation, K.N.; writing—review and editing, F.B.-G.; funding—received, F.B.-G. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the National Institute of Food and Agriculture, United States Department of Agriculture Capacity Building grant, under award number 2019-38821-29062.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Barbedo, J.G.A. A review on the use of unmanned aerial vehicles and imaging sensors for monitoring and assessing plant stresses. Drones 2019, 3, 40. [Google Scholar] [CrossRef] [Green Version]
  2. Barbedo, J.G.A.; Koenigkan, L.V. Perspectives on the use of unmanned aerial systems to monitor cattle. Outlook Agric. 2018, 47, 214–222. [Google Scholar] [CrossRef] [Green Version]
  3. Beloev, I.H. A review on current and emerging application possibilities for unmanned aerial vehicles. Acta Technol. Agric. 2016, 19, 70–76. [Google Scholar] [CrossRef] [Green Version]
  4. Hassler, S.C.; Baysal-Gurel, F. Unmanned aircraft system (UAS) technology and applications in agriculture. Agronomy 2019, 9, 618. [Google Scholar] [CrossRef] [Green Version]
  5. Hunt, E.R.; Daughtry, C.S.T.; Mirsky, S.B.; Hively, W.D. Remote sensing with simulated unmanned aircraft imagery for precision agriculture applications. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 4566–4571. [Google Scholar] [CrossRef]
  6. Shi, Y.; Thomasson, J.A.; Murray, S.C.; Pugh, N.A.; Rooney, W.L.; Shafian, S.; Rajan, N.; Rouze, G.; Morgan, C.L.; Neely, H.L.; et al. Unmanned aerial vehicles for high-throughput phenotyping and agronomic research. PLoS ONE 2016, 11, e0159781. [Google Scholar] [CrossRef] [Green Version]
  7. Zhang, C.; Kovacs, J.M. The application of small unmanned aerial systems for precision agriculture: A review. Precis. Agric. 2012, 13, 693–712. [Google Scholar] [CrossRef]
  8. Barbedo, J.G.A. Factors influencing the use of deep learning for plant disease recognition. Biosyst. Eng. 2018, 172, 84–91. [Google Scholar] [CrossRef]
  9. Singh, P.; Pandey, P.C.; Petropoulos, G.P.; Pavlides, A.; Srivastava, P.K.; Koutsias, N.; Deng, K.A.K.; Bao, Y. Hyperspectral remote sensing in precision agriculture: Present status, challenges, and future trends. In Hyperspectral Remote Sensing; Elsevier: Amsterdam, The Netherlands, 2020; pp. 121–146. [Google Scholar]
  10. Hashimoto, N.; Saito, Y.; Maki, M.; Homma, K. Simulation of reflectance and vegetation indices for unmanned aerial vehicle (UAV) monitoring of paddy fields. Remote Sens. 2019, 11, 2119. [Google Scholar] [CrossRef] [Green Version]
  11. Oliveira, H.C.; Guizilini, V.C.; Nunes, I.P.; Souza, J.R. Failure detection in row crops from UAV images using morphological operators. IEEE Geosci. Remote Sens. Lett. 2018, 15, 991–995. [Google Scholar] [CrossRef]
  12. Murugan, D.; Garg, A.; Singh, D. Development of an adaptive approach for precision agriculture monitoring with drone and satellite data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2017, 10, 5322–5328. [Google Scholar] [CrossRef]
  13. Duan, S.-B.; Li, Z.-L.; Wu, H.; Tang, B.-H.; Ma, L.; Zhao, E.; Li, C. Inversion of the PROSAIL model to estimate leaf area index of maize, potato, and sunflower fields from unmanned aerial vehicle hyperspectral data. Int. J. Appl. Earth Obs. Geoinf. 2014, 26, 12–20. [Google Scholar] [CrossRef]
  14. Verger, A.; Vigneau, N.; Chéron, C.; Gilliot, J.-M.; Comar, A.; Baret, F. Green area index from an unmanned aerial system over wheat and rapeseed crops. Remote Sens. Environ. 2014, 152, 654–664. [Google Scholar] [CrossRef]
  15. Rasmussen, J.; Nielsen, J.; Garcia-Ruiz, F.; Christensen, S.; Streibig, J. Potential uses of small unmanned aircraft systems (UAS) in weed research. Weed Res. 2013, 53, 242–248. [Google Scholar] [CrossRef]
  16. Sandler, H.A. Weed management in cranberries: A historical perspective and a look to the future. Agriculture 2018, 8, 138. [Google Scholar] [CrossRef] [Green Version]
  17. Abdu, A.M.; Mokji, M.M.; Sheikh, U.U. Automatic vegetable disease identification approach using individual lesion features. Comput. Electron. Agric. 2020, 176, 105660. [Google Scholar] [CrossRef]
  18. She, Y.; Ehsani, R.; Robbins, J.; Nahún Leiva, J.; Owen, J. Applications of high-resolution imaging for open field container nursery counting. Remote Sens. 2018, 10, 2018. [Google Scholar] [CrossRef] [Green Version]
  19. Zortea, M.; Macedo, M.M.; Mattos, A.B.; Ruga, B.C.; Gemignani, B.H. Automatic citrus tree detection from UAV images based on convolutional neural networks. In Proceedings of the 2018 31th SIBGRAPI Conference on Graphics, Patterns and Images (SIBGRAPI), Paraná, Brazil, 29 October–1 November 2018. [Google Scholar]
  20. Yanliang, Z.; Qi, L.; Wei, Z. Design and test of a six-rotor Unmanned Aerial Vehicle (UAV) electrostatic spraying system for crop protection. Int. J. Agric. Biol. Eng. 2017, 10, 68–76. [Google Scholar] [CrossRef]
  21. Mulla, D.J. Twenty five years of remote sensing in precision agriculture: Key advances and remaining knowledge gaps. Biosyst. Eng. 2013, 114, 358–371. [Google Scholar] [CrossRef]
  22. Gabriel, J.L.; Zarco-Tejada, P.J.; López-Herrera, P.J.; Pérez-Martín, E.; Alonso-Ayuso, M.; Quemada, M. Airborne and ground level sensors for monitoring nitrogen status in a maize crop. Biosyst. Eng. 2017, 160, 124–133. [Google Scholar] [CrossRef]
  23. Chen, Y.; Stark, B.; Kelly, M.; Hogan, S.D. Unmanned aerial systems for agriculture and natural resources. Calif. Agric. 2017, 71, 5–14. [Google Scholar] [CrossRef] [Green Version]
  24. Anderson, K.; Gaston, K.J. Lightweight unmanned aerial vehicles will revolutionize spatial ecology. Front. Ecol. Environ. 2013, 11, 138–146. [Google Scholar] [CrossRef] [Green Version]
  25. Kim, J.; Kim, S.; Ju, C.; Son, H.I. Unmanned aerial vehicles in agriculture: A review of perspective of platform, control, and applications. IEEE Access 2019, 7, 105100–105115. [Google Scholar] [CrossRef]
  26. Castelao Tetila, E.; Brandoli Machado, B.; Belete, N.A.d.S.; Guimaraes, D.A.; Pistori, H. Identification of soybean foliar diseases using unmanned aerial vehicle images. IEEE Geosci. Remote Sens. Lett. 2017, 14, 2190–2194. [Google Scholar] [CrossRef]
  27. Tetila, E.C.; Machado, B.B.; Menezes, G.K.; Da Silva Oliveira, A.; Alvarez, M.; Amorim, W.P.; De Souza Belete, N.A.; Da Silva, G.G.; Pistori, H. Automatic recognition of soybean leaf diseases using UAV images and deep convolutional neural networks. IEEE Geosci. Remote Sens. Lett. 2020, 17, 903–907. [Google Scholar] [CrossRef]
  28. Van Evert, F.K.; Gaitán-Cremaschi, D.; Fountas, S.; Kempenaar, C. Can precision agriculture increase the profitability and sustainability of the production of potatoes and olives? Sustainability 2017, 9, 1863. [Google Scholar] [CrossRef] [Green Version]
  29. Lee, B.; Park, P.; Kim, C.; Yang, S.; Ahn, S. Power managements of a hybrid electric propulsion system for UAVs. J. Mech. Sci. Technol. 2012, 26, 2291–2299. [Google Scholar] [CrossRef]
  30. von Bueren, S.K.; Burkart, A.; Hueni, A.; Rascher, U.; Tuohy, M.P.; Yule, I.J. Deploying four optical UAV-based sensors over grassland: Challenges and limitations. Biogeosciences 2015, 12, 163–175. [Google Scholar] [CrossRef] [Green Version]
  31. Hardin, P.J.; Hardin, T.J. Small-scale remotely piloted vehicles in environmental research. Geogr. Compass 2010, 4, 1297–1311. [Google Scholar] [CrossRef]
  32. Chang, C.Y.; Zhou, R.; Kira, O.; Marri, S.; Skovira, J.; Gu, L.; Sun, Y. An Unmanned Aerial System (UAS) for concurrent measurements of solar-induced chlorophyll fluorescence and hyperspectral reflectance toward improving crop monitoring. Agric. For. Meteorol. 2020, 294, 108145. [Google Scholar] [CrossRef]
  33. Garcia-Ruiz, F.; Sankaran, S.; Maja, J.M.; Lee, W.S.; Rasmussen, J.; Ehsani, R. Comparison of two aerial imaging platforms for identification of Huanglongbing-infected citrus trees. Comput. Electron. Agric. 2013, 91, 106–115. [Google Scholar] [CrossRef]
  34. Wang, D.; Song, Q.; Liao, X.; Ye, H.; Shao, Q.; Fan, J.; Cong, N.; Xin, X.; Yue, H.; Zhang, H. Integrating satellite and Unmanned Aircraft System (UAS) imagery to model livestock population dynamics in the Longbao Wetland National Nature Reserve, China. Sci. Total Environ. 2020, 746, 140327. [Google Scholar] [CrossRef]
  35. Mrisho, L.M.; Mbilinyi, N.A.; Ndalahwa, M.; Ramcharan, A.M.; Kehs, A.K.; McCloskey, P.C.; Murithi, H.; Hughes, D.P.; Legg, J.P. Accuracy of a smartphone-based object detection model, PlantVillage Nuru, in identifying the foliar symptoms of the viral diseases of cassava–CMD and CBSD. Front. Plant Sci. 2020, 11, 1964. [Google Scholar] [CrossRef]
  36. Di Gennaro, S.F.; Battiston, E.; Di Marco, S.; Facini, O.; Matese, A.; Nocentini, M.; Palliotti, A.; Mugnai, L. Unmanned Aerial Vehicle (UAV)-based remote sensing to monitor grapevine leaf stripe disease within a vineyard affected by esca complex. Phytopathol. Mediterr. 2016, 55, 262–275. [Google Scholar]
  37. Pederi, Y.A.; Cheporniuk, H.S. Unmanned aerial vehicles and new technological methods of monitoring and crop protection in precision agriculture. In Proceedings of the 2015 IEEE 3rd International Conference Actual Problems of Unmanned Aerial Vehicles Developments (APUAVD), Kyiv, Ukraine, 13–15 October 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 298–301. [Google Scholar]
  38. Zarco-Tejada, P.J.; Guillén-Climent, M.L.; Hernández-Clemente, R.; Catalina, A.; González, M.R.; Martín, P. Estimating leaf carotenoid content in vineyards using high resolution hyperspectral imagery acquired from an Unmanned Aerial Vehicle (UAV). Agric. For. Meteorol. 2013, 171, 281–294. [Google Scholar] [CrossRef] [Green Version]
  39. Gómez-Candón, D.; De Castro, A.I.; López-Granados, F. Assessing the accuracy of mosaics from Unmanned Aerial Vehicle (UAV) imagery for precision agriculture purposes in wheat. Precis. Agric. 2013, 15, 44–56. [Google Scholar] [CrossRef] [Green Version]
  40. Torres-Sanchez, J.; Lopez-Granados, F.; De Castro, A.I.; Pena-Barragan, J.M. Configuration and specifications of an Unmanned Aerial Vehicle (UAV) for early site specific weed management. PLoS ONE 2013, 8, e58210. [Google Scholar] [CrossRef] [Green Version]
  41. Torres-Sanchez, J.; Lopez-Granados, F.; Serrano, N.; Arquero, O.; Pena, J.M. High-throughput 3-D monitoring of agricultural-tree plantations with Unmanned Aerial Vehicle (UAV) technology. PLoS ONE 2015, 10, e0130479. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  42. Jannoura, R.; Brinkmann, K.; Uteau, D.; Bruns, C.; Joergensen, R.G. Monitoring of crop biomass using true colour aerial photographs taken from a remote controlled hexacopter. Biosyst. Eng. 2015, 129, 341–351. [Google Scholar] [CrossRef]
  43. Dai, B.; He, Y.; Gu, F.; Yang, L.; Han, J.; Xu, W. A vision-based autonomous aerial spray system for precision agriculture. In Proceedings of the IEEE International Conference on Robotics and Biomimetics, Macau, Macao, 5–8 December 2017. [Google Scholar]
  44. Xavier, T.W.F.; Souto, R.N.V.; Statella, T.; Galbieri, R.; Santos, E.S.; Suli, G.S.; Zeilhofer, P. Identification of Ramularia Leaf Blight cotton disease infection levels by multispectral, multiscale UAV imagery. Drones 2019, 3, 33. [Google Scholar] [CrossRef] [Green Version]
  45. Abdulridha, J.; Ampatzidis, Y.; Kakarla, S.C.; Roberts, P. Detection of target spot and bacterial spot diseases in tomato using UAV-based and benchtop-based hyperspectral imaging techniques. Precis. Agric. 2019, 21, 955–978. [Google Scholar] [CrossRef]
  46. Gomez Selvaraj, M.; Vergara, A.; Montenegro, F.; Alonso Ruiz, H.; Safari, N.; Raymaekers, D.; Ocimati, W.; Ntamwira, J.; Tits, L.; Omondi, A.B.; et al. Detection of banana plants and their major diseases through aerial images and machine learning methods: A case study in DR Congo and Republic of Benin. J. Photogramm. Remote Sens. 2020, 169, 110–124. [Google Scholar] [CrossRef]
  47. Schoofs, H.; Delalieux, S.; Deckers, T.; Bylemans, D. Fire Blight monitoring in pear orchards by Unmanned Airborne Vehicles (UAV) systems carrying spectral sensors. Agronomy 2020, 10, 615. [Google Scholar] [CrossRef]
  48. Su, J.; Liu, C.; Hu, X.; Xu, X.; Guo, L.; Chen, W.-H. Spatio-temporal monitoring of wheat yellow rust using UAV multispectral imagery. Comput. Electron. Agric. 2019, 167, 105035. [Google Scholar] [CrossRef]
  49. Berni, J.; Zarco-Tejada, P.J.; Suarez, L.; Fereres, E. Thermal and narrowband multispectral remote sensing for vegetation monitoring from an unmanned aerial vehicle. IEEE Trans. Geosci. Remote Sens. 2009, 47, 722–738. [Google Scholar] [CrossRef] [Green Version]
  50. Suproteem, K.; Sarkara, J.D.; Ehsanib, R.; Kumara, V. Towards autonomous phytopathology: Outcomes and challenges of citrus greening disease detection through close-range remote sensing. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–20 May 2016. [Google Scholar]
  51. Özgüven, M.M. Determination of sugar beet Leaf Spot disease level (Cercospora beticola Sacc.) with image processing technique by using drone. Curr. Investig. Agric. Curr. Res. 2018, 5, 621–631. [Google Scholar] [CrossRef]
  52. Valasek, J.; Thomasson, J.A.; Balota, M.; Oakes, J. Exploratory use of a UAV platform for variety selection in peanut. In Proceedings of the Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping, Baltimore, Maryland, 18–19 April 2016. 98660F. [Google Scholar] [CrossRef]
  53. Sugiura, R.; Tsuda, S.; Tamiya, S.; Itoh, A.; Nishiwaki, K.; Murakami, N.; Shibuya, Y.; Hirafuji, M.; Nuske, S. Field phenotyping system for the assessment of potato late blight resistance using RGB imagery from an unmanned aerial vehicle. Biosyst. Eng. 2016, 148, 1–10. [Google Scholar] [CrossRef]
  54. Rahman, M.F.F.; Fan, S.; Zhang, Y.; Chen, L. A comparative study on application of unmanned aerial vehicle systems in agriculture. Agriculture 2021, 11, 22. [Google Scholar] [CrossRef]
  55. Ludovisi, R.; Tauro, F.; Salvati, R.; Khoury, S.; Mugnozza Scarascia, G.; Harfouche, A. UAV-based thermal imaging for high-throughput field phenotyping of black poplar response to drought. Front. Plant Sci. 2017, 8, 1681. [Google Scholar] [CrossRef]
  56. Zhou, J.; Zhou, J.; Ye, H.; Ali, M.L.; Nguyen, H.T.; Chen, P. Classification of soybean leaf wilting due to drought stress using UAV-based imagery. Comput. Electron. Agric. 2020, 175, 105576. [Google Scholar] [CrossRef]
  57. Maes, W.H.; Steppe, K. Perspectives for remote sensing with unmanned aerial vehicles in precision agriculture. Trends Plant Sci. 2019, 24, 152–164. [Google Scholar] [CrossRef]
  58. Surový, P.; Almeida Ribeiro, N.; Panagiotidis, D. Estimation of positions and heights from UAV-sensed imagery in tree plantations in agrosilvopastoral systems. Int. J. Remote Sens. 2018, 39, 4786–4800. [Google Scholar] [CrossRef]
  59. Chang, A.; Jung, J.; Maeda, M.M.; Landivar, J. Crop height monitoring with digital imagery from Unmanned Aerial System (UAS). Comput. Electron. Agric. 2017, 141, 232–237. [Google Scholar] [CrossRef]
  60. Torres-Sánchez, J.; de Castro, A.I.; Peña, J.M.; Jiménez-Brenes, F.M.; Arquero, O.; Lovera, M.; López-Granados, F. Mapping the 3D structure of almond trees using UAV acquired photogrammetric point clouds and object-based image analysis. Biosyst. Eng. 2018, 176, 172–184. [Google Scholar] [CrossRef]
  61. Grüner, E.; Astor, T.; Wachendorf, M. Biomass prediction of heterogeneous temperate grasslands using an SfM approach based on UAV imaging. Agronomy 2019, 9, 54. [Google Scholar] [CrossRef] [Green Version]
  62. Roth, L.; Streit, B. Predicting cover crop biomass by lightweight UAS-based RGB and NIR photography: An applied photogrammetric approach. Precis. Agric. 2017, 19, 93–114. [Google Scholar] [CrossRef] [Green Version]
  63. Viljanen, N.; Honkavaara, E.; Näsi, R.; Hakala, T.; Niemeläinen, O.; Kaivosoja, J. A novel machine learning method for estimating biomass of grass swards using a photogrammetric canopy height model, images and vegetation indices captured by a drone. Agriculture 2018, 8, 70. [Google Scholar] [CrossRef] [Green Version]
  64. Berra, E.F.; Gaulton, R.; Barr, S. Commercial off-the-shelf digital cameras on unmanned aerial vehicles for multitemporal monitoring of vegetation reflectance and NDVI. IEEE Trans. Geosci. Remote Sens. 2017, 55, 4878–4886. [Google Scholar] [CrossRef] [Green Version]
  65. Nijland, W.; De Jong, R.; De Jong, S.M.; Wulder, M.A.; Bater, C.W.; Coops, N.C. Monitoring plant condition and phenology using infrared sensitive consumer grade digital cameras. Agric. For. Meteorol. 2014, 184, 98–106. [Google Scholar] [CrossRef] [Green Version]
  66. Bock, C.H.; Barbedo, J.G.; Del Ponte, E.M.; Bohnenkamp, D.; Mahlein, A.-K. From visual estimates to fully automated sensor-based measurements of plant disease severity: Status and challenges for improving accuracy. Phytopathol. Res. 2020, 2, 1–30. [Google Scholar] [CrossRef] [Green Version]
  67. Mattupalli, C.; Moffet, C.; Shah, K.; Young, C. Supervised classification of RGB aerial imagery to evaluate the impact of a root rot disease. Remote Sens. 2018, 10, 917. [Google Scholar] [CrossRef] [Green Version]
  68. Kerkech, M.; Hafiane, A.; Canals, R. Deep leaning approach with colorimetric spaces and vegetation indices for vine diseases detection in UAV images. Comput. Electron. Agric. 2018, 155, 237–243. [Google Scholar] [CrossRef]
  69. Ashourloo, D.; Mobasheri, M.R.; Huete, A. Developing two spectral disease indices for detection of wheat leaf rust (Pucciniatriticina). Remote Sens. 2014, 6, 4723–4740. [Google Scholar] [CrossRef] [Green Version]
  70. Zhang, D.; Zhou, X.; Zhang, J.; Lan, Y.; Xu, C.; Liang, D. Detection of rice sheath blight using an unmanned aerial system with high-resolution color and multispectral imaging. PLoS ONE 2018, 13, e0187470. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  71. Nhamo, L.; Ebrahim, G.Y.; Mabhaudhi, T.; Mpandeli, S.; Magombeyi, M.; Chitakira, M.; Magidi, J.; Sibanda, M. An assessment of groundwater use in irrigated agriculture using multi-spectral remote sensing. Phys. Chem. Earth Parts A/B/C 2020, 115, 102810. [Google Scholar] [CrossRef]
  72. Adão, T.; Hruška, J.; Pádua, L.; Bessa, J.; Peres, E.; Morais, R.; Sousa, J. Hyperspectral Imaging: A review on UAV-based sensors, data processing and applications for agriculture and forestry. Remote Sens. 2017, 9, 1110. [Google Scholar] [CrossRef] [Green Version]
  73. Geipel, J.; Link, J.; Wirwahn, J.; Claupein, W. A programmable aerial multispectral camera system for in-season crop biomass and nitrogen content estimation. Agriculture 2016, 6, 4. [Google Scholar] [CrossRef] [Green Version]
  74. Iqbal, F.; Lucieer, A.; Barry, K. Simplified radiometric calibration for UAS-mounted multispectral sensor. Eur. J. Remote Sens. 2018, 51, 301–313. [Google Scholar] [CrossRef]
  75. Deng, L.; Mao, Z.; Li, X.; Hu, Z.; Duan, F.; Yan, Y. UAV-based multispectral remote sensing for precision agriculture: A comparison between different cameras. J. Photogramm. Remote Sens. 2018, 146, 124–136. [Google Scholar] [CrossRef]
  76. Zaman-Allah, M.; Vergara, O.; Araus, J.L.; Tarekegne, A.; Magorokosho, C.; Zarco-Tejada, P.J.; Hornero, A.; Alba, A.H.; Das, B.; Craufurd, P.; et al. Unmanned aerial platform-based multi-spectral imaging for field phenotyping of maize. Plant Methods 2015, 11, 35. [Google Scholar] [CrossRef] [Green Version]
  77. Kalischuk, M.; Paret, M.L.; Freeman, J.H.; Raj, D.; Da Silva, S.; Eubanks, S.; Wiggins, D.J.; Lollar, M.; Marois, J.J.; Mellinger, H.C.; et al. An improved crop scouting technique incorporating unmanned aerial vehicle-assisted multispectral crop imaging into conventional scouting practice for gummy stem blight in watermelon. Plant Dis. 2019, 103, 1642–1650. [Google Scholar] [CrossRef]
  78. Al-Saddik, H.; Simon, J.C.; Brousse, O.; Cointault, F. Multispectral band selection for imaging sensor design for vineyard disease detection: Case of Flavescence dorée. Adv. Anim. Biosci. 2017, 8, 150–155. [Google Scholar] [CrossRef] [Green Version]
  79. Albetis, J.; Jacquin, A.; Goulard, M.; Poilvé, H.; Rousseau, J.; Clenet, H.; Dedieu, G.; Duthoit, S. On the potentiality of UAV multispectral imagery to detect Flavescence dorée and grapevine trunk diseases. Remote Sens. 2018, 11, 23. [Google Scholar] [CrossRef] [Green Version]
  80. Calderón, R.; Montes-Borrego, M.; Landa, B.B.; Navas-Cortés, J.A.; Zarco-Tejada, P.J. Detection of downy mildew of opium poppy using high-resolution multi-spectral and thermal imagery acquired with an unmanned aerial vehicle. Precis. Agric. 2014, 15, 639–661. [Google Scholar] [CrossRef]
  81. Dash, J.; Pearse, G.; Watt, M. UAV multispectral imagery can complement satellite data for monitoring forest health. Remote Sens. 2018, 10, 1216. [Google Scholar] [CrossRef] [Green Version]
  82. Khot, L.R.; Sankaran, S.; Carter, A.H.; Johnson, D.A.; Cummings, T.F. UAS imaging-based decision tools for arid winter wheat and irrigated potato production management. Int. J. Remote Sens. 2015, 37, 125–137. [Google Scholar] [CrossRef]
  83. Nebiker, S.; Lack, N.; Abächerli, M.; Läderach, S. Light-weight multispectral UAV sensors and their capabilities for predicting grain yield and detecting plant diseases. ISPRS -Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 41, 963–970. [Google Scholar] [CrossRef] [Green Version]
  84. Kerkech, M.; Hafiane, A.; Canals, R. Vine disease detection in UAV multispectral images using optimized image registration and deep learning segmentation approach. Comput. Electron. Agric. 2020, 174, 105446. [Google Scholar] [CrossRef]
  85. Gallo, R.; Ristorto, G.; Daglio, G.; Berta, G.; Lazzari, M.; Mazzetto, F. New solutions for the automatic early detection of diseases in vineyards through ground sensing approaches integrating LiDAR and optical sensors. Chem. Eng. Trans. 2017, 58, 673–678. [Google Scholar]
  86. Qin, J.; Wang, B.; Wu, Y.; Lu, Q.; Zhu, H. Identifying pine wood nematode disease using UAV images and deep learning algorithms. Remote Sens. 2021, 13, 162. [Google Scholar] [CrossRef]
  87. Ye, H.; Huang, W.; Huang, S.; Cui, B.; Dong, Y.; Guo, A.; Ren, Y.; Jin, Y. Recognition of banana fusarium wilt based on UAV remote sensing. Remote Sens. 2020, 12, 938. [Google Scholar] [CrossRef] [Green Version]
  88. Lowe, A.; Harrison, N.; French, A.P. Hyperspectral image analysis techniques for the detection and classification of the early onset of plant disease and stress. Plant Methods 2017, 13, 80. [Google Scholar] [CrossRef]
  89. Cilia, C.; Panigada, C.; Rossini, M.; Meroni, M.; Busetto, L.; Amaducci, S.; Boschetti, M.; Picchi, V.; Colombo, R. Nitrogen status assessment for variable rate fertilization in maize through hyperspectral imagery. Remote Sens. 2014, 6, 6549–6565. [Google Scholar] [CrossRef] [Green Version]
  90. Gevaert, C.M.; Suomalainen, J.; Tang, J.; Kooistra, L. Generation of spectral–temporal response surfaces by combining multispectral satellite and hyperspectral UAV imagery for precision agriculture applications. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2015, 8, 3140–3146. [Google Scholar] [CrossRef]
  91. Proctor, C.; He, Y. Workflow for building a hyperspectral UAV: Challenges and opportunities. ISPRS-Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, 40, 415–419. [Google Scholar] [CrossRef] [Green Version]
  92. Deery, D.; Jimenez-Berni, J.; Jones, H.; Sirault, X.; Furbank, R. proximal remote sensing buggies and potential applications for field-based phenotyping. Agronomy 2014, 4, 349–379. [Google Scholar] [CrossRef] [Green Version]
  93. Honkavaara, E.; Hakala, T.; Markelin, L.; Jaakkola, A.; Saari, H.; Ojanen, H.; Pölönen, I.; Tuominen, S.; Näsi, R.; Rosnell, T.; et al. Autonomous hyperspectral UAS photogrammetry for environmental monitoring applications. ISPRS-Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2014, 40, 155–159. [Google Scholar] [CrossRef] [Green Version]
  94. Honkavaara, E.; Saari, H.; Kaivosoja, J.; Pölönen, I.; Hakala, T.; Litkey, P.; Mäkynen, J.; Pesonen, L. Processing and assessment of spectrometric, stereoscopic imagery collected using a lightweight UAV spectral camera for precision agriculture. Remote Sens. 2013, 5, 5006–5039. [Google Scholar] [CrossRef] [Green Version]
  95. Saari, H.; Akujärvi, A.; Holmlund, C.; Ojanen, H.; Kaivosoja, J.; Nissinen, A.; Niemeläinen, O. Visible, very near IR and short wave IR hyperspectral drone imaging system for agriculture and natural water applications. ISPRS-Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 42, 165–170. [Google Scholar] [CrossRef] [Green Version]
  96. Tack, N.; Lambrechts, A.; Soussan, P.; Haspeslagh, L. A compact, high-speed, and low-cost hyperspectral imager. In Proceedings of the Silicon Photonics VII, 8266, San Francisco, CA, USA, 21–26 January 2012. [Google Scholar]
  97. Sima, A.A.; Baeck, P.; Nuyts, D.; Delalieux, S.; Livens, S.; Blommaert, J.; Delauré, B.; Boonen, M. Compact hyperspectral imaging system (COSI) for Small Remotely Piloted Aircraft Systems (RPAS) – System overview and first performance evaluation results. ISPRS-Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 41, 1157–1164. [Google Scholar] [CrossRef] [Green Version]
  98. Calderón, R.; Navas-Cortés, J.A.; Lucena, C.; Zarco-Tejada, P.J. High-resolution airborne hyperspectral and thermal imagery for early detection of Verticillium wilt of olive using fluorescence, temperature and narrow-band spectral indices. Remote Sens. Environ. 2013, 139, 231–245. [Google Scholar] [CrossRef]
  99. Sandino, J.; Pegg, G.; Gonzalez, F.; Smith, G. Aerial mapping of forests affected by pathogens using UAVs, hyperspectral sensors, and artificial intelligence. Sensors 2018, 18, 944. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  100. Calderón, R.; Navas-Cortés, J.; Lucena, C.; Zarco-Tejada, P. High-resolution hyperspectral and thermal imagery acquired from UAV platforms for early detection of Verticillium wilt using fluorescence, temperature and narrow-band indices. In Proceedings of the Workshop on UAV-basaed Remote Sensing Methods for Monitoring Vegetation, Cologne, Germany, 11–12 September 2013; p. 9. [Google Scholar]
  101. Thomas, S.; Kuska, M.T.; Bohnenkamp, D.; Brugger, A.; Alisaac, E.; Wahabzada, M.; Behmann, J.; Mahlein, A.-K. Benefits of hyperspectral imaging for plant disease detection and plant protection: A technical perspective. J. Plant Dis. Prot. 2018, 125, 5–20. [Google Scholar] [CrossRef]
  102. Costa, J.M.; Grant, O.M.; Chaves, M.M. Thermography to explore plant-environment interactions. J. Exp. Bot. 2013, 64, 3937–3949. [Google Scholar] [CrossRef] [PubMed]
  103. Mahajan, U.; Bundel, B.R. Drones for Normalized Difference Vegetation Index (NDVI), to estimate crop health for precision agriculture: A cheaper alternative for spatial satellite sensors. In International Conference on Innovative Research in Agriculture, Food Science, Forestry, Horticulture, Aquaculture, Animal Sciences, Biodiversity, Ecological Sciences and Climate Change; Krishi Sanskriti Publications: New Delhi, India, 2016. [Google Scholar]
  104. Gago, J.; Douthe, C.; Coopman, R.; Gallego, P.; Ribas-Carbo, M.; Flexas, J.; Escalona, J.; Medrano, H. UAVs challenge to assess water stress for sustainable agriculture. Agric. Water Manag. 2015, 153, 9–19. [Google Scholar] [CrossRef]
  105. Granum, E.; Pérez-Bueno, M.L.; Calderón, C.E.; Ramos, C.; de Vicente, A.; Cazorla, F.M.; Barón, M. Metabolic responses of avocado plants to stress induced by Rosellinia necatrix analysed by fluorescence and thermal imaging. Eur. J. Plant Pathol. 2015, 142, 625–632. [Google Scholar] [CrossRef]
  106. Smigaj, M.; Gaulton, R.; Barr, S.L.; Suárez, J.C. UAV-borne thermal imaging for forest health monitoring: Detection of disease-induced canopy temperature increase. ISPRS-Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, 40, 349–354. [Google Scholar] [CrossRef] [Green Version]
  107. Mahlein, A.-K.; Oerke, E.-C.; Steiner, U.; Dehne, H.-W. Recent advances in sensing plant diseases for precision crop protection. Eur. J. Plant Pathol. 2012, 133, 197–209. [Google Scholar] [CrossRef]
  108. Raza, S.-e.-A.; Prince, G.; Clarkson, J.P.; Rajpoot, N.M. Automatic detection of diseased tomato plants using thermal and stereo visible light images. PLoS ONE 2015, 10, e0123262. [Google Scholar]
  109. Baranowski, P.; Jedryczka, M.; Mazurek, W.; Babula-Skowronska, D.; Siedliska, A.; Kaczmarek, J. Hyperspectral and thermal imaging of oilseed rape (Brassica napus) response to fungal species of the genus Alternaria. PLoS ONE 2015, 10, e0122913. [Google Scholar] [CrossRef] [Green Version]
  110. López-López, M.; Calderón, R.; González-Dugo, V.; Zarco-Tejada, P.J.; Fereres, E. Early detection and quantification of almond red leaf blotch using high-resolution hyperspectral and thermal imagery. Remote Sens. 2016, 8, 276. [Google Scholar] [CrossRef] [Green Version]
  111. Sankaran, S.; Maja, J.M.; Buchanon, S.; Ehsani, R. Huanglongbing (citrus greening) detection using visible, near infrared and thermal imaging techniques. Sensors 2013, 13, 2117–2130. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  112. Xu, H.; Zhu, S.; Ying, Y.; Jiang, H. Early detection of plant disease using infrared thermal imaging. In Optics for Natural Resources, Agriculture, and Foods; International Society for Optics and Photonics: Bellingham, WA, USA, 2006; p. 638110. [Google Scholar]
  113. Wang, M.; Xiong, Y.; Ling, N.; Feng, X.; Zhong, Z.; Shen, Q.; Guo, S. Detection of the dynamic response of cucumber leaves to fusaric acid using thermal imaging. Plant Physiol. Biochem. 2013, 66, 68–76. [Google Scholar] [CrossRef] [PubMed]
  114. Anasta, N.; Setyawan, F.; Fitriawan, H. Disease detection in banana trees using an image processing-based thermal camera. In IOP Conference Series: Earth and Environmental Science; IOP Publishing: Bristol, UK, 2021; p. 012088. [Google Scholar]
  115. Yang, N.; Yuan, M.; Wang, P.; Zhang, R.; Sun, J.; Mao, H. Tea diseases detection based on fast infrared thermal image processing technology. J. Sci. Food Agric. 2019, 99, 3459–3466. [Google Scholar] [CrossRef] [PubMed]
  116. Vit, A.; Shani, G. Comparing RGB-D sensors for close range outdoor agricultural phenotyping. Sensors 2018, 18, 4413. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  117. Andujar, D.; Dorado, J.; Fernandez-Quintanilla, C.; Ribeiro, A. An approach to the use of depth cameras for weed volume estimation. Sensors 2016, 16, 972. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  118. Zollhöfer, M.; Stotko, P.; Görlitz, A.; Theobalt, C.; Nießner, M.; Klein, R.; Kolb, A. State of the art on 3D reconstruction with RGB-D cameras. Comput. Graph. Forum 2018, 37, 625–652. [Google Scholar] [CrossRef]
  119. Xia, C.; Wang, L.; Chung, B.-K.; Lee, J.-M. In situ 3D segmentation of individual plant leaves using a RGB-D camera for agricultural automation. Sensors 2015, 15, 20463–20479. [Google Scholar] [CrossRef]
  120. Mahlein, A.-K. Plant disease detection by imaging sensors–parallels and specific demands for precision agriculture and plant phenotyping. Plant Dis. 2016, 100, 241–251. [Google Scholar] [CrossRef] [Green Version]
  121. Paulus, S.; Dupuis, J.; Mahlein, A.-K.; Kuhlmann, H. Surface feature based classification of plant organs from 3D laserscanned point clouds for plant phenotyping. BMC Bioinform. 2013, 14, 1–12. [Google Scholar] [CrossRef] [Green Version]
  122. Singh, A.; Ganapathysubramanian, B.; Singh, A.K.; Sarkar, S. Machine learning for high-throughput stress phenotyping in plants. Trends Plant Sci. 2016, 21, 110–124. [Google Scholar] [CrossRef] [Green Version]
  123. Wallelign, S.; Polceanu, M.; Buche, C. Soybean plant disease identification using convolutional neural network. In Proceedings of the Thirty-First International Flairs Conference, Melbourne, FL, USA, 21–23 May 2018. [Google Scholar]
  124. Sonka, M.; Hlavac, V.; Boyle, R. Image pre-processing. In Image Processing, Analysis and Machine Vision; Springer: Berlin, Germany, 1993; pp. 56–111. [Google Scholar]
  125. Ghosal, S.; Blystone, D.; Singh, A.K.; Ganapathysubramanian, B.; Singh, A.; Sarkar, S. An explainable deep machine vision framework for plant stress phenotyping. Proc. Natl. Acad. Sci. USA 2018, 115, 4613–4618. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  126. Ferentinos, K.P. Deep learning models for plant disease detection and diagnosis. Comput. Electron. Agric. 2018, 145, 311–318. [Google Scholar] [CrossRef]
  127. Gu, L.; Robles-Kelly, A.A.; Zhou, J. Efficient estimation of reflectance parameters from imaging spectroscopy. IEEE Trans. Image Process. 2013, 22, 3648–3663. [Google Scholar]
  128. Habili, N.; Oorloff, J. Scyllarus ™: From research to commercial software. In Proceedings of the ASWEC 2015 24th Australasian Software Engineering Conference, New York, NY, USA, 28 September–1 October 2015; pp. 119–122. [Google Scholar]
  129. Choi, H.; Baraniuk, R. Analysis of wavelet-domain Wiener filters. In Proceedings of the IEEE-SP International Symposium on Time-Frequency and Time-Scale Analysis (Cat. No. 98TH8380), Philadelphia, PA, USA, 25–28 October 1994; pp. 613–616. [Google Scholar]
  130. Marroquin, J.L.; Girosi, F. Some extensions of the K-Means algorithm for image segmentation and pattern classification; Massachusetts Inst of Tech Cambridge Artificial Intelligence Lab: Cambridge, MA, USA, 1993. [Google Scholar]
  131. MacQueen, J. Some methods for classification and analysis of multivariate observations. In Proceedings of the Fifth Berkeley Symposium on Mathematical Statistics and Probability, Los Angeles, CA, USA, 27 December 1965–7 January 1966; pp. 281–297. [Google Scholar]
  132. Wagstaff, K.; Cardie, C.; Rogers, S.; Schrödl, S. Constrained k-means clustering with background knowledge. In Proceedings of the Eighteenth International Conference on Machine Learning, San Francisco, CA, USA, 28 June–1 July 2001; pp. 577–584. [Google Scholar]
  133. Picard, R.R.; Cook, R.D. Cross-validation of regression models. J. Am. Stat. Assoc. 1984, 79, 575–583. [Google Scholar] [CrossRef]
  134. Gupta, S.G.; Ghonge, D.; Jawandhiya, P.M. Review of Unmanned Aircraft System (UAS). Int. J. Adv. Res. Comput. Eng. Technol. 2013, 2, 1646–1658. [Google Scholar] [CrossRef]
  135. Marino, S.; Alvino, A. Detection of spatial and temporal variability of wheat cultivars by high-resolution vegetation indices. Agronomy 2019, 9, 226. [Google Scholar] [CrossRef] [Green Version]
  136. Ribeiro-Gomes, K.; Hernández-López, D.; Ortega, J.F.; Ballesteros, R.; Poblete, T.; Moreno, M.A. Uncooled thermal camera calibration and optimization of the photogrammetry process for UAV applications in agriculture. Sensors 2017, 17, 2173. [Google Scholar] [CrossRef]
  137. Xue, J.; Su, B. Significant remote sensing vegetation indices: A review of developments and applications. J. Sens. 2017, 2017, 1353691. [Google Scholar] [CrossRef] [Green Version]
  138. Woebbecke, D.M.; Meyer, G.E.; Von Bargen, K.; Mortensen, D.A. Color indices for weed identification under various soil, residue, and lighting conditions. Trans. ASAE 1995, 38, 259–269. [Google Scholar] [CrossRef]
  139. Patrick, A.; Pelham, S.; Culbreath, A.; Holbrook, C.C.; De Godoy, I.J.; Li, C. High throughput phenotyping of tomato spot wilt disease in peanuts using unmanned aerial systems and multispectral imaging. IEEE Instrum. Meas. Mag. 2017, 20, 4–12. [Google Scholar] [CrossRef]
  140. Albetis, J.; Duthoit, S.; Guttler, F.; Jacquin, A.; Goulard, M.; Poilvé, H.; Féret, J.-B.; Dedieu, G. Detection of Flavescence dorée grapevine disease using Unmanned Aerial Vehicle (UAV) multispectral imagery. Remote Sens. 2017, 9, 308. [Google Scholar] [CrossRef] [Green Version]
  141. McCulloch, W.S.; Pitts, W. A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biophys. 1943, 5, 115–133. [Google Scholar] [CrossRef]
  142. Carranza-Rojas, J.; Goeau, H.; Bonnet, P.; Mata-Montero, E.; Joly, A. Going deeper in the automated identification of Herbarium specimens. BMC Evol. Biol. 2017, 17, 1–14. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  143. Yang, X.; Guo, T. Machine learning in plant disease research. Eur. J. Biomed. Res. 2017, 3, 6–9. [Google Scholar] [CrossRef] [Green Version]
  144. LeCun, Y.; Bottou, L.; Bengio, Y.; Haffner, P. Gradient-based learning applied to document recognition. Proc. IEEE 1998, 86, 2278–2324. [Google Scholar] [CrossRef] [Green Version]
  145. Zhang, K.; Wu, Q.; Liu, A.; Meng, X. Can deep learning identify tomato leaf disease? Adv. Multimed. 2018, 2018, 6710865. [Google Scholar] [CrossRef] [Green Version]
  146. Türkoğlu, M.; Hanbay, D. Plant disease and pest detection using deep learning-based features. Turk. J. Electr. Eng. Comput. Sci. 2019, 27, 1636–1651. [Google Scholar] [CrossRef]
  147. Krizhevsky, A. One weird trick for parallelizing convolutional neural networks. arXiv 2014, arXiv:1404.5997. [Google Scholar]
  148. Sermanet, P.; Eigen, D.; Zhang, X.; Mathieu, M.; Fergus, R.; LeCun, Y. Overfeat: Integrated recognition, localization and detection using convolutional networks. arXiv 2013, arXiv:1312.6229. [Google Scholar]
  149. Simonyan, K.; Zisserman, A. Very deep convolutional networks for large-scale image recognition. arXiv 2014, arXiv:1409.1556. [Google Scholar]
  150. Hughes, D.; Salathé, M. An open access repository of images on plant health to enable the development of mobile disease diagnostics. arXiv 2015, arXiv:1511.08060. [Google Scholar]
  151. Mohanty, S.P.; Hughes, D.P.; Salathé, M. Using deep learning for image-based plant disease detection. Front. Plant Sci. 2016, 7, 1419. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  152. Sibiya, M.; Sumbwanyambe, M. A computational procedure for the recognition and classification of maize leaf diseases out of healthy leaves using convolutional neural networks. AgriEngineering 2019, 1, 119–131. [Google Scholar] [CrossRef] [Green Version]
  153. Wang, T.; Thomasson, J.A.; Yang, C.; Isakeit, T.; Nichols, R.L. Automatic classification of cotton root rot disease based on UAV remote sensing. Remote Sens. 2020, 12, 1310. [Google Scholar] [CrossRef] [Green Version]
  154. Kerkech, M.; Hafiane, A.; Canals, R.; Ros, F. Vine disease detection by deep learning method combined with 3d depth information. In Proceedings of the International Conference on Image and Signal Processing, Marrakesh, Morocco, 4–6 June 2020; Springer: Cham, Germany, 2020; pp. 82–90. [Google Scholar]
  155. Gibson-Poole, S.; Humphris, S.; Toth, I.; Hamilton, A. Identification of the onset of disease within a potato crop using a UAV equipped with un-modified and modified commercial off-the-shelf digital cameras. Adv. Anim. Biosci. 2017, 8, 812–816. [Google Scholar] [CrossRef]
  156. Sugiura, R.; Tsuda, S.; Tsuji, H.; Murakami, N. Virus-infected plant detection in potato seed production field by UAV imagery. In Proceedings of the 2018 ASABE Annual International Meeting, Detroit, MI, USA, 29 July–1 August 2018; p. 1. [Google Scholar]
  157. Dang, L.M.; Hassan, S.I.; Suhyeon, I.; kumar Sangaiah, A.; Mehmood, I.; Rho, S.; Seo, S.; Moon, H. UAV based wilt detection system via convolutional neural networks. Sustain. Comput. Inform. Syst. 2018, 28, 100250. [Google Scholar] [CrossRef] [Green Version]
  158. Rangarajan, A.K.; Purushothaman, R.; Ramesh, A. Tomato crop disease classification using pre-trained deep learning algorithm. Procedia Comput. Sci. 2018, 133, 1040–1047. [Google Scholar] [CrossRef]
  159. Too, E.C.; Yujian, L.; Njuki, S.; Yingchun, L. A comparative study of fine-tuning deep learning models for plant disease identification. Comput. Electron. Agric. 2019, 161, 272–279. [Google Scholar] [CrossRef]
  160. Durmuş, H.; Güneş, E.O.; Kırcı, M. Disease detection on the leaves of the tomato plants by using deep learning. In Proceedings of the 2017 6th International Conference on Agro-Geoinformatics, Fairfax, VA, USA, 7–10 August 2017; pp. 1–5. [Google Scholar]
  161. Chen, Y.; Lin, Z.; Zhao, X.; Wang, G.; Gu, Y. Deep learning-based classification of hyperspectral data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 2094–2107. [Google Scholar] [CrossRef]
  162. Grinblat, G.L.; Uzal, L.C.; Larese, M.G.; Granitto, P.M. Deep learning for plant identification using vein morphological patterns. Comput. Electron. Agric. 2016, 127, 418–424. [Google Scholar] [CrossRef] [Green Version]
  163. Kamilaris, A.; Prenafeta-Boldú, F.X. Deep learning in agriculture: A survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef] [Green Version]
  164. Lee, S.H.; Chan, C.S.; Wilkin, P.; Remagnino, P. Deep-plant: Plant identification with convolutional neural networks. In Proceedings of the 2015 IEEE international conference on image processing (ICIP), Quebec City, QC, Canada, 27–30 September 2015; pp. 452–456. [Google Scholar]
  165. Kussul, N.; Lavreniuk, M.; Skakun, S.; Shelestov, A. Deep learning classification of land cover and crop types using remote sensing data. IEEE Geosci. Remote Sens. Lett. 2017, 14, 778–782. [Google Scholar] [CrossRef]
  166. Song, X.; Zhang, G.; Liu, F.; Li, D.; Zhao, Y.; Yang, J. Modeling spatio-temporal distribution of soil moisture by deep learning-based cellular automata model. J. Arid Land 2016, 8, 734–748. [Google Scholar] [CrossRef] [Green Version]
  167. Zhu, N.; Liu, X.; Liu, Z.; Hu, K.; Wang, Y.; Tan, J.; Huang, M.; Zhu, Q.; Ji, X.; Jiang, Y. Deep learning for smart agriculture: Concepts, tools, applications, and opportunities. Int. J. Agric. Biol. Eng. 2018, 11, 32–44. [Google Scholar] [CrossRef]
  168. He, K.; Zhang, X.; Ren, S.; Sun, J. Identity mappings in deep residual networks. In Proceedings of the European Conference on Computer Vision, Amsterdam, The Netherlands, 8–16 October 2016; Springer: Cham, Germany, 2016; pp. 630–645. [Google Scholar]
  169. Yu, D.; Xiong, W.; Droppo, J.; Stolcke, A.; Ye, G.; Li, J.; Zweig, G. Deep convolutional neural networks with layer-wise context expansion and attention. In Proceedings of the 17th Annual Conference of the International Speech Communication Association, San Francisco, CA, USA, 8–12 September 2016; pp. 17–21. [Google Scholar]
  170. Barbedo, J.G.A. A review on the main challenges in automatic plant disease identification based on visible range images. Biosyst. Eng. 2016, 144, 52–60. [Google Scholar] [CrossRef]
  171. Khanal, S.; KC, K.; Fulton, J.P.; Shearer, S.; Ozkan, E. Remote sensing in agriculture—accomplishments, limitations, and opportunities. Remote Sens. 2020, 12, 3783. [Google Scholar] [CrossRef]
  172. Hulley, G.; Hook, S.; Fisher, J.; Lee, C. Ecostress, a Nasa Earth-Ventures Instrument for studying links between the water cycle and plant health over the diurnal cycle. In Proceedings of the 2017 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Fort Worth, TX, USA, 23–28 July 2017; pp. 5494–5496. [Google Scholar]
  173. Song, L.; Guanter, L.; Guan, K.; You, L.; Huete, A.; Ju, W.; Zhang, Y. Satellite sun-induced chlorophyll fluorescence detects early response of winter wheat to heat stress in the Indian Indo-Gangetic Plains. Glob. Chang. Biol. 2018, 24, 4023–4037. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  174. Nguyen, M.-T.; Shah, D. Improving Current Limitations of Deep Learning Based Plant Disease Identification; The Cooper Union: NewYork, NY, USA, 22 December 2019. [Google Scholar]
  175. Arsenovic, M.; Karanovic, M.; Sladojevic, S.; Anderla, A.; Stefanovic, D. Solving current limitations of deep learning based approaches for plant disease detection. Symmetry 2019, 11, 939. [Google Scholar] [CrossRef] [Green Version]
  176. Petty, R.V.; Chang, E.B.E. Drone use in aerial pesticide application faces outdated regulatory hurdles. Harvard J. Law Technol. Dig. 2018, 1–14. [Google Scholar]
  177. Stoica, A.-A. Emerging legal issues regarding civilian drone usage. Chall. Knowl. Soc. 2018, 692–699. [Google Scholar]
  178. Stöcker, C.; Bennett, R.; Nex, F.; Gerke, M.; Zevenbergen, J. Review of the current state of UAV regulations. Remote Sens. 2017, 9, 459. [Google Scholar] [CrossRef] [Green Version]
  179. Xu, H.; Ying, Y.; Fu, X.; Zhu, S. Near-infrared spectroscopy in detecting leaf miner damage on tomato leaf. Biosyst. Eng. 2007, 96, 447–454. [Google Scholar] [CrossRef]
  180. Al-Ahmadi, A.H.; Subedi, A.; Wang, G.; Choudhary, R.; Fakhoury, A.; Watson, D.G. Detection of charcoal rot (Macrophomina phaseolina) toxin effects in soybean (Glycine max) seedlings using hyperspectral spectroscopy. Comput. Electron. Agric. 2018, 150, 188–195. [Google Scholar] [CrossRef]
  181. Oosedo, A.; Abiko, S.; Konno, A.; Uchiyama, M. Optimal transition from hovering to level-flight of a quadrotor tail-sitter UAV. Auton. Robot. 2017, 41, 1143–1159. [Google Scholar] [CrossRef]
  182. Theys, B.; De Vos, G.; De Schutter, J. A control approach for transitioning VTOL UAVs with continuously varying transition angle and controlled by differential thrust. In Proceedings of the 2016 International Conference on Unmanned Aircraft Systems (ICUAS), Arlington, VA, USA, 7–10 June 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 118–125. [Google Scholar]
  183. Latif, M.A. An agricultural perspective on flying sensors: State of the art, challenges, and future directions. IEEE Geosci. Remote Sens. Mag. 2018, 6, 10–22. [Google Scholar] [CrossRef]
  184. Oghaz, M.M.D.; Razaak, M.; Kerdegari, H.; Argyriou, V.; Remagnino, P. Scene and environment monitoring using aerial imagery and deep learning. In Proceedings of the 2019 15th International Conference on Distributed Computing in Sensor Systems (DCOSS), Los Angeles, CA, USA, 29–31 May 2019; pp. 362–369. [Google Scholar]
  185. Boursianis, A.D.; Papadopoulou, M.S.; Diamantoulakis, P.; Liopa-Tsakalidi, A.; Barouchas, P.; Salahas, G.; Karagiannidis, G.; Wan, S.; Goudos, S.K. Internet of Things (IoT) and Agricultural Unmanned Aerial Vehicles (UAVs) in smart farming: A comprehensive review. Internet Things 2020, 100187. [Google Scholar] [CrossRef]
  186. Ernst, M.; Holst, H.; Winter, M.; Altermatt, P.P. SunCalculator: A program to calculate the angular and spectral distribution of direct and diffuse solar radiation. Sol. Energy Mater. Sol. Cells 2016, 157, 913–922. [Google Scholar] [CrossRef]
  187. Fernández, E.F.; Soria-Moya, A.; Almonacid, F.; Aguilera, J. Comparative assessment of the spectral impact on the energy yield of high concentrator and conventional photovoltaic technology. Sol. Energy Mater. Sol. Cells 2016, 147, 185–197. [Google Scholar] [CrossRef]
  188. Ben-Dor, E.; Schläpfer, D.; Plaza, A.J.; Malthus, T. Hyperspectral remote sensing. In Airborne Measurements for Environmental Research: Methods and Instruments; Wiley-VCH Verlag & Co. KGaA: Weinheim, Germany, 2013; Volume 413, p. 456. [Google Scholar]
  189. Colombo, R.; Celesti, M.; Bianchi, R.; Campbell, P.K.; Cogliati, S.; Cook, B.D.; Corp, L.A.; Damm, A.; Domec, J.C.; Guanter, L. Variability of sun-induced chlorophyll fluorescence according to stand age-related processes in a managed loblolly pine forest. Glob. Chang. Biol. 2018, 24, 2980–2996. [Google Scholar] [CrossRef] [Green Version]
  190. Middleton, E.M.; Rascher, U.; Huemmrich, K.F.; Cook, B.D.; Noormets, A.; Schickling, A.; Pinto, F.; Alonso, L.; Damm, A.; Guanter, L. The 2013 FLEX-US airborne campaign at the Parker Tract Loblolly Pine Plantation in North Carolina, USA. Remote Sens. 2017, 9, 612. [Google Scholar] [CrossRef] [Green Version]
  191. Bovensmann, H.; Bösch, H.; Brunner, D.; Ciais, P.; Crisp, D.; Dolman, H.; Hayman, G.; Houweling, S.; Lichtenberg, L. Report for Mission Selection: CarbonSat-An Earth Explorer to Observe Greenhouse Gases; European Space Agency: Noordwijk, The Netherlands, 2015. [Google Scholar]
  192. Pandley, P.; Manevski, K.; Srivastava, P.K.; Petropoulos, G.P. The use of hyperspectral earth observation data for land use/cover classification: Present status, challenges and future outlook. In Hyperspectral Remote Sensing of Vegetation, 1st ed.; Thenkabail, P., Ed.; CRC Press: Boca Raton, FL, USA, 2018; pp. 147–173. [Google Scholar]
Figure 1. DJI Mavic air UAV with single NDVI camera over the crapemyrtle (Lagerstroemia spp.) field at Tennessee State University, Otis L. Floyd Nursery Research Center, McMinnville, TN for automatic detection of drought stress in the plants.
Figure 1. DJI Mavic air UAV with single NDVI camera over the crapemyrtle (Lagerstroemia spp.) field at Tennessee State University, Otis L. Floyd Nursery Research Center, McMinnville, TN for automatic detection of drought stress in the plants.
Remotesensing 13 03841 g001
Table 1. Different deep learning CNN models used for the automatic identification of diseases in plants.
Table 1. Different deep learning CNN models used for the automatic identification of diseases in plants.
ReferencesCropDisease 1Accuracy 1,2Architectures
[150,151]Apple, Blueberry, Cherry, Corn, Grape, Orange, Peach, Bell Pepper, Potato, Raspberry, Soybean, Squash, Strawberry, and TomatoCedar-Apple apple rust, scab, apple black rot, apple powdery mildew, gray leaf spot, corn common rust, northern corn leaf blight, grape black rot, esca complex, Pseudocercospora leaf spot, HLB, bacterial spot of peach and bell pepper, early and late blight of potato, squash powdery mildew, strawberry leaf scorch, early and late blight of tomato, target leaf spot, leaf mold, bacterial leaf spot, TMV, and yellow leaf curl virus of tomatoAlexNet (85.53%),
GooLeNet (99.34%)
AlexNet and GoogLeNet
[50]CitrusHLB93.3%SVM
[33]CitrusHLB SVM with kernel
(85%)
SVM with kernel, SVM, LDA and QDA
[152]CornCorn leaf blight, common rust and grey leaf spot92.85%CNN, Neuroph studio
[44]CottonRamularia blight79%MLR, MLRb, SVM, RFT
[153]CottonCotton root rotKMSEG (88.5%),
KMSVM (87.77%)
KMSEG, and KMSVM
[36]GrapeLeaf stripe disease-ANOVA
[154]GrapeVine diseasesSegNet
corrected (88.26%)
SegNet (fusion) and SegNet (corrected)
[68]GrapeEsca complexVaried according
to sizes of patches
CNN
[99]Paperbark TeaMyrtle rust97.35%XGBoost
[155]PotatoBlackleg disease91%Thresholding
[53]PotatoLate blight -Thresholding
[156]PotatoPotato virus Y84%User defined CNN
[157]RadishFusarium wiltOver 90%GoogLeNet
[125]SoybeanBacterial blight, bacterial pustule, SDS, Septoria brown spot, and frogeye leaf spot 94.13%AlexNet
[123]SoybeanSeptorial leaf blightbrown spot, frogeye leaf spot, and downy mildew99.35%AlexNet and LeNet
[27]SoybeanAsian rust, mildew, powdery mildewInception (99.04%)InceptionV3, VGG-19, ResNet-50, and Xception
[145]TomatoEarly blight, yellow leaf curl, Corynespora leaf spot, leaf mold, TMV, late blight, septoria leaf spot ResNet (97.28%)AlexNet, GoogLeNet, and ResNet
[45]TomatoTarget spot and bacterial spotMLP (99%)MLP and STDA
1 Abbreviations used: Huanglongbing (HLB), Iron deficiency condition (IDC), K-means Segmentation (KMSEG), K-means SVM (KMSVM), Linear Discriminant Analysis (LDA), Multilayer Perceptron neural network (MLP), Multinominal Logistic Regression (MLR), Multinominal Logistic Regression with boosting (MLRb), Near Infra-red (NIR), Normalized Difference Vegetation Index (NDVI), Quadratic Discriminant Analysis (QDA), Random Forest Tree (RFT), Stepwise discriminant analysis (STDA), Sudden Death Syndrome (SDS), Support Vector Machine (SVM), and Tomato Mosaic Virus (TMV). 2 The accuracy of the model represent the percentage of the disease identified while classifying the image.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Neupane, K.; Baysal-Gurel, F. Automatic Identification and Monitoring of Plant Diseases Using Unmanned Aerial Vehicles: A Review. Remote Sens. 2021, 13, 3841. https://doi.org/10.3390/rs13193841

AMA Style

Neupane K, Baysal-Gurel F. Automatic Identification and Monitoring of Plant Diseases Using Unmanned Aerial Vehicles: A Review. Remote Sensing. 2021; 13(19):3841. https://doi.org/10.3390/rs13193841

Chicago/Turabian Style

Neupane, Krishna, and Fulya Baysal-Gurel. 2021. "Automatic Identification and Monitoring of Plant Diseases Using Unmanned Aerial Vehicles: A Review" Remote Sensing 13, no. 19: 3841. https://doi.org/10.3390/rs13193841

APA Style

Neupane, K., & Baysal-Gurel, F. (2021). Automatic Identification and Monitoring of Plant Diseases Using Unmanned Aerial Vehicles: A Review. Remote Sensing, 13(19), 3841. https://doi.org/10.3390/rs13193841

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop