Next Article in Journal
Thermal Infrared Imaging from Drones Offers a Major Advance for Spider Monkey Surveys
Next Article in Special Issue
Assessment of Texture Features for Bermudagrass (Cynodon dactylon) Detection in Sugarcane Plantations
Previous Article in Journal
Estimating Mangrove Forest Volume Using Terrestrial Laser Scanning and UAV-Derived Structure-from-Motion
Previous Article in Special Issue
Multispectral, Aerial Disease Detection for Myrtle Rust (Austropuccinia psidii) on a Lemon Myrtle Plantation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Identification of Ramularia Leaf Blight Cotton Disease Infection Levels by Multispectral, Multiscale UAV Imagery

1
Department of Geography, Federal University of Ceará, Fortaleza 60440-554, CE, Brazil
2
Federal Institute of Mato Grosso, Cuiabá 78043-400, MT, Brazil
3
Mato Grosso Cotton Institute, Primavera do Leste 78850-000, MT, Brazil
4
Department of Geography, Federal University of Mato Grosso, Cuiabá 78068-600, MT, Brazil
5
Department of Physics, Federal University of Mato Grosso, Cuiabá 78060-900, MT, Brazil
*
Authors to whom correspondence should be addressed.
Drones 2019, 3(2), 33; https://doi.org/10.3390/drones3020033
Submission received: 19 February 2019 / Revised: 27 March 2019 / Accepted: 1 April 2019 / Published: 2 April 2019
(This article belongs to the Special Issue UAV/Drones for Agriculture and Forestry)

Abstract

:
The reduction of the production cost and negative environmental impacts by pesticide application to control cotton diseases depends on the infection patterns spatialized in the farm scale. Here, we evaluate the potential of three-band multispectral imagery from a multi-rotor unmanned airborne vehicle (UAV) platform for the detection of ramularia leaf blight from different flight heights in an experimental field. Increasing infection levels indicate the progressive degradation of the spectral vegetation signal, however, they were not sufficient to differentiate disease severity levels. At resolutions of ~5 cm (100 m) and ~15 cm (300 m) up to a ground spatial resolution of ~25 cm (500 m flight height), two-scaled infection levels can be detected for the best performing algorithm of four classifiers tested, with an overall accuracy of ~79% and a kappa index of ~0.51. Despite limited classification performance, the results show the potential interest of low-cost multispectral systems to monitor ramularia blight in cotton.

1. Introduction

Remote sensing has been proven to be a key technology in monitoring cotton (Gossypium hirsutum L.) crop yields [1,2], nutrient status [3,4], water stress [5,6], and diseases [7,8,9,10,11]. Up to now, the focus has been mainly on the application of field spectroscopy at the canopy scale [8,12,13,14,15,16] and manned airborne systems [2,17] to support precision farming approaches. However, specific cotton-related orbital remote sensing studies are rare [18,19,20]. Most satellite-based approaches, including the identification of cotton crops, are broader land-use and land-cover studies or biomass estimates without the characterization of the physical, biological, or chemical conditions of the crop [21,22]. Furthermore, previous studies on remote pest detection mainly focused on cotton rot [17,23] or the identification of crop diseases on the leaf scale [24].
Ramularia leaf blight, also known as false or grey mildew, includes the fungus ramularia areola Atk. as an etiologic agent and is the most important foliar cotton disease. In Central-west Brazil, yield losses of ~20–30% have been reported, which can exceed 70% without control measures in other production regions such as India [25]. Infection can cause boll abortion, malformation of bolls, and lower fiber quality. For effective control measures, up to nine fungicide applications may be necessary, which cause the increase in production costs and additional negative ecological impact of the culture on natural resources.
As a quickly emerging remote sensing technology, triggered by the advanced miniaturization of electronic devices (sensors, modems, processors, servos, batteries, etc.) [26], a low-cost unmanned airborne vehicle (UAV) has been applied to fine-scale monitoring of the crop development, parameters of fertility and physical soil conditions, and precision farming and pest control [27,28,29,30,31]. The UAVs can be operated quickly and economically. They fly below the cloud layer and require little logistic resources and planning time as compared with photogrammetric aircraft campaigns. High spatial resolution imagery allows the examination of individual plants and sub-metric spatial patterns [32,33]. One of the most common classifications of UAVs, besides their size and weight, is based on their flying principle. Rotor wing systems were found to be more flexible in operation, whereas fixed-wing systems allow a quicker coverage of larger areas [34].
Crop diseases alter the multispectral reflectance of plants at different damage levels. In the absence of visual symptoms such as leaf lesions, plants can react to the presence of a pathogen with physiological mechanism such as the reduction of the photosynthesis rate [35], increasing reflection in visual wavelengths, and decreasing near-infrared reflection. Regardless of their potential [30] and supported by results of non-imaging spectroscopy measurements of cotton plants [7], image classification of cotton leaf [36], spectroscopic approaches in different scales for leaf diseases identification in different crops [37,38,39], and peer-reviewed studies on the application of UAV for cotton disease monitoring are extremely scarce. In our review, no previous studies on the applicability of air- or spaceborne, multispectral remote sensing techniques for the detection of ramularia blight could be identified.
Initially considered only a secondary phytosanitary problem and occurring only at the final stages of the production cycle, highly productive cotton cultivars are now affected as well in initial stages. In conjunction with a possible increasing resistance against fungicides [40], an effective corrective measure can demand up to ten repetitive applications [25], requiring the detection and mapping of ramularia blight imperative as early as possible.
Here, we examine the performance of multispectral imagery from a low-cost, low-flight multi-rotor UAV platform for the differentiation of ramularia infection levels in cotton on the farm scale. Furthermore, we evaluate how different flight heights affect the separability measures between infection classes to guide potential users in balancing between adequate spatial resolution and terrain coverage during UAV operation.

2. Materials and Methods

2.1. Study Area

Field data were obtained from a research field station of the Cotton Institute of Mato Grosso (IMAmt), in the Primavera municipality, in one of the most important cotton farming regions of the Neotropics in the Central Brazilian Highlands (Planalto Central). It is located at ~54°11′46″ W and 15°32′12″ S, ~210 km east of the Mato Grosso state capital, Cuiabá (Figure 1). The average annual temperature of the region is ~22 °C and the average annual rainfall precipitation is ~1650 mm with a rainy season between November and April and a dry season from May through October. As large parts of the crop farming regions of the Central-western Brazil, experimental sites are installed over tertiarian sandstones on a gently rolling relief, where deep oxisols develop.

2.2. UAV System, Image Acquisition, and Data Preprocessing

The UAV system we used was composed of a vertical takeoff and landing, eight-rotor oktokopter (model MK Okto XL 6S12, HiSystems GmbH, Moormerland, Germany) and a lightweight (200 g), multispectral TetraCam ADC camera (Tetracam Inc., Gainesville, FL, USA) with a 3.2-megapixel CMOS sensor (2048 × 1536 pixels) and 8.0 mm focal lens (Figure 2). The MK Okto platform included an ARM processor equipped with a mainboard and a navigation mainboard (NaviCtrl) with compass and GPS modules.
The NaviCtrl in conjunction with a pressure sensor enabled predefined autonomous flights. A lithium polymer battery with up to 6600 mAh capacity allowed flight durations of approximately 15 min. A standard 2.4 GHz transmitter remote control with additional channels for steering mechanisms and camera triggering was used for flight control. The ADC camera was mounted on a digital gimbal with biaxial roll and pitch control for flexible camera configurations.
The TetraCam ADC camera was sensitive in the spectral range between 520 nm and 920 nm and was operated with an 8-bit data depth. The three available bands simulated the Landsat TM sensor bands 2, 3, and 4 (OLI sensor bands 3, 4, and 5) at wavelengths of 520–600 nm (green band), 630–690 nm (red band), and 760–900 nm (near-infrared band).
A waypoint-programmed vertical flight was realized on 30 May 2014, between approximately 14:00 and 15:00 local time under clear weather conditions, with an air temperature of 30.6 °C and relative humidity of 51%. The cotton crops at the experimental site were in the maturity stage. Images were acquired at four different flight heights: 100 m, 300 m, 500 m, and 700 m, at a nominal ground sample distance and spatial resolutions at ground (SRG) of ~5 cm, ~15 cm, ~25 cm, and ~35 cm, respectively. The preprocessing of the images accounted for vignetting and sensor-related radiometric distortions corrections and was carried out using the PixelWrench 2 software (Tetracam Inc.).
During the flights, the Tetracam ADC camera was in automatic exposure mode, and therefore, the images were acquired with the following different integration times: 0.860 ms (at 100 m flight height), 0.946 ms (at 300 m flight height), and 1.032 ms (at 500 m and 700 m flight height). To minimize image distortions, the UAV platform was programmed to remain stable for 30 sec at respective flight heights. To compare the digital counts (or grey levels) of the acquired images with different exposure times, we applied a time normalization to the data by dividing each image by its exposure time, where DCexp are the time-normalized data in DC/s (Digital counts per seconds), according to Equation (1):
D C e x p = D C x a T ,
where D C is the digital count in a specific band x, a is the flight height, and T is the image exposure time in seconds.
In addition to time normalization, radiometric normalization was required for the comparison because weather conditions, solar illumination geometry, and atmospheric scattering and absorption variations differed for each flight height. For that purpose, we identified 26 semi-invariant radiometric sites in the scenes. The grey levels of a semi-invariant radiometric site were supposed to not change from scene to scene, except for differences in atmospheric scattering, absorption, relief, or illumination and viewing geometry conditions. We selected 26 bare soil sites to spread all over the study area and calculated their average digital count per second at each of the different flight heights. Using the 100 m flight height image as the reference, we calculated a radiometric normalization factor (fa), according to Equation (2):
f a = M E D 100 M E D a ,
where M E D 100 is the bare soil pixels average of the 100 m flight height image, and M E D a is the average of the bare soil pixels at other flight heights.
When multiplying each of the images acquired at flight heights of 300 m, 500 m, and 700 m with the respective factors, we radiometrically normalized them to the image acquired at the 100 m flight height. Therefore, the time and radiometrically normalized images in digital counts per second (DCnor) were obtained using Equation (3):
D C n o r = D C e x p x a f a ,
where D C e x p x is the exposition time-corrected digital count value at a specific band and flight height, a is the flight height, and f a is the atmospheric normalization factor.

2.3. Spatial Arrangement of Field Experiment and Assessment of Ramularia Blight Infection Levels

The field experiment for assessment of ramularia blight infection levels at IMAmt totaled an area of about 1.09 ha, subdivided in 78 plots, each with an extension of 16 m × 7 m, with management corridors of 2 m (Figure 1b). Plots were further divided into nine parcels of 7 m × 1.8 m, of which each included two plant lines of identical genotype (exact specifications were maintained under business confidentiality of IMAmt) and fertilizer and pesticide applications.
Single crop cotton genotypes, which result in the highest yields and quality, are seeded in central Brazil at the end of October and yielded mostly in June or July. Infection commonly surges during advanced phenological development. Small angular leaf lesions of 3–4 cm surge in the initial phases of infection, whereas lesions become necrotic with severe chlorosis during advanced stages, causing the premature death of leaves. In general, infection is initiated in lower leaf layers at the leaf downside.
The technical staff of the IMAmt examined the cotton infection levels of ramularia blight at the experimental site. Individual cotton plants were classified using a five-level key (Figure 3).
For comparison with UAV imagery, individual plant classifications were then averaged by parcel. The average was rounded to the 5-level key. Infection levels were initially spatialized by vectorization of parcel limits over a real-color orthophoto mosaic taken during a previous low-altitude flight at 50 m using a photogrammetric pipeline based on bundle adjustment, which had been georeferenced by 20 control points installed at the test site. Because the control point adjustment of multispectral imagery from the different flights achieved only an absolute spatial accuracy of 0.44 m (RMS) on average, parcels had to be individually adjusted for every flight height to guarantee an accurate signature extraction over the linear parcels. In order to exclude mixed pixels and reduce neighborhood effects of adjacent pixels of the management corridors, only pixels from a 50 cm inside buffer were considered in the data analyses (Figure 1c). Therefore, the number of extracted pixels for each inside buffer varied between approximately 2500 at the 100 m flight height and approximately 50 at the 700 m flight height. The values of the pixels were then averaged for each buffered parcel and spectral band, resulting in 154 instances for each flight height. This number was inferior to the number of parcels originally planted in the experiment, as some of the parcels were not entirely imaged in the 100 m flight height and others had to be excluded from analysis due to incomplete crop development caused by waterlogging and soil degradation.

2.4. Data Analysis

The distribution of class-wise DCnor values was tested for normality using the Kolmogorov–Smirnov test for every spectral band and flight height. The separability between DCnor cotton pixel values for each band from different flight heights and DCnor values of ramularia wilt infection classes for each band and flight height was evaluated by exploratory data analysis, and according to the normality tests, by the nonparametric, unpaired Mann–Whitney test (MWT). Adjacent remote sensing imagery pixels were autocorrelated [41] and could not be considered as independent samples. In order to avoid the inflating of test statistics, we averaged DCnor values by parcel to test group differences. Furthermore, multispectral classifications were conducted using four nonparametric classifiers as implemented in the WEKA data mining software [42]: (1) multinomial logistic regression (MLR) [43], (2) multinomial logistic regression with boosting (MLRb) [44], (3) Support Vector Machine (SVM) [45], and (4) random forest tree (RFT). We did not focus on the comparison of classifier performances but instead focused on the application of different algorithms to minimize the possibility that the obtained performance of infection level classification outcomes may be caused by specifications of a single classifier. The selected algorithms have been widely applied in remote sensing imagery classification [46,47,48] and were known to have different requirements with respect to the training data input and complexity of the parameter adjustment needed to achieve a sufficient generalization performance.
The MLR used a linear predictor function and required a relatively small number of training data to estimate the parameters necessary for classification. Furthermore, there was no need for predictors to be statistically independent of each other as compared with, for example, the Bayes classifier [49]. The SVM techniques performed well in the multispectral remote sensing classification, even with a modest amount of training data [46]. However, the SVM may have been subjected to overfitting the dataset because it adjusted to nonlinear class predictor relations [50]. The other two approaches were affected more by overfitting of training data or higher amounts of training data demanded, that is, MLRb because of its underlying boosting [51] and RFT because it was an ensemble approach based on bootstrap aggregating (bagging) [52].
For classification and validation, the dataset (n = 154) was split, two-thirds for training and one-third for validation. The overall accuracy and kappa indices were then determined using a 10-fold cross-validation approach.

3. Results

The empirical flight height normalization factors increased with flight heights in both bands of the visual spectrum (GREEN, RED), whereas they decreased slightly or remained almost stable in the near-infrared (NIR) range (Table 1). The modest f a increased in the visual bands and the slight reduction in the NIR were caused by higher sensor exposure times at higher flight heights, which reduced the DCnor estimates and thus compensated for increasing path radiance.
The DCnor values of extracted cotton pixels increased slightly in the GREEN and RED bands with flight height and decreased in the NIR.
The dispersion decreased with flight heights in all spectral bands, except in the case of the RED band between 500 and 700 m. Due to the reduced spatial resolution, zero values recorded at all flight heights had a lower relative percentage (700 m: 1.6%, 500 m: 2.4%) than those at lower heights (100 m: 4.9%, 300 m: 3.2%), increasing relatively central tendencies. The reduction in the variability is a function of the spatial resolution. At lower flight heights, both higher maximum (e.g., pure open soil, directionally reflecting leaves) and minimum (e.g., shadows) DCnor values were registered. On the basis of the Kolmogorov normality test, the distribution of cotton DCnor values differed significantly from the normality, mainly in the RED band with zero values occurring at all flight heights.
On average, the parcels of all infection levels showed a vegetation signal with expressive NIR values and higher values in the GREEN than RED band. The DCnor values of the cotton parcels with lower yields [2] and the canopies infected with verticillium wilt [7] decreased in the GREEN and NIR bands with an increased infection level (Figure 4) and increased in the RED, which corresponded to a reduction of the “red edge” in the plant leaf reflectance [53]. In general, these characteristics were maintained at all flight heights. Absolute values within the classes had the same trends at different flight heights and dispersion levels were observed for the entire dataset (Table 2), with values increasing slightly in the visible wavelength range, decreasing in the NIR, and decreasing variance with increasing flight height.
The median differences between successive infection levels are lower at higher classes (3–4, 4–5) than at lower ones (1–2, 2–3) throughout the spectral bands.
Visual inspections were underpinned by the separability analysis based on the p-values of the unpaired MWT test (Table 2). Independent from the flight height and spectral band, the separability of class pairs with a more severe infection level (2–4, 3–4, 2–3) was higher than that of classes with a low infection level (0–1, 1–2). This was directly linked to the classification key applied for the ramularia infection field surveys [54], where affected leaf areas are not scaled linearly in the classes (see Table 1). Parcels without symptoms (class 0) could not be separated from those of class 1 (affected leaf area < 5%) at any of the flight heights. Even separation from class 2 (affected leaf area < 15%) was mostly impossible in the 95% confidence interval for different bands and flight heights. Furthermore, partial leaf loss, which caused a stronger influence of reflection on the underlying soil, only occurred at the most severe level of ramularia wilt infection (class 5). In general, most pairs with a two-level difference between their classes showed a significant difference in all bands and flight heights, however, there was a better performance for pairs of high (e.g., 2–4) than low infection levels (e.g., 1–3).
Decreasing variability in the signatures partially improved the class differentiation at intermediate flight heights. The cotton signal in the imagery of the 100 m flight height with a nominal spatial resolution of ~5cm was influenced more by reflectance variations, due to biotic and abiotic factors such as reflection differences between different plant parts and plant structure causing shadowing, than by symptoms caused by infection (leaf necrosis). The separability was slightly reduced at a height of 700 m.
Because the band-wise separability was not viable, at least between the two lowest infection classes, test classifications were conducted for three-keys (Figure 5): (i) a four-class key (class 0 and 1 aggregated), (ii) a three-class key (class 0 through 2 aggregated), and also (iii) classes 0 through 2 and 3 and 4 aggregated.
The overall classification accuracies range between ~30 % and 79 % and kappa indices between 0.02 and 0.51, respectively. As expected, the successive reduction from four to three and then to two classes improved both the accuracy measures independent of classifier and imaging height. The classifiers performed similarly, on average through flight altitudes and number of differentiated infection levels, with a slight advantage of the LR, an inductive linear approach that is known to be less sensitive to smaller amounts of training data. The RF, a machine learning ensemble classifier approach, which tends to overfitting, underperformed in the classifications with four infection levels and for the 300 m and 700 m flight heights of the three-level classification key. In the two-class aggregations, the three classifiers with higher demand for training data (LB, RF, VM) performed at least as well as LR. The best single classification performance was obtained for VM, classifying the 500 m flight height imagery in two infection levels.

4. Discussion

Multispectral imagery from a vertical flight of a multi-rotor UAV platform was evaluated with respect to the applicability for the identification of ramularia blight infection levels in cotton. The DC signature separability and classification performance were based on a detailed controlled phytosanitary field experiment in one of the major regions for cotton cropping worldwide.
For the comparison of DC values from four flight heights, a relative radiometric adjustment was conducted to normalize different camera exposition times and to increase atmospheric interferences of the imagery. Such a procedure was chosen because the reliable transformation of DC values, from UAV-transported moderate costly lightweight CCDs such as the Tetracam ADC Lite to reflectance values is not trivial due to the lack of exact information on sensor calibration and viewing geometry caused by platform instability. In fact, standard tools that do not require ground measurements for the transformation of DC to reflectance values are not available [55]. After normalization, the DCnor distributions from the flight heights were not found to be significantly different.
From a practical viewpoint, the applied procedure seems to be much more reliable than extensive and time-demanding radiometric preprocessing and transformation to reflectance values, which would further make it difficult to obtain necessary input data (exact flight geometry, atmosphere parameters, etc.) and may be less viable for operational application by farmers. Nevertheless, future developments for radiometric calibration and correction of UAV-borne imagery and equipment refinement would allow the calculation of comparable vegetation indices throughout the phenological development. Such indices are known to be important in early disease detection, the differentiation of diseases, better differentiation of disease severity [7,56,57], as well as their upscaling to satellite imagery for large-scale monitoring. More recently, released multispectral camera systems such as the Parrot Sequoia+ (Parrot Drones SAS) or the Micasense Rededge MX (MicaSense, Inc.) allow for absolute reflectance calibration with or without a calibration target.
The classification accuracies were found to be similar to those obtained in vegetation mapping [34] and modest in comparison with results obtained during the assessment of other cotton diseases. In a study on the mapping of cotton root rot from airborne multispectral cameras [23], overall classification accuracies of more than 90% were achieved for the differentiation between infected and non-infected zones. For a binary key, we obtained the best classification at an OC of 79.1% (VM, 500m flight height). Other than ramularia wilt, which gradually affects the leaves canopy, plants infected with root rot die within several days, drastically changing multispectral reflection patterns and improving the differentiation from non-affected crops. The authors further emphasize that the detection is much less effective if plants are newly infected with a small level of damage, while newly infected plants without notable symptoms may not be detectable. Similar findings have been obtained for the detection of leaf diseases of other crops [57]. However, as shown for sugar beet leaves [58] changes in spectral reflectance may occur from impairments in the leaf structure, the chemical composition of the tissue-specific in the pathogenesis of different leaf diseases, or may even be due to abiotic stress [57], aspects not examined in the present study. As shown, for example, for barley diseased leaves with net blotch, rust, and powdery mildew [57], spectra alterations of diseased leaves are prominent in well-defined wavelengths. Therefore, hyperspectral approaches [59] or at least the utilization of camera systems with additional and/or narrower spectral bands [60] such as from the “red edge” (available for example through the Parrot Sequoia+ or the Micasense Rededge MX sensors) will be imperative to improve detection performance.
Spatial resolution has a known influence on pathogen mapping. In general, improved spatial resolution is expected to improve detection [61]. Leaf disease mapping in other crops, such as the powdery wilt of wheat, showed that the detection by multispectral satellite imagery is not trivial [62]. Using multispectral SPOT 6 data, the authors achieved a dichotomous classification for healthy and infected stands with OC and kappa values of ~77% and 0.55, respectively, similar to those obtained in our study.
Classification performances, for the 100 m (SRG = ~5 cm), 300 m (SRG = ~15 cm), and 500 m (SRG = ~25 cm) flight height imagery, were found to be almost equivalent, whereas performance decreased for the 700 m flight height (SRG = ~35 cm). As observed in a UAV-based weed seedling mapping from flight heights between 40 m and 100 m [63], the best detection performance does not always occur at the highest spatial resolution, which can improve with higher altitudes, if plant size is large. They further emphasized that tailored object-based image classification schemes must be elaborated in order to take full advantage of very high spatial resolution imagery. The non-equal spacing of the percentages of the affected leaf area in the applied infection level key decreased the performance of class differentiation and classification. The first three of the five classes include plants with up to only 15% of their total leaf area affected by necrosis (see Figure 2). Because similarly spaced keys are utilized in other farming regions, such as India [64], it should be evaluated if alternative field-scale keys could be developed, which may be better for infection detection using remote sensing methods. In addition, the spatial arrangement of the field experiment was not specifically developed for remote sensing studies. Spatial units with unique infection levels had an unfavorable length to width ratio of ~1:3.9. Principally, this linear design leads to a high percentage of mixed pixels and border effects at higher flight heights than for approximately quadratic parcels.
Furthermore, it must be pointed out, that the applied validation approach, as imperative by the available phytosanitary field survey, tends to underestimate real classification accuracies. Inside the parcels, individual cotton plants, or even different leaves of a unique plant show different leaf necrosis levels, which are not expressed by the unique classification given for each parcel. Considering diameters of 60–90 cm of a fully developed cotton crop and given nominal spatial resolutions between 5 cm and 35 cm, one specimen is represented by several pixels, and in the case of the lowest flight heights, one pixel represents a singular leaf. Therefore, possible false positives or negatives partially originated from the generalized ground truth.

5. Conclusions

The potential of three-band multispectral imagery from a multi-rotor UAV platform for the detection of ramularia blight from different flight heights was evaluated. Increasing infection levels have led to the progressive degradation of the spectral vegetation signal, however, they were not sufficient to differentiate finer-scaled disease severity levels. The findings, such as that the separability and classification accuracies did not decrease up to a monitoring height of 500 m, and that empirical, relative radiometric adjustment maintains multispectral DC signatures similar to flight heights with almost no atmospheric interference (100 m), have practical relevance. This means that the higher flight heights used in property scale disease monitoring and precision farming can equilibrate the major limitation of multi-rotor mini UAV with respect to their restricted autonomy and coverage as compared to fixed-wing systems without bias foliar disease detection. Limited classification performances have motivated our ongoing efforts to apply a camera system with a higher spectral resolution (Micasense RedEdge M) and its combined use with a thermal imaging system FLIR 420T (FLIR Commercial Systems). Recent field campaigns include very low altitude imaging (<100 m) for the acquisition of improved spatial resolution imagery and multitemporal approaches for mapping ramularia blight and other diseases in cotton.

Author Contributions

Conceptualization, T.W.F.X. and P.Z.; data curation, T.W.F.X. and E.S.S.; methodology, T.W.F.X., E.S.S., T.S., and P.Z.; investigation, T.W.F.X., E.S.S., T.S., and P.Z.; resources, R.G., R.N.V.S., E.S.S., and G.S.S.; project administration, R.N.V.S.; formal analysis, T.W.F.X. and P.Z.; writing—original draft preparation, validation, P.Z.; and visualization, T.W.F.X. and P.Z.

Funding

This research was funded by the Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq), grant number 469363/2014-2, Brazil, and supported by a M.Sc. studentship of the Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES), grant number 1519921, Brazil.

Acknowledgments

To the Instituto Mato-grossense do Algodão (IMAmt) for their partnership in the research through the preparation of the experimental field.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Domenikiotis, C.; Spiliotopoulos, M.; Tsiros, E.; Dalezios, N.R. Early cotton yield assessment by the use of the NOAA/AVHRR derived Vegetation Condition Index (VCI) in Greece. Int. J. Remote Sens. 2004, 25, 2807–2819. [Google Scholar] [CrossRef]
  2. Zarco-Tejada, P.J.; Ustin, S.L.; Whiting, M.L. Temporal and Spatial Relationships between Within-Field Yield Variability in Cotton and High-Spatial Hyperspectral Remote Sensing Imagery. Agron. J. 2005, 97, 641–653. [Google Scholar] [CrossRef]
  3. Read, J.J.; Tarpley, L.; McKinion, J.M.; Reddy, K.R. Narrow-Waveband Reflectance Ratios for Remote Estimation of Nitrogen Status in Cotton. J. Environ. Qual. 2002, 31, 1442–1452. [Google Scholar] [CrossRef] [PubMed]
  4. Zhao, D.; Reddy, K.R.; Kakani, V.G.; Read, J.J.; Koti, S. Selection of optimum reflectance ratios for estimating leaf nitrogen and chlorophyll concentrations of field-grown cotton. Agron. J. 2005, 97, 89–98. [Google Scholar] [CrossRef]
  5. Bowman, W.D. The relationship between leaf water status, gas exchange, and spectral reflectance in cotton leaves. Remote Sens. Environ. 1989, 30, 249–255. [Google Scholar] [CrossRef]
  6. Torrion, J.A.; Maas, S.J.; Guo, W.; Bordovsky, J.P.; Cranmer, A.M. A three-dimensional index for characterizing crop water stress. Remote Sens. 2014, 6, 4025–4042. [Google Scholar] [CrossRef]
  7. Chen, B.; Wang, K.; Li, S.; Wang, J.; Bai, J.; Xiao, C.; Lai, J. Spectrum Characteristics of Cotton Canopy Infected with Verticillium Wilt and Inversion of Severity Level. In Computer and Computing Technologies in Agriculture, Volume II, Proceedings of the First IFIP TC 12 International Conference on Computer and Computing Technologies in Agriculture (CCTA 2007), Wuyishan, China, 18–20 August 2007; Li, D., Ed.; Springer: Boston, MA, USA, 2008; pp. 1169–1180. [Google Scholar]
  8. Prabhakar, M.; Prasad, Y.G.; Thirupathi, M.; Sreedevi, G.; Dharajothi, B.; Venkateswarlu, B. Use of ground based hyperspectral remote sensing for detection of stress in cotton caused by leafhopper (Hemiptera: Cicadellidae). Comput. Electron. Agric. 2011, 79, 189–198. [Google Scholar] [CrossRef]
  9. Oosterhuis, D.M.; Coomer, T.; Raper, T.B.; Espinoza, L. Use of Remote Sensing in Cotton to Accurately Predict the Onset of Nutrient Stress for Foliar Alleviation for Optimizing Yield and Quality; Fluid Fertilizer Foundation: Fayetteville, NC, USA, 2015. [Google Scholar]
  10. Ranjitha, G.; Srinivasan, M.R. Hyperspectral radiometry for the detection and discrimination of damage caused by sucking pests of cotton. Curr. Biot. 2014, 8, 5–12. [Google Scholar]
  11. Wang, Q.; Chen, B.; Wang, J.; Wang, F.Y.; Han, H.Y.; Li, S.K.; Wang, K.R.; Xiao, C.H.; Dai, J.G. Four supervised classification methods for monitoring cotton field of verticillium wilt using TM image. J. Anim. Plant Sci. 2015, 25, 5–12. [Google Scholar]
  12. Kostrzewski, M.; Waller, P.; Guertin, P.; Haberland, J.; Colaizzi, P.; Barnes, E.; Thompson, T.; Clarke, T.; Riley, E.; Choi, C.; et al. Ground-based remote sensing of water and nitrogen stress. Trans. ASAE 2003, 46, 29–38. [Google Scholar] [CrossRef]
  13. Hunsaker, D.J.; Pinter, P.J.; Barnes, E.M.; Kimball, B.A. Estimating cotton evapotranspiration crop coefficients with a multispectral vegetation index. Irrig. Sci. 2003, 22, 95–104. [Google Scholar] [CrossRef]
  14. Raper, T.B.; Varco, J.J. Canopy-scale wavelength and vegetative index sensitivities to cotton growth parameters and nitrogen status. Precis. Agric. 2014, 16, 62–76. [Google Scholar] [CrossRef]
  15. Delegido, J.; Verrelst, J.; Meza, C.M.; Rivera, J.P.; Alonso, L.; Moreno, J. A red-edge spectral index for remote sensing estimation of green LAI over agroecosystems. Eur. J. Agron. 2013, 46, 42–52. [Google Scholar] [CrossRef]
  16. Sui, R.; Wilkerson, J.B.; Hart, W.E.; Wilhelm, L.R.; Howard, D.D. Multi-spectral sensor for detection of nitrogen status in cotton. Appl. Eng. Agric. 2005, 21, 167. [Google Scholar] [CrossRef]
  17. Yang, C.; Everitt, J.H.; Fernandez, C.J. Comparison of airborne multispectral and hyperspectral imagery for mapping cotton root rot. Biosyst. Eng. 2010, 107, 131–139. [Google Scholar] [CrossRef]
  18. Maas, S.J. Linear mixture modeling approach for estimating cotton canopy ground cover using satellite multispectral imagery. Remote Sens. Environ. 2000, 72, 304–308. [Google Scholar] [CrossRef]
  19. French, A.N.; Hunsaker, D.J.; Thorp, K.R. Remote sensing of evapotranspiration over cotton using the TSEB and METRIC energy balance models. Remote Sens. Environ. 2015, 158, 281–294. [Google Scholar] [CrossRef]
  20. Liu, H.; Meng, L.; Zhang, X.; Ustin, S.; Ning, D.; Sun, S. Estimation model of cotton yield with time series Landsat images. Trans. Chin. Soc. Agric. Eng. 2015, 31, 223–228. [Google Scholar]
  21. Zheng, B.; Myint, S.W.; Thenkabail, P.S.; Aggarwal, R.M. A support vector machine to identify irrigated crop types using time-series Landsat NDVI data. Int. J. Appl. Earth Obs. Geoinf. 2015, 34, 103–112. [Google Scholar] [CrossRef]
  22. Marshall, M.; Thenkabail, P.; Biggs, T.; Post, K. Hyperspectral narrowband and multispectral broadband indices for remote sensing of crop evapotranspiration and its components (transpiration and soil evaporation). Agric. For. Meteorol. 2016, 218–219, 122–134. [Google Scholar] [CrossRef]
  23. Yang, C.; Odvody, G.N.; Fernandez, C.J.; Landivar, J.A.; Minzenmayer, R.R.; Nichols, R.L. Evaluating unsupervised and supervised image classification methods for mapping cotton root rot. Precis. Agric. 2015, 16, 201–215. [Google Scholar] [CrossRef]
  24. Camargo, A.; Smith, J.S. An image-processing based algorithm to automatically identify plant disease visual symptoms. Biosyst. Eng. 2009, 102, 9–21. [Google Scholar] [CrossRef]
  25. Galbieri, R.; Cia, E.; Morello, C.D.L.; Fanan, S.; Andrade Junior, E.R.; Kobayasti, L. Ramularia areola sporulation potential in Brazilian cotton. Summa Phytopathol. 2015, 41, 233–235. [Google Scholar] [CrossRef]
  26. Salamí, E.; Barrado, C.; Pastor, E. UAV flight experiments applied to the remote sensing of vegetated areas. Remote Sens. 2014, 6, 11051–11081. [Google Scholar] [CrossRef]
  27. Swain, K.C.; Thomson, S.J.; Jayasuriya, H.P. Adoption of an Unmanned Helicopter for Low-Altitude Remote Sensing To Estimate Yield and Total Biomass of a Rice Crop. Am. Soc. Agric. Biol. Eng. 2010, 53, 21–27. [Google Scholar] [CrossRef]
  28. Zaks, D.P.M.; Kucharik, C.J. Data and monitoring needs for a more ecological agriculture. Environ. Res. Lett. 2011, 6, 014017. [Google Scholar] [CrossRef]
  29. Xiang, H.; Tian, L. Development of a low-cost agricultural remote sensing system based on an autonomous unmanned aerial vehicle (UAV). Biosyst. Eng. 2011, 108, 174–190. [Google Scholar] [CrossRef]
  30. Huang, Y.; Thomson, S.J.; Hoffmann, W.C.; Lan, Y.; Fritz, B.K. Development and prospect of unmanned aerial vehicle technologies for agricultural production management. Int. J. Agric. Biol. Eng. 2013, 6, 1–10. [Google Scholar]
  31. Gómez-Candón, D.; De Castro, A.I.; López-Granados, F. Assessing the accuracy of mosaics from unmanned aerial vehicle (UAV) imagery for precision agriculture purposes in wheat. Precis. Agric. 2014, 15, 44–56. [Google Scholar] [CrossRef]
  32. Laliberte, A.S.; Rango, A. Image Processing and Classification Procedures for Analysis of Sub-decimeter Imagery Acquired with an Unmanned Aircraft over Arid Rangelands. GISci. Remote Sens. 2011, 48, 4–23. [Google Scholar] [CrossRef]
  33. Colomina, I.; Molina, P. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef]
  34. Marcaccio, J.V.; Markle, C.E.; Chow-fraser, P. Use of fixed-wing and multi-rotor unmanned aerial vehicles to map dynamic changes in a freshwater. J. Unmanned Veh. Syst. 2016, 4, 193–202. [Google Scholar] [CrossRef]
  35. Martinelli, F.; Scalenghe, R.; Davino, S.; Panno, S.; Scuderi, G.; Ruisi, P.; Villa, P.; Stroppiana, D.; Boschetti, M.; Goulart, L.R.; et al. Advanced methods of plant disease detection. A review. Agron. Sustain. Dev. 2015, 35, 1–25. [Google Scholar] [CrossRef]
  36. Bernardes, A.A.; Rogeri, J.G.; Oliveira, R.B.; Marranghello, N.; Pereira, A.S.; Araujo, A.F.; Tavares, J.M.R.S. Identification of foliar diseases in cotton crop. In Lecture Notes in Computational Vision and Biomechanics; Tavares, J.M.R.S., Jorge, R.N., Eds.; Springer Publishing: Dordrecht, The Netherlands, 2013; Volume 8, pp. 67–85. [Google Scholar]
  37. Mahlein, A.K.; Rumpf, T.; Welke, P.; Dehne, H.W.; Plümer, L.; Steiner, U.; Oerke, E.C. Development of spectral indices for detecting and identifying plant diseases. Remote Sens. Environ. 2012, 128, 21–30. [Google Scholar] [CrossRef]
  38. Berdugo, C.A.; Zito, R.; Paulus, S.; Mahlein, A.K. Fusion of sensor data for the detection and differentiation of plant diseases in cucumber. Plant Pathol. 2014, 63, 1344–1356. [Google Scholar] [CrossRef]
  39. Oberti, R.; Marchi, M.; Tirelli, P.; Calcante, A.; Iriti, M.; Borghese, A.N. Automatic detection of powdery mildew on grapevine leaves by image analysis: Optimal view-angle range to increase the sensitivity. Comput. Electron. Agric. 2014, 104, 1–8. [Google Scholar] [CrossRef]
  40. Lopes, L.O.; Lacerda, J.J.; Mielezrski, F.; Ratke, R.F.; Lira, D.N.; Pacheco, L.P. Efeito de fungicidas para o controle da Ramularia areola na cultura do algodoeiro. Summa Phytopathol. 2017, 43, 229–235. [Google Scholar] [CrossRef]
  41. Wulder, M.; Boots, B. Local spatial autocorrelation characteristics of remotely sensed imagery assessed with the Getis statistic. Int. J. Remote Sens. 1998, 19, 2223–2231. [Google Scholar] [CrossRef]
  42. Witten, I.; Frank, E.; Hall, M.; Pal, C. Data Mining: Practical Machine Learning Tools and Techniques, 4th ed.; Morgan Kaufmann: Burlington, MA, USA, 2016. [Google Scholar]
  43. Le Cessie, S.; Van Houwelingen, J.C. Ridge Estimators in Logistic Regression. J. R. Stat. Soc. Ser. C Appl. Stat. Appl Stat. 1992, 41, 191–201. [Google Scholar] [CrossRef]
  44. Landwehr, N. Logistic Model Trees. Mach. Learn. 2005, 59, 161–205. [Google Scholar] [CrossRef]
  45. Platt, J.C. Advances in kernel methods. In Advances in Kernel Methods: Support Vector Learning; Schölkopf, B., Burges, C.J.C., Smola, A.J., Eds.; MIT Press: Cambridge, MA, USA, 1999; pp. 185–208. [Google Scholar]
  46. Mountrakis, G.; Im, J.; Ogole, C. Support vector machines in remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2011, 66, 247–259. [Google Scholar] [CrossRef]
  47. Belgiu, M.; Drăgu, L. Random forest in remote sensing: A review of applications and future directions. ISPRS J. Photogramm. Remote Sens. 2016, 114, 24–31. [Google Scholar] [CrossRef]
  48. Long, J.A.; Lawrence, R.L. Mapping percent tree mortality due to mountain pine beetle damage. For. Sci. 2016, 62, 392–402. [Google Scholar] [CrossRef]
  49. Hosmer, D.W., Jr.; Lemeshow, S.; Sturdivant, R.X. Applied Logistic Regression, 3rd ed.; Wiley: Hoboken, NJ, USA, 2013. [Google Scholar]
  50. Abe, S. Support Vector Machines for Pattern Classification; Advances in Pattern Recognition; Springer London: London, UK, 2010; Volume 26. [Google Scholar]
  51. Long, P.M.; Servedio, R.A. Random classification noise defeats all convex potential boosters. Mach. Learn. 2010, 78, 287–304. [Google Scholar] [CrossRef]
  52. Bekkar, M.; Alitouche, D.; Akrouf, T.; Akrouf Alitouche, T. Imbalanced Data Learning Approaches Review. Int. J. Data Min. Knowl. Manag. Process 2013, 3, 15–33. [Google Scholar] [CrossRef]
  53. Horler, D.N.H.; Dockray, M.; Barber, J. The red edge of plant leaf reflectance. Int. J. Remote Sens. 1983, 4, 273–288. [Google Scholar] [CrossRef]
  54. Chitarra, L.G.; Galbieri, R. Controle de doenças no algodeiro em Mato Grosso. In Manual de Boas Práticas de Manejo do Algodeiro em Mato Grosso; Belot, J.L., Ed.; IMAmt: Cuiabá, Brazil, 2015; pp. 166–177. [Google Scholar]
  55. Candiago, S.; Remondino, F.; De Giglio, M.; Dubbini, M.; Gattelli, M. Evaluating multispectral images and vegetation indices for precision farming applications from UAV images. Remote Sens. 2015, 7, 4026–4047. [Google Scholar] [CrossRef]
  56. Chen, B.; Li, S.; Wang, K.; Zhou, G.; Bai, J. Evaluating the severity level of cotton Verticillium using spectral signature analysis. Int. J. Remote Sens. 2012, 33, 2706–2724. [Google Scholar] [CrossRef]
  57. Mahlein, A.-K. Plant Disease Detection by Imaging Sensors—Parallels and Specific Demands for Precision Agriculture and Plant Phenotyping. Plant Diseas. 2016, 100, 241–251. [Google Scholar] [CrossRef]
  58. Mahlein, A.K.; Steiner, U.; Dehne, H.W.; Oerke, E.C. Spectral signatures of sugar beet leaves for the detection and differentiation of diseases. Precis. Agric. 2010, 11, 413–431. [Google Scholar] [CrossRef]
  59. Calderón, R.; Navas-Cortés, J.A.; Lucena, C.; Zarco-Tejada, P.J. High-resolution airborne hyperspectral and thermal imagery for early detection of Verticillium wilt of olive using fluorescence, temperature and narrow-band spectral indices. Remote Sens. Environ. 2013, 139, 231–245. [Google Scholar] [CrossRef]
  60. Bravo, C.; Moshou, D.; West, J.; McCartney, A.; Ramon, H. Early Disease Detection in Wheat Fields using Spectral Reflectance. Biosyst. Eng. 2003, 84, 137–145. [Google Scholar] [CrossRef]
  61. Navrozidis, I.; Alexandridis, T.K.; Dimitrakos, A.; Lagopodi, A.L.; Moshou, D.; Zalidis, G. Identification of purple spot disease on asparagus crops across spatial and spectral scales. Comput. Electron. Agric. 2018, 148, 322–329. [Google Scholar] [CrossRef]
  62. Yuan, L.; Pu, R.; Zhang, J. Using high spatial resolution satellite imagery for mapping powdery mildew at a regional scale. Precis. Agric. 2016, 17, 332–348. [Google Scholar] [CrossRef]
  63. Peña, J.; Torres-Sánchez, J.; Serrano-Pérez, A.; de Castro, A.; López-Granados, F. Quantifying Efficacy and Limits of Unmanned Aerial Vehicle (UAV) Technology for Weed Seedling Detection as Affected by Sensor Resolution. Sensors 2015, 15, 5609–5626. [Google Scholar] [CrossRef] [PubMed]
  64. Raj, S. Grading System for Cotton Diseases; Technical Bulletin; Central Institute for Cotton Research (CICR): Nagpur, India, 1988; pp. 1–7. [Google Scholar]
Figure 1. (a) Experimental test site of the IMAmt, (b) parcel structure with inner buffer, and (c) plant lines in a SL camera true-color image and multispectral RGB-composites with inner buffers used for signature extraction as obtained from four flight heights with the value of spatial resolutions at ground (SRG) of each image.
Figure 1. (a) Experimental test site of the IMAmt, (b) parcel structure with inner buffer, and (c) plant lines in a SL camera true-color image and multispectral RGB-composites with inner buffers used for signature extraction as obtained from four flight heights with the value of spatial resolutions at ground (SRG) of each image.
Drones 03 00033 g001
Figure 2. An eight-rotor oktokopter, model MK Okto XL 6S12, and a docked TetraCam ADC Lite multispectral camera.
Figure 2. An eight-rotor oktokopter, model MK Okto XL 6S12, and a docked TetraCam ADC Lite multispectral camera.
Drones 03 00033 g002
Figure 3. Classification key for the ramularia blight infection severity of cotton leaves. For comparison with unmanned airborne vehicle (UAV) imagery, individual plant classification was averaged, rounded by parcel, and labeled with the same key. (a) 0: without symptom; (b) 1: ≤5% of the leaf area infected, without incidences in the middle layer; (c) 2: 5–25% of the leaf area infected, incidences in the middle layer; (d) 3: 25–50% of the leaf area infected, incidences in the upper layer; and (e) 4: >50% of the leaf area infected, incidences in the upper layer, leaf loss.
Figure 3. Classification key for the ramularia blight infection severity of cotton leaves. For comparison with unmanned airborne vehicle (UAV) imagery, individual plant classification was averaged, rounded by parcel, and labeled with the same key. (a) 0: without symptom; (b) 1: ≤5% of the leaf area infected, without incidences in the middle layer; (c) 2: 5–25% of the leaf area infected, incidences in the middle layer; (d) 3: 25–50% of the leaf area infected, incidences in the upper layer; and (e) 4: >50% of the leaf area infected, incidences in the upper layer, leaf loss.
Drones 03 00033 g003
Figure 4. Band-wise DCnor values at different UAV flight heights, (a) green, (b) red, and (c) near- infrared.
Figure 4. Band-wise DCnor values at different UAV flight heights, (a) green, (b) red, and (c) near- infrared.
Drones 03 00033 g004
Figure 5. Overall accuracy and kappa index for three-band classifications of four, three, and two ramularia blight infection levels. The cotton pixel DCnor values have been averaged by parcel (median) for the classifications.
Figure 5. Overall accuracy and kappa index for three-band classifications of four, three, and two ramularia blight infection levels. The cotton pixel DCnor values have been averaged by parcel (median) for the classifications.
Drones 03 00033 g005
Table 1. Spatial resolution at ground (SRG), atmospheric normalization factor ( f a ), and descriptive statistics of cotton pixel values after atmospheric normalization (DCnor) for the three spectral bands and different flight heights. The trends of the statistical parameters do not change and an identical pixel number (n = 6102) is generated for all flight heights by random sampling. According to the Mann–Whitney test (MWT), all distributions of the band-specific DCnor from different heights are significantly equal (p > 0.05).
Table 1. Spatial resolution at ground (SRG), atmospheric normalization factor ( f a ), and descriptive statistics of cotton pixel values after atmospheric normalization (DCnor) for the three spectral bands and different flight heights. The trends of the statistical parameters do not change and an identical pixel number (n = 6102) is generated for all flight heights by random sampling. According to the Mann–Whitney test (MWT), all distributions of the band-specific DCnor from different heights are significantly equal (p > 0.05).
BandHeight (m)SRG (cm)
f a
NMeanMedianSDAmplitudeMinMax
GRE100~51.000332,34543.1643.028.4062.7913.9576.74
300~151.01136,64743.5743.826.5449.1622.4471.60
500~251.02712,94943.8443.795.8838.8124.8863.70
700~351.085610245.6445.205.7438.9027.3366.23
RED100~51.000332,3458.238.146.3075.580.0075.58
300~151,06636,6479.299.016.1946.180.0046.18
500~251.15012,9499.678.925.8933.440.0033.44
700~351.334610212.6512.937.0749.120.0049.12
NIR100~51.000332,345122.96124.4223.97160.4741.86202.33
300~150.99436,647121.47122.9218.28119.7757.78177.56
500~250.96912,949116.08117.4115.67101.4468.56170.00
700~350.9636102110.58111.9514.5383.0369.97153.00
Table 2. P-values of the unpaired MW test for the comparison between ramularia areola infection levels of different flight heights.
Table 2. P-values of the unpaired MW test for the comparison between ramularia areola infection levels of different flight heights.
01234
GREREDNIRGREREDNIRGREREDNIRGREREDNIR
0100 m0.5640.9740.5190.0940.0110.1030.0000.0000.0000.0000.0000.000
300 m0.3930.2050.2080.0580.3040.0170.0000.0000.0000.0000.0000.000
500 m0.1670.3900.4110.0150.2550.0630.0000.0040.0000.0000.0000.000
700 m0.2550.3390.6230.2480.3800.7590.0000.0060.0000.0000.0000.000
1 100 m0.1520.0130.1420.0000.0000.0000.0000.0000.000
300 m0.5010.8170.5980.0010.0160.0010.0000.0000.000
500 m0.5220.8550.7240.0000.0430.0040.0000.0000.000
700 m0.8440.6880.8810.0160.2050.0160.0000.0000.000
2 100 m0.0000.0140.0000.0000.0000.000
300 m0.0000.0010.0000.0000.0000.000
500 m0.0000.0290.0000.0000.0000.000
700 m0.0000.0250.0000.0000.0000.000
3 100 m0.0090.0010.016
300 m0.0090.0000.006
500 m0.0030.0030.003
700 m0.0060.0000.002
4

Share and Cite

MDPI and ACS Style

Xavier, T.W.F.; Souto, R.N.V.; Statella, T.; Galbieri, R.; Santos, E.S.; S. Suli, G.; Zeilhofer, P. Identification of Ramularia Leaf Blight Cotton Disease Infection Levels by Multispectral, Multiscale UAV Imagery. Drones 2019, 3, 33. https://doi.org/10.3390/drones3020033

AMA Style

Xavier TWF, Souto RNV, Statella T, Galbieri R, Santos ES, S. Suli G, Zeilhofer P. Identification of Ramularia Leaf Blight Cotton Disease Infection Levels by Multispectral, Multiscale UAV Imagery. Drones. 2019; 3(2):33. https://doi.org/10.3390/drones3020033

Chicago/Turabian Style

Xavier, Thomaz W. F., Roberto N. V. Souto, Thiago Statella, Rafael Galbieri, Emerson S. Santos, George S. Suli, and Peter Zeilhofer. 2019. "Identification of Ramularia Leaf Blight Cotton Disease Infection Levels by Multispectral, Multiscale UAV Imagery" Drones 3, no. 2: 33. https://doi.org/10.3390/drones3020033

APA Style

Xavier, T. W. F., Souto, R. N. V., Statella, T., Galbieri, R., Santos, E. S., S. Suli, G., & Zeilhofer, P. (2019). Identification of Ramularia Leaf Blight Cotton Disease Infection Levels by Multispectral, Multiscale UAV Imagery. Drones, 3(2), 33. https://doi.org/10.3390/drones3020033

Article Metrics

Back to TopTop