Early Detection of Wheat Yellow Rust Disease and Its Impact on Terminal Yield with Multi-Spectral UAV-Imagery
Abstract
:1. Introduction
2. Study Area and Data Collection
2.1. Study Area
2.2. Aerial Data Collection
2.3. Ground Data Collection
3. Method
3.1. Descriptive Statistics
3.2. Feature Engineering for Machine Learning
3.3. A Machine Learning Pipeline
3.4. Deep Convolutional Neural Networks
3.5. Model Evaluation and Assessment of Disease Impact on Yield Loss
4. Results
4.1. Temporal and Spectral Dimensions by Disease Status
4.2. Dimensional Reduction by Principal Component Analysis
4.3. Classification Performance of Fused Data and Classifiers
4.4. Spatial Mapping of Disease Status
4.5. Spatial Assessment of Disease Impact on Yield Loss
5. Discussion
5.1. Importance of Spectral, Texture, and Temporal Information
5.2. Prediction Performance of Multimodality Fusion
5.3. Comparison between Traditional Machine Learning and Deep Learning 3D CNN
5.4. Impact of Disease on Permanent Yield Loss
5.5. Limitations of UAV Aerial Data
6. Conclusions
- Red-edge (RE) (690–740 nm) and near infrared (NIR) (740–1000 nm) were critical spectral bands to distinguish between healthy wheats and severely yellow-rust-infected wheats. All raw spectral bands originally obtained from the UAV were incapable of differentiating healthy and mildly infected wheats.
- Among engineered features, the carotenoid reflectance index 2 (CRI2) [70] and soil-adjusted vegetation index 2 (SAVI2) [48] became the two most influential VIs in detecting disease infection early. Similarly, GLCM contrast texture at an optimal distance of d = 5 pixels and angular direction of θ = 135° from airborne multispectral images brought in the most predictive value for disease sensing.
- The AI-powered wheat disease monitoring performed at 60% detection accuracy as early as 40 DAS when tillering, increasing to 71% and 77% at the later booting and flowering stages (100–120 DAS). The accuracy reached a peak of 79% for the spectral-spatio-temporal fused data model.
- Neither relying on human expertise nor being prone to human judgement, the imagery-based 3D convolutional neural network (3D-CNN) method, on the basis of an automated learning process, outperformed feature-based machine learning methods in detecting the wheat disease invasion early.
- Violent attacks of yellow rust on wheats permanently caused a significant amount of yield loss. The success of early sensing from low-cost multispectral UAVs could preserve a potential amount of yield of 3–7% at the confidence level of 95%.
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Yuan, L.; Huang, Y.; Loraamm, R.W.; Nie, C.; Wang, J.; Zhang, J. Spectral analysis of winter wheat leaves for detection and differentiation of diseases and insects. Field Crop. Res. 2014, 156, 199–207. [Google Scholar] [CrossRef] [Green Version]
- Moshou, D.; Bravo, C.; West, J.; Wahlen, S.; McCartney, A.; Ramon, H. Automatic detection of ‘yellow rust’ in wheat using reflectance measurements and neural networks. Comput. Electron. Agric. 2004, 44, 173–188. [Google Scholar] [CrossRef]
- Duan, T.; Chapman, S.; Guo, Y.; Zheng, B. Dynamic monitoring of NDVI in wheat agronomy and breeding trials using an unmanned aerial vehicle. Field Crop. Res. 2017, 210, 71–80. [Google Scholar] [CrossRef]
- Su, J.; Liu, C.; Coombes, M.; Hu, X.; Wang, C.; Xu, X.; Li, Q.; Guo, L.; Chen, W.-H. Wheat yellow rust monitoring by learning from multispectral UAV aerial imagery. Comput. Electron. Agric. 2018, 155, 157–166. [Google Scholar] [CrossRef]
- De Castro, A.; Ehsani, R.; Ploetz, R.; Crane, J.; Abdulridha, J. Optimum spectral and geometric parameters for early detection of laurel wilt disease in avocado. Remote Sens. Environ. 2015, 171, 33–44. [Google Scholar] [CrossRef]
- Su, J.; Liu, C.; Hu, X.; Xu, X.; Guo, L.; Chen, W.-H. Spatio-temporal monitoring of wheat yellow rust using UAV multispectral imagery. Comput. Electron. Agric. 2019, 167, 105035. [Google Scholar] [CrossRef]
- Zheng, Q.; Huang, W.; Cui, X.; Dong, Y.; Shi, Y.; Ma, H.; Liu, L. Identification of wheat yellow rust using optimal three-band spectral indices in different growth stages. Sensors 2018, 19, 35. [Google Scholar] [CrossRef] [Green Version]
- Chen, X. Epidemiology and control of stripe rust [Puccinia striiformis f. sp. tritici] on wheat. Can. J. Plant Pathol. 2005, 27, 314–337. [Google Scholar] [CrossRef]
- Nguyen, C.; Sagan, V.; Maimaitiyiming, M.; Maimaitijiang, M.; Bhadra, S.; Kwasniewski, M.T. Early detection of plant viral disease using hyperspectral imaging and deep learning. Sensors 2021, 21, 742. [Google Scholar] [CrossRef]
- Sankaran, S.; Mishra, A.; Maja, J.M.; Ehsani, R. Visible-near infrared spectroscopy for detection of Huanglongbing in citrus orchards. Comput. Electron. Agric. 2011, 77, 127–134. [Google Scholar] [CrossRef]
- Devadas, R.; Lamb, D.; Simpfendorfer, S.; Backhouse, D. Evaluating ten spectral vegetation indices for identifying rust infection in individual wheat leaves. Precis. Agric. 2009, 10, 459–470. [Google Scholar] [CrossRef]
- Bagheri, N.; Mohamadi-Monavar, H.; Azizi, A.; Ghasemi, A. Detection of Fire Blight disease in pear trees by hyperspectral data. Eur. J. Remote Sens. 2018, 51, 1–10. [Google Scholar] [CrossRef] [Green Version]
- Ashourloo, D.; Mobasheri, M.R.; Huete, A. Evaluating the effect of different wheat rust disease symptoms on vegetation indices using hyperspectral measurements. Remote Sens. 2014, 6, 5107–5123. [Google Scholar] [CrossRef] [Green Version]
- Franke, J.; Menz, G.; Oerke, E.-C.; Rascher, U. Comparison of multi-and hyperspectral imaging data of leaf rust infected wheat plants. In Remote Sensing for Agriculture, Ecosystems, and Hydrology VII; SPIE: Philadelphia, PA, USA, 2005; pp. 349–359. [Google Scholar]
- Huang, W.; Lamb, D.W.; Niu, Z.; Zhang, Y.; Liu, L.; Wang, J. Identification of yellow rust in wheat using in-situ spectral reflectance measurements and airborne hyperspectral imaging. Precis. Agric. 2007, 8, 187–197. [Google Scholar] [CrossRef]
- Gamon, J.A.; Peñuelas, J.; Field, C.B. A narrow-waveband spectral index that tracks diurnal changes in photosynthetic efficiency. Remote Sens. Environ. 1992, 41, 35–44. [Google Scholar] [CrossRef]
- Rouse, J.W.; Haas, R.H.; Schell, J.A.; Deering, D.W. Monitoring vegetation systems in the Great Plains with ERTS. Nasa Spec. Publ. 1974, 351, 309. [Google Scholar]
- Kavdır, I.; Guyer, D. Comparison of artificial neural networks and statistical classifiers in apple sorting using textural features. Biosyst. Eng. 2004, 89, 331–344. [Google Scholar] [CrossRef]
- Pydipati, R.; Burks, T.; Lee, W. Identification of citrus disease using color texture features and discriminant analysis. Comput. Electron. Agric. 2006, 52, 49–59. [Google Scholar] [CrossRef]
- Lu, J.; Zhou, M.; Gao, Y.; Jiang, H. Using hyperspectral imaging to discriminate yellow leaf curl disease in tomato leaves. Precis. Agric. 2018, 19, 379–394. [Google Scholar] [CrossRef]
- Guo, A.; Huang, W.; Ye, H.; Dong, Y.; Ma, H.; Ren, Y.; Ruan, C. Identification of wheat yellow rust using spectral and texture features of hyperspectral images. Remote Sens. 2020, 12, 1419. [Google Scholar] [CrossRef]
- Guo, A.; Huang, W.; Dong, Y.; Ye, H.; Ma, H.; Liu, B.; Wu, W.; Ren, Y.; Ruan, C.; Geng, Y. Wheat Yellow Rust Detection Using UAV-Based Hyperspectral Technology. Remote Sens. 2021, 13, 123. [Google Scholar] [CrossRef]
- Shafi, U.; Mumtaz, R.; Haq, I.U.; Hafeez, M.; Iqbal, N.; Shaukat, A.; Zaidi, S.M.H.; Mahmood, Z. Wheat Yellow Rust Disease Infection Type Classification Using Texture Features. Sensors 2022, 22, 146. [Google Scholar] [CrossRef] [PubMed]
- Al-Saddik, H.; Laybros, A.; Billiot, B.; Cointault, F. Using image texture and spectral reflectance analysis to detect Yellowness and Esca in grapevines at leaf-level. Remote Sens. 2018, 10, 618. [Google Scholar] [CrossRef] [Green Version]
- Bohnenkamp, D.; Behmann, J.; Mahlein, A.-K. In-Field Detection of Yellow Rust in Wheat on the Ground Canopy and UAV Scale. Remote Sens. 2019, 11, 2495. [Google Scholar] [CrossRef] [Green Version]
- Zhao, Y.; Qin, F.; Xu, F.; Ma, J.; Sun, Z.; Song, Y.; Zhao, L.; Li, J.; Wang, H. Identification of Tilletia foetida, Ustilago tritici, and Urocystis tritici based on near-infrared spectroscopy. J. Spectrosc. 2019, 2019, 9753829. [Google Scholar] [CrossRef] [Green Version]
- Duarte-Carvajalino, J.M.; Alzate, D.F.; Ramirez, A.A.; Santa-Sepulveda, J.D.; Fajardo-Rojas, A.E.; Soto-Suárez, M. Evaluating late blight severity in potato crops using unmanned aerial vehicles and machine learning algorithms. Remote Sens. 2018, 10, 1513. [Google Scholar] [CrossRef] [Green Version]
- Zhang, X.; Han, L.; Dong, Y.; Shi, Y.; Huang, W.; Han, L.; González-Moreno, P.; Ma, H.; Ye, H.; Sobeih, T. A Deep Learning-Based Approach for Automated Yellow Rust Disease Detection from High-Resolution Hyperspectral UAV Images. Remote Sens. 2019, 11, 1554. [Google Scholar] [CrossRef] [Green Version]
- Fukushima, K. Neocognitron: A self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position. Biol. Cybern. 1980, 36, 193–202. [Google Scholar] [CrossRef]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
- LeCun, Y.; Bottou, L.; Bengio, Y.; Haffner, P. Gradient-based learning applied to document recognition. Proc. IEEE 1998, 86, 2278–2324. [Google Scholar] [CrossRef] [Green Version]
- Su, J.; Yi, D.; Su, B.; Mi, Z.; Liu, C.; Hu, X.; Xu, X.; Guo, L.; Chen, W.-H. Aerial visual perception in smart farming: Field study of wheat yellow rust monitoring. IEEE Trans. Ind. Inform. 2020, 17, 2242–2249. [Google Scholar] [CrossRef] [Green Version]
- Pan, Q.; Gao, M.; Wu, P.; Yan, J.; Li, S. A Deep-Learning-Based Approach for Wheat Yellow Rust Disease Recognition from Unmanned Aerial Vehicle Images. Sensors 2021, 21, 6540. [Google Scholar] [CrossRef] [PubMed]
- Sagan, V.; Maimaitijiang, M.; Bhadra, S.; Maimaitiyiming, M.; Brown, D.R.; Sidike, P.; Fritschi, F.B. Field-scale crop yield prediction using multi-temporal WorldView-3 and PlanetScope satellite data and deep learning. ISPRS J. Photogramm. Remote Sens. 2021, 174, 265–281. [Google Scholar] [CrossRef]
- Nguyen, C.; Sagan, V.; Bhadra, S.; Moose, S. UAV Multisensory Data Fusion and Multi-Task Deep Learning for High-Throughput Maize Phenotyping. Sensors 2023, 23, 1827. [Google Scholar] [CrossRef]
- Gitelson, A.A.; Merzlyak, M.N. Remote sensing of chlorophyll concentration in higher plant leaves. Adv. Space Res. 1998, 22, 689–692. [Google Scholar] [CrossRef]
- Birth, G.S.; McVey, G.R. Measuring the color of growing turf with a reflectance spectrophotometer 1. Agron. J. 1968, 60, 640–643. [Google Scholar] [CrossRef]
- Gitelson, A.A.; Gritz, Y.; Merzlyak, M.N. Relationships between leaf chlorophyll content and spectral reflectance and algorithms for non-destructive chlorophyll assessment in higher plant leaves. J. Plant Physiol. 2003, 160, 271–282. [Google Scholar] [CrossRef]
- Jordan, C.F. Derivation of leaf-area index from quality of light on the forest floor. Ecology 1969, 50, 663–666. [Google Scholar] [CrossRef]
- Tucker, C.J. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sens. Environ. 1979, 8, 127–150. [Google Scholar] [CrossRef] [Green Version]
- Huete, A.R. A soil-adjusted vegetation index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
- Qi, J.; Chehbouni, A.; Huete, A.R.; Kerr, Y.H.; Sorooshian, S. A modified soil adjusted vegetation index. Remote Sens. Environ. 1994, 48, 119–126. [Google Scholar] [CrossRef]
- Rondeaux, G.; Steven, M.; Baret, F. Optimization of soil-adjusted vegetation indices. Remote Sens. Environ. 1996, 55, 95–107. [Google Scholar] [CrossRef]
- Roujean, J.-L.; Breon, F.-M. Estimating PAR absorbed by vegetation from bidirectional reflectance measurements. Remote Sens. Environ. 1995, 51, 375–384. [Google Scholar] [CrossRef]
- Broge, N.H.; Leblanc, E. Comparing prediction power and stability of broadband and hyperspectral vegetation indices for estimation of green leaf area index and canopy chlorophyll density. Remote Sens. Environ. 2001, 76, 156–172. [Google Scholar] [CrossRef]
- Baret, F.; Guyot, G.; Major, D. Crop biomass evaluation using radiometric measurements. Photogrammetria 1989, 43, 241–256. [Google Scholar] [CrossRef]
- Richardson, A.J.; Wiegand, C. Distinguishing vegetation from soil background information. Photogramm. Eng. Remote Sens. 1977, 43, 1541–1552. [Google Scholar]
- Major, D.J.; Baret, F.; Guyot, G. A ratio vegetation index adjusted for soil brightness. Int. J. Remote Sens. 1990, 11, 727–740. [Google Scholar] [CrossRef]
- Baret, F.; Guyot, G. Potentials and limits of vegetation indices for LAI and APAR assessment. Remote Sens. Environ. 1991, 35, 161–173. [Google Scholar] [CrossRef]
- McFeeters, S.K. The use of the Normalized Difference Water Index (NDWI) in the delineation of open water features. Int. J. Remote Sens. 1996, 17, 1425–1432. [Google Scholar] [CrossRef]
- Peñuelas, J.; Gamon, J.; Fredeen, A.; Merino, J.; Field, C. Reflectance indices associated with physiological changes in nitrogen-and water-limited sunflower leaves. Remote Sens. Environ. 1994, 48, 135–146. [Google Scholar] [CrossRef]
- Shibayama, M.; Akiyama, T. Seasonal visible, near-infrared and mid-infrared spectra of rice canopies in relation to LAI and above-ground dry phytomass. Remote Sens. Environ. 1989, 27, 119–127. [Google Scholar] [CrossRef]
- Daughtry, C.S.; Walthall, C.; Kim, M.; De Colstoun, E.B.; McMurtrey Iii, J. Estimating corn leaf chlorophyll concentration from leaf and canopy reflectance. Remote Sens. Environ. 2000, 74, 229–239. [Google Scholar] [CrossRef]
- Haboudane, D.; Miller, J.R.; Pattey, E.; Zarco-Tejada, P.J.; Strachan, I.B. Hyperspectral vegetation indices and novel algorithms for predicting green LAI of crop canopies: Modeling and validation in the context of precision agriculture. Remote Sens. Environ. 2004, 90, 337–352. [Google Scholar] [CrossRef]
- Eitel, J.; Long, D.; Gessler, P.; Smith, A. Using in-situ measurements to evaluate the new RapidEye™ satellite series for prediction of wheat nitrogen status. Int. J. Remote Sens. 2007, 28, 4183–4190. [Google Scholar] [CrossRef]
- Huete, A.; Didan, K.; Miura, T.; Rodriguez, E.P.; Gao, X.; Ferreira, L.G. Overview of the radiometric and biophysical performance of the MODIS vegetation indices. Remote Sens. Environ. 2002, 83, 195–213. [Google Scholar] [CrossRef]
- Datt, B. A new reflectance index for remote sensing of chlorophyll content in higher plants: Tests using Eucalyptus leaves. J. Plant Physiol. 1999, 154, 30–36. [Google Scholar] [CrossRef]
- Marshak, A.; Knyazikhin, Y.; Davis, A.; Wiscombe, W.; Pilewskie, P. Cloud-vegetation interaction: Use of normalized difference cloud index for estimation of cloud optical thickness. Geophys. Res. Lett. 2000, 27, 1695–1698. [Google Scholar] [CrossRef] [Green Version]
- Merzlyak, M.N.; Solovchenko, A.E. Photostability of pigments in ripening apple fruit: A possible photoprotective role of carotenoids during plant senescence. Plant Sci. 2002, 163, 881–888. [Google Scholar] [CrossRef]
- Penuelas, J.; Baret, F.; Filella, I. Semi-empirical indices to assess carotenoids/chlorophyll a ratio from leaf spectral reflectance. Photosynthetica 1995, 31, 221–230. [Google Scholar]
- Vincini, M.; Frazzi, E.; D’Alessio, P. Angular dependence of maize and sugar beet VIs from directional CHRIS/Proba data. In Proceedings of the 4th ESA CHRIS PROBA Workshop, Frascati, Italy, 19–21 September 2006; pp. 19–21. [Google Scholar]
- Haboudane, D.; Miller, J.R.; Tremblay, N.; Zarco-Tejada, P.J.; Dextraze, L. Integrated narrow-band vegetation indices for prediction of crop chlorophyll content for application to precision agriculture. Remote Sens. Environ. 2002, 81, 416–426. [Google Scholar] [CrossRef]
- Xu, M.; Liu, R.; Chen, J.M.; Liu, Y.; Shang, R.; Ju, W.; Wu, C.; Huang, W. Retrieving leaf chlorophyll content using a matrix-based vegetation index combination approach. Remote Sens. Environ. 2019, 224, 60–73. [Google Scholar] [CrossRef]
- Barnes, E.; Clarke, T.; Richards, S.; Colaizzi, P.; Haberland, J.; Kostrzewski, M.; Waller, P.; Choi, C.; Riley, E.; Thompson, T. Coincident detection of crop water stress, nitrogen status and canopy density using ground based multispectral data. In Proceedings of the Fifth International Conference on Precision Agriculture, Bloomington, MN, USA, 16–19 July 2000. [Google Scholar]
- Dash, J.; Curran, P. The MERIS terrestrial chlorophyll index. Int. J. Remote Sens. 2004, 25, 5403–5413. [Google Scholar] [CrossRef]
- Jiang, Z.; Huete, A.R.; Didan, K.; Miura, T. Development of a two-band enhanced vegetation index without a blue band. Remote Sens. Environ. 2008, 112, 3833–3845. [Google Scholar] [CrossRef]
- Woebbecke, D.M.; Meyer, G.E.; Von Bargen, K.; Mortensen, D.A. Color indices for weed identification under various soil, residue, and lighting conditions. Trans. ASAE 1995, 38, 259–269. [Google Scholar] [CrossRef]
- Strong, C.J.; Burnside, N.G.; Llewellyn, D. The potential of small-Unmanned Aircraft Systems for the rapid detection of threatened unimproved grassland communities using an Enhanced Normalized Difference Vegetation Index. PLoS ONE 2017, 12, e0186193. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Gitelson, A.A.; Merzlyak, M.N.; Chivkunova, O.B. Optical properties and nondestructive estimation of anthocyanin content in plant leaves. Photochem. Photobiol. 2001, 74, 38–45. [Google Scholar] [CrossRef]
- Gitelson, A.A.; Zur, Y.; Chivkunova, O.B.; Merzlyak, M.N. Assessing carotenoid content in plant leaves with reflectance spectroscopy. Photochem. Photobiol. 2002, 75, 272–281. [Google Scholar] [CrossRef]
- Haralick, R.M.; Shanmugam, K.; Dinstein, I.H. Textural features for image classification. IEEE Trans. Syst. Man Cybern. 1973, 3, 610–621. [Google Scholar] [CrossRef] [Green Version]
- Cortes, C.; Vapnik, V. Support-vector networks. Mach. Learn. 1995, 20, 273–297. [Google Scholar] [CrossRef]
- Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef] [Green Version]
- Hinton, G.E. Connectionist learning procedures. In Machine Learning; Elsevier: Amsterdam, The Netherlands, 1990; pp. 555–610. [Google Scholar]
- Glorot, X.; Bengio, Y. Understanding the difficulty of training deep feedforward neural networks. In Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, Sardinia, Italy, 13–15 May 2010; pp. 249–256. [Google Scholar]
- Kingma, D.P.; Ba, J. Adam: A method for stochastic optimization. arXiv 2014, arXiv:1412.6980. [Google Scholar]
- Kohavi, R. A study of cross-validation and bootstrap for accuracy estimation and model selection. In Proceedings of the 14th International Joint Conference on Artificial Intelligence, Montreal, QC, Canada, 20–25 August 1995; pp. 1137–1145. [Google Scholar]
- Ioffe, S.; Szegedy, C. Batch normalization: Accelerating deep network training by reducing internal covariate shift. In Proceedings of the International Conference on Machine Learning, Lille, France, 6–11 July 2015; pp. 448–456. [Google Scholar]
- Nair, V.; Hinton, G.E. Rectified linear units improve restricted boltzmann machines. In Proceedings of the 27th International Conference on International Conference on Machine Learning, Madison, WI, USA, 21–24 June 2010. [Google Scholar]
- Srivastava, N.; Hinton, G.; Krizhevsky, A.; Sutskever, I.; Salakhutdinov, R. Dropout: A simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 2014, 15, 1929–1958. [Google Scholar]
- Zou, H.; Hastie, T. Regularization and variable selection via the elastic net. J. R. Stat. Soc. Ser. B (Stat. Methodol.) 2005, 67, 301–320. [Google Scholar] [CrossRef] [Green Version]
- Tukey, J.W. Comparing individual means in the analysis of variance. Biometrics 1949, 5, 99–114. [Google Scholar] [CrossRef]
- Maimaitijiang, M.; Sagan, V.; Sidike, P.; Hartling, S.; Esposito, F.; Fritschi, F.B. Soybean yield prediction from UAV using multimodal data fusion and deep learning. Remote Sens. Environ. 2020, 237, 111599. [Google Scholar] [CrossRef]
- Hartling, S.; Sagan, V.; Maimaitijiang, M. Urban tree species classification using UAV-based multi-sensor data fusion and machine learning. GIScience Remote Sens. 2021, 58, 1250–1275. [Google Scholar] [CrossRef]
- Hartling, S.; Sagan, V.; Sidike, P.; Maimaitijiang, M.; Carron, J. Urban Tree Species Classification Using a WorldView-2/3 and LiDAR Data Fusion Approach and Deep Learning. Sensors 2019, 19, 1284. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Thenkabail, P.S.; Lyon, J.G. Hyperspectral Remote Sensing of Vegetation; CRC Press: Boca Raton, FL, USA, 2016. [Google Scholar]
- Neupane, K.; Baysal-Gurel, F. Automatic Identification and Monitoring of Plant Diseases Using Unmanned Aerial Vehicles: A Review. Remote Sens. 2021, 13, 3841. [Google Scholar] [CrossRef]
- Xiao, Y.; Dong, Y.; Huang, W.; Liu, L.; Ma, H. Wheat fusarium head blight detection using UAV-based spectral and texture features in optimal window size. Remote Sens. 2021, 13, 2437. [Google Scholar] [CrossRef]
- Raschka, S. Python Machine Learning; Packt Publishing Ltd.: Birmingham, UK, 2015. [Google Scholar]
- Rapilly, F. Yellow rust epidemiology. Annu. Rev. Phytopathol. 1979, 17, 59–73. [Google Scholar] [CrossRef]
- Chen, X. Integration of cultivar resistance and fungicide application for control of wheat stripe rust. Can. J. Plant Pathol. 2014, 36, 311–326. [Google Scholar] [CrossRef]
- Sharma, R.C.; Nazari, K.; Amanov, A.; Ziyaev, Z.; Jalilov, A.U. Reduction of winter wheat yield losses caused by stripe rust through fungicide management. J. Phytopathol. 2016, 164, 671–677. [Google Scholar] [CrossRef] [Green Version]
- Zhang, J.; Wang, C.; Yang, C.; Xie, T.; Jiang, Z.; Hu, T.; Luo, Z.; Zhou, G.; Xie, J. Assessing the effect of real spatial resolution of in situ UAV multispectral images on seedling rapeseed growth monitoring. Remote Sens. 2020, 12, 1207. [Google Scholar] [CrossRef] [Green Version]
- Dash, J.P.; Watt, M.S.; Pearse, G.D.; Heaphy, M.; Dungey, H.S. Assessing very high resolution UAV imagery for monitoring forest health during a simulated disease outbreak. ISPRS J. Photogramm. Remote Sens. 2017, 131, 1–14. [Google Scholar] [CrossRef]
Dates | 19 June | 30 August | 24 September | 5 October | 7 October | 26 October | 15 November | 17 November | 16 December |
---|---|---|---|---|---|---|---|---|---|
Days of Sowing (DOS) | 0 | 40 | 65 | 77 | 79 | 98 | 119 | 121 | 150 |
Crop Growing Stages | Planting | Tillering | Stem Extension | Stem Extension/ Heading | Stem Extension/ Heading | Heading/ Flowering | Flowering | Flowering | Harvesting |
Data Collection Types | UAV | UAV | UAV | UAV | UAV | Health Status Annotation | UAV | Dry Grain Yield |
700 Field Plots in Total | Total Plots | ||||
---|---|---|---|---|---|
592 Plots Identified Health Status | 108 Plots Unknown Health Status | ||||
Healthy | Mild Infection | Severe Infection | |||
No. of plots | 106 | 430 | 56 | 108 | 700 |
Mean | 4046.35 | 4126.53 | 3482.70 | 4190.85 | 4072.80 |
STD | 466.55 | 469.37 | 608.73 | 469.57 | 512.59 |
CV (%) | 11.53 | 11.39 | 17.48 | 11.20 | 12.58 |
Min | 3012 | 2196 | 2174 | 2277 | 2174 |
25% | 3715.25 | 3871 | 3054.75 | 4038.75 | 3801.75 |
50% | 4126 | 4166.50 | 3550 | 4251 | 4143 |
75% | 4415.75 | 4435.75 | 3964.75 | 4450 | 4420.25 |
Max | 5094 | 5279 | 4539 | 5124 | 5279 |
No. | Index Names | Index Formula | Reference |
---|---|---|---|
1 | Normalized difference vegetation index | ndvi = (NIR − R)/(NIR + R) | [17] |
2 | Green normalized difference vegetation index | gndvi = (NIR − G)/(NIR + G) | [36] |
3 | Ratio vegetation index | rvi_1 = NIR/R | [37] |
4 | Green chlorophyll index | gci = (NIR/G) − 1.0 | [38] |
5 | Red-green ratio vegetation index | rgvi = R/G | [39] |
6 | Difference vegetation index | dvi = NIR − R | [40] |
7 | Soil-adjusted vegetation index | L = 0.5 savi = ((NIR − R)/(NIR + R + L))×(1.0 + L) | [41] |
8 | Modified soil-adjusted vegetation index | msavi = 0.5 × ((2.0 × NIR) + 1.0 − ((2.0 × NIR + 1.0)2 − 8.0 × (NIR − R))0.5 | [42] |
9 | Optimized soil-adjusted vegetation index | osavi = (NIR − R)/(NIR + R + 0.16) | [43] |
10 | Renormalized difference vegetation index | rdvi = ((NIR − R)2/(NIR + R))0.5 | [44] |
11 | Triangular vegetation index | tvi = 60.0 × (NIR − G) −100.0 ∗ (R − G) | [45] |
12 | Transformed soil-adjusted vegetation index | a, b = 0.96916, 0.084726 tsavi = (a × (NIR − a × R − b))/(a × NIR + R − a × b) | [46] |
13 | Perpendicular vegetation index | a, b = 0.96916, 0.084726 pvi = (NIR − a × R − b)/(1 + (a)2)0.5 | [47] |
14 | Soil-adjusted vegetation index 2 | a, b = 0.96916, 0.084726 savi_2 = NIR/(R − (b/a)) | [48] |
15 | Adjusted transformed soil-adjusted vegetation index | a, b, X = 0.96916, 0.084726, 0.08 atsavi = (a × (−a × R − b))/(a × NIR + R − a × b + X × (1 + np.square(a))) | [49] |
16 | Normalized difference water index | ndwi = (G − NIR)/(G + NIR) | [50] |
17 | normalized total pigment to chlorophyll a ratio index | npci = (R − B)/(R + B) | [51] |
18 | Simple ratio pigment index | srpi = B/R | [51] |
19 | Ratio vegetation index | rvi_2 = NIR/G | [52] |
20 | Modified chlorophyll absorption ratio index | mcari = (RE − R − 0.2 × (RE − G)) × (RE/R) | [53] |
21 | Modified chlorophyll absorption ratio index 1 | mcari_1 = 1.2 × (2.5 × (NIR − R) − 1.3 × (NIR − G)) | [54] |
22 | Modified chlorophyll absorption ratio index 2 | mcari_2 = 1.5 × (2.5 × (NIR − R) − 1.3 × (NIR − G)) × ((2.0 × NIR + 1)2) − (6.0 × NIR − 5.0 × R) − 0.5 | [54] |
23 | Modified triangular vegetation index 1 | mtvi_1 = 1.2 × (1.2 × (NIR − G) − 2.5 × (R − G)) | [54] |
24 | Modified triangular vegetation index 2 | mtvi_2 = 1.5 × (1.2 × (NIR − G) − 2.5 × (R − G)) × ((2 × NIR + 1)2) − (6.0 × NIR−5.0 × R) − 0.5 | [54] |
25 | Ratio of Modified chlorophyll absorption ratio index and Modified triangular vegetation index 2 | r_mcari_mtvi2 = ((RE − R − 0.2 × (RE − G)) × (RE/R))/(1.5 × (1.2 × (NIR−G) − 2.5 × (R − G)) × ((2 × NIR + 1)2) − (6.0 × NIR − 5.0 × R) − 0.5) | [55] |
26 | Enhanced vegetation index | evi = (NIR − R)/(NIR + 6.0 × R − 7.5 × B + 1.0) | [56] |
27 | Datt’s chlorophyll content | datt = (NIR − RE)/(NIR − R) | [57] |
28 | Normalized difference cloud index | ndci = (RE − G)/(RE + G) | [58] |
29 | Plant senescence reflectance index | psri = (R − G)/RE | [59] |
30 | Structure insensitive pigment index | sipi = (NIR − B)/(NIR + R) | [60] |
31 | Spectral polygon vegetation index | spvi = 0.4 × 3.7 × (NIR − R) − 1.2 × |G − R| | [61] |
32 | Transformed chlorophyll absorption in reflectance index | tcari = 3.0 × ((RE − R) − 0.2 × (RE − G) × (RE/R)) | [62] |
33 | Ratio of TCARI and OSAVI | r_tcari_osavi = (3.0 × ((RE − R) − 0.2 × (RE − G) × (RE/R)))/((NIR − R)/(NIR + R + 0.16)) | [62] |
34 | Red edge relative index | reri = (RE − R)/NIR | [63] |
35 | Normalized difference red edge index | ndre = (NIR − RE)/(NIR + RE) | [64] |
36 | MERIS terrestrial chlorophyll index | mtci = (NIR − RE)/(RE − R) | [65] |
37 | Enhanced vegetation index 2 | evi_2 = 2.5 × ((NIR − R)/(NIR + 2.4 × R + 1.0)) | [66] |
38 | Red edge chlorophyll index | reci = (NIR/RE) − 1 | [38] |
39 | Normalized excess green index | nexg = (2 × G − R − B)/(G + R + B) | [67] |
40 | Normalized green-red difference index | ngrdi = (G − R)/(G + R) | [40] |
41 | Enhanced normalized difference vegetation index | endvi = (NIR + G − 2.0 × B)/(NIR + G + 2.0 × B) | [68] |
42 | Anthocyanin reflectance index 2 | ari_2 = NIR × ((1.0/G) − (1.0/RE)) | [69] |
43 | Carotenoid reflectance index 2 | cri_2 = (1.0/G) − (1.0/RE) | [70] |
Texture Attributes | Formulas | Neighboring Distances | Angulars | Spectral Bands | Total |
---|---|---|---|---|---|
Contrast | 3 values (1, 3, 5) | 4 angles (0°, 45°, 90°, 135°) | 5 bands (B, G, R, RE, NIR) | 60 | |
Dissimilarity | 3 values (1, 3, 5) | 4 angles (0°, 45°, 90°, 135°) | 5 bands (B, G, R, RE, NIR) | 60 | |
Homogeneity | 3 values (1, 3, 5) | 4 angles (0°, 45°, 90°, 135°) | 5 bands (B, G, R, RE, NIR) | 60 | |
Angular second moment (ASM) | 3 values (1, 3, 5) | 4 angles (0°, 45°, 90°, 135°) | 5 bands (B, G, R, RE, NIR) | 60 | |
Energy | 3 values (1, 3, 5) | 4 angles (0°, 45°, 90°, 135°) | 5 bands (B, G, R, RE, NIR) | 60 | |
Correlation | 3 values (1, 3, 5) | 4 angles (0°, 45°, 90°, 135°) | 5 bands (B, G, R, RE, NIR) | 60 | |
Total Texture Attributes | 360 |
Group 1 | Group 2 | Mean Difference | p Value | Lower CI | Upper CI | Reject |
---|---|---|---|---|---|---|
Healthy | Mild | 80.18 | 0.2785 | −43.05 | 203.403 | False |
Healthy | Severe | −563.65 | <0.001 | −751.37 | −375.93 | True |
Mild | Severe | −643.83 | <0.001 | −805.26 | −482.40 | True |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Nguyen, C.; Sagan, V.; Skobalski, J.; Severo, J.I. Early Detection of Wheat Yellow Rust Disease and Its Impact on Terminal Yield with Multi-Spectral UAV-Imagery. Remote Sens. 2023, 15, 3301. https://doi.org/10.3390/rs15133301
Nguyen C, Sagan V, Skobalski J, Severo JI. Early Detection of Wheat Yellow Rust Disease and Its Impact on Terminal Yield with Multi-Spectral UAV-Imagery. Remote Sensing. 2023; 15(13):3301. https://doi.org/10.3390/rs15133301
Chicago/Turabian StyleNguyen, Canh, Vasit Sagan, Juan Skobalski, and Juan Ignacio Severo. 2023. "Early Detection of Wheat Yellow Rust Disease and Its Impact on Terminal Yield with Multi-Spectral UAV-Imagery" Remote Sensing 15, no. 13: 3301. https://doi.org/10.3390/rs15133301
APA StyleNguyen, C., Sagan, V., Skobalski, J., & Severo, J. I. (2023). Early Detection of Wheat Yellow Rust Disease and Its Impact on Terminal Yield with Multi-Spectral UAV-Imagery. Remote Sensing, 15(13), 3301. https://doi.org/10.3390/rs15133301