Monitoring the Rice Panicle Blast Control Period Based on UAV Multispectral Remote Sensing and Machine Learning
Abstract
:1. Introduction
2. Materials and Methods
2.1. Field Experiment
2.2. Acquisition and Pre-Processing of UAV Image Data
2.2.1. Acquisition of UAV Image Data
2.2.2. Pre-Processing of Multispectral Data
2.3. Selection of VIs
2.4. Data Processing Methods
2.4.1. NN and SVR
2.4.2. RF
2.4.3. GBDT
2.4.4. AdaBoost Algorithm
2.5. Data Strategy Analysis
3. Research Results
3.1. Rice Heading Rate
3.2. Spectral Reflectance
3.3. Regression Models of Different Bands, Different VIs, and Heading Rates
3.3.1. Regression Models of Rice Canopy Spectral Information and Heading Rates at Different Bands
3.3.2. Regression Modes of Rice Single VIs and Heading Rate
3.3.3. Regression Models of Rice Heading Rate Based on Multi-VIs
4. Discussion
5. Conclusions
- The rice canopy color changed at the heading stage and further affected canopy reflectance. The fitting results of the regression models for single band and heading rate indicated that the 650 and 730 nm bands were more sensitive at the rice heading stage.
- The inversion models of single band, single VI, multi-VIs, and multi-VI combinations were superior to the inversion models of single band and single VI. The AdaBoost-based inversion model of the rice heading rate was the best, with an R2 of 0.94 and RMSE of 0.12. Among the several methods, the ensemble learning-based algorithm could further improve the accuracy and robustness of the inversion model compared with the traditional machine learning algorithm.
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Zahra, N.; Hafeez, M.B.; Nawaz, A.; Farooq, M. Rice production systems and grain quality. J. Cereal Sci. 2022, 105, 103463. [Google Scholar] [CrossRef]
- Huang, S.; Sun, C.; Qi, L.; Ma, X.; Wang, W. Rice panicle blast identification method based on deep convolution neural network. Trans. Chin. Soc. Agric. Eng. 2017, 33, 169–176. (In Chinese) [Google Scholar]
- Wen, X.; Xie, M.; Jiang, J.; Yang, B.; Shao, Y.; He, W.; Liu, L.; Zhao, Y. Advances in research on control method of rice blast. Chin. Agric. Sci. Bull. 2013, 29, 190–195. (In Chinese) [Google Scholar]
- Asibi, A.E.; Chai, Q.; Coulter, J.A. Rice Blast: A Disease with Implications for Global Food Security. Agronomy 2019, 9, 451. [Google Scholar] [CrossRef]
- Sriwanna, K. Weather-based rice blast disease forecasting. Comput. Electron. Agric. 2022, 193, 106658. [Google Scholar] [CrossRef]
- Feng, S.; Zhao, D.; Guan, Q.; Li, J.; Liu, Z.; Jin, Z.; Li, G.; Xu, T. A deep convolutional neural network-based wavelength selection method for spectral characteristics of rice blast disease. Comput. Electron. Agric. 2022, 199, 107199. [Google Scholar] [CrossRef]
- Kongcharoen, N.; Kaewsalong, N.; Dethoup, T. Efficacy of fungicides in controlling rice blast and dirty panicle diseases in Thailand. Sci. Rep. 2020, 10, 16233. [Google Scholar] [CrossRef]
- Du, Y.; Qi, Z.; Yu, J.; Yu, M.; Cao, H.; Zhang, R.; Yong, M.; Yin, X.; Pan, X.; Song, T.; et al. Effects of panicle development stage and temperature on rice panicle blast infection by Magnaporthe oryzae and visualization of its infection process. Plant Pathol. 2021, 70, 1436–1444. [Google Scholar] [CrossRef]
- Yang, G.; Liu, J.; Zhao, C.; Li, Z.; Huang, Y.; Yu, H.; Xu, B.; Yang, X.; Zhu, D.; Zhang, X.; et al. Unmanned Aerial Vehicle Remote Sensing for Field-Based Crop Phenotyping: Current Status and Perspectives. Front. Plant Sci. 2017, 8, 1111. [Google Scholar] [CrossRef]
- Zhang, J.; Tian, H.; Yang, J.; Pan, S. Improving Representation of Crop Growth and Yield in the Dynamic Land Ecosystem Model and Its Application to China. J. Adv. Model Earth Syst. 2018, 10, 1680–1707. [Google Scholar] [CrossRef]
- Hou, Y.; He, L.; Jin, N.; Zheng, C.; Liu, W.; Zhang, L. Establishment and application of crop growth simulating and monitoring system in China. Trans. Chin. Soc. Agric. Eng. 2018, 34, 165–175. (In Chinese) [Google Scholar]
- Duan, B.; Fang, S.; Gong, Y.; Peng, Y.; Wu, X.; Zhu, R. Remote estimation of grain yield based on UAV data in different rice cultivars under contrasting climatic zone. Field Crop. Res. 2021, 267, 108148. [Google Scholar] [CrossRef]
- Liu, T.; Zhang, H.; Wang, Z.; He, C.; Zhang, Q.; Jiao, Y. Estimation of the leaf area index and chlorophyll content of wheat using UAV multi-spectrum images. Trans. Chin. Soc. Agric. Eng. 2021, 37, 65–72. (In Chinese) [Google Scholar]
- Chen, Z.; Ren, J.; Tang, H.; Shi, Y.; Leng, P.; Liu, J.; Wang, L.; Wu, W.; Yao, Y.; Hasiyuya, P. Progress and perspectives on agricultural remote sensing research and applications in China. J. Remote Sens. 2016, 20, 748–767. (In Chinese) [Google Scholar]
- Liu, S.; Li, L.; Gao, W.; Zhang, Y.; Liu, Y.; Wang, S.; Lu, J. Diagnosis of nitrogen status in winter oilseed rape (Brassica napus L.) using in-situ hyperspectral data and unmanned aerial vehicle (UAV) multispectral images. Comput. Electron. Agric. 2018, 151, 185–195. [Google Scholar] [CrossRef]
- Aslan, M.F.; Durdu, A.; Sabanci, K.; Ropelewska, E.; Gültekin, S.S. A Comprehensive Survey of the Recent Studies with UAV for Precision Agriculture in Open Fields and Greenhouses. Appl. Sci. 2022, 12, 1047. [Google Scholar] [CrossRef]
- Li, S.; Yuan, F.; Ata-Ui-Karim, S.T.; Zheng, H.; Cheng, T.; Liu, X.; Tian, Y.; Zhu, Y.; Cao, W.; Cao, Q. Combining Color Indices and Textures of UAV-Based Digital Imagery for Rice LAI Estimation. Remote Sens. 2019, 11, 1763. [Google Scholar] [CrossRef]
- Ban, S.; Liu, W.; Tian, M.; Wang, Q.; Yuan, T.; Chang, Q.; Li, L. Rice Leaf Chlorophyll Content Estimation Using UAV-Based Spectral Images in Different Regions. Agronomy 2022, 12, 2832. [Google Scholar] [CrossRef]
- Colorado, J.D.; Cera-Bornacelli, N.; Caldas, J.S.; Petro, E.; Rebolledo, M.C.; Cuellar, D.; Calderon, F.; Mondragon, I.F.; Jaramillo-Botero, A. Estimation of Nitrogen in Rice Crops from UAV-Captured Images. Remote Sens. 2020, 12, 3396. [Google Scholar] [CrossRef]
- Jordan, M.I.; Mitchell, T.M. Machine learning: Trends, perspectives, and prospects. Science 2015, 349, 255–260. [Google Scholar] [CrossRef]
- Chen, C.; Cao, G.; Li, Y.; Liu, D.; Ma, B.; Zhang, J.; Li, L.; Hu, J. Research on Monitoring Methods for the Appropriate Rice Harvest Period Based on Multispectral Remote Sensing. Discrete Dyn. Nat. Soc. 2022, 2022, 1519667. [Google Scholar]
- Wan, L.; Cen, H.; Zhu, J.; Zhang, J.; Zhu, Y.; Sun, D.; Du, X.; Zhai, L.; Weng, H.; Li, Y.; et al. Grain yield prediction of rice using multi-temporal UAV-based RGB and multispectral images and model transfer—A case study of small farmlands in the South of China. Agric. For Meteorol. 2020, 291, 108096. [Google Scholar] [CrossRef]
- Xu, S.; Xu, X.; Blacker, C.; Gaulton, R.; Zhu, Q.; Yang, M.; Yang, G.; Zhang, J.; Yang, Y.; Yang, M.; et al. Estimation of Leaf Nitrogen Content in Rice Using Vegetation Indices and Feature Variable Optimization with Information Fusion of Multiple-Sensor Images from UAV. Remote Sens. 2023, 15, 854. [Google Scholar] [CrossRef]
- Tucker, C.J.; Elgin, J., Jr.; McMurtrey Iii, J.; Fan, C. Monitoring corn and soybean crop development with hand-held radiometer spectral data. Remote Sens. Environ. 1979, 8, 237–248. [Google Scholar] [CrossRef]
- Gitelson, A.A.; Kaufman, Y.J.; Merzlyak, M.N. Use of a green channel in remote sensing of global vegetation from EOS-MODIS. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
- Daughtry, C.S.; Walthall, C.; Kim, M.; De Colstoun, E.B.; McMurtrey Iii, J. Estimating corn leaf chlorophyll concentration from leaf and canopy reflectance. Remote Sens. Environ. 2000, 74, 229–239. [Google Scholar] [CrossRef]
- Gong, P.; Pu, R.; Biging, G.S.; Larrieu, M.R. Estimation of forest leaf area index using vegetation indices derived from Hyperion hyperspectral data. IEEE Trans. Geosci. Electron. 2003, 41, 1355–1362. [Google Scholar] [CrossRef]
- Gitelson, A.A.; Kaufman, Y.J.; Stark, R.; Rundquist, D. Novel algorithms for remote estimation of vegetation fraction. Remote Sens. Environ. 2002, 80, 76–87. [Google Scholar] [CrossRef]
- Jordan, C.F. Derivation of leaf-area index from quality of light on the forest floor. Ecology 1969, 50, 663–666. [Google Scholar] [CrossRef]
- Broge, N.H.; Leblanc, E. Comparing prediction power and stability of broadband and hyperspectral vegetation indices for estimation of green leaf area index and canopy chlorophyll density. Remote Sens. Environ. 2001, 76, 156–172. [Google Scholar] [CrossRef]
- Birth, G.S.; McVey, G.R. Measuring the color of growing turf with a reflectance spectrophotometer 1. Agron. J. 1968, 60, 640–643. [Google Scholar] [CrossRef]
- Huete, A.; Didan, K.; Miura, T.; Rodriguez, E.P.; Gao, X.; Ferreira, L.G. Overview of the radiometric and biophysical performance of the MODIS vegetation indices. Remote Sens. Environ. 2002, 83, 195–213. [Google Scholar] [CrossRef]
- Roujean, J.-L.; Breon, F.-M. Estimating PAR absorbed by vegetation from bidirectional reflectance measurements. Remote Sens. Environ. 1995, 51, 375–384. [Google Scholar] [CrossRef]
- Cao, Y.; Miao, Q.; Liu, J.; Gao, L. Advance and prospects of AdaBoost algorithm. Acta Autom. Sin. 2013, 39, 745–758. (In Chinese) [Google Scholar] [CrossRef]
- Benami, E.; Jin, Z.; Carter, M.R.; Ghosh, A.; Hijmans, R.J.; Hobbs, A.; Kenduiywo, B.; Lobell, D.B. Uniting remote sensing, crop modelling and economics for agricultural risk management. Nat. Rev. Earth Env. 2021, 2, 140–159. [Google Scholar] [CrossRef]
- Orynbaikyzy, A.; Gessner, U.; Conrad, C. Crop type classification using a combination of optical and radar remote sensing data: A review. Int. J. Remote Sens. 2019, 40, 6553–6595. [Google Scholar] [CrossRef]
- Luo, S.; Jiang, X.; Jiao, W.; Yang, K.; Li, Y.; Fang, S. Remotely Sensed Prediction of Rice Yield at Different Growth Durations Using UAV Multispectral Imagery. Agriculture. 2022, 12, 1447. [Google Scholar] [CrossRef]
- Miao, X.; Miao, Y.; Liu, Y.; Tao, S.; Zheng, H.; Wang, J.; Wang, W.; Tang, Q. Measurement of nitrogen content in rice plant using near infrared spectroscopy combined with different PLS algorithms. Spectrochim Acta A Mol. Biomol. Spectrosc. 2023, 284, 121733. [Google Scholar] [CrossRef]
- Yang, F.; Fan, Y.; Li, J.; Qian, Y.; Wang, Y.; Zhang, J. Estimating LAI and CCD of rice and wheat using hyperspectral remote sensing data. Trans. Chin. Soc. Agric. Eng. 2010, 26, 237–243. (In Chinese) [Google Scholar]
- Ganaie, M.A.; Hu, M.; Malik, A.K.; Tanveer, M.; Suganthan, P.N. Ensemble deep learning: A review. Eng. Appl. Artif. Intel. 2022, 115, 105151. [Google Scholar] [CrossRef]
- Mohammed, A.; Kora, R. A Comprehensive Review on Ensemble Deep Learning: Opportunities and Challenges. J. King Saud. Univ. Com. 2023. [Google Scholar] [CrossRef]
Vegetation Index | Calculation Formula | Reference |
---|---|---|
Normalized Difference Vegetation Index (NDVI) | (NIR − R)/(NIR + R) | Tucker et al. [24] |
Green Normalized Difference Vegetation Index (GNDVI) | (R − G)/(R + G) | Gitelson et al. [25] |
Normalized Difference Red-Edge Index (NDRE) | (NIR − RE)/(NIR + RE) | Daughtry et al. [26] |
Modified Nonlinear Vegetation Index (MNVI) | 1.5 (NIR2 − R)/(NIR2 + R + 0.5) | Gong et al. [27] |
Red-Edge Chlorophyll Index (CIred edge) | NIR/RE − 1 | Gitelson et al. [28] |
Difference Vegetation Index (DVI) | NIR − R | Jordan. [29] |
Triangular Vegetation Index (TVI) | 60(RE − G) − 100(R − G) | Broge et al. [30] |
Ratio Vegetation Index (RVI) | NIR/R | Birth et al. [31] |
Enhanced Vegetation Index (EVI) | 2.5(NIR − R)/(NIR + 6NIR − 7.5B + 1) | Huete et al. [32] |
Optimized Soil Adjusted Vegetation Index (OSAVI) | 1.16(NIR − R)/(NIR + R + 0.16) | Roujean et al. [33] |
Date | Varieties | Band | ||||
---|---|---|---|---|---|---|
450 | 560 | 650 | 730 | 840 | ||
11 September | Nanjing 5055 | 27.03% | 26.15% | 17.92% | 38.65% | 48.52% |
Changxiangjing 1813 | 31.84% | 33.28% | 26.15% | 43.15% | 50.14% | |
Zixiangnuo | 24.25% | 23.81% | 14.54% | 38.15% | 46.58% | |
12 September | Nanjing 5055 | 27.12% | 27.46% | 18.08% | 39.78% | 47.97% |
Changxiangjing 1813 | 32.60% | 33.93% | 26.04% | 44.42% | 50.26% | |
Zixiangnuo | 24.66% | 23.67% | 15.02% | 39.27% | 46.74% | |
16 September | Nanjing 5055 | 29.26% | 29.19% | 21.08% | 41.23% | 50.75% |
Changxiangjing 1813 | 35.45% | 34.16% | 27.48% | 47.28% | 55.67% | |
Zixiangnuo | 25.16% | 24.49% | 15.58% | 41.25% | 50.53% | |
17 September | Nanjing 5055 | 30.66% | 29.79% | 21.17% | 41.92% | 51.06% |
Changxiangjing 1813 | 36.16% | 32.14% | 24.85% | 47.21% | 56.58% | |
Zixiangnuo | 25.78% | 25.47% | 14.99% | 42.82% | 51.37% | |
18 September | Nanjing 5055 | 28.86% | 28.83% | 21.00% | 38.84% | 54.32% |
Changxiangjing 1813 | 40.04% | 36.14% | 30.44% | 47.12% | 63.43% | |
Zixiangnuo | 25.47% | 26.59% | 15.89% | 42.31% | 60.97% | |
19 September | Nanjing 5055 | 28.54% | 27.84% | 20.66% | 37.71% | 56.81% |
Changxiangjing 1813 | 33.70% | 31.43% | 24.44% | 43.36% | 62.22% | |
Zixiangnuo | 21.78% | 23.06% | 13.62% | 38.70% | 56.24% | |
20 September | Nanjing 5055 | 27.22% | 27.34% | 19.78% | 35.98% | 47.68% |
Changxiangjing 1813 | 37.56% | 34.06% | 28.78% | 42.23% | 53.42% | |
Zixiangnuo | 26.63% | 28.47% | 17.21% | 40.70% | 52.93% | |
21 September | Nanjing 5055 | 29.08% | 28.54% | 21.08% | 38.40% | 49.99% |
Changxiangjing 1813 | 37.31% | 35.64% | 28.55% | 46.73% | 58.96% | |
Zixiangnuo | 26.08% | 29.98% | 16.71% | 48.36% | 57.62% | |
22 September | Nanjing 5055 | 25.90% | 25.94% | 18.56% | 34.83% | 46.33% |
Changxiangjing 1813 | 36.04% | 33.74% | 25.96% | 46.25% | 56.84% | |
Zixiangnuo | 25.22% | 27.84% | 16.51% | 44.50% | 55.63% | |
25 September | Nanjing 5055 | 30.95% | 32.12% | 24.10% | 39.67% | 49.67% |
Changxiangjing 1813 | 38.48% | 36.64% | 29.03% | 47.26% | 58.03% | |
Zixiangnuo | 24.28% | 25.66% | 16.35% | 41.90% | 58.73% | |
29 September | Nanjing 5055 | 31.94% | 33.29% | 25.80% | 41.90% | 54.78% |
Changxiangjing 1813 | 36.58% | 35.02% | 28.40% | 46.84% | 64.04% | |
Zixiangnuo | 26.88% | 23.80% | 18.90% | 37.38% | 62.24% | |
2 October | Nanjing 5055 | 27.79% | 28.19% | 20.98% | 39.87% | 46.93% |
Changxiangjing 1813 | 31.10% | 30.79% | 25.01% | 41.41% | 49.71% | |
Zixiangnuo | 24.67% | 24.26% | 17.69% | 34.59% | 46.02% |
Input Band | Fitting Model | Fitting Result | |
---|---|---|---|
R2 | RMSE | ||
450 | NN | 0.40 | 0.30 |
SVR | 0.31 | 0.30 | |
RF | 0.52 | 0.28 | |
GBDT | 0.59 | 0.24 | |
AdaBoost | 0.63 | 0.27 | |
560 | NN | 0.43 | 0.30 |
SVR | 0.33 | 0.35 | |
RF | 0.53 | 0.27 | |
GBDT | 0.51 | 0.34 | |
AdaBoost | 0.65 | 0.18 | |
650 | NN | 0.39 | 0.33 |
SVR | 0.40 | 0.33 | |
RF | 0.67 | 0.34 | |
GBDT | 0.63 | 0.37 | |
AdaBoost | 0.72 | 0.29 | |
730 | NN | 0.68 | 0.27 |
SVR | 0.51 | 0.25 | |
RF | 0.63 | 0.21 | |
GBDT | 0.66 | 0.18 | |
AdaBoost | 0.79 | 0.20 | |
840 | NN | 0.36 | 0.26 |
SVR | 0.56 | 0.29 | |
RF | 0.43 | 0.24 | |
GBDT | 0.48 | 0.21 | |
AdaBoost | 0.53 | 0.17 |
Input VI | Fitting Model | Fitting Result | |
---|---|---|---|
R2 | RMSE | ||
NDVI | NN | 0.59 | 0.31 |
SVR | 0.76 | 0.32 | |
RF | 0.70 | 0.29 | |
GBDT | 0.75 | 0.22 | |
AdaBoost | 0.83 | 0.19 | |
GNDVI | NN | 0.41 | 0.33 |
SVR | 0.60 | 0.31 | |
RF | 0.44 | 0.27 | |
GBDT | 0.51 | 0.41 | |
AdaBoost | 0.54 | 0.37 | |
NDRE | NN | 0.25 | 0.24 |
SVR | 0.43 | 0.26 | |
RF | 0.39 | 0.25 | |
GBDT | 0.52 | 0.19 | |
AdaBoost | 0.64 | 0.11 | |
MNVI | NN | 0.52 | 0.26 |
SVR | 0.61 | 0.28 | |
RF | 0.47 | 0.27 | |
GBDT | 0.44 | 0.35 | |
AdaBoost | 0.63 | 0.30 | |
CIred edge | NN | 0.34 | 0.26 |
SVR | 0.42 | 0.29 | |
RF | 0.39 | 0.25 | |
GBDT | 0.46 | 0.28 | |
AdaBoost | 0.41 | 0.37 | |
DVI | NN | 0.27 | 0.19 |
SVR | 0.35 | 0.30 | |
RF | 0.54 | 0.24 | |
GBDT | 0.51 | 0.33 | |
AdaBoost | 0.63 | 0.31 |
Fitting Model | Fitting Result | |
---|---|---|
R2 | RMSE | |
NN | 0.81 | 0.17 |
SVR | 0.79 | 0.15 |
RF | 0.89 | 0.33 |
GBDT | 0.91 | 0.25 |
AdaBoost | 0.94 | 0.12 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ma, B.; Cao, G.; Hu, C.; Chen, C. Monitoring the Rice Panicle Blast Control Period Based on UAV Multispectral Remote Sensing and Machine Learning. Land 2023, 12, 469. https://doi.org/10.3390/land12020469
Ma B, Cao G, Hu C, Chen C. Monitoring the Rice Panicle Blast Control Period Based on UAV Multispectral Remote Sensing and Machine Learning. Land. 2023; 12(2):469. https://doi.org/10.3390/land12020469
Chicago/Turabian StyleMa, Bin, Guangqiao Cao, Chaozhong Hu, and Cong Chen. 2023. "Monitoring the Rice Panicle Blast Control Period Based on UAV Multispectral Remote Sensing and Machine Learning" Land 12, no. 2: 469. https://doi.org/10.3390/land12020469
APA StyleMa, B., Cao, G., Hu, C., & Chen, C. (2023). Monitoring the Rice Panicle Blast Control Period Based on UAV Multispectral Remote Sensing and Machine Learning. Land, 12(2), 469. https://doi.org/10.3390/land12020469