Corn Grain Yield Estimation from Vegetation Indices, Canopy Cover, Plant Density, and a Neural Network Using Multispectral and RGB Images Acquired with Unmanned Aerial Vehicles
Abstract
:1. Introduction
2. Materials and Methods
2.1. Study Site
2.2. Acquisition and Analysis of Data from Remote Sensors Mounted on Unmanned Aerial Vehicles
2.3. Plant Count, Determination of Plant Density, and Estimation of Canopy Cover
2.4. Vegetation Pixels’ Segmentation and Extraction of the Indices Values
2.5. Development and Training of the Feed-Forward Neural Network
2.6. Statistical Analysis
2.7. Variables’ Relative Importance in the Yield Estimation Using Garson’s Algorithm
3. Results and Discussion
3.1. Corn Plant Vegetation Indices
3.2. Plant Density, Canopy Cover, and Yield
3.3. Vegetation Indices and Yield
3.4. Training, Validation, and Testing of the Artificial Neural Network for Estimating Yield
3.5. Variables’ Relative Importance in the Yield Estimation Using Garson’s Algorithm
4. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- USDA-Office of the Chief Economist. Available online: https://www.usda.gov/oce/commodity/wasde/ (accessed on 28 April 2020).
- ASERCA. CIMA. Available online: https://www.cima.aserca.gob.mx/ (accessed on 4 March 2020).
- Domínguez Mercado, C.A.; de Jesús Brambila Paz, J.; Carballo Carballo, A.; Quero Carrillo, A.R. Red de valor para maíz con alta calidad de proteína. Rev. Mex. Cienc. Agríc. 2014, 5, 391–403. [Google Scholar]
- Mercer, K.L.; Perales, H.R.; Wainwright, J.D. Climate change and the transgenic adaptation strategy: Smallholder livelihoods, climate justice, and maize landraces in Mexico. Glob. Environ. Chang. 2012, 22, 495–504. [Google Scholar] [CrossRef]
- Tarancón, M.; Díaz-Ambrona, C.H.; Trueba, I. Cómo alimentar a 9.000 millones de personas en el 2050? In Proceedings of the XV Congreso Internacional de Ingeniería de Proyectos, Huesca, Spain, 6–8 July 2011. [Google Scholar]
- Cervantes, R.A.; Angulo, G.V.; Tavizón, E.F.; González, J.R. Impactos potenciales del cambio climático en la producción de maíz Potential impacts of climate change on maize production. Investigación Ciencia 2014, 22, 48–53. [Google Scholar]
- Moore, F.C.; Lobell, D.B. Reply to Gonsamo and Chen: Yield findings independent of cause of climate trends. Proc. Natl. Acad. Sci. USA 2015, 112, E2267. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ruiz Corral, J.A.; Medina García, G.; Ramírez Díaz, J.L.; Flores López, H.E.; Ramírez Ojeda, G.; Manríquez Olmos, J.D.; Zarazúa Villaseñor, P.; González Eguiarte, D.R.; Díaz Padilla, G.; Mora Orozco, C.D.L. Cambio climático y sus implicaciones en cinco zonas productoras de maíz en México. Rev. Mex. Cienc. Agríc. 2011, 2, 309–323. [Google Scholar]
- Tinoco-Rueda, J.A.; Gómez-Díaz, J.D.; Monterroso-Rivas, A.I.; Tinoco-Rueda, J.A.; Gómez-Díaz, J.D.; Monterroso-Rivas, A.I. Efectos del cambio climático en la distribución potencial del maíz en el estado de Jalisco, México. Terra Latinoam. 2011, 29, 161–168. [Google Scholar]
- Bolaños, H.O.; Vázquez, M.H.; Juárez, G.G.; González, G.S. Cambio climático: Una percepción de los productores de maíz de temporal en el estado de Tlaxcala, México. CIBA Rev. Iberoam. Las Cienc. Biológicas Agropecu. 2019, 8, 1–26. [Google Scholar] [CrossRef] [Green Version]
- Khaki, S.; Wang, L.; Archontoulis, S.V. A CNN-RNN Framework for Crop Yield Prediction. Front. Plant Sci. 2020, 10. [Google Scholar] [CrossRef]
- Dahikar, S.S.; Rode, S.V. Agricultural Crop Yield Prediction Using Artificial Neural Network Approach. Int. J. Innov. Res. Electr. Electron. Instrum. Control Eng. 2014, 2, 683–686. [Google Scholar]
- Li, A.; Liang, S.; Wang, A.; Qin, J. Estimating Crop Yield from Multi-temporal Satellite Data Using Multivariate Regression and Neural Network Techniques. Photogramm. Eng. Remote Sens. 2007, 73, 1149–1157. [Google Scholar] [CrossRef] [Green Version]
- Assefa, Y.; Vara Prasad, P.V.; Carter, P.; Hinds, M.; Bhalla, G.; Schon, R.; Jeschke, M.; Paszkiewicz, S.; Ciampitti, I.A. Yield Responses to Planting Density for US Modern Corn Hybrids: A Synthesis-Analysis. Crop. Sci. 2016, 56, 2802–2817. [Google Scholar] [CrossRef]
- Kitano, B.T.; Mendes, C.C.T.; Geus, A.R.; Oliveira, H.C.; Souza, J.R. Corn Plant Counting Using Deep Learning and UAV Images. IEEE Geosci. Remote Sens. Lett. 2019, 1–5. [Google Scholar] [CrossRef]
- Lindblom, J.; Lundström, C.; Ljung, M.; Jonsson, A. Promoting sustainable intensification in precision agriculture: Review of decision support systems development and strategies. Precis. Agric. 2017, 18, 309–331. [Google Scholar] [CrossRef] [Green Version]
- Geipel, J.; Link, J.; Claupein, W. Combined Spectral and Spatial Modeling of Corn Yield Based on Aerial Images and Crop Surface Models Acquired with an Unmanned Aircraft System. Remote Sens. 2014, 6, 10335–10355. [Google Scholar] [CrossRef] [Green Version]
- Tsouros, D.C.; Bibi, S.; Sarigiannidis, P.G. A Review on UAV-Based Applications for Precision Agriculture. Information 2019, 10, 349. [Google Scholar] [CrossRef] [Green Version]
- Olson, D.; Chatterjee, A.; Franzen, D.W.; Day, S.S. Relationship of Drone-Based Vegetation Indices with Corn and Sugarbeet Yields. Agron. J. 2019, 111, 2545–2557. [Google Scholar] [CrossRef]
- Xue, J.; Su, B. Significant Remote Sensing Vegetation Indices: A Review of Developments and Applications. J. Sens. 2017. [Google Scholar] [CrossRef] [Green Version]
- Peñuelas, J.; Filella, I. Visible and near-infrared reflectance techniques for diagnosing plant physiological status. Trends Plant Sci. 1998, 3, 151–156. [Google Scholar] [CrossRef]
- Serrano, L.; Filella, I.; Peñuelas, J. Remote Sensing of Biomass and Yield of Winter Wheat under Different Nitrogen Supplies. Crop. Sci. 2000, 40, 723–731. [Google Scholar] [CrossRef] [Green Version]
- Buchaillot, M.; Gracia-Romero, A.; Vergara-Diaz, O.; Zaman-Allah, M.A.; Tarekegne, A.; Cairns, J.E.; Prasanna, B.M.; Araus, J.L.; Kefauver, S.C. Evaluating Maize Genotype Performance under Low Nitrogen Conditions Using RGB UAV Phenotyping Techniques. Sensors 2019, 19, 1815. [Google Scholar] [CrossRef] [Green Version]
- Kefauver, S.C.; El-Haddad, G.; Vergara-Diaz, O.; Araus, J.L. RGB picture vegetation indexes for High-Throughput Phenotyping Platforms (HTPPs). In Remote Sensing for Agriculture, Ecosystems, and Hydrology XVII, Volume 9637; International Society for Optics and Photonics: Touluse, France, 2015; p. 96370. [Google Scholar] [CrossRef]
- Vergara-Diaz, O.; Kefauver, S.C.; Elazab, A.; Nieto-Taladriz, M.T.; Araus, J.L. Grain yield losses in yellow-rusted durum wheat estimated using digital and conventional parameters under field conditions. Crop. J. 2015, 3, 200–210. [Google Scholar] [CrossRef] [Green Version]
- Jeong, J.H.; Resop, J.P.; Mueller, N.D.; Fleisher, D.H.; Yun, K.; Butler, E.E.; Timlin, D.J.; Shim, K.M.; Gerber, J.S.; Reddy, V.R.; et al. Random Forests for Global and Regional Crop Yield Predictions. PLoS ONE 2016, 11. [Google Scholar] [CrossRef] [PubMed]
- Oguntunde, P.G.; Lischeid, G.; Dietrich, O. Relationship between rice yield and climate variables in southwest Nigeria using multiple linear regression and support vector machine analysis. Int. J. Biometeorol. 2018, 62, 459–469. [Google Scholar] [CrossRef] [PubMed]
- Pantazi, X.E.; Moshou, D.; Alexandridis, T.; Whetton, R.L.; Mouazen, A.M. Wheat yield prediction using machine learning and advanced sensing techniques. Comput. Electron. Agric. 2016, 121, 57–65. [Google Scholar] [CrossRef]
- Panda, S.S.; Panigrahi, S.; Ames, D.P. Crop Yield Forecasting from Remotely Sensed Aerial Images with Self-Organizing Maps. Trans. ASABE 2010, 53, 323–338. [Google Scholar] [CrossRef]
- Schwalbert, R.A.; Amado, T.; Corassa, G.; Pott, L.P.; Prasad, P.V.V.; Ciampitti, I.A. Satellite-based soybean yield forecast: Integrating machine learning and weather data for improving crop yield prediction in southern Brazil. Agric. For. Meteorol. 2020, 284, 107886. [Google Scholar] [CrossRef]
- Waheed, T.; Bonnell, R.B.; Prasher, S.O.; Paulet, E. Measuring performance in precision agriculture: CART—A decision tree approach. Agric. Water Manag. 2006, 84, 173–185. [Google Scholar] [CrossRef]
- Ashapure, A.; Oh, S.; Marconi, T.G.; Chang, A.; Jung, J.; Landivar, J.; Enciso, J. Unmanned aerial system based tomato yield estimation using machine learning. In Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV; International Society for Optics and Photonics: Baltimore, MD, USA, 2019; Volume 11008, p. 110080O. [Google Scholar] [CrossRef]
- Fu, Z.; Jiang, J.; Gao, Y.; Krienke, B.; Wang, M.; Zhong, K.; Cao, Q.; Tian, Y.; Zhu, Y.; Cao, W.; et al. Wheat Growth Monitoring and Yield Estimation based on Multi-Rotor Unmanned Aerial Vehicle. Remote Sens. 2020, 12, 508. [Google Scholar] [CrossRef] [Green Version]
- Khaki, S.; Wang, L. Crop Yield Prediction Using Deep Neural Networks. In Smart Service Systems, Operations Management, and Analytics; Yang, H., Qiu, R., Chen, W., Eds.; Springer Proceedings in Business and Economics; Springer: Berlin, Germany, 2020; pp. 139–147. [Google Scholar] [CrossRef] [Green Version]
- Kim, N.; Ha, K.-J.; Park, N.-W.; Cho, J.; Hong, S.; Lee, Y.-W. A Comparison between Major Artificial Intelligence Models for Crop Yield Prediction: Case Study of the Midwestern United States, 2006–2015. ISPRS Int. J. Geo Inf. 2019, 8, 240. [Google Scholar] [CrossRef] [Green Version]
- Wang, A.X.; Tran, C.; Desai, N.; Lobell, D.; Ermon, S. Deep Transfer Learning for Crop Yield Prediction with Remote Sensing Data. In Proceedings of the 1st ACM SIGCAS Conference on Computing and Sustainable Societies (COMPASS ’18), New York, NY, USA, 20–22 June 2018; pp. 1–5. [Google Scholar] [CrossRef]
- You, J.; Li, X.; Low, M.; Lobell, D.; Ermon, S. Deep Gaussian Process for Crop Yield Prediction Based on Remote Sensing Data. In Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence, San Francisco, CA, USA, 4–9 February 2017; Available online: https://aaai.org/ocs/index.php/AAAI/AAAI17/paper/view/14435 (accessed on 19 February 2020).
- Fieuzal, R.; Marais Sicre, C.; Baup, F. Estimation of corn yield using multi-temporal optical and radar satellite data and artificial neural networks. Int. J. Appl. Earth Obs. Geoinf. 2017, 57, 14–23. [Google Scholar] [CrossRef]
- Reisi-Gahrouei, O.; Homayouni, S.; McNairn, H.; Hosseini, M.; Safari, A. Crop Biomass Estimation Using Multi Regression Analysis and Neural Networks from Multitemporal L-Band Polarimetric Synthetic Aperture Radar Data. Available online: https://pubag.nal.usda.gov/catalog/6422744 (accessed on 27 February 2020).
- Han, L.; Yang, G.; Dai, H.; Xu, B.; Yang, H.; Feng, H.; Li, Z.; Yang, X. Modeling maize above-ground biomass based on machine learning approaches using UAV remote-sensing data. Plant Methods 2019, 15, 10. [Google Scholar] [CrossRef] [Green Version]
- Michelon, G.K.; Menezes, P.L.; de Bazzi, C.L.; Jasse, E.P.; Magalhães, P.S.G.; Borges, L.F. Artificial neural networks to estimate the productivity of soybeans and corn by chlorophyll readings. J. Plant Nutr. 2018, 41, 1285–1292. [Google Scholar] [CrossRef]
- Khaki, S.; Khalilzadeh, Z.; Wang, L. Predicting yield performance of parents in plant breeding: A neural collaborative filtering approach. PLoS ONE 2020, 15, e0233382. [Google Scholar] [CrossRef] [PubMed]
- Allan, B.M.; Ierodiaconou, D.; Hoskins, A.J.; Arnould, J.P.Y. A Rapid UAV Method for Assessing Body Condition in Fur Seals. Drones 2019, 3, 24. [Google Scholar] [CrossRef] [Green Version]
- Lucieer, A.; Jong, S.M.; de Turner, D. Mapping landslide displacements using Structure from Motion (SfM) and image correlation of multi-temporal UAV photography. Prog. Phys. Geogr. Earth Environ. 2014, 38, 97–116. [Google Scholar] [CrossRef]
- James, M.R.; Robson, S.; d’Oleire-Oltmanns, S.; Niethammer, U. Optimising UAV topographic surveys processed with structure-from-motion: Ground control quality, quantity and bundle adjustment. Geomorphology 2017, 280, 51–66. [Google Scholar] [CrossRef] [Green Version]
- Franzini, M.; Ronchetti, G.; Sona, G.; Casella, V. Geometric and Radiometric Consistency of Parrot Sequoia Multispectral Imagery for Precision Agriculture Applications. Appl. Sci. 2019, 9, 5314. [Google Scholar] [CrossRef] [Green Version]
- Radiometric Corrections. Support. Available online: http://support.pix4d.com/hc/en-us/articles/202559509 (accessed on 16 June 2020).
- Hunt, E.R.; Doraiswamy, P.C.; McMurtrey, J.E.; Daughtry, C.S.T.; Perry, E.M.; Akhmedov, B. A visible band index for remote sensing leaf chlorophyll content at the canopy scale. Int. J. Appl. Earth Obs. Geoinf. 2013, 21, 103–112. [Google Scholar] [CrossRef] [Green Version]
- Woebbecke, D.M.; Meyer, G.E.; Von Bargen, K.; Mortensen, D.A. Color indices for weed identification under various soil, residue, and lighting conditions. Trans. ASAE 1995, 38, 259–269. [Google Scholar] [CrossRef]
- Gitelson, A.A.; Kaufman, Y.J.; Stark, R.; Rundquist, D. Novel Algorithms for Remote Estimation of Vegetation Fraction. Pap. Nat. Resour. 2002, 80, 76–87. [Google Scholar] [CrossRef] [Green Version]
- Rouse, J.W. Monitoring Vegetation Systems in the Great Plains with ERTS. 1974. Available online: https://ntrs.nasa.gov/search.jsp?R=19740022614 (accessed on 19 February 2020).
- Barnes, E.M.; Clarke, T.R.; Richards, S.E.; Colaizzi, P.D.; Haberland, J.; Kostrzewski, M.; Waller, P.; Choi, C.; Riley, E.; Thompson, T.; et al. Coincident detection of crop water stress, nitrogen status and canopy density using ground-based multispectral data. In Proceedings of the Fifth International Conference on Precision Agriculture and Other Resource Management, ASA–CSSA–SSSA, Bloomington, MN, USA, 16–19 July 2000; Precision Agriculture Center, University of Minnesota, ASA-CSSA-SSSA: Madison, WI, USA, 2000; pp. 16–19. [Google Scholar]
- Gitelson, A.A. Wide Dynamic Range Vegetation Index for Remote Quantification of Biophysical Characteristics of Vegetation. J. Plant Physiol. 2004, 161, 165–173. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- García-Martínez, H.; Flores-Magdaleno, H.; Khalil-Gardezi, A.; Ascencio-Hernández, R.; Tijerina-Chávez, L.; Vázquez-Peña, M.A.; Mancilla-Villa, O.R. Digital Count of Corn Plants Using Images Taken by Unmanned Aerial Vehicles and Cross Correlation of Templates. Agronomy 2020, 10, 469. [Google Scholar] [CrossRef] [Green Version]
- Conrad, O.; Bechtel, B.; Bock, M.; Dietrich, H.; Fischer, E.; Gerlitz, L.; Wehberg, J.; Wichmann, V.; Böhner, J. System for Automated Geoscientific Analyses (SAGA) v. 2.1.4. Geosci. Model Dev. 2015, 8, 1991–2007. [Google Scholar] [CrossRef] [Green Version]
- Rubin, J. Optimal classification into groups: An approach for solving the taxonomy problem. J. Theor. Biol. 1967, 15, 103–144. [Google Scholar] [CrossRef]
- Kumar, A.; Tiwari, A. A Comparative Study of Otsu Thresholding and K-means Algorithm of Image Segmentation. Int. J. Eng. Technol. Res. 2019, 9, 2454–4698. [Google Scholar] [CrossRef]
- Liu, D.; Yu, J. Otsu Method and K-means. In Proceedings of the 2009 Ninth International Conference on Hybrid Intelligent Systems, Shenyang, China, 12–14 August 2009; Volume 1, pp. 344–349. [Google Scholar] [CrossRef]
- Benz, U.C.; Hofmann, P.; Willhauck, G.; Lingenfelder, I.; Heynen, M. Multi-resolution, object-oriented fuzzy analysis of remote sensing data for GIS-ready information. ISPRS J. Photogramm. Remote Sens. 2004, 58, 239–258. [Google Scholar] [CrossRef]
- eCognition Suite Dev RB. Available online: https://docs.ecognition.com/v9.5.0/Page%20collection/eCognition%20Suite%20Dev%20RB.htm (accessed on 21 April 2020).
- Chandola, V.; Banerjee, A.; Kumar, V. Anomaly detection: A survey. ACM Comput. Surv. CSUR 2009, 41, 15:1–15:58. [Google Scholar] [CrossRef]
- Torres, J. Deep Learning—Introducción Práctica con Keras. Jordi TORRES.AI. Available online: https://torres.ai/deep-learning-inteligencia-artificial-keras/ (accessed on 24 February 2020).
- Pedamonti, D. Comparison of Non-Linear Activation Functions for Deep Neural Networks on MNIST Classification Task. ArXiv180402763 Cs Stat. Available online: http://arxiv.org/abs/1804.02763 (accessed on 16 June 2020).
- Sharma, S. Activation functions in neural networks. Data Sci. 2017, 6, 310–316. [Google Scholar]
- Levenberg, K. A method for the solution of certain non-linear problems in least squares. Q. Appl. Math. 1944, 2, 164–168. [Google Scholar] [CrossRef] [Green Version]
- Marquardt, D.W. An Algorithm for Least-Squares Estimation of Nonlinear Parameters. J. Soc. Ind. Appl. Math. 1963, 11, 431–441. [Google Scholar] [CrossRef]
- Vogl, T.P.; Mangis, J.K.; Rigler, A.K.; Zink, W.T.; Alkon, D.L. Accelerating the convergence of the back-propagation method. Biol. Cybern. 1988, 59, 257–263. [Google Scholar] [CrossRef]
- Zhang, Z.; Beck, M.W.; Winkler, D.A.; Huang, B.; Sibanda, W.; Goyal, H. Opening the black box of neural networks: Methods for interpreting neural network models in clinical applications. Ann. Transl. Med. 2018, 6. [Google Scholar] [CrossRef] [PubMed]
- Garson, G.D. Interpreting neural-network connection weights. AI Experts 1991, 6, 46–51. [Google Scholar]
- Goh, A.T. Back-propagation neural networks for modeling complex systems. Artif. Intell. Eng. 1995, 9, 143–151. [Google Scholar] [CrossRef]
- Gitelson, A.A.; Viña, A.; Arkebauer, T.J.; Rundquist, D.C.; Keydan, G.; Leavitt, B. Remote estimation of leaf area index and green leaf biomass in maize canopies. Geophys. Res. Lett. 2003, 30. [Google Scholar] [CrossRef] [Green Version]
- Maresma, Á.; Ariza, M.; Martínez, E.; Lloveras, J.; Martínez-Casasnovas, J.A. Analysis of Vegetation Indices to Determine Nitrogen Application and Yield Prediction in Maize (Zea mays L.) from a Standard UAV Service. Remote Sens. 2016, 8, 973. [Google Scholar] [CrossRef] [Green Version]
- Zhang, Y.; Han, W.; Niu, X.; Li, G. Maize Crop Coefficient Estimated from UAV-Measured Multispectral Vegetation Indices. Sensors 2019, 19, 5250. [Google Scholar] [CrossRef] [Green Version]
- Zhang, M.; Zhou, J.; Sudduth, K.A.; Kitchen, N.R. Estimation of maize yield and effects of variable-rate nitrogen application using UAV-based RGB imagery. Biosyst. Eng. 2020, 189, 24–35. [Google Scholar] [CrossRef]
- Zhou, X.; Zheng, H.B.; Xu, X.Q.; He, J.Y.; Ge, X.K.; Yao, X.; Cheng, T.; Zhu, Y.; Cao, W.X.; Tian, Y.C. Predicting grain yield in rice using multi-temporal vegetation indices from UAV-based multispectral and digital imagery. ISPRS J. Photogramm. Remote Sens. 2017, 130, 246–255. [Google Scholar] [CrossRef]
- Fernández, E.; Gorchs, G.; Serrano, L. Use of consumer-grade cameras to assess wheat N status and grain yield. PLoS ONE 2019, 14. [Google Scholar] [CrossRef] [Green Version]
DAS | Development Stage | Date | Sensor | No. of Images | Area (m2) | Ground Sampling Distance (GSD) (cm) |
---|---|---|---|---|---|---|
47 | V8 | 21 May | DJI FC6310 | 272 | 10,067 | 0.49 |
47 | V8 | 21 May | Parrot Sequoia | 752 | 9409 | 2.15 |
79 | R0 | 22 June | DJI FC6310 | 188 | 7720 | 0.49 |
79 | R0 | 22 June | Parrot Sequoia | 752 | 9409 | 2.15 |
Vegetation Index | Formula | Reference |
---|---|---|
Triangular greenness index | TGI = RGreen − 0.39RRed − 0.61RBlue | [48] |
Excess green index | EXG = 2 g − r − b | [49] |
Visible atmospherically resistant index | VARI = (RRed − RGreen)/(RGreen + RRed − RBlue) | [50] |
Normalized difference vegetation index | NDVI = (RNir − RRed)/(RNir + RRed) | [51] |
Normalized difference red edge | NDRE = (RNir − RRE)/(RNir + RRE) | [52] |
Wide dynamic range vegetation index * | WDRVI = (α⋅RNir − RRed)/(α⋅RNir + RRed) | [53] |
DAS | Treatment | Vegetation Index | |||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
NDVI | NDRE | WDRVI | EXG | TGI | VARI | Yield (g plant−1) | |||||||||
Mean | Std DeV | Mean | Std DeV | Mean | Std DeV | Mean | Std DeV | Mean | Std DeV | Mean | Std DeV | Mean | Std DeV | ||
47 | N140 | 0.46 | 0.05 | 0.19 | 0.02 | −0.55 | 0.04 | 0.40 | 0.03 | 0.40 | 0.03 | 0.12 | 0.02 | 109.3 | 19.6 |
N200 | 0.48 | 0.04 | 0.19 | 0.01 | −0.53 | 0.04 | 0.42 | 0.03 | 0.42 | 0.03 | 0.13 | 0.02 | 134.0 | 2.7 | |
N260 | 0.46 | 0.05 | 0.18 | 0.02 | −0.55 | 0.05 | 0.41 | 0.02 | 0.41 | 0.02 | 0.13 | 0.02 | 135.6 | 2.7 | |
N320 | 0.46 | 0.04 | 0.18 | 0.01 | −0.55 | 0.04 | 0.42 | 0.04 | 0.42 | 0.04 | 0.13 | 0.02 | 137.1 | 3.6 | |
N380 | 0.47 | 0.05 | 0.19 | 0.02 | −0.53 | 0.05 | 0.43 | 0.02 | 0.43 | 0.03 | 0.14 | 0.01 | 138.3 | 5.3 | |
79 | N140 | 0.90 | 0.02 | 0.26 | 0.02 | −0.13 | 0.06 | 0.53 | 0.03 | 0.54 | 0.01 | 0.13 | 0.02 | 109.3 | 19.6 |
N200 | 0.91 | 0.01 | 0.26 | 0.03 | −0.10 | 0.06 | 0.54 | 0.03 | 0.54 | 0.01 | 0.14 | 0.02 | 134.0 | 2.7 | |
N260 | 0.90 | 0.01 | 0.25 | 0.02 | −0.13 | 0.06 | 0.52 | 0.03 | 0.54 | 0.01 | 0.13 | 0.02 | 135.6 | 2.7 | |
N320 | 0.90 | 0.01 | 0.25 | 0.02 | −0.12 | 0.05 | 0.53 | 0.04 | 0.54 | 0.01 | 0.13 | 0.02 | 137.1 | 3.6 | |
N380 | 0.90 | 0.01 | 0.25 | 0.02 | −0.11 | 0.05 | 0.53 | 0.03 | 0.54 | 0.01 | 0.13 | 0.02 | 138.3 | 5.3 |
Input Variables | 47 DAS | 79 DAS | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
R Training | R Validation | R Test | R Total | MAE (t ha−1) | RMSE (t ha−1) | R Training | R Validation | R Test | R Total | MAE (t ha−1) | RMSE (t ha−1) | |
NDVI, NDRE, WDRVI, D, C | 0.96 | 0.97 | 0.98 | 0.97 | 0.285 | 0.414 | 0.96 | 0.97 | 0.98 | 0.96 | 0.242 | 0.337 |
NDVI, NDRE, WDRVI, TGI, EXG, VARI, D, C | 0.98 | 0.93 | 0.98 | 0.97 | 0.307 | 0.400 | 0.97 | 0.90 | 0.99 | 0.97 | 0.249 | 0.425 |
NDRE, D, C | 0.97 | 0.99 | 0.95 | 0.97 | 0.256 | 0.365 | 0.97 | 0.88 | 0.97 | 0.96 | 0.278 | 0.437 |
EXG, D, C | 0.97 | 0.98 | 0.99 | 0.98 | 0.252 | 0.354 | 0.95 | 0.97 | 0.99 | 0.96 | 0.280 | 0.431 |
TGI, EXG, VARI, D, C | 0.97 | 0.94 | 0.96 | 0.96 | 0.331 | 0.470 | 0.92 | 0.97 | 0.96 | 0.94 | 0.292 | 0.562 |
C, D | 0.97 | 0.93 | 0.95 | 0.96 | 0.298 | 0.441 | 0.96 | 0.92 | 0.95 | 0.96 | 0.298 | 0.441 |
NDVI, D, C | 0.96 | 0.99 | 0.93 | 0.97 | 0.280 | 0.425 | 0.96 | 0.94 | 0.91 | 0.96 | 0.300 | 0.449 |
TGI, D, C | 0.96 | 0.97 | 0.98 | 0.97 | 0.304 | 0.414 | 0.94 | 0.95 | 0.97 | 0.95 | 0.312 | 0.459 |
VARI, D, C | 0.97 | 0.99 | 0.93 | 0.97 | 0.279 | 0.395 | 0.97 | 0.93 | 0.90 | 0.95 | 0.347 | 0.571 |
NDVI, NDRE, WDRVI, TGI, EXG, VARI, C | 0.92 | 0.86 | 0.96 | 0.92 | 0.512 | 0.643 | 0.94 | 0.97 | 0.88 | 0.94 | 0.381 | 0.538 |
NDVI, NDRE, WDRVI, TGI, EXG, VARI | 0.80 | 0.94 | 0.92 | 0.86 | 0.622 | 0.809 | 0.84 | 0.91 | 0.90 | 0.86 | 0.528 | 0.876 |
EXG, C | 0.81 | 0.81 | 0.84 | 0.81 | 0.733 | 0.938 | 0.80 | 0.81 | 0.84 | 0.80 | 0.623 | 0.947 |
NDVI, NDRE, WDRVI | 0.74 | 0.65 | 0.85 | 0.73 | 0.811 | 1.093 | 0.82 | 0.85 | 0.87 | 0.85 | 0.641 | 0.837 |
NDVI, C | 0.83 | 0.82 | 0.90 | 0.84 | 0.597 | 0.883 | 0.80 | 0.86 | 0.88 | 0.82 | 0.649 | 0.884 |
VARI, C | 0.86 | 0.94 | 0.51 | 0.85 | 0.604 | 0.836 | 0.86 | 0.79 | 0.92 | 0.86 | 0.653 | 0.917 |
NDVI, NDRE, WDRVI, C | 0.81 | 0.93 | 0.83 | 0.84 | 0.618 | 0.874 | 0.87 | 0.87 | 0.62 | 0.84 | 0.672 | 0.856 |
WDRVI, C | 0.87 | 0.62 | 0.95 | 0.87 | 0.584 | 0.784 | 0.82 | 0.64 | 0.56 | 0.79 | 0.689 | 0.971 |
TGI, EXG, VARI | 0.69 | 0.64 | 0.82 | 0.67 | 0.908 | 1.189 | 0.83 | 0.72 | 0.72 | 0.80 | 0.720 | 0.951 |
TGI, EXG, VARI, C | 0.86 | 0.90 | 0.87 | 0.86 | 0.629 | 0.817 | 0.78 | 0.93 | 0.71 | 0.78 | 0.746 | 1.015 |
WDRVI, D, C | 0.99 | 0.99 | 0.99 | 0.99 | 0.028 | 0.125 | 0.96 | 0.99 | 0.89 | 0.96 | 0.209 | 0.449 |
NDRE, C | 0.75 | 0.98 | 0.97 | 0.81 | 0.527 | 0.986 | 0.68 | 0.68 | 0.55 | 0.65 | 0.774 | 1.179 |
TGI, C | 0.85 | 0.62 | 0.78 | 0.82 | 0.701 | 0.909 | 0.50 | 0.62 | 0.29 | 0.54 | 1.017 | 1.380 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
García-Martínez, H.; Flores-Magdaleno, H.; Ascencio-Hernández, R.; Khalil-Gardezi, A.; Tijerina-Chávez, L.; Mancilla-Villa, O.R.; Vázquez-Peña, M.A. Corn Grain Yield Estimation from Vegetation Indices, Canopy Cover, Plant Density, and a Neural Network Using Multispectral and RGB Images Acquired with Unmanned Aerial Vehicles. Agriculture 2020, 10, 277. https://doi.org/10.3390/agriculture10070277
García-Martínez H, Flores-Magdaleno H, Ascencio-Hernández R, Khalil-Gardezi A, Tijerina-Chávez L, Mancilla-Villa OR, Vázquez-Peña MA. Corn Grain Yield Estimation from Vegetation Indices, Canopy Cover, Plant Density, and a Neural Network Using Multispectral and RGB Images Acquired with Unmanned Aerial Vehicles. Agriculture. 2020; 10(7):277. https://doi.org/10.3390/agriculture10070277
Chicago/Turabian StyleGarcía-Martínez, Héctor, Héctor Flores-Magdaleno, Roberto Ascencio-Hernández, Abdul Khalil-Gardezi, Leonardo Tijerina-Chávez, Oscar R. Mancilla-Villa, and Mario A. Vázquez-Peña. 2020. "Corn Grain Yield Estimation from Vegetation Indices, Canopy Cover, Plant Density, and a Neural Network Using Multispectral and RGB Images Acquired with Unmanned Aerial Vehicles" Agriculture 10, no. 7: 277. https://doi.org/10.3390/agriculture10070277
APA StyleGarcía-Martínez, H., Flores-Magdaleno, H., Ascencio-Hernández, R., Khalil-Gardezi, A., Tijerina-Chávez, L., Mancilla-Villa, O. R., & Vázquez-Peña, M. A. (2020). Corn Grain Yield Estimation from Vegetation Indices, Canopy Cover, Plant Density, and a Neural Network Using Multispectral and RGB Images Acquired with Unmanned Aerial Vehicles. Agriculture, 10(7), 277. https://doi.org/10.3390/agriculture10070277