Prediction of Seedling Oilseed Rape Crop Phenotype by Drone-Derived Multimodal Data
Abstract
:1. Introduction
2. Materials and Methods
2.1. Field Experiment and Biomass Sampling
2.2. Collection and Processing of Rapeseed Phenotype Information
2.3. UAV Systems and Flight Missions
2.4. Image Processing and Feature Extraction
2.4.1. Image Processing
2.4.2. VIs
2.4.3. GCFs
2.5. Non-Remote Sensing Auxiliary Data
3. Multimodal Data Fusion
3.1. Image Fusion
3.2. Machine Learning
4. Results and Discussion
4.1. Correlation Analysis
4.2. PCA Data Dimensionality Reduction
4.3. Phenotypic Prediction of Rapeseed Crops during Seedling Stage
4.3.1. Four Multimodal Data Model Frameworks
4.3.2. Forecast Results
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Conflicts of Interest
References
- Jin, X.; Yang, W.; Doonan, J.H.; Atzberger, C. Crop phenotyping studies with application to crop monitoring. Crop J. 2022, 10, 1221–1223. [Google Scholar] [CrossRef]
- Huang, J.; Sedano, F.; Huang, Y.; Ma, H.; Li, X.; Liang, S.; Tian, L.; Zhang, X.; Fan, J.; Wu, W. Assimilating a synthetic Kalman filter leaf area index series into the WOFOST model to improve regional winter wheat yield estimation. Agric. Forest Meteorol. 2016, 216, 188–202. [Google Scholar] [CrossRef]
- Dobermann, A.; Pampolino, M.F. Indirect leaf area index measurement as a tool for characterizing rice growth at the field scale. Commun. Soil Sci. Plant Anal. 1995, 26, 1507–1523. [Google Scholar] [CrossRef]
- Wang, J.-J.; Li, Z.; Jin, X.; Liang, G.; Struik, P.C.; Gu, J.; Zhou, Y. Phenotyping flag leaf nitrogen content in rice using a three-band spectral index. Comput. Electron. Agric. 2019, 162, 475–481. [Google Scholar] [CrossRef]
- Zhao, C.; Zhang, Y.; Du, J.; Guo, X.; Wen, W.; Gu, S.; Wang, J.; Fan, J. Crop Phenomics: Current Status and Perspectives. Front. Plant Sci. 2019, 10, 714. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Hussain, S.; Gao, K.; Din, M.; Gao, Y.; Shi, Z.; Wang, S. Assessment of UAV-Onboard Multispectral Sensor for non-destructive site-specific rapeseed crop phenotype variable at different phenological stages and resolutions. Remote Sens. 2020, 12, 397. [Google Scholar] [CrossRef] [Green Version]
- Wang, T.; Liu, Y.; Wang, M.; Fan, Q.; Tian, H.; Qiao, X.; Li, Y. Applications of UAS in crop biomass monitoring: A review. Front. Plant Sci. 2021, 12, 616689. [Google Scholar] [CrossRef]
- Aasen, H.; Bolten, A. Multi-temporal high-resolution imaging spectroscopy with hyperspectral 2D imagers–From theory to application. Remote Sens. Environ. 2018, 205, 374–389. [Google Scholar] [CrossRef]
- Bhadra, S.; Sagan, V.; Maimaitijiang, M.; Maimaitiyiming, M.; Newcomb, M.; Shakoor, N.; Mockler, T.C. Quantifying leaf chlorophyll concentration of sorghum from hyperspectral data using derivative calculus and machine learning. Remote Sens. 2020, 12, 2082. [Google Scholar] [CrossRef]
- Padalia, H.; Sinha, S.K.; Bhave, V.; Trivedi, N.K.; Kumar, A.S. Estimating canopy LAI and chlorophyll of tropical forest plantation (North India) using Sentinel-2 data. Adv. Space Res. 2020, 65, 458–469. [Google Scholar] [CrossRef]
- Tanabe, R.; Matsui, T.; Tanaka, T.S. Winter wheat yield prediction using convolutional neural networks and UAV-based multispectral imagery. Field Crops Res. 2023, 291, 108786. [Google Scholar] [CrossRef]
- Yang, G.; Liu, J.; Zhao, C.; Li, Z.; Huang, Y.; Yu, H.; Xu, B.; Yang, X.; Zhu, D.; Zhang, X. Unmanned aerial vehicle remote sensing for field-based crop phenotyping: Current status and perspectives. Front. Plant Sci. 2017, 8, 1111. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Xie, C.; Yang, C. A review on plant high-throughput phenotyping traits using UAV-based sensors. Comput. Electron. Agric. 2020, 178, 105731. [Google Scholar] [CrossRef]
- Deng, L.; Mao, Z.; Li, X.; Hu, Z.; Duan, F.; Yan, Y. UAV-based multispectral remote sensing for precision agriculture: A comparison between different cameras. ISPRS J. Photogramm. Remote Sens. 2018, 146, 124–136. [Google Scholar] [CrossRef]
- Johansen, K.; Morton, M.J.; Malbeteau, Y.; Aragon, B.; Al-Mashharawi, S.; Ziliani, M.G.; Angel, Y.; Fiene, G.; Negrão, S.; Mousa, M.A. Predicting biomass and yield in a tomato phenotyping experiment using UAV imagery and random forest. Front. Artif. Intel. 2020, 3, 28. [Google Scholar] [CrossRef] [PubMed]
- Lee, H.; Wang, J.; Leblon, B. Intra-field canopy nitrogen retrieval from unmanned aerial vehicle imagery for wheat and corn fields. Can. J. Remote Sens. 2020, 46, 454–472. [Google Scholar] [CrossRef]
- Gilabert, M.; Moreno, A.; Maselli, F.; Martínez, B.; Chiesi, M.; Sánchez-Ruiz, S.; García-Haro, F.; Pérez-Hoyos, A.; Campos-Taberner, M.; Pérez-Priego, O. Daily GPP estimates in Mediterranean ecosystems by combining remote sensing and meteorological data. ISPRS J. Photogramm. Remote Sens. 2015, 102, 184–197. [Google Scholar] [CrossRef]
- Sun, C.; Bian, Y.; Zhou, T.; Pan, J. Using of multi-source and multi-temporal remote sensing data improves crop-type mapping in the subtropical agriculture region. Sensors 2019, 19, 2401. [Google Scholar] [CrossRef] [Green Version]
- Zhang, Z.; An, Y. Ocean application conception of sky, earth, and sea multi base collaborative multi source fusion. Satell. Appl. 2019, 2, 24–29. [Google Scholar]
- Pawłowski, M.; Wróblewska, A.; Sysko-Romańczuk, S. Effective Techniques for Multimodal Data Fusion: A Comparative Analysis. Sensors 2023, 23, 2381. [Google Scholar] [CrossRef]
- Zhai, G.; Min, X. Perceptual image quality assessment: A survey. Sci. China Inf. Sci. 2020, 63, 211301. [Google Scholar] [CrossRef]
- Liu, Y.S.; Jiang, M.Y.; Liao, C.Z. In Multifocus Image Fusion Based on Multiresolution Transform and Particle Swarm Optimization. Adv. Mater. Res. 2013, 756, 3281–3285. [Google Scholar] [CrossRef] [Green Version]
- Lu, J.; Eitel, J.U.; Engels, M.; Zhu, J.; Ma, Y.; Liao, F.; Zheng, H.; Wang, X.; Yao, X.; Cheng, T. Improving Unmanned Aerial Vehicle (UAV) remote sensing of rice plant potassium accumulation by fusing spectral and textural information. Int. J. Appl. Earth OBS 2021, 104, 102592. [Google Scholar] [CrossRef]
- Maimaitijiang, M.; Sagan, V.; Sidike, P.; Hartling, S.; Esposito, F.; Fritschi, F.B. Soybean yield prediction from UAV using multimodal data fusion and deep learning. Remote Sens. Environ. 2020, 237, 111599. [Google Scholar] [CrossRef]
- Fei, S.; Hassan, M.A.; Xiao, Y.; Su, X.; Chen, Z.; Cheng, Q.; Duan, F.; Chen, R.; Ma, Y. UAV-based multi-sensor data fusion and machine learning algorithm for yield prediction in wheat. Precis. Agric. 2023, 24, 187–212. [Google Scholar] [CrossRef]
- Min, X.; Zhai, G.; Zhou, J.; Farias, M.C.; Bovik, A.C. Study of Subjective and Objective Quality Assessment of Audio-Visual Signals. IEEE Trans. Image Process. 2020, 29, 6054–6068. [Google Scholar] [CrossRef]
- Min, X.; Zhai, G.; Zhou, J.; Zhang, X.P.; Yang, X.; Guan, X. A Multimodal Saliency Model for Videos with High Audio-Visual Correspondence. IEEE Trans. Image Process. 2020, 29, 3805–3819. [Google Scholar] [CrossRef]
- Zheng, H.; Cheng, T.; Zhou, M.; Li, D.; Yao, X.; Tian, Y.; Cao, W.; Zhu, Y. Improved estimation of rice aboveground biomass combining textural and spectral analysis of UAV imagery. Precis. Agric. 2019, 20, 611–629. [Google Scholar] [CrossRef]
- Jin, X.; Jiang, Q.; Yao, S.; Zhou, D.; Nie, R.; Lee, S.-J.; He, K. Infrared and visual image fusion method based on discrete cosine transform and local spatial frequency in discrete stationary wavelet transform domain. Infrared Phys. Technol. 2018, 88, 1–12. [Google Scholar] [CrossRef]
- Tan, W.; Xiang, P.; Zhang, J.; Zhou, H.; Qin, H. Remote sensing image fusion via boundary measured dual-channel PCNN in multi-scale morphological gradient domain. IEEE Access 2020, 8, 42540–42549. [Google Scholar] [CrossRef]
- Torgbor, B.A.; Rahman, M.M.; Brinkhoff, J.; Sinha, P.; Robson, A. Integrating Remote Sensing and Weather Variables for Mango Yield Prediction Using a Machine Learning Approach. Remote Sens. 2023, 15, 3075. [Google Scholar] [CrossRef]
- Thenkabail, P.S.; Biradar, C.M.; Noojipady, P.; Dheeravath, V.; Li, Y.; Velpuri, M.; Gumma, M.; Gangalakunta, O.R.P.; Turral, H.; Cai, X. Global irrigated area map (GIAM), derived from remote sensing, for the end of the last millennium. Int. J. Remote Sens. 2009, 30, 3679–3733. [Google Scholar] [CrossRef]
- Zhou, J.; Zhou, J.; Ye, H.; Ali, M.L.; Chen, P.; Nguyen, H.T. Yield estimation of soybean breeding lines under drought stress using unmanned aerial vehicle-based imagery and convolutional neural network. Biosyst. Eng. 2021, 204, 90–103. [Google Scholar] [CrossRef]
- Yang, Q.; Shi, L.; Han, J.; Zha, Y.; Zhu, P. Deep convolutional neural networks for rice grain yield estimation at the ripening stage using UAV-based remotely sensed images. Field Crops Res. 2019, 235, 142–153. [Google Scholar] [CrossRef]
- Cui, Z.; Kerekes, J.P. Potential of Red Edge Spectral Bands in Future Landsat Satellites on Agroecosystem Canopy Green Leaf Area Index Retrieval. Remote Sens. 2018, 10, 1458. [Google Scholar] [CrossRef] [Green Version]
- Cui, Z.; Kerekes, J.P. Impact of Wavelength Shift in Relative Spectral Response at High Angles of Incidence in Landsat-8 Operational Land Imager and Future Landsat Design Concepts. IEEE Trans. Geosci. Remote Sens. 2018, 56, 5873–5883. [Google Scholar] [CrossRef]
- Min, X.; Zhai, G.; Gu, K.; Yang, X.; Guan, X. Objective Quality Evaluation of Dehazed Images. IEEE Trans. Intell. Transp. Syst. 2019, 20, 2879–2892. [Google Scholar] [CrossRef]
- Lukas, V.; Huňady, I.; Kintl, A.; Mezera, J.; Hammerschmiedt, T.; Sobotková, J.; Brtnický, M.; Elbl, J. Using UAV to Identify the Optimal Vegetation Index for Yield Prediction of Oil Seed Rape (Brassica napus L.) at the Flowering Stage. Remote Sens. 2022, 14, 4953. [Google Scholar] [CrossRef]
- Rouse, J.; Haas, R.; Schell, J.; Deeng, R.; Harlan, J. Monitoring the vernal advancement of retrogradation (greenwave effect) of natural vegetation. In Type III Final Report RSC 1978-4; Remote Sensing Center, Texas A&M University: College Station, TX, USA, 1974; pp. 1–93. [Google Scholar]
- Schleicher, T.D.; Bausch, W.C.; Delgado, J.A.; Ayers, P.D. Evaluation and Refinement of the Nitrogen Reflectance Index (NRI) for Site-Specific Fertilizer Management; 2001 ASAE Annual Meeting; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 1998; Volume 1. [Google Scholar]
- Gitelson, A.A.; Zur, Y.; Chivkunova, O.B.; Merzlyak, M.N. Assessing carotenoid content in plant leaves with reflectance spectroscopy. Photochem. Photobiol. 2002, 75, 272–281. [Google Scholar] [CrossRef]
- Gitelson, A.A.; Viña, A.; Arkebauer, T.J.; Rundquist, D.C.; Keydan, G.; Leavitt, B. Remote estimation of leaf area index and green leaf biomass in maize canopies. Geophys. Res. Lett. 2003, 30, 52. [Google Scholar] [CrossRef] [Green Version]
- Baret, F.; Guyot, G. Potentials and limits of vegetation indices for LAI and APAR assessment. Remote Sens. Env. 1991, 35, 161–173. [Google Scholar] [CrossRef]
- Goel, N.S.; Qin, W. Influences of canopy architecture on relationships between various vegetation indices and LAI and FPAR: A computer simulation. Remote Sens. Rev. 1994, 10, 309–347. [Google Scholar] [CrossRef]
- Chen, J.M. Evaluation of vegetation indices and a modified simple ratio for boreal applications. Can. J. Remote Sens. 1996, 22, 229–242. [Google Scholar] [CrossRef]
- de la Iglesia Martinez, A.; Labib, S. Demystifying normalized difference vegetation index (NDVI) for greenness exposure assessments and policy interventions in urban greening. Env. Res. 2023, 220, 115155. [Google Scholar] [CrossRef]
- Zhao, F.; Yang, G.; Yang, H.; Long, H.; Xu, W.; Zhu, Y.; Meng, Y.; Han, S.; Liu, M. A Method for Prediction of Winter Wheat Maturity Date Based on MODIS Time Series and Accumulated Temperature. Agriculture 2022, 12, 945. [Google Scholar] [CrossRef]
- Li, S.; Kang, X.; Hu, J. Image fusion with guided filtering. IEEE Trans. Image Process. 2013, 22, 2864–2875. [Google Scholar]
- May, J.O.; Looney, S.W. Sample size charts for Spearman and Kendall coefficients. J. Biom. Biostat. 2020, 11, 1–7. [Google Scholar]
- Fırat, H.; Asker, M.E.; Hanbay, D. Classification of hyperspectral remote sensing images using different dimension reduction methods with 3D/2D CNN. Remote Sens. Appl. 2022, 25, 100694. [Google Scholar] [CrossRef]
- Lapajne, J.; Knapič, M.; Žibrat, U. Comparison of Selected Dimensionality Reduction Methods for Detection of Root-Knot Nematode Infestations in Potato Tubers Using Hyperspectral Imaging. Sensors 2022, 22, 367. [Google Scholar] [CrossRef]
- Jiang, Y.; Wei, H.; Hou, S.; Yin, X.; Wei, S.; Jiang, D. Estimation of Maize Yield and Protein Content under Different Density and N Rate Conditions Based on UAV Multi-Spectral Images. Agronomy 2023, 13, 421. [Google Scholar] [CrossRef]
- de Oliveira, R.P.; Rodrigues, B.J.M.; Alves, P.A.; Pereira, O.J.L.; Cristiano, Z.; Angeli, F.C.E. Predicting Sugarcane Biometric Parameters by UAV Multispectral Images and Machine Learning. Agronomy 2022, 12, 1992. [Google Scholar] [CrossRef]
- Mohidem, N.A.; Jaafar, S.; Rosle, R.; Che’Ya, N.N.; Arif, S.J.; Fazlil, I.W.F.; Ismail, M.R. Application of multispectral UAV for paddy growth monitoring in Jitra, Kedah, Malaysia. IOP Conf. Ser. Earth Environ. Sci. 2022, 1038, 012053. [Google Scholar] [CrossRef]
- Zhang, X.; Zhang, K.; Sun, Y.; Zhao, Y.; Zhuang, H.; Ban, W.; Chen, Y.; Fu, E.; Chen, S.; Liu, J.; et al. Combining Spectral and Texture Features of UAS-Based Multispectral Images for Maize Leaf Area Index Estimation. Remote Sens. 2022, 14, 331. [Google Scholar] [CrossRef]
- Zheng, H.; Ma, J.; Zhou, M.; Li, D.; Yao, X.; Cao, W.; Zhu, Y.; Cheng, T. Enhancing the Nitrogen Signals of Rice Canopies across Critical Growth Stages through the Integration of Textural and Spectral Information from Unmanned Aerial Vehicle (UAV) Multispectral Imagery. Remote Sens. 2020, 12, 957. [Google Scholar] [CrossRef] [Green Version]
- Lu, L.; Fengqiao, W.; Cheolkon, J. LRINet: Long-range imaging using multispectral fusion of RGB and NIR images. Inf. Fusion 2023, 92, 177–189. [Google Scholar] [CrossRef]
- Zhou, C.; Gong, Y.; Fang, S.; Yang, K.; Peng, Y.; Wu, X.; Zhu, R. Combining spectral and wavelet texture features for unmanned aerial vehicles remote estimation of rice leaf area index. Front. Plant Sci. 2022, 13, 957870. [Google Scholar] [CrossRef]
- Usha, S.G.A.; Vasuki, S. Significance of texture features in the segmentation of remotely sensed images. Optik 2022, 249, 168241. [Google Scholar] [CrossRef]
- Saini, P.; Kumar, A. Effect of Fusion of Statistical and Texture Features on HSI based Leaf Images with Both Dorsal and Ventral Sides. Int. J. Adv. Comput. Sci. Appl. 2018, 9, 305–312. [Google Scholar] [CrossRef] [Green Version]
- Islam, M.D.; Di, L.; Qamer, F.M.; Shrestha, S.; Guo, L.; Lin, L.; Mayer, T.J.; Phalke, A.R. Rapid Rice Yield Estimation Using Integrated Remote Sensing and Meteorological Data and Machine Learning. Remote Sens. 2023, 15, 2374. [Google Scholar] [CrossRef]
- Aswed, G.K.; Ahmed, M.N.; Mohammed, H.A. Predicting initial duration of project using linear and nonlinear regression models. Int. J. Adv. Technol. Eng. Explor. 2022, 9, 1730. [Google Scholar]
- Fu, T.T.; Sieng, Y.W. A comparative study between PCR, PLSR, and LW-PLS on the predictive performance at different data splitting ratios. Chem. Eng. Commun. 2022, 209, 1439–1456. [Google Scholar]
Sensor Category | Spectral Area (μm) | Resolution | Field of View (H° × V°) |
---|---|---|---|
Visible light | N/A | 1600 × 1200 | 56° × 84° |
Multispectral | Green: 0.560; Red: 0.650; Red edge: 0.730; NIR: 0.860 | 800 × 600 | 47.2° × 73.9° |
Test Time | Date of Remote Sensing Image Acquisition | Precise Time | Height (m) | Heading/Sideways Overlap |
---|---|---|---|---|
19 November 2022–21 November 2022 | 21 November 2022 | 11:30–12:00 | 40 | 75/70% |
8 December 2022–10 December 2022 | 8 December 2022 | 12:00–12:30 | 40 | 75/70% |
9 January 2023–11 January 2023 | 10 January 2023 | 12:00–12:30 | 40 | 75/70% |
29 January 2023–31 January 2023 | 30 January 2023 | 13:30–14:00 | 40 | 75/70% |
Spectral Index | Abbreviations | Calculation Formula | Source |
---|---|---|---|
Normalized vegetation index | NDVI | NDVI = (NIR − R)/(NIR + R) | [39] |
Nitrogen reflection index | NRI | NRI = (G − R)/(G + R) | [40] |
Greenness vegetation index | GNDVI | GNDVI = (NIR − G)/(NIR + G) | [41,42] |
Ratio vegetation index | RVI | RVI = NIR/R | [43] |
Non-linear vegetation index | NLI | (NIR × NIR − R)/(R × R − NIR) | [44] |
Modified simple ratio index | MSR | (NIR/R − 1)/[(NIR/R + 1)^(1/2)] | [45] |
Model Framework | Data Source | Model Input |
---|---|---|
MFM1 | Original MS images | Spectral index VIs (NDVI; NRI; MSR) |
MFM2 | Original MS images | Spectral index VIs (NDVI; NRI; MSR) |
Texture feature GCFs (Vcv; Scv; Ra) | ||
MFM3 | HR_MS image after image fusion | Spectral indices VIs_F (NDVI_F; NRI_F; MSR_F) |
Texture Features GCFs_F (Vcv_F; Scv_F; Ra_F) | ||
MFM4 | HR_MS image after image fusion. Meteorological Data | Spectral indices VIs_F (NDVI_F; NRI_F; MSR_F) |
Texture Features GCFs_F (Vcv_F; Scv_F; Ra_F) | ||
Meteorological data MDs (Ae; Aar) |
Modal Input | Machine Learning Models | Evaluation Indicators | Physiological Indicators of Oil-Seed Rape Growth | Average | |||
---|---|---|---|---|---|---|---|
LAI | AGB | LNC | SPAD | ||||
MFM1 | SVR | R-square | 0.6773 | 0.3864 | 0.4729 | 0.7237 | 0.5651 |
MSE | 0.2068 | 32.1681 | 6.0256 | 15.2862 | 13.4217 | ||
PLSR | R-square | 0.4528 | 0.3391 | 0.1971 | 0.2183 | 0.3018 | |
MSE | 0.4042 | 28.7874 | 11.6831 | 41.9419 | 20.7042 | ||
BPNN | R-square | 0.4233 | 0.3215 | 0.5304 | 0.8277 | 0.5257 | |
MSE | 0.3649 | 15.2748 | 4.1654 | 9.1541 | 7.7398 | ||
NMR | R-square | 0.3654 | 0.4482 | 0.7043 | 0.7741 | 0.5730 | |
MSE | 0.4262 | 16.5012 | 3.8428 | 12.3704 | 8.2852 | ||
MFM2 | SVR | R-square | 0.7029 | 0.4697 | 0.4829 | 0.7784 | 0.6085 |
MSE | 0.2249 | 20.8486 | 7.2676 | 12.0857 | 10.1067 | ||
PLSR | R-square | 0.5212 | 0.3406 | 0.2846 | 0.4586 | 0.4013 | |
MSE | 0.3601 | 39.9055 | 7.8961 | 29.3944 | 19.3890 | ||
BPNN | R-square | 0.6253 | 0.4477 | 0.57875 | 0.8881 | 0.6350 | |
MSE | 0.2641 | 17.9229 | 4.996 | 6.8762 | 7.5148 | ||
NMR | R-square | 0.4281 | 0.5019 | 0.6239 | 0.8079 | 0.5905 | |
MSE | 0.3731 | 38.6885 | 5.7252 | 8.0058 | 13.1982 | ||
MFM3 | SVR | R-square | 0.7802 | 0.5909 | 0.6371 | 0.8651 | 0.7183 |
MSE | 0.1471 | 17.8331 | 3.7256 | 7.6178 | 7.3309 | ||
PLSR | R-square | 0.6298 | 0.4514 | 0.4145 | 0.5975 | 0.5233 | |
MSE | 0.2919 | 29.2122 | 5.8248 | 19.7959 | 13.7812 | ||
BPNN | R-square | 0.6505 | 0.4703 | 0.5991 | 0.8935 | 0.6534 | |
MSE | 0.2962 | 27.1946 | 4.4556 | 5.2076 | 9.2885 | ||
NMR | R-square | 0.4466 | 0.5918 | 0.7027 | 0.8202 | 0.6403 | |
MSE | 0.3853 | 17.6365 | 4.129 | 9.4111 | 7.8155 | ||
MFM4 | SVR | R-square | 0.8071 | 0.6356 | 0.6646 | 0.8742 | 0.7454 * |
MSE | 0.1411 | 17.4372 | 3.3715 | 5.5718 | 6.6630 * | ||
PLSR | R-square | 0.5973 | 0.4903 | 0.4494 | 0.6526 | 0.5474 | |
MSE | 0.3251 | 18.7233 | 5.7181 | 15.7756 | 10.1355 | ||
BPNN | R-square | 0.7702 | 0.4438 | 0.6351 | 0.8852 | 0.6836 | |
MSE | 0.1222 | 33.0606 | 6.1542 | 5.6177 | 11.2387 | ||
NMR | R-square | 0.6045 | 0.5602 | 0.6915 | 0.8266 | 0.6707 | |
MSE | 0.2539 | 24.1886 | 3.5773 | 8.5108 | 9.1327 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yang, Y.; Wei, X.; Wang, J.; Zhou, G.; Wang, J.; Jiang, Z.; Zhao, J.; Ren, Y. Prediction of Seedling Oilseed Rape Crop Phenotype by Drone-Derived Multimodal Data. Remote Sens. 2023, 15, 3951. https://doi.org/10.3390/rs15163951
Yang Y, Wei X, Wang J, Zhou G, Wang J, Jiang Z, Zhao J, Ren Y. Prediction of Seedling Oilseed Rape Crop Phenotype by Drone-Derived Multimodal Data. Remote Sensing. 2023; 15(16):3951. https://doi.org/10.3390/rs15163951
Chicago/Turabian StyleYang, Yang, Xinbei Wei, Jiang Wang, Guangsheng Zhou, Jian Wang, Zitong Jiang, Jie Zhao, and Yilin Ren. 2023. "Prediction of Seedling Oilseed Rape Crop Phenotype by Drone-Derived Multimodal Data" Remote Sensing 15, no. 16: 3951. https://doi.org/10.3390/rs15163951
APA StyleYang, Y., Wei, X., Wang, J., Zhou, G., Wang, J., Jiang, Z., Zhao, J., & Ren, Y. (2023). Prediction of Seedling Oilseed Rape Crop Phenotype by Drone-Derived Multimodal Data. Remote Sensing, 15(16), 3951. https://doi.org/10.3390/rs15163951