Estimating Grass Sward Quality and Quantity Parameters Using Drone Remote Sensing with Deep Neural Networks
Abstract
:1. Introduction
2. Materials and Methods
2.1. Test Area and References
2.2. Drone Remote Sensing Datasets
2.3. Neural Network Architectures for Estimating Grass Quality and Quantity Parameters
2.4. Performance Assessment
3. Results
3.1. Fresh and Dry Matter Yield Estimation
3.2. D-Value
3.3. NDF and iNDF
3.4. WSC
3.5. Ncont
3.6. NU
Primary Growth | Regrowth | |
---|---|---|
FY | VGG RGB, VGG RGB_Refl | 3D HSI + CHM |
DMY | VGG RGB | ViT RGB + CHM |
D-value | ViT RGB + CHM, VGG RGB_Refl | ViT RGB + CHM |
iNDF | VGG RGB_Refl | ViT RGB + CHM |
NDF | VGG HSI + CHM | 2D HSI + CHM |
WSC | 3D HSI + CHM | VGG HSI |
Ncont | 3D HSI + CHM | 3D HSI + CHM |
NU | 3D HSI | 2D HSI + CHM, 3D HSI + CHM |
4. Discussion
4.1. Performance of Different Remote Sensing Technologies
4.2. Data and Application Area-Related Aspects
4.3. Assessment of the Results and Future Research
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Pulli, S. Growth factors and management technique used in relation to the developmental rhythm and yield formation pattern of a pure grass stand. Agric. Food Sci. 1980, 52, 281–330. [Google Scholar] [CrossRef] [Green Version]
- Rinne, M. Influence of the Timing of the Harvest of Primary Grass Growth on Herbage Quality and Subsequent Digestion and Performance in the Ruminant Animal. Ph.D. Dissertation, University of Helsinki, Helsinki, Finland, 2000. [Google Scholar]
- Hyrkäs, M.; Korhonen, P.; Pitkänen, T.; Rinne, M.; Kaseva, J. Grass growth models for estimating digestibility and dry matter yield of forage grasses in Finland. In Sustainable Meat and Milk Production from Grasslands; Wageningen Academic Publishers: Wageningen, The Netherlands, 2018; pp. 252–254. [Google Scholar]
- Aasen, H.; Honkavaara, E.; Lucieer, A.; Zarco-Tejada, P.J. Quantitative Remote Sensing at Ultra-High Resolution with UAV Spectroscopy: A Review of Sensor Technology, Measurement Procedures, and Data Correction Workflows. Remote Sens. 2018, 10, 1091. [Google Scholar] [CrossRef] [Green Version]
- Swain, K.C.; Thomson, S.J.; Jayasuriya, H.P. Adoption of an unmanned helicopter for low-altitude remote sensing to estimate yield and total biomass of a rice crop. Trans. ASABE 2010, 53, 21–27. [Google Scholar] [CrossRef] [Green Version]
- Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
- Lary, D.J.; Alavi, A.H.; Gandomi, A.H.; Walker, A.L. Machine learning in geosciences and remote sensing. Geosci. Front. 2016, 7, 3–10. [Google Scholar] [CrossRef] [Green Version]
- Näsi, R.; Viljanen, N.; Kaivosoja, J.; Alhonoja, K.; Hakala, T.; Markelin, L.; Honkavaara, E. Estimating Biomass and Nitrogen Amount of Barley and Grass Using UAV and Aircraft Based Spectral and Photogrammetric 3D Features. Remote Sens. 2018, 10, 1082. [Google Scholar] [CrossRef] [Green Version]
- Viljanen, N.; Honkavaara, E.; Näsi, R.; Hakala, T.; Niemeläinen, O.; Kaivosoja, J. A novel machine learning method for estimating biomass of grass swards using a photogrammetric canopy height model, images and vegetation indices captured by a drone. Agriculture 2018, 8, 70. [Google Scholar] [CrossRef] [Green Version]
- Wijesingha, J.; Astor, T.; Schulze-Brüninghoff, D.; Wengert, M.; Wachendorf, M. Predicting Forage Quality of Grasslands Using UAV-Borne Imaging Spectroscopy. Remote Sens. 2020, 12, 126. [Google Scholar] [CrossRef] [Green Version]
- Oliveira, R.A.; Näsi, R.; Niemeläinen, O.; Nyholm, L.; Alhonoja, K.; Kaivosoja, J.; Jauhiainen, L.; Viljanen, N.; Nezami, S.; Markelin, L.; et al. Machine learning estimators for the quantity and quality of grass swards used for silage production using drone-based imaging spectrometry and photogrammetry. Remote Sens. Environ. 2020, 246, 111830. [Google Scholar] [CrossRef]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
- Krizhevsky, A.; Sutskever, I.; Hinton, G. ImageNet classification with deep convolutional neural networks. Adv. Neural Inf. Process. Syst. 2012, 25, 1090–1098. [Google Scholar] [CrossRef]
- Lathuilière, S.; Mesejo, P.; Alameda-Pineda, X.; Horaud, R. A Comprehensive Analysis of Deep Regression. IEEE Trans. Pattern Anal. Mach. Intell. 2020, 42, 2065–2081. [Google Scholar] [CrossRef] [Green Version]
- Zhang, L.; Zhang, L.; Du, B. Deep learning for remote sensing data: A technical tutorial on the state of the art. IEEE Geosci. Remote Sens. Mag. 2016, 4, 22–40. [Google Scholar] [CrossRef]
- Ball, J.E.; Anderson, D.T.; Chan Sr, C.S. Comprehensive survey of deep learning in remote sensing: Theories, tools, and challenges for the community. J. Appl. Remote Sens. 2017, 11, 042609. [Google Scholar] [CrossRef] [Green Version]
- Zhu, X.X.; Tuia, D.; Mou, L.; Xia, G.-S.; Zhang, L.; Xu, F.; Fraundorfer, F. Deep learning in remote sensing: A comprehensive review and list of resources. IEEE Geosci. Remote Sens. Mag. 2017, 5, 8–36. [Google Scholar] [CrossRef] [Green Version]
- Yuan, X.; Shi, J.; Gu, L. A review of deep learning methods for semantic segmentation of remote sensing imagery. Expert Syst. Appl. 2021, 169, 114417. [Google Scholar] [CrossRef]
- Chen, Y.; Jiang, H.; Li, C.; Jia, X.; Ghamisi, P. Deep Feature Extraction and Classification of Hyperspectral Images Based on Convolutional Neural Networks. IEEE Trans. Geosci. Remote Sens. 2016, 54, 6232–6251. [Google Scholar] [CrossRef] [Green Version]
- Dosovitskiy, A.; Beyer, L.; Kolesnikov, A.; Weissenborn, D.; Zhai, X.; Unterthiner, T.; Dehghani, M.; Minderer, M.; Heigold, G.; Gelly, S.; et al. An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale. arXiv 2020, arXiv:2010.11929. [Google Scholar]
- Bazi, Y.; Bashmal, L.; Rahhal, M.M.A.; Dayil, R.A.; Ajlan, N.A. Vision Transformers for Remote Sensing Image Classification. Remote Sens. 2021, 13, 516. [Google Scholar] [CrossRef]
- Chen, H.; Qi, Z.; Shi, Z. Remote Sensing Image Change Detection with Transformers. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5607514. [Google Scholar] [CrossRef]
- Donahue, J.; Jia, Y.; Vinyals, O.; Hoffman, J.; Zhang, N.; Tzeng, E.; Darrell, T. DeCAF: A Deep Convolutional Activation Feature for Generic Visual Recognition. In Proceedings of the 31st International Conference on Machine Learning, Beijing, China, 21–26 June 2014; Volume 32, pp. 647–655. [Google Scholar]
- Castelluccio, M.; Poggi, G.; Sansone, C.; Verdoliva, L. Land Use Classification in Remote Sensing Images by Convolutional Neural Networks. arXiv 2015, arXiv:1508.00092. [Google Scholar]
- Kattenborn, T.; Leitloff, J.; Schiefer, F.; Hinz, S. Review on Convolutional Neural Networks (CNN) in vegetation remote sensing. ISPRS J. Photogramm. Remote Sens. 2021, 173, 24–49. [Google Scholar] [CrossRef]
- Kamilaris, A.; Prenafeta-Boldú, F.X. Deep learning in agriculture: A survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef] [Green Version]
- Osco, L.P.; Junior, J.M.; Ramos, A.P.M.; de Castro Jorge, L.A.; Fatholahi, S.N.; de Andrade Silva, J.; Matsubara, E.T.; Pistori, H.; Gonçalves, W.N.; Li, J. A Review on Deep Learning in UAV Remote Sensing. Int. J. Appl. Earth Obs. Geoinf. 2021, 102, 102456. [Google Scholar] [CrossRef]
- Castro, W.; Marcato Junior, J.; Polidoro, C.; Osco, L.P.; Gonçalves, W.; Rodrigues, L.; Santos, M.; Jank, L.; Barrios, S.; Valle, C.; et al. Deep Learning Applied to Phenotyping of Biomass in Forages with UAV-Based RGB Imagery. Sensors 2020, 20, 4802. [Google Scholar] [CrossRef]
- Ma, J.; Li, Y.; Chen, Y.; Du, K.; Zheng, F.; Zhang, L.; Sun, Z. Estimating above ground biomass of winter wheat at early growth stages using digital images and deep convolutional neural network. Eur. J. Agron. 2019, 103, 117–129. [Google Scholar] [CrossRef]
- de Oliveira, G.S.; Marcato Junior, J.; Polidoro, C.; Osco, L.P.; Siqueira, H.; Rodrigues, L.; Jank, L.; Barrios, S.; Valle, C.; Simeão, R.; et al. Convolutional Neural Networks to Estimate Dry Matter Yield in a Guineagrass Breeding Program Using UAV Remote Sensing. Sensors 2021, 21, 3971. [Google Scholar] [CrossRef]
- Kattenborn, T.; Eichel, J.; Wiser, S.; Burrows, L.; Fassnacht, F.E.; Schmidtlein, S. Convolutional Neural Networks accurately predict cover fractions of plant species and communities in Unmanned Aerial Vehicle imagery. Remote Sens. Ecol. 2020, 6, 472–486. [Google Scholar] [CrossRef] [Green Version]
- Yang, Q.; Shi, L.; Han, J.; Zha, Y.; Zhu, P. Deep convolutional neural networks for rice grain yield estimation at the ripening stage using UAV-based remotely sensed images. Field Crops Res. 2019, 235, 142–153. [Google Scholar] [CrossRef]
- Dvorak, J.S.; Pampolini, L.F.; Jackson, J.J.; Seyyedhasani, H.; Sama, M.P.; Goff, B. Predicting Quality and Yield of Growing Alfalfa from a UAV. Trans. ASABE 2021, 64, 63–72. [Google Scholar] [CrossRef]
- Grüner, E.; Astor, T.; Wachendorf, M. Prediction of Biomass and N Fixation of Legume–Grass Mixtures Using Sensor Fusion. Front. Plant Sci. 2021, 11, 603921. [Google Scholar] [CrossRef] [PubMed]
- Askari, M.S.; McCarthy, T.; Magee, A.; Murphy, D.J. Evaluation of Grass Quality under Different Soil Management Scenarios Using Remote Sensing Techniques. Remote Sens. 2019, 11, 1835. [Google Scholar] [CrossRef] [Green Version]
- Jones, D.B. Factors for Converting Percentages of Nitrogen in Foods and Feeds into Percentages of Protein; US Department of Agriculture: Washington, DC, USA, 1931; Volume 183, pp. 1–21.
- Simonyan, K.; Zisserman, A. Very deep convolutional networks for large-scale image recognition. arXiv 2014, arXiv:1409.1556. [Google Scholar]
- Honkavaara, E.; Rosnell, T.; Oliveira, R.; Tommaselli, A. Band registration of tuneable frame format hyperspectral UAV imagers in complex scenes. ISPRS J. Photogramm. Remote Sens. 2017, 134, 96–109. [Google Scholar] [CrossRef]
- Honkavaara, E.; Khoramshahi, E. Radiometric correction of close-range spectral image blocks captured using an unmanned aerial vehicle with a radiometric block adjustment. Remote Sens. 2018, 10, 256. [Google Scholar] [CrossRef] [Green Version]
- Ridnik, T.; Ben-Baruch, E.; Noy, A.; Zelnik-Manor, L. Imagenet-21k pretraining for the masses. arXiv 2021, arXiv:2104.10972. [Google Scholar]
- Touvron, H.; Cord, M.; Douze, M.; Massa, F.; Sablayrolles, A.; Jegou, H. Training data-efficient image transformers & distillation through attention. arXiv 2021, arXiv:2012.12877. [Google Scholar]
- Loshchilov, I.; Hutter, F. Decoupled Weight Decay Regularization. arXiv 2019, arXiv:1711.05101. [Google Scholar]
- Scott, A.; Knott, M. Cluster-Analysis Method for Grouping Means in Analysis of Variance. Biometrics 1974, 30, 507–512. [Google Scholar] [CrossRef] [Green Version]
- Tantithamthavorn, C.; McIntosh, S.; Hassan, A.E.; Matsumoto, K. The Impact of Automated Parameter Optimization on Defect Prediction Models. IEEE Trans. Softw. Eng. 2019, 45, 683–711. [Google Scholar] [CrossRef] [Green Version]
- Ahmad, M.; Shabbir, S.; Roy, S.K.; Hong, D.; Wu, X.; Yao, J.; Khan, A.; Mazzara, M.; Distefano, S.; Chanussot, J. Hyperspectral Image Classification-Traditional to Deep Models: A Survey for Future Prospects. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 15, 968–999. [Google Scholar] [CrossRef]
- Senop.fi, Hyperspecral Imaging|High Performance Devices—Senop. Available online: https://senop.fi/industry-research/hyperspectral-imaging/ (accessed on 20 May 2022).
- Nex, F.; Armenakis, C.; Cramer, M.; Cucci, D.A.; Gerke, M.; Honkavaara, E.; Kukko, A.; Persello, C.; Skaloud, J. UAV in the advent of the twenties: Where we stand and what is next. ISPRS J. Photogramm. Remote Sens. 2022, 184, 215–242. [Google Scholar] [CrossRef]
- Van Soest, P.J. Nutritional Ecology of the Ruminant; Cornell University Press: Ithaca, NY, USA, 1994. [Google Scholar]
- Kuoppala, K. Influence of Harvesting Strategy on Nutrient Supply and Production of Dairy Cows Consuming Diets Based on Grass and Red Clover Silage; MTT Science 11; MTT Agrifood Research Finland: Jokioinen, Finland, 2010; Available online: http://urn.fi/URN:ISBN:978-952-487-286-7 (accessed on 29 May 2022).
- Nykänen, A.; Jauhiainen, L.; Kemppainen, J.; Lindström, K. Field-scale spatial variation in soil nutrients and in yields and nitrogen fixation of clover-grass leys. Agric. Food Sci. 2008, 17, 376–393. Available online: https://journal.fi/afs/article/view/5927/67185 (accessed on 29 May 2022). [CrossRef] [Green Version]
- Nykänen, A. Nitrogen Dynamics of Organic Farming in a Crop Rotation Based on Red Clover (Trifolium pratense) Leys; Agrifood Research Reports 121; MTT Agrifood Research Finland: Jokioinen, Finland, 2008. [Google Scholar]
- Sun, S.; Liang, N.; Zuo, Z.; Parsons, D.; Morel, J.; Shi, J.; Wang, Z.; Luo, L.; Zhao, L.; Fang, H.; et al. Estimation of Botanical Composition in Mixed Clover–Grass Fields Using Machine Learning-Based Image Analysis. Front. Plant Sci. 2021, 12, 622429. [Google Scholar] [CrossRef]
- Virkajärvi, P.; Järvenranta, K. Leaf dynamics of timothy and meadow fescue under Nordic conditions. Grass Forage Sci. 2001, 56, 294–304. [Google Scholar] [CrossRef]
- Li, Y.; Nie, J.; Chao, X. Do we really need deep CNN for plant diseases identification? Comput. Electron. Agric. 2020, 178, 105803. [Google Scholar] [CrossRef]
- Michez, A.; Philippe, L.; David, K.; Sébastien, D.; Christian, D.; Bindelle, J. Can Low-Cost Unmanned Aerial Systems Describe the Forage Quality Heterogeneity? Insight from a Timothy Pasture Case Study in Southern Belgium. Remote Sens. 2020, 12, 1650. [Google Scholar] [CrossRef]
- Feng, L.; Zhang, Z.; Ma, Y.; Sun, Y.; Du, Q.; Williams, P.; Drewry, J.; Luck, B. Multitask Learning of Alfalfa Nutritive Value From UAV-Based Hyperspectral Images. IEEE Geosci. Remote Sens. Lett. 2022, 19, 5506305. [Google Scholar] [CrossRef]
FY (kg/ha) | DMY (kg DM/ha) | D-Value (g/kg DM) | iNDF (g/kg DM) | NDF (g/kg DM) | WSC (g/kg DM) | Ncont (g N/kg DM) | NU (N kg/ha) | ||
---|---|---|---|---|---|---|---|---|---|
PG Train | Mean | 11,046.0 | 2652.6 | 710.1 | 62.0 | 537.8 | 138.8 | 22.1 | 53.6 |
Min | 1022.2 | 336.0 | 632.0 | 11.0 | 435.0 | 76.0 | 11.8 | 7.6 | |
Max | 25,459.3 | 6135.1 | 770.0 | 128.0 | 614.0 | 246.0 | 40.8 | 105.7 | |
Std | 6955.0 | 1578.8 | 39.7 | 31.8 | 56.1 | 42.2 | 7.6 | 26.7 | |
PG Test | Mean | 11,205.7 | 2408.4 | 720.8 | 41.4 | 591.0 | 108.4 | 21.7 | 54.4 |
Min | 4796.2 | 1255.9 | 708.0 | 29.0 | 573.0 | 79.0 | 16.0 | 21.1 | |
Max | 18,783.5 | 3822.4 | 741.0 | 49.0 | 605.0 | 160.0 | 27.8 | 104.0 | |
Std | 4440.3 | 816.3 | 11.8 | 5.6 | 10.5 | 31.4 | 4.3 | 26.0 | |
RG train | Mean | 16,658.3 | 3542.9 | 689.6 | 65.9 | 546.0 | 137.1 | 20.2 | 70.5 |
Min | 1379.3 | 368.0 | 618.0 | 13.0 | 426.0 | 50.0 | 13.0 | 7.5 | |
Max | 32,390.6 | 7228.7 | 756.0 | 137.0 | 612.0 | 294.0 | 33.1 | 132.9 | |
Std | 8043.4 | 1611.1 | 34.4 | 30.5 | 44.4 | 64.4 | 4.9 | 32.3 | |
RG test | Mean | 17,144.0 | 3898.6 | 677.4 | 81.0 | 530.5 | 141.6 | 20.3 | 76.5 |
Min | 7843.1 | 2073.3 | 631.0 | 57.0 | 464.0 | 89.0 | 14.7 | 37.8 | |
Max | 27,911.1 | 7346.0 | 714.0 | 111.0 | 575.0 | 207.0 | 28.5 | 104.9 | |
Std | 7259.5 | 1621.7 | 28.6 | 18.6 | 36.6 | 50.4 | 5.0 | 27.4 |
Training | Testing | |||
---|---|---|---|---|
Sample Plots | 1 m × 1 m Plots | Sample Plots | 1 m × 1 m Plots | |
Primary growth | 96 | 192 | 8 | 104 |
Regrowth | 108 | 216 | 8 | 2392 |
2D-CNN | 3D-CNN (37 Input Channels) | ||||||||
---|---|---|---|---|---|---|---|---|---|
Layer | K | S | F | Output | Layer | K | S | F | Output |
Conv2D | 3 × 3 | 1 | 32 | 32, 125, 125 | Conv3D | 3 × 3 × 3 | 1 | 32 | 32, 37, 125, 125 |
BN2D | 32, 125, 125 | BN3D | 32, 37, 125, 125 | ||||||
ReLU | 32, 125, 125 | ReLU | 32, 37, 125, 125 | ||||||
MaxPool2D | 2 × 2 | 2 | 32, 62, 62 | MaxPool3D | 2 × 2 × 2 | 2 | 32, 18, 62, 62 | ||
Conv2D | 3 × 3 | 1 | 64 | 64, 62, 62 | Conv3D | 3 × 3 × 3 | 1 | 64 | 64, 18, 62, 62 |
BN2D | 64, 62, 62 | BN3D | 64, 18, 62, 62 | ||||||
ReLU | 64, 62, 62 | ReLU | 64, 18, 62, 62 | ||||||
MaxPool2D | 2 × 2 | 2 | 64, 31, 31 | MaxPool3D | 2 × 2 × 2 | 2 | 64, 9, 31, 31 | ||
Conv2D | 3 × 3 | 1 | 64 | 64, 31, 31 | Conv3D | 3 × 3 × 3 | 1 | 64 | 64, 9, 31, 31 |
BN2D | 64, 31, 31 | BN3D | 64, 9, 31, 31 | ||||||
ReLU | 64, 31, 31 | ReLU | 64, 9, 31, 31 | ||||||
AdaptiveAvgPool2D | 64, 6, 6 | AdaptiveAvgPool3D | 64, 3, 3, 3 | ||||||
Linear | 64 | Linear | 64 | ||||||
ReLU | 64 | ReLU | 64 | ||||||
Dropout | 64 | Dropout | 64 | ||||||
Linear | 1 | Linear | 1 |
Random Forest | FY | DMY | D-Value | iNDF | NDF | WSC | Ncont | NU |
---|---|---|---|---|---|---|---|---|
Primary Growth NRMSE% | ||||||||
3D + RGB_b + RGB_i | 40.8 | 50.6 | 5.3 | 99.8 | 6.7 | 35.8 | 17.2 | 25.5 |
3D + RGB_b+ RGB_i + MS_i | 24.1 | 35.7 | 4.0 | 78.4 | 6.7 | 37.9 | 12.5 | 19.0 |
HS_b + HS_i | 36.8 | 31.6 | 1.4 | 37.8 | 12.9 | 39.9 | 19.7 | 31.3 |
3D + HS_b + HS_i | 20.8 | 23.0 | 2.8 | 67.0 | 9.3 | 41.8 | 17.8 | 31.2 |
Regrowth NRMSE% | ||||||||
3D + RGB_b + RGB_i | 36.4 | 27.8 | 2.5 | 24.2 | 4.2 | 32.6 | 24.6 | 34.5 |
3D + RGB_b+ RGB_i + MS_i | 26.6 | 25.9 | 2.8 | 27.0 | 3.6 | 17.0 | 19.0 | 26.3 |
HS_b + HS_i | 29.2 | 30.7 | 4.9 | 41.4 | 4.0 | 28.4 | 14.0 | 29.3 |
3D + HS_b + HS_i | 30.6 | 30.8 | 4.9 | 41.2 | 2.5 | 26.8 | 13.5 | 30.4 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Karila, K.; Alves Oliveira, R.; Ek, J.; Kaivosoja, J.; Koivumäki, N.; Korhonen, P.; Niemeläinen, O.; Nyholm, L.; Näsi, R.; Pölönen, I.; et al. Estimating Grass Sward Quality and Quantity Parameters Using Drone Remote Sensing with Deep Neural Networks. Remote Sens. 2022, 14, 2692. https://doi.org/10.3390/rs14112692
Karila K, Alves Oliveira R, Ek J, Kaivosoja J, Koivumäki N, Korhonen P, Niemeläinen O, Nyholm L, Näsi R, Pölönen I, et al. Estimating Grass Sward Quality and Quantity Parameters Using Drone Remote Sensing with Deep Neural Networks. Remote Sensing. 2022; 14(11):2692. https://doi.org/10.3390/rs14112692
Chicago/Turabian StyleKarila, Kirsi, Raquel Alves Oliveira, Johannes Ek, Jere Kaivosoja, Niko Koivumäki, Panu Korhonen, Oiva Niemeläinen, Laura Nyholm, Roope Näsi, Ilkka Pölönen, and et al. 2022. "Estimating Grass Sward Quality and Quantity Parameters Using Drone Remote Sensing with Deep Neural Networks" Remote Sensing 14, no. 11: 2692. https://doi.org/10.3390/rs14112692
APA StyleKarila, K., Alves Oliveira, R., Ek, J., Kaivosoja, J., Koivumäki, N., Korhonen, P., Niemeläinen, O., Nyholm, L., Näsi, R., Pölönen, I., & Honkavaara, E. (2022). Estimating Grass Sward Quality and Quantity Parameters Using Drone Remote Sensing with Deep Neural Networks. Remote Sensing, 14(11), 2692. https://doi.org/10.3390/rs14112692