Under-Canopy Drone 3D Surveys for Wild Fruit Hotspot Mapping
Abstract
:1. Introduction
2. Related Works
3. Platforms and Data
3.1. UAV Platforms
3.2. Test Area
3.3. Data Acquisition
4. Methodology
4.1. Stereo Image Sequence Processing
4.2. LiDAR Data Processing
4.3. Data Visualization: The Picker App
5. Results
5.1. Photogrammetric 3D Reconstruction
5.2. LiDAR Point Cloud Segmentation
5.3. Mobile App Visualization
6. Discussion
7. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Granshaw, S.I. RPV, UAV, UAS, RPAS… or just drone? Photogramm. Rec. 2018, 33, 160–170. [Google Scholar] [CrossRef]
- Colomina, I.; Molina, P. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef]
- Nex, F.; Remondino, F. UAV for 3D mapping applications: A review. Appl. Geomat. 2014, 6, 1–15. [Google Scholar] [CrossRef]
- Giordan, D.; Hayakawa, Y.; Nex, F.; Remondino, F.; Tarolli, P. The use of remotely piloted aircraft systems (RPASs) for natural hazards monitoring and management. Nat. Hazards Earth Syst. Sci. 2018, 18, 1079–1096. [Google Scholar] [CrossRef]
- Nex, F.; Armenakis, C.; Cramer, M.; Cucci, D.A.; Gerke, M.; Honkavaara, E.; Kukko, A.; Persello, C.; Skaloud, J. UAV in the advent of the twenties: Where we stand and what is next. ISPRS J. Photogramm. Remote Sens. 2022, 184, 215–242. [Google Scholar] [CrossRef]
- Fletcher, S.; Oostveen, A.M.; Chippendale, P.; Couceiro, M.S.; Ballester, L.S. Developing unmanned aerial robotics to support wild berry harvesting in Finland: Human factors, standards and ethics. In Proceedings of the 8th International Conference on Robot Ethics and Standards (ICRES 2023), Utrecht, The Netherlands, 17–18 July 2023. [Google Scholar]
- Yalçinkaya, B.; Couceiro, M.S.; Soares, S.P.; Valente, A. Human-aware collaborative robots in the wild: Coping with uncertainty in activity recognition. Sensors 2023, 23, 3388. [Google Scholar] [CrossRef]
- Yalcinkaya, B.; Couceiro, M.S.; Pina, L.; Soares, S.; Valente, A.; Remondino, F. Towards Enhanced Human Activity Recognition for Real-World Human-Robot Collaboration. In Proceedings of the 2024 IEEE International Conference on Robotics and Automation (ICRA), Yokohama, Japan, 13–17 May 2024; pp. 7909–7915. [Google Scholar] [CrossRef]
- Riz, L.; Povoli, S.; Caraffa, A.; Boscaini, D.; Mekhalfi, M.L.; Chippendale, P.; Turtiainen, M.; Partanen, B.; Ballester, L.S.; Noguera, F.B.; et al. Wild Berry image dataset collected in Finnish forests and peatlands using drones. arXiv 2024, arXiv:2405.07550. [Google Scholar] [CrossRef]
- Shamshiri, R.; Kalantari, F.; Ting, K.; Thorp, K.R.; Hameed, I.A.; Weltzien, C.; Ahmad, D.; Shad, Z.M. Advances in greenhouse automation and controlled environment agriculture: A transition to plant factories and urban agriculture. Int. J. Agric. Biol. Eng. 2018, 11, 1–22. [Google Scholar] [CrossRef]
- Vougioukas, S.G. Agricultural robotics. Annu. Rev. Control. Robot. Auton. Syst. 2019, 2, 365–392. [Google Scholar] [CrossRef]
- van Henten, E.J.; Tabb, A.; Billingsley, J.; Popovic, M.; Deng, M.; Reid, J. Agricultural robotics and automation. IEEE Robot. Autom. Mag. 2022, 29, 145–147. [Google Scholar] [CrossRef]
- Pearson, S.; Camacho-Villa, T.C.; Valluru, R.; Gaju, O.; Rai, M.C.; Gould, I.; Brewer, S.; Sklar, E. Robotics and autonomous systems for net zero agriculture. Curr. Robot. Rep. 2022, 3, 57–64. [Google Scholar] [CrossRef]
- Oliveira, L.F.; Moreira, A.P.; Silva, M.F. Advances in agriculture robotics: A state-of-the-art review and challenges ahead. Robotics 2021, 10, 52. [Google Scholar] [CrossRef]
- Agrafiotis, P.; Skarlatos, D.; Georgopoulos, A.; Karantzalos, K. Shallow Water Bathymetry Mapping from UAV Imagery based on Machine Learning. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, 48, 9–16. [Google Scholar] [CrossRef]
- Ayamga, M.; Akaba, S.; Nyaaba, A.A. Multifaceted applicability of drones: A review. Technol. Forecast. Soc. Chang. 2021, 167, 120677. [Google Scholar] [CrossRef]
- Shukla, V.; Morelli, L.; Remondino, F.; Micheli, A.; Tuia, D.; Risse, B. Towards Estimation of 3D Poses and Shapes of Animals from Oblique Drone Imagery. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2024, 48, 379–386. [Google Scholar] [CrossRef]
- Trybała, P.; Rigon, S.; Remondino, F.; Banasiewicz, A.; Wróblewski, A.; Macek, A.; Kujawa, P.; Romańczukiewicz, K.; Redondo, C.; Espada, F. Optimizing Mining Ventilation Using 3D Technologies. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2024, 48, 427–434. [Google Scholar] [CrossRef]
- Han, L.; Yang, G.; Yang, H.; Xu, B.; Li, Z.; Yang, X. Clustering field-based maize phenotyping of plant-height growth and canopy spectral dynamics using a UAV remote-sensing approach. Front. Plant Sci. 2018, 9, 1638. [Google Scholar] [CrossRef]
- Su, W.; Zhang, M.; Bian, D.; Liu, Z.; Huang, J.; Wang, W.; Wu, J.; Guo, H. Phenotyping of corn plants using unmanned aerial vehicle (UAV) images. Remote Sens. 2019, 11, 2021. [Google Scholar] [CrossRef]
- Xie, C.; Yang, C. A review on plant high-throughput phenotyping traits using UAV-based sensors. Comput. Electron. Agric. 2020, 178, 105731. [Google Scholar] [CrossRef]
- Feng, L.; Chen, S.; Zhang, C.; Zhang, Y.; He, Y. A comprehensive review on recent applications of unmanned aerial vehicle remote sensing with various sensors for high-throughput plant phenotyping. Comput. Electron. Agric. 2021, 182, 106033. [Google Scholar] [CrossRef]
- Herrero-Huerta, M.; Gonzalez-Aguilera, D.; Yang, Y. Structural component phenotypic traits from individual maize skeletonization by UAS-based structure-from-motion photogrammetry. Drones 2023, 7, 108. [Google Scholar] [CrossRef]
- Johansen, K.; Morton, M.; Malbeteau, Y.; Aragon, B.; Al-Mashharawi, S.; Ziliani, M.; Angel, Y.; Fiene, G.; Negrão, S.; Mousa, M.; et al. Predicting biomass and yield at harvest of salt-stressed tomato plants using UAV imagery. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, 42, 407–411. [Google Scholar] [CrossRef]
- Yang, Q.; Shi, L.; Han, J.; Zha, Y.; Zhu, P. Deep convolutional neural networks for rice grain yield estimation at the ripening stage using UAV-based remotely sensed images. Field Crop. Res. 2019, 235, 142–153. [Google Scholar] [CrossRef]
- Qu, H.; Zheng, C.; Ji, H.; Barai, K.; Zhang, Y.J. A fast and efficient approach to estimate wild blueberry yield using machine learning with drone photography: Flight altitude, sampling method and model effects. Comput. Electron. Agric. 2024, 216, 108543. [Google Scholar] [CrossRef]
- Tang, Y.; Chen, M.; Wang, C.; Luo, L.; Li, J.; Lian, G.; Zou, X. Recognition and localization methods for vision-based fruit picking robots: A review. Front. Plant Sci. 2020, 11, 510. [Google Scholar] [CrossRef]
- Li, D.; Sun, X.; Elkhouchlaa, H.; Jia, Y.; Yao, Z.; Lin, P.; Li, J.; Lu, H. Fast detection and location of longan fruits using UAV images. Comput. Electron. Agric. 2021, 190, 106465. [Google Scholar] [CrossRef]
- Hyyppä, E.; Hyyppä, J.; Hakala, T.; Kukko, A.; Wulder, M.A.; White, J.C.; Pyörälä, J.; Yu, X.; Wang, Y.; Virtanen, J.P.; et al. Under-canopy UAV laser scanning for accurate forest field measurements. ISPRS J. Photogramm. Remote Sens. 2020, 164, 41–60. [Google Scholar] [CrossRef]
- Wang, Y.; Kukko, A.; Hyyppä, E.; Hakala, T.; Pyörälä, J.; Lehtomäki, M.; El Issaoui, A.; Yu, X.; Kaartinen, H.; Liang, X.; et al. Seamless integration of above-and under-canopy unmanned aerial vehicle laser scanning for forest investigation. For. Ecosyst. 2021, 8, 10. [Google Scholar] [CrossRef]
- Tian, Y.; Liu, K.; Ok, K.; Tran, L.; Allen, D.; Roy, N.; How, J.P. Search and rescue under the forest canopy using multiple UAVs. Int. J. Robot. Res. 2020, 39, 1201–1221. [Google Scholar] [CrossRef]
- Yao, H.; Liang, X. Autonomous Exploration Under Canopy for Forest Investigation Using LiDAR and Quadrotor. IEEE Trans. Geosci. Remote Sens. 2024, 62, 5704719. [Google Scholar] [CrossRef]
- Liang, X.; Yao, H.; Qi, H.; Wang, X. Forest in situ observations through a fully automated under-canopy unmanned aerial vehicle. Geo-Spat. Inf. Sci. 2024, 27, 983–999. [Google Scholar] [CrossRef]
- Gupta, A.; Fernando, X. Simultaneous Localization and Mapping (SLAM) and Data Fusion in Unmanned Aerial Vehicles: Recent Advances and Challenges. Drones 2022, 6, 85. [Google Scholar] [CrossRef]
- Zhuang, L.; Zhong, X.; Xu, L.; Tian, C.; Yu, W. Visual SLAM for Unmanned Aerial Vehicles: Localization and Perception. Sensors 2024, 24, 2980. [Google Scholar] [CrossRef]
- Morelli, L.; Ioli, F.; Beber, R.; Menna, F.; Remondino, F.; Vitti, A. COLMAP-SLAM: A framework for visual odometry. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2023, 48, 317–324. [Google Scholar] [CrossRef]
- Schonberger, J.L.; Frahm, J.M. Structure-from-Motion Revisited. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 4104–4113. [Google Scholar] [CrossRef]
- Krisanski, S.; Taskhiri, M.S.; Turner, P. Enhancing methods for under-canopy unmanned aircraft system based photogrammetry in complex forests for tree diameter measurement. Remote Sens. 2020, 12, 1652. [Google Scholar] [CrossRef]
- Zhang, Y.; Onda, Y.; Kato, H.; Feng, B.; Gomi, T. Understory biomass measurement in a dense plantation forest based on drone-SfM data by a manual low-flying drone under the canopy. J. Environ. Manag. 2022, 312, 114862. [Google Scholar] [CrossRef]
- Agisoft LLC. Agisoft Metashape. Available online: https://www.agisoft.com/ (accessed on 8 October 2024).
- Karjalainen, V.; Koivumäki, N.; Hakala, T.; George, A.; Muhojoki, J.; Hyyppa, E.; Suomalainen, J.; Honkavaara, E. Autonomous robotic drone system for mapping forest interiors. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2024, 48, 167–172. [Google Scholar] [CrossRef]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You only look once: Unified, real-time object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar] [CrossRef]
- Kirillov, A.; Mintun, E.; Ravi, N.; Mao, H.; Rolland, C.; Gustafson, L.; Xiao, T.; Whitehead, S.; Berg, A.C.; Lo, W.Y.; et al. Segment anything. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Paris, France, 1–6 October 2023; pp. 4015–4026. [Google Scholar] [CrossRef]
- Radford, A.; Kim, J.W.; Hallacy, C.; Ramesh, A.; Goh, G.; Agarwal, S.; Sastry, G.; Askell, A.; Mishkin, P.; Clark, J.; et al. Learning transferable visual models from natural language supervision. In Proceedings of the International Conference on Machine Learning, Virtual, 18–24 July 2021; pp. 8748–8763. [Google Scholar]
- Caron, M.; Touvron, H.; Misra, I.; Jégou, H.; Mairal, J.; Bojanowski, P.; Joulin, A. Emerging properties in self-supervised vision transformers. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Montreal, QC, Canada, 10–17 October 2021; pp. 9650–9660. [Google Scholar] [CrossRef]
- Zhuang, F.; Qi, Z.; Duan, K.; Xi, D.; Zhu, Y.; Zhu, H.; Xiong, H.; He, Q. A comprehensive survey on transfer learning. Proc. IEEE 2020, 109, 43–76. [Google Scholar] [CrossRef]
- Tian, Y.; Yang, G.; Wang, Z.; Wang, H.; Li, E.; Liang, Z. Apple detection during different growth stages in orchards using the improved YOLO-V3 model. Comput. Electron. Agric. 2019, 157, 417–426. [Google Scholar] [CrossRef]
- Li, W.; Zhu, L.; Liu, J. PL-DINO: An Improved Transformer-Based Method for Plant Leaf Disease Detection. Agriculture 2024, 14, 691. [Google Scholar] [CrossRef]
- Balasundaram, A.; Sharma, A.; Swaathy, K.; Shaik, A.; Kavitha, M.S. An Improved Normalized Difference Vegetation Index (NDVI) Estimation using Grounded Dino and Segment Anything Model for Plant Health Classification. IEEE Access 2024, 12, 75907–75919. [Google Scholar] [CrossRef]
- Feuer, B.; Joshi, A.; Cho, M.; Chiranjeevi, S.; Deng, Z.K.; Balu, A.; Singh, A.K.; Sarkar, S.; Merchant, N.; Singh, A.; et al. Zero-shot insect detection via weak language supervision. Plant Phenome J. 2024, 7, e20107. [Google Scholar] [CrossRef]
- Zhou, X.; Girdhar, R.; Joulin, A.; Krähenbühl, P.; Misra, I. Detecting twenty-thousand classes using image-level supervision. In European Conference on Computer Vision; Springer: Berlin/Heidelberg, Germany, 2022; pp. 350–368. [Google Scholar] [CrossRef]
- Junos, M.H.; Mohd Khairuddin, A.S.; Thannirmalai, S.; Dahari, M. An optimized YOLO-based object detection model for crop harvesting system. IET Image Process. 2021, 15, 2112–2125. [Google Scholar] [CrossRef]
- Zhu, H.; Qin, S.; Su, M.; Lin, C.; Li, A.; Gao, J. Harnessing Large Vision and Language Models in Agriculture: A Review. arXiv 2024, arXiv:2407.19679. [Google Scholar] [CrossRef]
- Quigley, M.; Conley, K.; Gerkey, B.; Faust, J.; Foote, T.; Leibs, J.; Wheeler, R.; Ng, A.Y. ROS: An open-source Robot Operating System. In Proceedings of the ICRA Workshop on Open Source Software, Kobe, Japan, 12–17 May 2009; Volume 3, p. 5. [Google Scholar]
- Muhojoki, J.; Tavi, D.; Hyyppä, E.; Lehtomäki, M.; Faitli, T.; Kaartinen, H.; Kukko, A.; Hakala, T.; Hyyppä, J. Benchmarking Under-and Above-Canopy Laser Scanning Solutions for Deriving Stem Curve and Volume in Easy and Difficult Boreal Forest Conditions. Remote Sens. 2024, 16, 1721. [Google Scholar] [CrossRef]
- Muhojoki, J.; Hakala, T.; Kukko, A.; Kaartinen, H.; Hyyppä, J. Comparing positioning accuracy of mobile laser scanning systems under a forest canopy. Sci. Remote Sens. 2024, 9, 100121. [Google Scholar] [CrossRef]
- Kilpeläinen, H.; Miina, J.; Store, R.; Salo, K.; Kurttila, M. Evaluation of bilberry and cowberry yield models by comparing model predictions with field measurements from North Karelia, Finland. For. Ecol. Manag. 2016, 363, 120–129. [Google Scholar] [CrossRef]
- Rinne, J.; Laurila, T.; Hypén, H.; Kellomäki, S.; Rouvinen, I. General Description of the Climate and Vegetation at the BIPHOREP Measurement Sites; European Commission: Luxembourg, 1999. [Google Scholar]
- Labbé, M.; Michaud, F. RTAB-Map as an Open-Source LiDAR and Visual Simultaneous Localization and Mapping Library for Large-Scale and Long-Term Online Operation. J. Field Robot. 2019, 36, 416–446. [Google Scholar] [CrossRef]
- Campos, C.; Elvira, R.; Rodríguez, J.J.G.; Montiel, J.M.; Tardós, J.D. ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual–Inertial, and Multimap SLAM. IEEE Trans. Robot. 2021, 37, 1874–1890. [Google Scholar] [CrossRef]
- Morelli, L.; Ioli, F.; Maiwald, F.; Mazzacca, G.; Menna, F.; Remondino, F. Deep-Image-Matching: A Toolbox for Multiview Image Matching of Complex Scenarios. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2024, 48, 309–316. [Google Scholar] [CrossRef]
- Lowe, D.G. Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
- DeTone, D.; Malisiewicz, T.; Rabinovich, A. Superpoint: Self-supervised interest point detection and description. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, Salt Lake City, UT, USA, 18–22 June 2018; pp. 224–236. [Google Scholar] [CrossRef]
- Tyszkiewicz, M.; Fua, P.; Trulls, E. DISK: Learning local features with policy gradient. Adv. Neural Inf. Process. Syst. 2020, 33, 14254–14265. [Google Scholar]
- Ronneberger, O.; Fischer, P.; Brox, T. U-net: Convolutional networks for biomedical image segmentation. In Proceedings of the Medical Image Computing and Computer-Assisted Intervention—MICCAI 2015: 18th International Conference, Munich, Germany, 5–9 October 2015; Proceedings, Part III 18; Springer: Berlin/Heidelberg, Germany, 2015; pp. 234–241. [Google Scholar] [CrossRef]
- Lindenberger, P.; Sarlin, P.E.; Pollefeys, M. Lightglue: Local feature matching at light speed. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Paris, France, 1–6 October 2023; pp. 17627–17638. [Google Scholar] [CrossRef]
- Riba, E.; Mishkin, D.; Ponsa, D.; Rublee, E.; Bradski, G. Kornia: An open source differentiable computer vision library for pytorch. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, Snowmass, CO, USA, 1–5 March 2020; pp. 3674–3683. [Google Scholar] [CrossRef]
- Bellavia, F.; Morelli, L.; Menna, F.; Remondino, F. Image Orientation with a Hybrid Pipeline Robust to Rotations and Wide-Baselines. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2022, 46, 73–80. [Google Scholar] [CrossRef]
- Grupp, M. evo: Python Package for the Evaluation of Odometry and SLAM. 2017. Available online: https://github.com/MichaelGrupp/evo (accessed on 8 October 2024).
- Han, X.F.; Jin, J.S.; Wang, M.J.; Jiang, W.; Gao, L.; Xiao, L. A review of algorithms for filtering the 3D point cloud. Signal Process. Image Commun. 2017, 57, 103–112. [Google Scholar] [CrossRef]
- Zhang, W.; Qi, J.; Wan, P.; Wang, H.; Xie, D.; Wang, X.; Yan, G. An easy-to-use airborne LiDAR data filtering method based on cloth simulation. Remote Sens. 2016, 8, 501. [Google Scholar] [CrossRef]
- Chew, L.P. Constrained Delaunay triangulations. In Proceedings of the Third Annual Symposium on Computational Geometry, Waterloo, ON, Canada, 8–10 June 1987; pp. 215–222. [Google Scholar]
- Pfeifer, N.; Mandlburger, G. LiDAR data filtering and DTM generation. In Topographic Laser Ranging and Scanning; CRC Press: Boca Raton, FL, USA, 2017; pp. 307–334. [Google Scholar]
- Xi, Z.; Hopkinson, C. 3D graph-based individual-tree isolation (Treeiso) from terrestrial laser scanning point clouds. Remote Sens. 2022, 14, 6116. [Google Scholar] [CrossRef]
- Silverman, B.W. Density Estimation for Statistics and Data Analysis; Routledge: London, UK, 2018. [Google Scholar] [CrossRef]
- Umeyama, S. Least-squares estimation of transformation parameters between two point patterns. IEEE Trans. Pattern Anal. Mach. Intell. 1991, 13, 376–380. [Google Scholar] [CrossRef]
UAV System | Data Type | Size | Sensor | Format | System Purpose |
---|---|---|---|---|---|
Hexarotor | Front-facing stereo images | 5644 images | Zed X stereo camera | Rosbag | Wild berry detection in a global reference system |
Nadir images | 632 images | Zed X stereo camera | |||
GNSS positions | 815 positions | Emlid Reach | |||
Quadrotor | 3D point cloud | 11,443,116 points | Ouster OS0-32 | .pcd | Forest modeling |
Reference GNSS Positions | Feature Extractor | Mean (m) | Median (m) |
---|---|---|---|
All available | SIFT | 3.81 | 3.47 |
DISK | 16.10 | 13.44 | |
SuperPoint | 3.09 | 2.69 | |
Accuracy < 10 m | SIFT | 3.29 | 2.72 |
DISK | 14.06 | 10.29 | |
SuperPoint | 2.57 | 2.24 | |
Accuracy < 1 m | SIFT | 1.36 | 1.09 |
DISK | 1.18 | 0.89 | |
SuperPoint | 1.25 | 1.01 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Trybała, P.; Morelli, L.; Remondino, F.; Farrand, L.; Couceiro, M.S. Under-Canopy Drone 3D Surveys for Wild Fruit Hotspot Mapping. Drones 2024, 8, 577. https://doi.org/10.3390/drones8100577
Trybała P, Morelli L, Remondino F, Farrand L, Couceiro MS. Under-Canopy Drone 3D Surveys for Wild Fruit Hotspot Mapping. Drones. 2024; 8(10):577. https://doi.org/10.3390/drones8100577
Chicago/Turabian StyleTrybała, Paweł, Luca Morelli, Fabio Remondino, Levi Farrand, and Micael S. Couceiro. 2024. "Under-Canopy Drone 3D Surveys for Wild Fruit Hotspot Mapping" Drones 8, no. 10: 577. https://doi.org/10.3390/drones8100577
APA StyleTrybała, P., Morelli, L., Remondino, F., Farrand, L., & Couceiro, M. S. (2024). Under-Canopy Drone 3D Surveys for Wild Fruit Hotspot Mapping. Drones, 8(10), 577. https://doi.org/10.3390/drones8100577