Camera-Aided Orientation of Mobile Lidar Point Clouds Acquired from an Uncrewed Water Vehicle
Abstract
:1. Introduction
1.1. Uncrewed Water Vehicles as Multisensor Platforms
1.2. Camera-Based Orientation
1.3. Outline and Innovations of This Article
2. Platform Orientation Determination
3. Calibration of Lidar to Camera Orientation
3.1. Geometric Calibration
3.2. Time Synchronization
4. Lidar Point Transformation
5. Experiments
5.1. Reference Point Cloud
5.2. Calibration and Synchronization Results
5.3. Transformation of Mobile Lidar Point Clouds
6. Accuracy Analysis
6.1. Theoretical Accuracy
6.1.1. Platform Orientation Accuracy
6.1.2. Time Synchronization Accuracy
6.1.3. Calibration Accuracy
6.1.4. Lidar 3D Point Accuracy
6.1.5. Propagation of Errors
6.2. Experimental Results
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Bai, X.; Li, B.; Xu, X.; Xiao, Y. A review of current research and advances in unmanned surface vehicles. J. Mar. Sci. Appl. 2022, 21, 47–58. [Google Scholar] [CrossRef]
- De Jong, C.D.; Lachapelle, G.; Skone, S.; Elemea, I.A. Hydrography; Delft University Press: Delft, The Netherlands, 2003; ISBN 90-407-2359-1. [Google Scholar]
- Lewicka, O.; Specht, M.; Stateczny, A.; Specht, C.; Dardanelli, G.; Brčić, D.; Szostak, B.; Halicki, A.; Stateczny, M.; Widźgowski, S. Integration Data Model of the Bathymetric Monitoring System for Shallow Waterbodies Using UAV and USV Platforms. Remote Sens. 2022, 14, 4075. [Google Scholar] [CrossRef]
- Mandlburger, G. A review of active and passive optical methods in hydrography. Int. Hydrogr. Rev. 2022, 28, 8–52. [Google Scholar] [CrossRef]
- Schneider, D.; Blaskow, R. Boat-based mobile laser scanning for shoreline monitoring of large lakes. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2021, XLIII-B2-2, 759–762. [Google Scholar] [CrossRef]
- Zhang, W.; Jiang, F.; Yang, C.F.; Wang, Z.P.; Zhao, T.J. Research on unmanned surface vehicles environment perception based on the fusion of vision and lidar. IEEE Access 2021, 9, 63107–63121. [Google Scholar] [CrossRef]
- Tauro, F.; Selker, J.; van de Giesen, N.; Abrate, T.; Uijlenhoet, R.; Porfiri, M.; Manfreda, S.; Caylor, K.; Moramarco, T.; Benveniste, J.; et al. Measurements and Observations in the XXI century (MOXXI): Innovation and multi-disciplinarity to sense the hydrological cycle. Hydrol. Sci. J. 2018, 63, 169–196. [Google Scholar] [CrossRef] [Green Version]
- Cheng, Y.; Jiang, M.; Zhu, J.; Liu, Y. Are we ready for unmanned surface vehicles in inland waterways? The usv inland multisensor dataset and benchmark. IEEE Robot. Autom. Lett. 2021, 6, 3964–3970. [Google Scholar] [CrossRef]
- Elhashash, M.; Albanwan, H.; Qin, R. A Review of Mobile Mapping Systems: From Sensors to Applications. Sensors 2022, 22, 4262. [Google Scholar] [CrossRef] [PubMed]
- Nistér, D.; Naroditsky, O.; Bergen, J. Visual odometry. In Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Washington, DC, USA, 27 June–2 July 2004; Volume 1, p. I. [Google Scholar]
- Macario Barros, A.; Michel, M.; Moline, Y.; Corre, G.; Carrel, F. A Comprehensive Survey of Visual SLAM Algorithms. Robotics 2022, 11, 24. [Google Scholar] [CrossRef]
- Liebold, F.; Maas, H.-G. Integrated Georeferencing of LiDAR and Camera Data Acquired from a Moving Platform. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2014, XL-3, 191–196. [Google Scholar] [CrossRef] [Green Version]
- Ying, X.; Wang, G.; Mei, X.; Yang, S.; Jong, J.; Zha, H. A direct method for the extrinsic calibration of a camera and a line scan LIDAR. In Proceedings of the 2014 IEEE International Conference on Mechatronics and Automation, Tianjin, China, 3–6 August 2014; pp. 571–576. [Google Scholar]
- Kim, E.-S.; Park, S.-Y. Extrinsic Calibration between Camera and LiDAR Sensors by Matching Multiple 3D Planes. Sensors 2020, 20, 52. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Pusztai, Z.; Hajder, L. Accurate Calibration of LiDAR-Camera Systems Using Ordinary Boxes. In Proceedings of the 2017 IEEE International Conference on Computer Vision Workshops (ICCVW), Venice, Italy, 22–29 October 2017; pp. 394–402. [Google Scholar]
- Mader, D.; Westfeld, P.; Maas, H.-G. An integrated flexible self-calibration approach for 2D laser scanning range finders applied to the Hokuyo UTM-30LX-EW. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2014, XL-5, 385. [Google Scholar] [CrossRef] [Green Version]
- Sardemann, H.; Eltner, A.; Maas, H.-G. Acquisition of Geometrical Data of Small Rivers with an Unmanned Water Vehicle. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, XLII-2, 1023. [Google Scholar] [CrossRef] [Green Version]
- Sardemann, H.; Mulsow, C.; Maas, H.-G. Accuracy Analysis of an Oblique Underwater Laser Lightsheet Triangulation System. J. Photogramm. Remote Sens. Geoinf. Sci. 2022, 90, 3–18. [Google Scholar] [CrossRef]
- Velodyne Lidar. VLP-16 User Manual 63-9243 Rev. E.; Velodyne Lidar: San Jose, CA, USA, 2019. [Google Scholar]
- Kidd, J.R. Performance Evaluation of the Velodyne VLP-16 System for Surface Feature Surveying. Master’s Thesis, University of New Hampshire, Durham, NH, USA, 2017. Available online: https://scholars.unh.edu/thesis/1116 (accessed on 2 December 2022).
Name | |||||
---|---|---|---|---|---|
5 m | 10 m | 25 m | 50 m | ||
0.078 | 5.7 | 11.3 | 28.0 | 55.8 | |
0.004 | 0.3 | 0.6 | 1.5 | 2.9 | |
0.078 | 5.6 | 11.0 | 27.4 | 54.7 | |
5.1 | 5.1 | 5.1 | 5.1 | 5.1 | |
5.4 | 5.4 | 5.4 | 5.4 | 5.4 | |
5.1 | 5.1 | 5.1 | 5.1 | 5.1 | |
0.023 | 21.1 | 42.5 | 112 | 229 | |
0.067 | 4.7 | 9.4 | 23.4 | 46.8 | |
0.035 | 2.6 | 5.2 | 13.0 | 25.9 | |
0.015 | 1.0 | 2.1 | 5.2 | 10.5 | |
1.0 | 1.0 | 1.0 | 1.0 | 1.0 | |
0.4 | 0.4 | 0.4 | 0.4 | 0.4 | |
0.2 | 0.2 | 0.2 | 0.2 | 0.2 | |
17.3 | 17.3 | 17.3 | 17.3 | 17.3 | |
17.3 | 17.3 | 17.3 | 17.3 | 17.3 | |
17.3 | 17.3 | 17.3 | 17.3 | 17.3 | |
39 | 56 | 126 | 250 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Sardemann, H.; Blaskow, R.; Maas, H.-G. Camera-Aided Orientation of Mobile Lidar Point Clouds Acquired from an Uncrewed Water Vehicle. Sensors 2023, 23, 6009. https://doi.org/10.3390/s23136009
Sardemann H, Blaskow R, Maas H-G. Camera-Aided Orientation of Mobile Lidar Point Clouds Acquired from an Uncrewed Water Vehicle. Sensors. 2023; 23(13):6009. https://doi.org/10.3390/s23136009
Chicago/Turabian StyleSardemann, Hannes, Robert Blaskow, and Hans-Gerd Maas. 2023. "Camera-Aided Orientation of Mobile Lidar Point Clouds Acquired from an Uncrewed Water Vehicle" Sensors 23, no. 13: 6009. https://doi.org/10.3390/s23136009
APA StyleSardemann, H., Blaskow, R., & Maas, H. -G. (2023). Camera-Aided Orientation of Mobile Lidar Point Clouds Acquired from an Uncrewed Water Vehicle. Sensors, 23(13), 6009. https://doi.org/10.3390/s23136009