Multi-Lidar System Localization and Mapping with Online Calibration
Abstract
:1. Introduction
- (1)
- The registration method based on line and surface features is used for point cloud matching, and the LM algorithm is used for nonlinear optimization. A refined root mean square error (RMSE) method is introduced to evaluate the point cloud registration accuracy.
- (2)
- A feature fusion strategy, grounded in feature point cloud smoothness and distribution, enables frame-to-frame matching using the Ceres library and LM nonlinear optimization method. This prevents an excessive number of multi-lidar feature point clouds from impairing the real-time performance of the system.
- (3)
- The GTSAM library is employed to formulate a graph optimization model, supplemented by a loop closure detection constraint factor grounded in the Scan Context descriptor and a ground constraint factor relying on plane feature matching. These elements collectively achieve global trajectory optimization and z-axis error limitation.
- (4)
- A real vehicle experimental platform is established to validate the robustness, real-time performance, and accuracy of the online calibration algorithm presented in this paper. This study also verifies that the multi-radar SLAM system has a more pronounced accuracy improvement than the single radar system.
2. Front-End Laser Odometry and Online Calibration
2.1. Point Cloud Preprocessing
2.2. Realization of Multi-Lidar Online Calibration
2.2.1. Online External Parameter Calibration
2.2.2. Convergence Assessment of External Parameters
2.3. Inter-Frame Registration
3. Back-end Optimization and Map Construction
3.1. Loop Detection Based on Scan Context
3.2. Z-Axis Error Limitation Based on Ground Constraints
3.3. Map Construction and Update
4. Real-World Experiment Validation and Result Analysis
4.1. Online Calibration Experiment Results and Analysis
4.1.1. Comparison and Analysis of Online Calibration Results for Different
Lidar Layouts
4.1.2. Comparison and Analysis of Online Calibration Algorithm Results for
Multiple Lidars
4.2. Experimental Results and Analysis of LiDAR SLAM on a Real Vehicle
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Peng, Y.H.; Jiang, M.; Ma, Z.Y.; Zhong, C. Research progress in key technologies of automobile autonomous vehicle driving. J. Fuzhou Univ. 2021, 49, 691–703. [Google Scholar]
- Guo, W.H. Intelligent and connected vehicle technology Roadmap 2.0 generation. Intell. Connect. Car 2020, 6, 10–13. [Google Scholar]
- Yu, Z.L. Research on Intelligent Vehicle Positioning Technology Based on Visual Inertial Navigation Fusion. Master’s Thesis, University of Electronic Science and Technology of China, UESTC, Chengdu, China, 2022. [Google Scholar]
- Pi, J.H. Visual SLAM Algorithm for a Complex and Dynamic Traffic Environment. Master’s Thesis, Dalian University of Technology, Dalian, China, 2022. [Google Scholar]
- Cai, Y.F.; Lu, Z.H.; Li, Y.W.; Chen, L.; Wang, H. A-coupled SLAM system based on multi-sensor fusion. Automob. Eng. 2022, 44, 350–361. [Google Scholar]
- Lezki, H.; Yetik, I.S. Localization Using Single Camera and Lidar in GPS-Denied Environments. In Proceedings of the 28th Signal Processing and Communications Applications Conference (SIU), Gaziantep, Turkey, 5–7 October 2020. [Google Scholar]
- Wang, H.; Xu, Y.S.; Cai, Y.F.; Chen, L. Summary of multi-target detection technology for intelligent vehicles based on multi-sensor fusion. J. Automot. Saf. Energy Conserv. 2021, 12, 440–455. [Google Scholar]
- Feng, M.C.; Gao, X.Q.; Wang, J.S.; Feng, H.Z. Research on the vehicle target shape position fusion algorithm based on stereo vision and lidar. Chin. J. Sci. Instrum. 2021, 42, 210–220. [Google Scholar]
- Zhao, Y.C.; Wei, T.X.; Tong, D.; Zhao, J.B. Visual SLAM algorithm for fused YOLOv5s in dynamic scenes. Radio Eng. 2023, 1–10. Available online: http://kns.cnki.net/kcms/detail/13.1097.TN.20230629.1308.002.html (accessed on 6 September 2023).
- Guo, P.; Wang, Q.; Cao, Z.; Xia, H. High Precision Odometer and Mapping Algorithm Based on Multi Lidar Fusion. In Proceedings of the International Conference on Image, Vision and Intelligent Systems, Jinan, China, 15–17 August 2022. [Google Scholar]
- Cao, M.C.; Wang, J.M. Obstacle Detection for Autonomous Driving Vehicles With Multi-LiDAR Sensor Fusion. J. Dyn. Syst. Meas. Control 2020, 142, 021007. [Google Scholar] [CrossRef]
- Hu, Z.Y.; Dong, T.H.; Luo, Y. A vehicle front collision warning system with sensor fusion. J. Chongqing Univ. Technol. 2022, 36, 46–53. [Google Scholar]
- Liu, J.Y.; Tang, Z.M.; Wang, A.D.; Shi, Z.X. Detection of negative obstacles in an unstructured environment based on multi-lidar and combined features. Robot 2017, 39, 638–651. [Google Scholar]
- Guan, L.M.; Zhang, Q.; Chu, Q.L.; Zhu, J.Y. Algorithm for roadside dual lidar data fusion. Laser J. 2021, 42, 38–44. [Google Scholar]
- Wang, H.; Luo, T.; Lu, P.Y. Application of lidar in unmanned vehicles and its key technology analysis. Laser Infrared 2018, 48, 1458–1467. [Google Scholar]
- Jiao, J.H.; Yu, Y.; Liao, Q.H.; Ye, H.; Liu, M. Automatic Calibration of Multiple 3D LiDARs in Urban Environments. arXiv 2019, arXiv:1905.04912. [Google Scholar]
- Shi, B.; Yu, P.D.; Yang, M.; Wang, C.; Bai, Y.; Yang, F. Extrinsic Calibration of Dual LiDARs Based on Plane Features and Uncertainty Analysis. IEEE Sens. J. 2021, 21, 11117–11130. [Google Scholar] [CrossRef]
- He, Y.B.; Ye, B.; Li, H.J.; Zhou, X.Y. Multi-solid-state lidar calibration algorithm based on point cloud registration. Semicond. Photoelectr. 2022, 43, 195–200. [Google Scholar]
- Jiao, J.H.; Liao, Q.H.; Zhu, Y.L.; Liu, T.; Yu, Y.; Fan, R.; Wang, L.; Liu, M. A Novel Dual-Lidar Calibration Algorithm Using Planar Surfaces; IEEE: Piscataway Township, NJ, USA, 2019. [Google Scholar]
- Jiang, F. External parameter calibration method of mobile robot dual lidar system. Inf. Syst. Eng. 2021, 7, 105–107. [Google Scholar]
- Yeul, J.K.; Eun, J.H. Extrinsic Calibration of a Camera and a 2D LiDAR Using a Dummy Camera with IR Cut Filter Removed; IEEE ACCESS: Piscataway Township, NJ, USA, 2020; Volume 8. [Google Scholar]
- Wang, K.L.; Guo, L.Z.; Zhu, H.; Bao, C.X.; Wang, X.G. Study on the calibration method of intelligent navigation and multiline lidar. J. Test Technol. 2023, 37, 381–385+393. [Google Scholar]
- Zhang, J.; Chen, C.; Sun, J.W. Roadside multi-lidar cooperative sensing and error self-calibration methods. Inf. Comput. 2023, 35, 88–92+99. [Google Scholar]
- Liu, X.Y.; Zhang, F. Extrinsic Calibration of Multiple LiDARs of Small FoV in Targetless Environments. IEEE Robot. Autom. Lett. 2021, 6, 2036–2043. [Google Scholar]
- Jiao, J.H.; Ye, H.Y.; Zhu, Y.L.; Liu, M. Robust Odometry and Mapping for Multi-LiDAR Systems with Online Extrinsic Calibration. IEEE Trans. Robot. 2021, 38, 351–371. [Google Scholar]
- Yang, X.K. Research on Obstacle Detection Based on Multi-Lidar Fusion. Master’s Thesis, University of Northern Technology, Beijing, China, 2022. [Google Scholar]
- Zhang, Z.Q. Research on Multi-Sensor Joint Calibration Algorithm for Intelligent Vehicles. Master’s Thesis, University Of Chongqing, Chongqing, China, 2022. [Google Scholar]
- Shan, T.X.; Englot, B. Lego-loam: Lightweight and ground-optimized lidar odometry and mapping on variable terrain. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Madrid, Spain, 1–5 October 2018; pp. 4758–4765. [Google Scholar]
- Zhang, J.; Singh, S. Visual-lidar odometry and mapping: Low-drift, robust, and fast. In Proceedings of the IEEE International Conference on Robotics and Automation, Seattle, WA, USA, 26–30 May 2015; pp. 2174–2181. [Google Scholar]
- Wang, H.; Wang, C.; Chen, C.L.; Xie, L. F-LOAM: Fast LiDAR Odometry and Mapping. In Proceedings of the 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic, 27 September 2021–1 October 2021. [Google Scholar]
- Himmelsbach, M.; Hundelshausen, F.V.; Wuensche, H.J. Fast Segmentation of 3D Point Clouds for Ground Vehicles. In Proceedings of the IEEE Intelligent Vehicles Symposium, La Jolla, CA, USA, 21–24 June 2010; pp. 560–565. [Google Scholar]
- Bogoslavskyi, I.; Stachniss, C. Fast Range Image-based Segmen-tation of Sparse 3D Laser Scans for Online Operation. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems, Daejeon, Republic of Korea, 9–14 October 2016; pp. 163–169. [Google Scholar]
- Du, P.F. Research on Key Technology of Monocular Vision SLAM Based on Graph Optimization. Master’s Thesis, Zhejiang University, Hangzhou, China, 2018. [Google Scholar]
- Chai, M.N.; Liu, Y.S.; Ren, L.J. Two-step loop detection based on laser point cloud NDT feature. Laser Infrared 2020, 50, 17–24. [Google Scholar]
- Adamshan. Automatic Calibration Algorithm and Code Examples. Available online: https://blog.csdn.net/AdamShan/article/details/105930565?spm=1001.2014.3001.5501 (accessed on 5 May 2020).
- Yan, G.H.; Liu, Z.C.; Wang, C.J.; Shi, C.; Wei, P.; Cai, X.; Ma, T.; Liu, Z.; Zhong, Z.; Liu, Y.; et al. OpenCalib: A Multi-sensor Calibration Toolbox for Autonomous Driving. arXiv 2022, arXiv:2205.14087. [Google Scholar] [CrossRef]
X (m) | Y (m) | Z (m) | Roll (°) | Pitch (°) | Yaw (°) | |
---|---|---|---|---|---|---|
The 32-line lidar is placed at the center of the roof, while the 16-line lidar is placed at the center of the front of the vehicle. | ||||||
Actual measurement values | 0.82 | 0 | −0.05 | 0 | 15 | 5 |
Algorithm calibration values | 0.848785 | −0.0669597 | −0.110832 | −1.01514 | 15.052825 | 4.54818 |
The 32-line lidar is centered on the roof, and the 16-line lidar is centered on the front of the vehicle. | ||||||
Actual measurement values | 0.82 | 0.45 | −0.05 | −2 | 15 | 5 |
Algorithm calibration values | 0.788523 | 0.450865 | −0.0857012 | −2.90022 | 15.0670923 | 4.27292 |
The 32-line lidar is placed at the center of the roof, and the 16-line lidar is placed at the left side of the rear end. | ||||||
Actual measurement values | −0.63 | 0.45 | −0.03 | 1 | −1 | −1 |
Algorithm calibration values | −0.69039 | 0.440672 | −0.010485 | 0.38109943 | −0.78497562 | −1.66093791 |
The 32-line lidar is mounted on the center of the roof, and the 16-line lidar is mounted on the front right of the vehicle. | ||||||
Actual measurement values | 0.82 | −0.45 | −0.05 | 8 | 15 | −10 |
Algorithm calibration values | 0.812278 | −0.428665 | −0.0959522 | 7.30276 | 16.0154646 | −9.31442 |
The 32-line Lidar is placed on the roof in the center, while the 16-line Lidar is placed on the rear of the car and to the right. | ||||||
Actual measurement values | −0.63 | −0.45 | −0.03 | 10 | −3 | −10 |
Algorithm calibration values | −0.644347 | −0.37646 | −0.078509 | 10.6063 | −3.16483 | −11.497818 |
The 32-line lidar is mounted on the right side of the roof, and the 16-line lidar is mounted on the left side of the roof. | ||||||
Actual measurement values | 0 | 0.67 | −0.02 | 0 | 0 | 0 |
Algorithm calibration values | 0.0617705 | 0.732039 | −0.0673341 | 0.39993226 | −0.10664561 | 1.47808788 |
X (m) | Y (m) | Z (m) | Roll (°) | Pitch (°) | Yaw (°) | |
---|---|---|---|---|---|---|
maximum deviation | 0.0617705 | 0.07364 | 0.060832 | 1.01514 | 1.0154646 | 1.497818 |
minimum deviation | 0.007722 | 0.000865 | 0.019515 | 0.3993226 | 0.052825 | 0.45182 |
mean deviation | 0.0340818 | 0.03901111 | 0.0429739 | 0.7062888 | 0.27031365 | 0.9168873 |
Algorithm Name | Improved RMSE | Original RMSE | Visual Evaluation | Running Time | Corresponding Point Number |
---|---|---|---|---|---|
multi_lidar_calibrator | 2.194913 | 5.33469 | Poor | 14,121 ms | 1559 |
OpenCalib | 1.985486 | 8.4994 | Good | 5366.5 ms | 2069 |
my_calibration | 0.914785 | 13.307 | Excellent | 98 ms | 3814 |
Max (m) | Mean (m) | Median (m) | Min (m) | RMSE (m) | Std (m) | |
---|---|---|---|---|---|---|
A-LOAM | 44.507192 | 5.494679 | 2.910864 | 0.016384 | 10.944484 | 9.465212 |
F-LOAM | 6.335769 | 2.797792 | 2.316942 | 0.024018 | 3.206701 | 1.566937 |
LeGO-LOAM | 5.727396 | 3.054158 | 3.192376 | 0.061186 | 3.252286 | 1.117804 |
M-LOAM | 7.572633 | 2.675861 | 1.500130 | 0.061675 | 3.459226 | 2.192261 |
My-SLAM | 3.250923 | 2.196747 | 2.648033 | 0.028497 | 2.433783 | 1.047665 |
Max (m) | Mean (m) | Median (m) | Min (m) | RMSE (m) | Std (m) | |
---|---|---|---|---|---|---|
A-LOAM | 2.852986 | 0.488630 | 0.446949 | 0.002057 | 0.610143 | 0.365397 |
F-LOAM | 2.802553 | 0.455393 | 0.426244 | 0.001329 | 0.602286 | 0.394164 |
LeGO-LOAM | 2.807766 | 0.508525 | 0.436876 | 0.004183 | 0.625385 | 0.364017 |
M-LOAM | 2.807151 | 0.536770 | 0.477380 | 0.008337 | 0.648626 | 0.364133 |
My-SLAM | 2.201508 | 0.829093 | 0.834060 | 0.006764 | 0.951827 | 0.467525 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, F.; Zhao, X.; Gu, H.; Wang, L.; Wang, S.; Han, Y. Multi-Lidar System Localization and Mapping with Online Calibration. Appl. Sci. 2023, 13, 10193. https://doi.org/10.3390/app131810193
Wang F, Zhao X, Gu H, Wang L, Wang S, Han Y. Multi-Lidar System Localization and Mapping with Online Calibration. Applied Sciences. 2023; 13(18):10193. https://doi.org/10.3390/app131810193
Chicago/Turabian StyleWang, Fang, Xilong Zhao, Hengzhi Gu, Lida Wang, Siyu Wang, and Yi Han. 2023. "Multi-Lidar System Localization and Mapping with Online Calibration" Applied Sciences 13, no. 18: 10193. https://doi.org/10.3390/app131810193
APA StyleWang, F., Zhao, X., Gu, H., Wang, L., Wang, S., & Han, Y. (2023). Multi-Lidar System Localization and Mapping with Online Calibration. Applied Sciences, 13(18), 10193. https://doi.org/10.3390/app131810193