Camera–LiDAR Calibration Using Iterative Random Sampling and Intersection Line-Based Quality Evaluation
Abstract
:1. Introduction
2. Related Works
2.1. Two-Dimensional Object-Based Approach
2.2. Three-Dimensional Object-Based Approach
3. Proposed Method
3.1. Proposed Calibration Object
3.2. Overview of the Proposed Method
3.3. Three-Dimensional Plane Estimation
3.3.1. Camera-Based 3D Plane Estimation
3.3.2. LiDAR-Based 3D Plane Estimation
3.4. Extrinsic Parameter Estimation
3.4.1. Initialization and Optimization
3.4.2. Quality Evaluation for Extrinsic Parameters
4. Experiment Results and Discussion
4.1. Simulation-Based Quantitative Evaluation
4.2. Real-Data-Based Qualitative Evaluation
4.3. LiDAR–LiDAR Calibration
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Pang, S.; Morris, D.; Radha, H. CLOCs: Camera-LiDAR object candidates fusion for 3D object detection. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 24 October 2020–24 January 2021; pp. 10386–10393. [Google Scholar]
- Xiao, L.; Wang, R.; Dai, B.; Fang, Y.; Liu, D.; Wu, T. Hybrid conditional random field based camera-LIDAR fusion for road detection. Inf. Sci. 2018, 432, 543–558. [Google Scholar] [CrossRef]
- Huang, K.; Hao, Q. Joint multi-object detection and tracking with camera-LiDAR fusion for autonomous driving. In Proceedings of the 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic, 27 September–1 October 2021; pp. 6983–6989. [Google Scholar]
- Kocić, J.; Jovičić, N.; Drndarević, V. Sensors and sensor fusion in autonomous vehicles. In Proceedings of the 2018 26th Telecommunications Forum (TELFOR), Belgrade, Serbia, 20–21 November 2018; pp. 420–425. [Google Scholar]
- Fayyad, J.; Jaradat, M.A.; Gruyer, D.; Najjaran, H. Deep learning sensor fusion for autonomous vehicle perception and localization: A review. Sensors 2020, 20, 4220. [Google Scholar] [CrossRef] [PubMed]
- Qi, C.R.; Liu, W.; Wu, C.; Su, H.; Guibas, L.J. Frustum pointnets for 3d object detection from rgb-d data. In Proceedings of the Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018; pp. 918–927. [Google Scholar]
- Wang, Z.; Jia, K. Frustum convnet: Sliding frustums to aggregate local point-wise features for amodal 3d object detection. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 3–8 November 2019; pp. 1742–1749. [Google Scholar]
- Zhang, Q.; Pless, R. Extrinsic calibration of a camera and laser range finder (improves camera calibration). In Proceedings of the 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)(IEEE Cat. No. 04CH37566), Sendai, Japan, 28 September–2 October 2004; pp. 2301–2306. [Google Scholar]
- Unnikrishnan, R.; Hebert, M. Fast Extrinsic Calibration of a Laser Rangefinder to a Camera; Robotics Institute: Pittsburgh, PA, USA, 2005. [Google Scholar]
- Fremont, V.; Bonnifait, P. Extrinsic calibration between a multi-layer lidar and a camera. In Proceedings of the 2008 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, Seoul, Republic of Korea, 20–22 August 2008; pp. 214–219. [Google Scholar]
- Geiger, A.; Moosmann, F.; Car, Ö.; Schuster, B. Automatic camera and range sensor calibration using a single shot. In Proceedings of the 2012 IEEE International Conference on Robotics and Automation, Saint Paul, MN, USA, 14–18 May 2012; pp. 3936–3943. [Google Scholar]
- Kim, E.S.; Park, S.Y. Extrinsic calibration between camera and LiDAR sensors by matching multiple 3D planes. Sensors 2019, 20, 52. [Google Scholar] [CrossRef] [PubMed]
- Sui, J.; Wang, S. Extrinsic calibration of camera and 3D laser sensor system. In Proceedings of the 2017 36th Chinese Control Conference (CCC), Dalian, China, 26–28 July 2017; pp. 6881–6886. [Google Scholar]
- Zhou, L.; Li, Z.; Kaess, M. Automatic extrinsic calibration of a camera and a 3d lidar using line and plane correspondences. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 5562–5569. [Google Scholar]
- Tu, S.; Wang, Y.; Hu, C.; Xu, Y. Extrinsic Parameter Co-calibration of a Monocular Camera and a LiDAR Using Only a Chessboard. In Proceedings of the 2019 Chinese Intelligent Systems Conference: Volume II 15th; Springer: Singapore, 2020; pp. 446–455. [Google Scholar]
- Xie, S.; Yang, D.; Jiang, K.; Zhong, Y. Pixels and 3-D points alignment method for the fusion of camera and LiDAR data. IEEE Trans. Instrum. Meas. 2018, 68, 3661–3676. [Google Scholar] [CrossRef]
- Beltrán, J.; Guindel, C.; de la Escalera, A.; García, F. Automatic extrinsic calibration method for lidar and camera sensor setups. IEEE Trans. Intell. Transp. Syst. 2022, 23, 17677–17689. [Google Scholar] [CrossRef]
- Li, X.; He, F.; Li, S.; Zhou, Y.; Xia, C.; Wang, X. Accurate and automatic extrinsic calibration for a monocular camera and heterogenous 3D LiDARs. IEEE Sens. J. 2022, 22, 16472–16480. [Google Scholar] [CrossRef]
- Liu, H.; Xu, Q.; Huang, Y.; Ding, Y.; Xiao, J. A Method for Synchronous Automated Extrinsic Calibration of LiDAR and Cameras Based on a Circular Calibration Board. IEEE Sens. J. 2023, 23, 25026–25035. [Google Scholar] [CrossRef]
- Yan, G.; He, F.; Shi, C.; Wei, P.; Cai, X.; Li, Y. Joint camera intrinsic and lidar-camera extrinsic calibration. In Proceedings of the 2023 IEEE International Conference on Robotics and Automation (ICRA), London, UK, 29 May–2 June 2023; pp. 11446–11452. [Google Scholar]
- Cai, H.; Pang, W.; Chen, X.; Wang, Y.; Liang, H. A novel calibration board and experiments for 3D LiDAR and camera calibration. Sensors 2020, 20, 1130. [Google Scholar] [CrossRef] [PubMed]
- Bu, Z.; Sun, C.; Wang, P.; Dong, H. Calibration of camera and flash LiDAR system with a triangular pyramid target. Appl. Sci. 2021, 11, 582. [Google Scholar] [CrossRef]
- Fang, C.; Ding, S.; Dong, Z.; Li, H.; Zhu, S.; Tan, P. Single-shot is enough: Panoramic infrastructure based calibration of multiple cameras and 3D LiDARs. In Proceedings of the 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic, 27 September–1 October 2021; pp. 8890–8897. [Google Scholar]
- Huang, J.-K.; Grizzle, J.W. Improvements to target-based 3D LiDAR to camera calibration. IEEE Access 2020, 8, 134101–134110. [Google Scholar] [CrossRef]
- Mishra, S.; Pandey, G.; Saripalli, S. Extrinsic Calibration of a 3D-LIDAR and a Camera. In Proceedings of the 2020 IEEE Intelligent Vehicles Symposium (IV), Las Vegas, NV, USA, 19 October–13 November 2020; pp. 1765–1770. [Google Scholar]
- Xu, X.; Zhang, L.; Yang, J.; Liu, C.; Xiong, Y.; Luo, M.; Liu, B. LiDAR–camera calibration method based on ranging statistical characteristics and improved RANSAC algorithm. Robot. Auton. Syst. 2021, 141, 103776. [Google Scholar] [CrossRef]
- Tóth, T.; Pusztai, Z.; Hajder, L. Automatic LiDAR-camera calibration of extrinsic parameters using a spherical target. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 8580–8586. [Google Scholar]
- Gong, X.; Lin, Y.; Liu, J. 3D LIDAR-camera extrinsic calibration using an arbitrary trihedron. Sensors 2013, 13, 1902–1918. [Google Scholar] [CrossRef] [PubMed]
- Pusztai, Z.; Hajder, L. Accurate calibration of LiDAR-camera systems using ordinary boxes. In Proceedings of the IEEE International Conference on Computer Vision Workshops, Venice, Italy, 22–29 October 2017; pp. 394–402. [Google Scholar]
- Kümmerle, J.; Kühner, T.; Lauer, M. Automatic calibration of multiple cameras and depth sensors with a spherical target. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 1–8. [Google Scholar]
- Erke, S.; Bin, D.; Yiming, N.; Liang, X.; Qi, Z. A fast calibration approach for onboard LiDAR-camera systems. Int. J. Adv. Robot. Syst. 2020, 17, 1729881420909606. [Google Scholar] [CrossRef]
- Agrawal, S.; Bhanderi, S.; Doycheva, K.; Elger, G. Static multi-target-based auto-calibration of RGB cameras, 3D Radar, and 3D Lidar sensors. IEEE Sens. J. 2023, 23, 21493–21505. [Google Scholar] [CrossRef]
- Garrido-Jurado, S.; Muñoz-Salinas, R.; Madrid-Cuevas, F.J.; Marín-Jiménez, M.J. Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recognit. 2014, 47, 2280–2292. [Google Scholar] [CrossRef]
- OpenCV’s ChArUco Board Example. Available online: https://docs.opencv.org/3.4/df/d4a/tutorial_charuco_detection.html (accessed on 8 February 2022).
- Zhengyou, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef]
- Fischler, M.A.; Bolles, R.C. Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 1981, 24, 381–395. [Google Scholar] [CrossRef]
- Moré, J.J. The Levenberg-Marquardt algorithm: Implementation and theory. In Proceedings of the Numerical Analysis: Proceedings of the Biennial CONFERENCE, Dundee, UK, 28 June–1 July 1977; Springer: Berlin/Heidelberg, Germany, 2006; pp. 105–116. [Google Scholar]
- Gazebo. Available online: https://gazebosim.org/ (accessed on 24 March 2022).
- Mathworks’s Lidar and Camera Calibration Example. Available online: https://kr.mathworks.com/help/lidar/ug/lidar-and-camera-calibration.html (accessed on 22 March 2023).
Calibration Object | Method | Calibration Object Creation Difficulty | Calibration Quality Evaluation Method | |
---|---|---|---|---|
2D | Checkerboard | [8,9,11,12,14,15] | Easy | No |
Planar board with holes | [10,13,16,17,18,20] | Medium | No | |
Planar board in special shapes | [19,24,25,26] | Medium | No | |
3D | Spherical object | [27,30] | Hard | No |
Polyhedron | [21,22,23,28,29,31,32] | Hard | No | |
Proposed method | Easy | Yes |
Error | Translation Error (cm) | Rotation Error (deg) | |||
---|---|---|---|---|---|
Method | Mean | Std | Mean | Std | |
Method I (Whole data set) | 0.52 | 0.33 | 1.30 | 1.14 | |
Method II (sub-sampling and point-plane distance) | 0.37 | 0.17 | 0.26 | 0.10 | |
Proposed method (sub-sampling and mILD) | 0.37 | 0.14 | 0.14 | 0.07 |
Error | Translation Error (cm) | Rotation Error (deg) | |||
---|---|---|---|---|---|
Method | Mean | Std | Mean | Std | |
Previous method in [14] | 1.90 | 0.75 | 0.31 | 0.15 | |
Proposed method (sub-sampling and mILD) | 0.37 | 0.14 | 0.14 | 0.07 |
Error | Translation Error (cm) | Rotation Error (deg) | |||
---|---|---|---|---|---|
Method | Mean | Std | Mean | Std | |
Method I (Whole data set) | 1.35 | 0.91 | 1.70 | 0.98 | |
Method II (sub-sampling and point-plane distance) | 0.59 | 0.35 | 0.66 | 0.36 | |
Proposed method (sub-sampling and mILD) | 0.52 | 0.25 | 0.48 | 0.25 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yoo, J.H.; Jung, G.B.; Jung, H.G.; Suhr, J.K. Camera–LiDAR Calibration Using Iterative Random Sampling and Intersection Line-Based Quality Evaluation. Electronics 2024, 13, 249. https://doi.org/10.3390/electronics13020249
Yoo JH, Jung GB, Jung HG, Suhr JK. Camera–LiDAR Calibration Using Iterative Random Sampling and Intersection Line-Based Quality Evaluation. Electronics. 2024; 13(2):249. https://doi.org/10.3390/electronics13020249
Chicago/Turabian StyleYoo, Ju Hee, Gu Beom Jung, Ho Gi Jung, and Jae Kyu Suhr. 2024. "Camera–LiDAR Calibration Using Iterative Random Sampling and Intersection Line-Based Quality Evaluation" Electronics 13, no. 2: 249. https://doi.org/10.3390/electronics13020249
APA StyleYoo, J. H., Jung, G. B., Jung, H. G., & Suhr, J. K. (2024). Camera–LiDAR Calibration Using Iterative Random Sampling and Intersection Line-Based Quality Evaluation. Electronics, 13(2), 249. https://doi.org/10.3390/electronics13020249