Extrinsic Camera Calibration with Line-Laser Projection
Abstract
:1. Introduction
2. Related Work
3. Algorithm
- Step A:
- In the first step, laser lines are projected onto a plane and image sequences are captured by all cameras (Section 3.1).
- Step B:
- The calibration algorithm needs line correspondences between the different cameras. This can be a practical challenge, as it requires some form of synchronization between the cameras and the line projector. Our approach solves this problem by projecting the lines with different frequencies. Each line can then be isolated from the others by its specific frequency (Section 3.2) and is then detected in the image.
- Step C:
- A first estimate of the plane is made in optimization with only two degrees of freedom (Section 3.4). This results in a first estimate for the plane and the camera poses.
- Step D:
- Starting from this estimate, a refinement optimization is done (Section 3.5).
- Step E:
- The poses can only be determined up to a scale factor. Therefore, scaling may be needed (Section 3.6).
3.1. Line Projection
- Manual:
- A simple—but tedious—approach would be to project and capture one line at a time.This requires lots of manual intervention and may take quite some time. While this may be fine for a proof of concept, it may not be suited to calibrate many cameras in an active industrial environment where down-times must be kept to a minimum.
- Triggered:
- The cameras could be synchronized electronically with the projector. This means there must be a connection between the projector and the capturing system.
- Modulated:
- By modulating the different lines, the line projector can be a stand-alone device that can be moved around easily to different poses. Low modulation frequencies—in the range 1 to 3 Hz—can be used. This allows for simple projector hardware and works with all common camera frame rates. It works with all cameras—monochrome or RGB—that can record the projected wavelength.
3.2. Line Correspondence by Frequency Separation
3.3. Bundle Adjustment Cost Function
3.4. Plane Optimization
- : the angle between the normal of the plane and the -plane.
- : the angle between the normal of the plane and the -plane.
- d: the distance from the plane to the origin.
3.5. Pose Refinement
3.6. Pose Scaling
- When the distance is known between one of the camera centers and the projection plane. If the projection plane is the floor, then this distance is the height of that camera. This is used in all simulation experiments of Section 4. Usually, the exact location of the camera center is known with an uncertainty of several millimeters. However, the distance between the camera center and the plane is several orders of magnitude larger. So the location error of the exact camera center will not contribute significantly to the total scale error.
- If the real-world distance between two cameras is known, the scale is known. This is used in the experiment with the translation stage in Section 5.1.
- If one or more stereo cameras are involved, its stereo baseline determines the scale.
- A marker of known size can be applied to the projection plane or two markers with a known distance between them. Such markers can be detected either automatically or manually in a camera image. Knowing the parameters of the calculated plane, the image points can be reprojected to calculate the scale. When using the floor plane, this could also be the known size between the seams of the floor tiles or an object of known size placed on the floor.
4. Evaluation on Simulated Data
4.1. Evaluation Metrics
4.2. Sensitivity Analysis with Simulation
4.3. Comparison between Overlapping and Non-Overlapping Fields of View
4.4. Sensitivity to Sensor Noise
4.5. Sensitivity to Errors in Camera Intrinsics
4.6. Sensitivity to Plane Curvature
4.7. Sensitivity to Line Curvature
5. Real-World Experiments
5.1. Translation Stage for Ground Truth
5.2. 360° Camera Setup
6. Comparison to State of the Art
- Van Crombrugge et al. [20] use a standard LCD or DLP projector to project Gray code. We report the median rotation errors for simulation with no overlap and three real-world experiments, hence the range instead of a single number.
- Robinson et al. [22] use a straightforward method. The two non-overlapping cameras each have a checkerboard in view. A third camera is temporarily added that has both checkerboards in view. No results for real-world experiments were published, only simulations.
- Zhu et al. [23] use “planar structures in the scene and combine plane-based structure from motion, camera pose estimation, and task-specific bundle adjustment.” The rotation error reported here is the mean pose error of 16 cameras compared to the ground truth.
7. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Nikodem, M.; Słabicki, M.; Surmacz, T.; Mrówka, P.; Dołȩga, C. Multi-Camera Vehicle Tracking Using Edge Computing and Low-Power Communication. Sensors 2020, 20, 3334. [Google Scholar] [CrossRef] [PubMed]
- Sheu, R.K.; Pardeshi, M.; Chen, L.C.; Yuan, S.M. STAM-CCF: Suspicious Tracking Across Multiple Camera Based on Correlation Filters. Sensors 2019, 19, 3016. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef] [Green Version]
- Su, P.C.; Shen, J.; Xu, W.; Cheung, S.C.; Luo, Y. A Fast and Robust Extrinsic Calibration for RGB-D Camera Networks. Sensors 2018, 18, 235. [Google Scholar] [CrossRef] [Green Version]
- Guan, J.; Deboeverie, F.; Slembrouck, M.; Van Haerenborgh, D.; Van Cauwelaert, D.; Veelaert, P.; Philips, W. Extrinsic Calibration of Camera Networks Based on Pedestrians. Sensors 2016, 16, 654. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Xia, R.; Hu, M.; Zhao, J.; Chen, S.; Chen, Y.; Fu, S. Global calibration of non-overlapping cameras: State of the art. Optik 2018, 158, 951–961. [Google Scholar] [CrossRef]
- Sun, J.; Liu, Q.; Liu, Z.; Zhang, G. A calibration method for stereo vision sensor with large FOV based on 1D targets. Opt. Lasers Eng. 2011, 49, 1245–1250. [Google Scholar] [CrossRef]
- Penne, R.; Ribbens, B.; Roios, P. An Exact Robust Method to Localize a Known Sphere by Means of One Image. Int. J. Comput. Vis. 2019, 127, 1012–1024. [Google Scholar] [CrossRef]
- Liu, Z.; Zhang, G.; Wei, Z.; Sun, J. A global calibration method for multiple vision sensors based on multiple targets. Meas. Sci. Technol. 2011, 22, 125102–125112. [Google Scholar] [CrossRef]
- Miyata, S.; Saito, H.; Takahashi, K.; Mikami, D.; Isogawa, M.; Kojima, A. Extrinsic Camera Calibration Without Visible Corresponding Points Using Omnidirectional Cameras. IEEE Trans. Circuits Syst. Video Technol. 2018, 28, 2210–2219. [Google Scholar] [CrossRef]
- Xu, Y.; Gao, F.; Zhang, Z.; Jiang, X. A calibration method for non-overlapping cameras based on mirrored absolute phase target. Int. J. Adv. Manuf. Technol. 2019, 104, 9–15. [Google Scholar] [CrossRef] [Green Version]
- Jiang, T.; Chen, X.; Chen, Q.; Jiang, Z. Flexible and Accurate Calibration Method for Non-Overlapping Vision Sensors Based on Distance and Reprojection Constraints. Sensors 2019, 19, 4623. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Liu, Q.; Sun, J.; Liu, Z.; Zhang, G. Global calibration method of multi-sensor vision system using skew laser lines. Chin. J. Mech. Eng. 2012, 25, 405–410. [Google Scholar] [CrossRef]
- Liu, Q.; Sun, J.; Zhao, Y.; Liu, Z. Calibration method for geometry relationships of nonoverlapping cameras using light planes. Opt. Eng. 2013, 52, 074108. [Google Scholar] [CrossRef] [Green Version]
- Liu, Z.; Wei, X.; Zhang, G. External parameter calibration of widely distributed vision sensors with non-overlapping fields of view. Opt. Lasers Eng. 2013, 51, 643–650. [Google Scholar] [CrossRef]
- Sels, S.; Ribbens, B.; Vanlanduit, S.; Penne, R. Camera calibration using gray code. Sensors 2019, 19, 246. [Google Scholar] [CrossRef] [Green Version]
- Frigo, M.; Johnson, S.G. The Design and Implementation of FFTW3. Special issue on “Program Generation, Optimization, and Platform Adaptation”. Proc. IEEE 2005, 93, 216–231. [Google Scholar] [CrossRef] [Green Version]
- Torr, P.H.S.; Zisserman, A. MLESAC: A New Robust Estimator with Application to Estimating Image Geometry. Comput. Vis. Image Underst. 2000, 78, 138–156. [Google Scholar] [CrossRef] [Green Version]
- Hartley, R.; Zisserman, A. Multiple View Geometry in Computer Vision; Cambridge University Press: Cambridge, UK, 2004; pp. 88–93. [Google Scholar] [CrossRef] [Green Version]
- Van Crombrugge, I.; Penne, R.; Vanlanduit, S. Extrinsic camera calibration for non-overlapping cameras with Gray code projection. Opt. Lasers Eng. 2020, 134, 106305. [Google Scholar] [CrossRef]
- Poulin-Girard, A.S.; Thibault, S.; Laurendeau, D. Influence of camera calibration conditions on the accuracy of 3D reconstruction. Opt. Express 2016, 24, 2678. [Google Scholar] [CrossRef] [PubMed]
- Robinson, A.; Persson, M.; Felsberg, M. Robust accurate extrinsic calibration of static non-overlapping cameras. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Berlin/Heidelberg, Germany, 2017; Volume 10425, pp. 342–353. [Google Scholar] [CrossRef] [Green Version]
- Zhu, C.; Zhou, Z.; Xing, Z.; Dong, Y.; Ma, Y.; Yu, J. Robust Plane-Based Calibration of Multiple Non-Overlapping Cameras. In Proceedings of the 2016 Fourth International Conference on 3D Vision (3DV), Stanford, CA, USA, 25–28 October 2016; pp. 658–666. [Google Scholar] [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Van Crombrugge, I.; Penne, R.; Vanlanduit, S. Extrinsic Camera Calibration with Line-Laser Projection. Sensors 2021, 21, 1091. https://doi.org/10.3390/s21041091
Van Crombrugge I, Penne R, Vanlanduit S. Extrinsic Camera Calibration with Line-Laser Projection. Sensors. 2021; 21(4):1091. https://doi.org/10.3390/s21041091
Chicago/Turabian StyleVan Crombrugge, Izaak, Rudi Penne, and Steve Vanlanduit. 2021. "Extrinsic Camera Calibration with Line-Laser Projection" Sensors 21, no. 4: 1091. https://doi.org/10.3390/s21041091
APA StyleVan Crombrugge, I., Penne, R., & Vanlanduit, S. (2021). Extrinsic Camera Calibration with Line-Laser Projection. Sensors, 21(4), 1091. https://doi.org/10.3390/s21041091