A Fast Calibration Method for Photonic Mixer Device Solid-State Array Lidars
Abstract
:1. Introduction
- (1)
- From an analysis of the actual laser emission modulation signal, the echo demodulation error of the PMD solid-state array lidar system is obtained through a detailed study of the echo demodulation processes of a sinusoidal modulation wave and a rectangular modulation wave. This provides a theoretical basis for rapid calibration of PMD solid-state array lidar at long distances.
- (2)
- A black-box calibration device is specially prepared for the calibration requirements of anti-ambient light and the echo reflection route to reduce the external interference on the one hand while also improving the uniformity of the received light on the other hand.
- (3)
- A dynamic distance simulation system that integrates a laser emission unit, a laser receiving unit, a delay control unit, and other units is designed. The echo-demodulation of the CMOS photodetector array is calibrated using the electrical analog delay. The delay phase-locked loop is used to set different laser emission times to simulate different calibration distances. We correct the calibration curve linearly to improve the delay time accuracy. This method realizes rapid and long-distance range calibration for the CMOS photodetector array without changing the calibration distance.
- (4)
- A checkerboard image is captured in grayscale, and the internal parameters and distortion coefficients of the lens that are obtained using the calibration method of Zhang [38] are used to correct the distortion of the external lens. To address the problems where the distortion correction pixels do not correspond exactly and the distance image collected using the area lidar has no depth values for some pixels, the pixel adaptive interpolation strategy is used to reduce the distortion. This method can meet the needs of the different users of the lens and achieve modular calibration.
2. System Introduction
3. Materials and Methods
3.1. Analysis of Lidar Modulation and Demodulation Errors
3.1.1. Sinusoidal-Wave Modulation and Demodulation
3.1.2. Rectangular Wave Modulation and Demodulation
3.2. Calibration Method
3.2.1. Black-box Calibration Device
3.2.2. Calibration of CMOS Photodetector Array Based on the Electrical Delay Method
3.3. Lens Distortion Correction
- (1)
- Install the lens for different angles of view at the front end of the PMD solid-state array lidar CMOS photodetector array.
- (2)
- Print a checkerboard grid to act as a target and fix it on a hard sheet.
- (3)
- Change the relative positions of the PMD solid-state array lidar and the target and acquire multiple images of the target from different angles in the grayscale mode of the lidar system.
- (4)
- Extract the feature points from each image and select the corner points of the checkerboard to act as calibration points.
- (5)
- Find the plane projection matrix H for each image.
- (6)
- Solve for the internal parameters of the PMD solid-state array lidar system using the matrix H.
- (7)
- Optimize the calibration results by back-projection transformation to obtain more accurate calibration results and calculate the distortion coefficients of the PMD solid-state array lidar system.
- (8)
- Use the internal parameters of the PMD solid-state array lidar system to convert the normalized plane points and the pixel plane points.
- (9)
- Correct the distortion of the PMD solid-state array lidar system using the distortion coefficients.
- (10)
- Process the depth image pixel points using a pixel adaptive interpolation strategy.
4. Experiments and Results
4.1. Results of Array CMOS Photoelectric Sensor Calibration
4.1.1. Results of Theoretical Error Analysis
4.1.2. Actual Experimental Results
4.2. Modular Lens Distortion Calibration Results
4.3. Distance Test Results After Completion
4.4. Performance Comparison
5. Discussion
- (1)
- By analyzing the actual laser emission modulation signals, the echo demodulation error of the PMD solid-state array lidar system was obtained based on a detailed study of the echo demodulation process of a sinusoidal modulation wave and a rectangular modulation wave. This provided a theoretical basis for rapid calibration of PMD solid-state array lidar over a large distance range. Comparison of Figure 13, Figure 14, Figure 15 and Figure 16 indicates that the theoretical error analysis results are basically consistent with the actual error analysis results, which further verifies the correctness of the theoretical analysis.
- (2)
- As shown in Figure 14, this paper presents the design of a dynamic distance simulation system that integrates a laser emission unit, a laser receiving unit, a delay control unit, and various other units. Echo demodulation of the CMOS photodetector array was performed using the black-box calibration device and the electrical analog delay method. Table 3 shows that, in terms of accuracy, the method provided by our group is superior to those proposed in [28,31] but is slightly poorer than those presented in [24,26]. However, the proposed method is superior to all other methods in terms of both calibration time and scene setup. The apparatus and the method can calibrate a CMOS photodetector array over a large distance range without changing the calibration distance.
- (3)
- Different users have different requirements for the angle of view of the lens. We have proposed a modular lens distortion correction method based on a checkerboard. Figure 18 and Table 2 show that the internal parameters and the distortion coefficients of the lens can be obtained using the calibration method of Zhang and these parameters can then be used to correct the lens distortion. To address the problems in that the distortion correction pixels do not correspond exactly and the distance images collected via area lidar have no depth values for some pixels, this paper proposed a pixel adaptive interpolation strategy to achieve distortion optimization. Figure 19 shows that the method can correct distortion for the different angles of view.
6. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Boonkwang, S.; Saiyod, S. Distance measurement using 3D stereoscopic technique for robot eyes. In Proceedings of the 7th International Conference on Information Technology and Electrical Engineering (ICITEE), Chiang Mai, Thailand, 29–30 October 2015; pp. 232–236. [Google Scholar]
- Xu, H.; Ma, Z.; Chen, Y. A modified calibration technique of camera for 3D laser measurement system. In Proceedings of the IEEE International Conference on Automation and Logistics (ICAL), Shenyang, China, 5–7 August 2009; pp. 794–798. [Google Scholar]
- Yi, H.; Yan, L.; Tsujino, K.; Lu, C. A long-distance sea wave height measurement based on 3D image measurement technique. In Proceedings of the Progress in Electromagnetic Research Symposium (PIERS), Shanghai, China, 7–10 August 2016; pp. 4774–4779. [Google Scholar]
- Zhao, H.; Diao, X.; Jiang, H.; Zhao, Z. The 3D measurement techniques for ancient architecture and historical relics. In Proceedings of the Second International Conference on Photonics and Optical Engineering (OPAL), Suzhou, China, 26–28 September 2017; p. 102562L. [Google Scholar]
- Zeng, X.; Ma, S. Flying attitude measurement of projectile using high speed photography and 3D digital image correlation technique. In Proceedings of the 2011 International Conference on Optical Instruments and Technology, Beijing, China, 6–9 November 2011; p. 819706. [Google Scholar]
- Park, Y.; Yun, S.; Won, C.S.; Cho, K.; Um, K.; Sim, S. Calibration between color camera and 3D LIDAR instruments with a polygonal planar board. Sensors 2014, 14, 5333–5353. [Google Scholar] [CrossRef] [PubMed]
- Zeng, Y.; Yu, H.; Dai, H.; Song, S.; Lin, M.; Sun, B.; Jiang, W.; Meng, M.Q.-H. An improved calibration method for a rotating 2D LiDAR system. Sensors 2018, 18, 497. [Google Scholar] [CrossRef] [PubMed]
- Sun, M.J.; Edgar, M.P.; Gibson, G.M.; Sun, B.; Radwell, N.; Lamb, R.; Padgett, M.J. Single-pixel three-dimensional imaging with time-based depth resolution. Nat. Commun. 2016, 7, 12010. [Google Scholar] [CrossRef] [PubMed]
- Kurtti, S.; Nissinen, J.; Kostamovaara, J. A wide dynamic range CMOS laser radar receiver with a time-domain walk error compensation scheme. IEEE Trans. Circuits Syst. I 2017, 64, 550–561. [Google Scholar] [CrossRef]
- Bjorndal, O.; Hamran, S.E.; Lande, T.S. UWB waveform generator for digital CMOS radar. In Proceedings of the IEEE International Symposium on Circuits and Systems (ISCAS), Oregon, Portland, 24–27 May 2015; pp. 1510–1513. [Google Scholar]
- Shotton, J.; Fitzgibbon, A.; Cook, M.; Sharp, T.; Finocchio, M.; Moore, R.; Kipman, A.; Blake, A. Real-time human pose recognition in parts from single depth images. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Colorado Springs, CO, USA, 20–25 June 2011; pp. 1297–1304. [Google Scholar]
- Izadi, S.; Kim, D.; Hilliges, O.; Molyneaux, D.; Newcombe, R.; Kohli, P.; Shotton, J.; Hodges, S.; Freeman, D.; Davison, A. KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera. In Proceedings of the 24th annual ACM symposium on User interface software and technology (UIST), Santa Barbara, CA, USA, 16–19 October 2011; pp. 559–568. [Google Scholar]
- Sturm, J.; Engelhard, N.; Endres, F.; Burgard, W.; Cremers, D. A benchmark for the evaluation of RGB-D SLAM systems. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vilamoura, Algarve, Portugal, 7–12 October 2012; pp. 573–580. [Google Scholar]
- Schiller, I.; Bartczak, B.; Kellner, F.; Koch, R. Increasing Realism and Supporting Content Planning for Dynamic Scenes in a Mixed Reality System incorporating a Time-of-Flight Camera. J. Virtual Real. Broadcast. 2010, 7, 1–10. [Google Scholar]
- Fürsattel, P.; Placht, S.; Balda, M.; Schaller, C.; Hofmann, H.; Maier, A.; Riess, C. A comparative error analysis of current time-of-flight sensors. IEEE Trans. Comput. Imaging 2016, 2, 27–41. [Google Scholar] [CrossRef]
- Chiabrando, F.; Chiabrando, R.; Piatti, D.; Rinaudo, F. Sensors for 3D imaging: Metric evaluation and calibration of a CCD/CMOS time-of-flight camera. Sensors 2009, 9, 10080–10096. [Google Scholar] [CrossRef] [PubMed]
- Kim, Y.M.; Chan, D.; Theobalt, C.; Thrun, S. Design and calibration of a multi-view TOF sensor fusion system. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), Anchorage, AK, USA, 24–26 June 2008; pp. 1–7. [Google Scholar]
- Rapp, H. Experimental and Theoretical Investigation of Correlating TOF-Camera Systems. Master’s Thesis, University of Heidelberg, Heidelberg, Germany, September 2007; pp. 1–85. [Google Scholar]
- Huang, T.; Qian, K.; Li, Y. All Pixels Calibration for ToF Camera. In Proceedings of the IOP Conference Series: Earth and Environmental Science, Paris, France, 7–9 February 2018; p. 022164. [Google Scholar]
- Fuchs, S.; Hirzinger, G. Extrinsic and depth calibration of ToF-cameras. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), Anchorage, AK, USA, 24–26 June 2008; pp. 1–6. [Google Scholar]
- Kuhnert, K.-D.; Stommel, M. Fusion of Stereo-Camera and PMD-Camera Data for Real-Time Suited Precise 3D Environment Reconstruction. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Beijing, China, 9–15 October 2006; pp. 4780–4785. [Google Scholar]
- Kahlmann, T.; Remondino, F.; Ingensand, H. Calibration for increased accuracy of the range imaging camera swissrangertm. Image Eng. Vis. Metrol. 2006, 36, 136–141. [Google Scholar]
- Lindner, M.; Kolb, A. Lateral and depth calibration of PMD-distance sensors. In Proceedings of the International Symposium on Visual Computing, Lake Tahoe, NV, USA, 6–8 November 2006; pp. 524–533. [Google Scholar]
- Schiller, I.; Beder, C.; Koch, R. Calibration of a PMD-camera using a planar calibration pattern together with a multi-camera setup. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2008, 21, 297–302. [Google Scholar]
- Schmidt, M. Analysis, Modeling and Dynamic Optimization of 3d Time-of-Flight Imaging Systems. Ph.D. Thesis, University of Heidelberg, Heidelberg, Germany, 20 July 2011; pp. 1–158. [Google Scholar]
- Jung, J.; Lee, J.-Y.; Jeong, Y.; Kweon, I.S. Time-of-flight sensor calibration for a color and depth camera pair. IEEE Trans. Pattern Anal. Mach. Intell. 2015, 37, 1501–1513. [Google Scholar] [CrossRef] [PubMed]
- Frank, M.; Plaue, M.; Rapp, H.; Köthe, U.; Jähne, B.; Hamprecht, F.A. Theoretical and experimental error analysis of continuous-wave time-of-flight range cameras. Opt. Eng. 2009, 48, 013602. [Google Scholar]
- Lindner, M.; Kolb, A. Calibration of the intensity-related distance error of the PMD ToF-camera. In Proceedings of the Intelligent Robots and Computer Vision XXV: Algorithms, Techniques, and Active Vision, Boston, MA, USA, 9–11 September 2007; p. 67640W. [Google Scholar]
- May, S.; Werner, B.; Surmann, H.; Pervolz, K. 3D time-of-flight cameras for mobile robotics. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, 9–15 October 2006; pp. 790–795. [Google Scholar]
- Gil, P.; Pomares, J.; Torres, F. Analysis and adaptation of integration time in PMD camera for visual servoing. In Proceedings of the 20th International Conference on Pattern Recognition (ICPR), Istanbul, Turkey, 23–26 August 2010; pp. 311–315. [Google Scholar]
- Steiger, O.; Felder, J.; Weiss, S. Calibration of time-of-flight range imaging cameras. In Proceedings of the 15th IEEE International Conference on Image Processing (ICIP), San Diego, CA, USA, 12–15 October 2008; pp. 1968–1971. [Google Scholar]
- Swadzba, A.; Beuter, N.; Schmidt, J.; Sagerer, G. Tracking objects in 6D for reconstructing static scenes. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), Anchorage, AK, USA, 24–26 June 2008; pp. 1–7. [Google Scholar]
- Reynolds, M.; Doboš, J.; Peel, L.; Weyrich, T.; Brostow, G.J. Capturing time-of-flight data with confidence. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Colorado Springs, CO, USA, 20–25 June 2011; pp. 945–952. [Google Scholar]
- Pathak, K.; Birk, A.; Poppinga, J. Sub-pixel depth accuracy with a time of flight sensor using multimodal gaussian analysis. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Nice, France, 22–26 September 2008; pp. 3519–3524. [Google Scholar]
- Lindner, M.; Lambers, M.; Kolb, A. Sub-pixel data fusion and edge-enhanced distance refinement for 2d/3d images. Int. J. Intell. Syst. Technol. Appl. 2008, 5, 344–354. [Google Scholar] [CrossRef]
- Kim, Y.S.; Kang, B.; Lim, H.; Choi, O.; Lee, K.; Kim, J.D.; Kim, C. Parametric model-based noise reduction for ToF depth sensors. In Proceedings of the Three-Dimensional Image Processing (3DIP) and Applications II, Burlingame, CA, USA, 24–26 January 2012; p. 82900A. [Google Scholar]
- Kern, F. Supplementing laserscanner geometric data with photogrammetric images for modeling. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2002, 34, 454–461. [Google Scholar]
- Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef]
Case | Pixel Adaptive Interpolation Strategy |
---|---|
Internal Parameters | fx | fy | cx | cy |
207.767 | 209.308 | 174.585 | 129.201 | |
Distortion Coefficients | k1 | k2 | p1 | p2 |
−0.37568 | 0.15729 | 0.00304 | 0.00046 |
Distance Error (mm) | Calibration Time | Scene Scope | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Test Distance | 900 | 1100 | 1300 | 1700 | 2100 | 2500 | 3000 | 3500 | 4000 | |||
Steiger et al. [31] | NaN | 3 (at 1207) | 25 (at 1608) | 57 (at 2250) | NaN | NaN | NaN | NaN | About dozens of minutes | Not mentioned | ||
Lindner et al. [28] | 19.4 | 28.2 | 21.0 | 28.9 | 13.5 | 17.3 | 15.9 | 21.8 | 26.7 | About dozens of minutes | About 4 m × 0.6 m × 0.4 m | |
Schiller et al. [24] (automatic feature detection) | 7.45 (mean) | NaN | NaN | About dozens of minutes | About 3 m × 0.6 m × 0.4 m | |||||||
Schiller et al. [24] (some manual feature selection) | 7.51 (mean) | NaN | NaN | About dozens of minutes | About 3 m × 0.6 m × 0.4 m | |||||||
Jung et al. [26] | 7.18 (mean) | NaN | NaN | 137 s (calculation) About dozens of minutes (scene setup) | About 3 m × 0.6 m ×1 m | |||||||
Our method | 3.37 | 4.82 | 6.17 | 8.30 | 9.57 | 8.88 | 12.79 | 10.58 | 14.52 | 90 s (calculation) 10 min (calculation, scene setup and initialization) | About 0.8 m × 0.4 m × 0.3 m |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhai, Y.; Song, P.; Chen, X. A Fast Calibration Method for Photonic Mixer Device Solid-State Array Lidars. Sensors 2019, 19, 822. https://doi.org/10.3390/s19040822
Zhai Y, Song P, Chen X. A Fast Calibration Method for Photonic Mixer Device Solid-State Array Lidars. Sensors. 2019; 19(4):822. https://doi.org/10.3390/s19040822
Chicago/Turabian StyleZhai, Yayu, Ping Song, and Xiaoxiao Chen. 2019. "A Fast Calibration Method for Photonic Mixer Device Solid-State Array Lidars" Sensors 19, no. 4: 822. https://doi.org/10.3390/s19040822
APA StyleZhai, Y., Song, P., & Chen, X. (2019). A Fast Calibration Method for Photonic Mixer Device Solid-State Array Lidars. Sensors, 19(4), 822. https://doi.org/10.3390/s19040822