Experimental Study of Radial Distortion Compensation for Camera Submerged Underwater Using Open SaltWaterDistortion Data Set
Abstract
:1. Introduction
- How accurate is such analytical calibration?
- How does the accuracy of the initial calibration in the air affect the RD correction quality for this type of water?
- How does the inaccuracy of water refractive index selection affect the RD correction quality?
2. Reference Calibration Algorithm
- Calibration algorithms based on active vision, calculating the parameters of RD correction from a series of scene images; in this case, information about camera movement is known during calibration [23].
- Self-calibration algorithms that calculate the RD correction parameters by checking the correctness of the epipolar constraint for a series of images of the same scene taken from different angles [24].
3. Methods
3.1. Classic Algorithm for Radial Distortion Correction
3.2. Formula for Radial Distortion Correction
4. Calibration Data Set of Images in Salt Water (SWD)
- After the shutter of the smartphone camera was released, the image was processed programmatically from the RAW format, as a result of which the final image was cropped, which was not displayed on the smartphone screen at the time of shooting. Thus, it was difficult to obtain a board at the edge of the image, where the distortion is known to be maximal. An example histogram of the distribution of board cells corners in the original image is shown in Figure 3.
- Due to the movement of water movement during shutter release and further image processing, many photos turned cloudy. These photos had to be excluded as, due to the turbidity, some cell corners of the chessboard became indistinguishable.
- With an increase in salinity, the water turbidity also increased, as the salt used was not pure enough and contained impurities.
- Laminated paper with the chessboard image was used for shooting underwater. Because of this, light glares appeared at some angles. Particularly, it became difficult to determine the location of the chess grid corners, and such photos were also manually excluded.
5. Experimental Results
5.1. Precision of the Correction Formula
- Metric 1. The standard deviation of the cell corners from the straight line approximating them, determined using the OLS method, was estimated.
- Metric 2. The distance from the straight line constructed through the corners of the chessboard and the most distant cell corner corresponding to it.
- M1. The average value of metric 1 on the entire set of images for all lines.
- M2. The maximum value of metric 1 on the entire set of images among all lines.
- M3. The average value of metric 2 on the entire set of images for all lines.
- M4. The average value of metric 2 on the entire set of images for the most distant lines (for each image, one such line was chosen).
5.2. Dependence of the Correction Precision on Salinity
5.3. Quality of the Automatic Chessboard Cell Corner Detector
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Conrady, A.E. Decentred Lens-Systems. Mon. Not. R. Astron. Soc. 1919, 79, 384–390. [Google Scholar] [CrossRef] [Green Version]
- Shortis, M. Camera Calibration Techniques for Accurate Measurement Underwater. Sensors 2015, 15, 30810–30826. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ellender, B.; Becker, A.; Weyl, O.; Swartz, E. Underwater video analysis as a non-destructive alternative to electrofishing for sampling imperilled headwater stream fishes. Aquat. Conserv. Mar. Freshw. Ecosyst. 2012, 22, 58–65. [Google Scholar] [CrossRef]
- Pavin, A.M. Identifikatsiya podvodnykh ob”ektov proizvol’noi formy na fotosnimkakh morskogo dna [Identification of underwater objects of any shape on photos of the sea floor]. Podvodnye Issledovaniya i Robototekhni- ka [Underw. Res. Robot.] 2011, 2, 26–31. [Google Scholar]
- Somerton, D.; Glendhill, C. Report of the National Marine Fisheries Service Workshop on Underwater Video Analysis. In Proceedings of the National Marine Fisheries Service Workshop on Underwater Video Analysis, Seattle, WA, USA, 4–6 August 2004. [Google Scholar]
- Elibol, A.; Möller, B.; Garcia, R. Perspectives of auto-correcting lens distortions in mosaic-based underwater navigation. In Proceedings of the 2008 23rd International Symposium on Computer and Information Sciences, Istanbul, Turkey, 27–29 October 2008; pp. 1–6. [Google Scholar] [CrossRef] [Green Version]
- Skarlatos, D.; Agrafiotis, P. Image-Based Underwater 3D Reconstruction for Cultural Heritage: From Image Collection to 3D. Critical Steps and Considerations. In Visual Computing for Cultural Heritage; Springer: Cham, Switzerland, 2020; pp. 141–158. [Google Scholar] [CrossRef]
- Botelho, S.; Drews-Jr, P.; Oliveira, G.; Figueiredo, M. Visual odometry and mapping for Underwater Autonomous Vehicles. In Proceedings of the 6th Latin American Robotics Symposium (LARS 2009), Valparaiso, Chile, 29–30 October 2009; pp. 1–6. [Google Scholar] [CrossRef]
- Berman, D.; Levy, D.; Avidan, S.; Treibitz, T. Underwater Single Image Color Restoration Using Haze-Lines and a New Quantitative Dataset. arXiv 2018, arXiv:1811.01343. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Degrees of Protection Provided by Enclosures (IP Code). International Standard IEC 60529:1989+AMD1:1999+AMD2:2013. 2013. Available online: https://webstore.iec.ch/publication/2452 (accessed on 1 April 2022).
- Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef] [Green Version]
- Heikkila, J. Geometric Camera Calibration Using Circular Control Points. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1066–1077. [Google Scholar] [CrossRef] [Green Version]
- Quan, X.; Fry, E.S. Empirical equation for the index of refraction of seawater. Appl. Opt. 1995, 34, 3477–3480. [Google Scholar] [CrossRef] [PubMed]
- Lavest, J.M.; Rives, G.; Lapresté, J.T. Underwater Camera Calibration; Springer: Berlin/Heidelberg, Germany, 2000; pp. 654–668. [Google Scholar]
- Konovalenko, I.; Sidorchuk, D.; Zenkin, G. Analysis and Compensation of Geometric Distortions, Appearing when Observing Objects under Water. Pattern Recognit. Image Anal. 2018, 28, 379–392. [Google Scholar] [CrossRef]
- Sedlazeck, A.; Koch, R. Perspective and Non-perspective Camera Models in Underwater Imaging—Overview and Error Analysis. In Advances in Computer Communication and Computational Sciences; Springer: Berlin/Heidelberg, Germany, 2012; pp. 212–242. [Google Scholar]
- Daimon, M.; Masumura, A. Measurement of the refractive index of distilled water from the near-infrared region to the ultraviolet region. Appl. Opt. 2007, 46, 3811–3820. [Google Scholar] [CrossRef] [PubMed]
- Huang, L.; Zhao, X.; Huang, X.; Liu, Y. Underwater camera model and its use in calibration. In Proceedings of the 2015 IEEE International Conference on Information and Automation, Lijiang, China, 8–10 August 2015; pp. 1519–1523. [Google Scholar]
- Yau, T.; Gong, M.; Yang, Y.H. Underwater Camera Calibration Using Wavelength Triangulation. In Proceedings of the Conference on Computer Vision and Pattern Recognition, Portland, OR, USA, 23–28 June 2013; pp. 2499–2506. [Google Scholar]
- Long, L.; Dongri, S. Review of Camera Calibration Algorithms. In Advances in Computer Communication and Computational Sciences; Springer: Singapore, 2019; pp. 723–732. [Google Scholar]
- Zhang, Y.; Zhou, F.; Deng, P. Camera calibration approach based on adaptive active target. In Proceedings of the Fourth International Conference on Machine Vision (ICMV 2011), Singapore, 9–10 December 2012; Volume 8350, p. 83501G. [Google Scholar]
- Brunken, H.; Gühmann, C. Deep learning self-calibration from planes. In Proceedings of the Twelfth International Conference on Machine Vision (ICMV 2019), Amsterdam, The Netherlands, 16–18 November 2019; Volume 11433, p. 1114333L. [Google Scholar]
- Duan, Y.; Ling, X.; Zhang, Z.; Liu, X.; Hu, K. A Simple and Efficient Method for Radial Distortion Estimation by Relative Orientation. IEEE Trans. Geosci. Remote Sens. 2017, 55, 6840–6848. [Google Scholar] [CrossRef]
- Lehtola, V.; Kurkela, M.; Ronnholm, P. Radial Distortion from Epipolar Constraint for Rectilinear Cameras. J. Med. Imaging 2017, 3, 8. [Google Scholar] [CrossRef] [Green Version]
- Kunina, I.; Gladilin, S.; Nikolaev, D. Blind radial distortion compensation in a single image using fast Hough transform. Comput. Opt. 2016, 40, 395–403. [Google Scholar] [CrossRef]
- Xue, Z.; Xue, N.; Xia, G.S.; Shen, W. Learning to calibrate straight lines for fisheye image rectification. In Proceedings of the Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 15–20 June 2019; pp. 1643–1651. [Google Scholar]
- Brown, D.C. Decentering distortion of lenses. Photogramm. Eng. Remote Sens. 1966, 32, 444–462. [Google Scholar]
- Open Source Computer Vision Library. Available online: https://opencv.org (accessed on 7 July 2020).
- OpenCV Documentation: Camera Calibration and 3D Reconstruction. Available online: https://docs.opencv.org/2.4/modules/calib3d/doc/camera_calibration_and_3d_reconstruction.html (accessed on 7 July 2020).
- CRC Handbook. CRC Handbook of Chemistry and Physics, 85th ed.; CRC Press: Boca Raton, FL, USA, 2004; pp. 8–71. [Google Scholar]
Conditions | Method | M1 | M2 | M3 | M4 |
---|---|---|---|---|---|
In the air | Without correction | 0.51 | 2.74 | 1.06 | 1.66 |
Classic | 0.49 | 2.67 | 1.02 | 1.59 | |
Underwater (salinity ) | Without correction | 1.41 | 7.57 | 2.45 | 3.97 |
Classic | 0.93 | 4.89 | 2.33 | 3.68 | |
Formula (2) (n = 1.33) | 0.65 | 3.96 | 1.79 | 2.75 | |
Underwater (salinity ) | Without correction | 1.51 | 6.80 | 2.79 | 5.28 |
Classic | 0.90 | 4.38 | 2.13 | 3.49 | |
Formula (2) (n = 1.35) | 0.69 | 4.05 | 1.80 | 2.64 | |
Underwater (salinity ) | Without correction | 1.60 | 8.54 | 3.25 | 5.36 |
Classic | 1.03 | 4.79 | 2.66 | 3.90 | |
Formula (2) (n = 1.38) | 0.91 | 4.55 | 2.44 | 3.54 | |
Underwater (salinity ) | Without correction | 1.50 | 7.03 | 3.54 | 6.08 |
Classic | 0.89 | 5.24 | 2.34 | 3.63 | |
Formula (2) (n = 1.40) | 0.83 | 5.00 | 2.18 | 3.24 |
The Set of Images | Assumed n | M1 | M2 | M3 | M4 |
---|---|---|---|---|---|
Underwater, salinity (actual refractive index ) | 1.33 | 0.6504 | 3.9616 | 1.7934 | 2.7539 |
1.34 | 0.6529 | 3.9729 | 1.8062 | 2.7755 | |
1.35 | 0.6566 | 3.9840 | 1.8198 | 2.7968 | |
1.36 | 0.6615 | 3.9949 | 1.8336 | 2.8180 | |
1.37 | 0.6674 | 4.0055 | 1.8488 | 2.8399 | |
1.38 | 0.6743 | 4.0159 | 1.8645 | 2.8615 | |
1.39 | 0.6819 | 4.0260 | 1.8838 | 2.8838 | |
1.4 | 0.6901 | 4.0360 | 1.8975 | 2.9061 | |
Underwater, salinity (actual refractive index ) | 1.33 | 0.6898 | 3.9957 | 1.7931 | 2.6572 |
1.34 | 0.6903 | 4.0225 | 1.7968 | 2.6445 | |
1.35 | 0.6918 | 4.0487 | 1.8006 | 2.6405 | |
1.36 | 0.6943 | 4.0743 | 1.8060 | 2.6442 | |
1.37 | 0.6977 | 4.0994 | 1.8140 | 2.6865 | |
1.38 | 0.7018 | 4.1239 | 1.8235 | 2.6865 | |
1.39 | 0.7066 | 4.1478 | 1.8353 | 2.7120 | |
1.4 | 0.7121 | 4.1712 | 1.8476 | 2.7386 | |
Underwater, salinity (actual refractive index ) | 1.33 | 0.8940 | 4.4589 | 2.3946 | 3.4778 |
1.34 | 0.8950 | 4.4683 | 2.4003 | 3.4911 | |
1.35 | 0.8967 | 4.4901 | 2.4077 | 3.5040 | |
1.36 | 0.8992 | 4.5114 | 2.4168 | 3.5174 | |
1.37 | 0.9022 | 4.5323 | 2.4266 | 3.5308 | |
1.38 | 0.9058 | 4.5527 | 2.4371 | 3.5445 | |
1.39 | 0.9100 | 4.5727 | 2.4482 | 3.5581 | |
1.4 | 0.9146 | 4.5922 | 2.4604 | 3.5720 | |
Underwater, salinity (actual refractive index ) | 1.33 | 0.8038 | 4.8682 | 2.1883 | 3.2080 |
1.34 | 0.8056 | 4.8881 | 2.1842 | 3.2064 | |
1.35 | 0.8083 | 4.9074 | 2.1812 | 3.2060 | |
1.36 | 0.8116 | 4.9264 | 2.1794 | 3.2086 | |
1.37 | 0.8156 | 4.9449 | 2.1789 | 3.2149 | |
1.38 | 0.8201 | 4.9630 | 2.1799 | 3.2211 | |
1.39 | 0.8251 | 4.9807 | 2.1814 | 3.2275 | |
1.4 | 0.8305 | 4.9980 | 2.1841 | 3.2369 |
Image Set | Mean | Maximum |
---|---|---|
In the air | 1.7833 | 3.7317 |
Underwater (salinity ) | 2.8408 | 22.6256 |
Underwater (salinity ) | 2.6668 | 6.7385 |
Underwater (salinity ) | 3.3144 | 11.9810 |
Underwater (salinity ) | 4.1901 | 9.7023 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Senshina, D.; Polevoy, D.; Ershov, E.; Kunina, I. Experimental Study of Radial Distortion Compensation for Camera Submerged Underwater Using Open SaltWaterDistortion Data Set. J. Imaging 2022, 8, 289. https://doi.org/10.3390/jimaging8100289
Senshina D, Polevoy D, Ershov E, Kunina I. Experimental Study of Radial Distortion Compensation for Camera Submerged Underwater Using Open SaltWaterDistortion Data Set. Journal of Imaging. 2022; 8(10):289. https://doi.org/10.3390/jimaging8100289
Chicago/Turabian StyleSenshina, Daria, Dmitry Polevoy, Egor Ershov, and Irina Kunina. 2022. "Experimental Study of Radial Distortion Compensation for Camera Submerged Underwater Using Open SaltWaterDistortion Data Set" Journal of Imaging 8, no. 10: 289. https://doi.org/10.3390/jimaging8100289
APA StyleSenshina, D., Polevoy, D., Ershov, E., & Kunina, I. (2022). Experimental Study of Radial Distortion Compensation for Camera Submerged Underwater Using Open SaltWaterDistortion Data Set. Journal of Imaging, 8(10), 289. https://doi.org/10.3390/jimaging8100289