Panoramic Stereo Imaging of a Bionic Compound-Eye Based on Binocular Vision
Abstract
:1. Introduction
- (1)
- Based on binocular stereo vision optimization, a binocular stereo vision optimization scheme is proposed. Firstly, the design of the imaging system requires that all the light collected should be tangent to a circle with the pupil distance as the diameter, and the number of cameras, the radius of the disc, and the field of view angle of the lens should also meet the constraints of the equivalent pupil distance. Then, the imaging system takes two shots, the first one is the left eye panorama, the second one is the right eye panorama, and finally the binocular panoramic stereo image is synthesized.
- (2)
- A real-time registration method of multi detector image mosaic based on hardware and software is proposed. Firstly, an ultra-high precision calibration platform based on theodolite angle compensation is developed to calibrate the spatial coordinates of camera detector at sub-pixel level and calculate the imaging overlap area of adjacent cameras. Then, an image registration algorithm based on GPU acceleration is proposed to complete the real-time stitching of image overlapping areas.
2. Related Work
3. Optical Design of Panoramic Stereo Imaging
3.1. Binocular Stereo Vision Model
3.2. Camera Array Design of Panoramic Stereo Imaging
4. Binocular Stereo Panoramic Image Synthesis Algorithm
4.1. Ultra High Precision Camera Calibration Based on Binocular Stereo Vision
4.2. Binocular Stereo Matching and Depth Information Estimation
5. Results and Discussion
5.1. Testing Environment
5.2. Experiments for Camera Calibration
5.3. Experiments for Stereo Matching and Depth Information Estimation
5.4. Experiments for Panorama Mosaic
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Bastug, E.; Bennis, M.; Medard, M. Toward Interconnected Virtual Reality: Opportunities, Challenges, and Enablers. IEEE Commun. Mag. 2017, 55, 110–117. [Google Scholar] [CrossRef]
- Berg, L.P.; Vance, J.M. Industry use of virtual reality in product design and manufacturing: A survey. Virtual Real. 2017, 21, 1–17. [Google Scholar] [CrossRef]
- Tseng, S.-M.; Yu, J.-C.; Hsu, Y.-T. Panoramic endoscope based on convex parabolic mirrors. Opt. Eng. 2018, 57, 033102. [Google Scholar] [CrossRef]
- Li, W.; Li, Y.F. Single-camera panoramic stereo imaging system with a fisheye lens and a convex mirror. Opt. Express 2011, 19, 5855–5867. [Google Scholar] [CrossRef] [PubMed]
- Zheng, Y.; Song, L.; Huang, J. Detection of the three-dimensional trajectory of an object based on a curved bionic compound eye. Optics Lett. 2019, 44, 4143–4146. [Google Scholar] [CrossRef] [PubMed]
- Golish, D.R.; Vera, E.M.; Kelly, K.J. Development of a scalable image formation pipeline for multiscale gigapixel photography. Optics Express 2012, 20, 22048–22062. [Google Scholar] [CrossRef] [PubMed]
- Marks, D.L.; Son, H.S.; Kim, J.; Brady, D.J. Engineering a gigapixel monocentric multiscale camera. Opt. Eng. 2012, 51, 083202. [Google Scholar] [CrossRef]
- Yan, J.; Kong, L.; Diao, Z. Panoramic stereo imaging system for efficient mosaicking: Parallax analyses and system design. Appl. Opt. 2018, 57, 396–403. [Google Scholar] [CrossRef] [PubMed]
- Marks, D.L.; Llull, P.R.; Phillips, Z. Characterization of the AWARE 10 two-gigapixel wide-field-of-view visible imager. Appl. Opt. 2014, 53, 54–63. [Google Scholar] [CrossRef]
- Brady, D.J.; Gehm, M.E.; Stack, R.A. Multiscale gigapixel photography. Nature 2012, 486, 386–389. [Google Scholar] [CrossRef] [PubMed]
- Cao, T.; Xiang, Z.; Gong, X.J. Terrain Reconstruction of Lunar Surface Based on Binocular Fisheye Camera. In Proceedings of the 11th World Congress on Intelligent Control and Automation, Shenyang, China, 29 June–4 July 2014; pp. 2463–2468. [Google Scholar]
- Lai, P.K.; Xie, S.; Lang, J. Real-Time Panoramic Depth Maps from Omni-directional Stereo Images for 6 DoF Videos in Virtual Reality. In Proceedings of the 26th IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 405–412. [Google Scholar]
- Perazzi, F.; Sorkine-Hornung, A.; Zimmer, H. Panoramic Video from Unstructured Camera Arrays. In Proceedings of the 36th Annual Conference of the European-Association-for-Computer-Graphics, Zurich, Switzerland, 4–8 May 2015; pp. 57–68. [Google Scholar]
- Im, S.; Ha, H.; Rameau, F. All-Around Depth from Small Motion with a Spherical Panoramic Camera. In Proceedings of the 14th European Conference on Computer Vision (ECCV), Amsterdam, The Netherlands, 8–16 October 2016; pp. 156–172. [Google Scholar]
- Zou, X.; Zou, H.; Lu, J. Virtual manipulator-based binocular stereo vision positioning system and errors modelling. Mach. Vis. Appl. 2012, 23, 43–63. [Google Scholar] [CrossRef]
- Zhong, L.; Qin, J.; Yang, X. An Accurate Linear Method for 3D Line Reconstruction for Binocular or Multiple View Stereo Vision. Sensors 2021, 21, 658. [Google Scholar] [CrossRef] [PubMed]
- Jia, Z.; Yang, J.; Liu, W. Improved camera calibration method based on perpendicularity compensation for binocular stereo vision measurement system. Opt. Express 2015, 23, 15205–15223. [Google Scholar] [CrossRef] [PubMed]
- Khoramshahi, E.; Campos, M.B.; Tommaselli, A.M.G. Accurate Calibration Scheme for a Multi-Camera Mobile Mapping System. Remote Sens. 2019, 11, 2778. [Google Scholar] [CrossRef] [Green Version]
- Ding, P.; Wang, F.; Gu, D. Research on Optimization of SURF Algorithm Based on Embedded CUDA Platform. In Proceedings of the 8th IEEE Annual International Conference on Cyber Technology in Automation Control, and Intelligent Systems (IEEE-CYBER), Tianjin, China, 19–23 July 2018; pp. 1351–1355. [Google Scholar]
- Muja, M.; Lowe, D.G. Fast approximate nearest neighbors with automatic algorithm configuration. In Proceedings of the 4th International Conference on Computer Vision Theory and Applications, Lisbon, Portugal, 5–8 February 2009; pp. 331–340. [Google Scholar]
- Wang, X.; Ouyang, J.; Wei, Y. Real-Time Vision through Haze Based on Polarization Imaging. Appl. Sci. 2019, 9, 142. [Google Scholar] [CrossRef] [Green Version]
- Douini, Y.; Riffi, J.; Mahraz, A.M. An image registration algorithm based on phase correlation and the classical Lucas-Kanade technique. Signal Image Video Process. 2017, 11, 1321–1328. [Google Scholar] [CrossRef]
- Prawira, W.; Nasrullah, E.; Sulistiyanti, S.R. The Detection of 3D Object Using a Method of a Harris Corner Detector and Lucas-Kanade Tracker Based on Stereo Image. In Proceedings of the International Conference on Electrical Engineering and Computer Science (ICECOS), Palembang, Indonesia, 22–23 August 2017; pp. 163–166. [Google Scholar]
- Wang, Z.; Yang, X. Moving Target Detection and Tracking Based on Pyramid Lucas-Kanade Optical Flow. In Proceedings of the 3rd IEEE International Conference on Image, Vision and Computing (ICIVC), Chongqing, China, 27–29 June 2018; pp. 66–69. [Google Scholar]
- Ahmine, Y.; Caron, G.; Mouaddib, E.M. Adaptive Lucas-Kanade tracking. Image Vis. Comput. 2019, 88, 1–8. [Google Scholar] [CrossRef] [Green Version]
- Liu, Y.; Huang, W.; Wang, Y. A New Stereo Matching Method for RAW Image Data Based on Improved SGBM. In Proceedings of the Conference on Optical Sensing and Imaging Technologies and Applications/International Symposium on Optoelectronic Technology and Application (OTA)/Annual Conference of the Chinese-Society-for-Optical-Engineering (CSOE), Beijing, China, 22–24 May 2018; p. 10845. [Google Scholar]
- Muja, M.; Lowe, D.G. Scalable nearest neighbor algorithms for high dimensional data. IEEE Trans. Pattern Anal. Mach. Intell. 2014, 36, 2227–2240. [Google Scholar] [CrossRef] [PubMed]
Matching Method | Running Time (ms) | MAE | RMSE | Mismatch Percentage (%) |
---|---|---|---|---|
BM | 32 | 5.5432 | 12.9204 | 4.61 |
SGBM | 76 | 5.5041 | 12.9892 | 3.82 |
proposed | 21 | 3.1827 | 10.6416 | 2.93 |
The Actual Distance between the Target Object and the Camera (mm) | The Depth Information Estimated by the Algorithm Is Proposed (mm) | Average Error between Calculated Value and Actual Value (%) |
---|---|---|
1200 | 1268.43 1263.31 | 5.21 |
1258.24 1260.34 | ||
1800 | 1842.95 1832.17 | 1.49 |
1821.52 1810.99 | ||
2400 | 2428.70 2420.01 | 0.52 |
2410.01 2391.62 | ||
3000 | 3012.52 3012.52 | 2.79 |
3297.91 3012.52 |
Algorithm | Registration Time (s) | Translation Error (Pixel) | Rotation Error (°) |
---|---|---|---|
Contrast | 0.164 | 0.032 | 0.051 |
proposed | 0.031 | 0.017 | 0.026 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, X.; Li, D.; Zhang, G. Panoramic Stereo Imaging of a Bionic Compound-Eye Based on Binocular Vision. Sensors 2021, 21, 1944. https://doi.org/10.3390/s21061944
Wang X, Li D, Zhang G. Panoramic Stereo Imaging of a Bionic Compound-Eye Based on Binocular Vision. Sensors. 2021; 21(6):1944. https://doi.org/10.3390/s21061944
Chicago/Turabian StyleWang, Xinhua, Dayu Li, and Guang Zhang. 2021. "Panoramic Stereo Imaging of a Bionic Compound-Eye Based on Binocular Vision" Sensors 21, no. 6: 1944. https://doi.org/10.3390/s21061944
APA StyleWang, X., Li, D., & Zhang, G. (2021). Panoramic Stereo Imaging of a Bionic Compound-Eye Based on Binocular Vision. Sensors, 21(6), 1944. https://doi.org/10.3390/s21061944