Ant3D—A Fisheye Multi-Camera System to Survey Narrow Spaces
Abstract
:1. Introduction
1.1. Background
1.2. Research and Paper Objectives
- Cost-effectiveness: To be competitive for low-budget applications and for the survey of secondary spaces for which laser scanning cannot be justified, such as geology and archaeology.
- Speed-effectiveness: Like the other MMSs, it must speed up the acquisition process regardless of the complexity of the space to be surveyed (narrow and meandering spaces).
- Reliability: The time saved on site must not be spent during data elaboration due to unreliable processes. This is probably the most important flaw of today’s MMSs, and it is also a problem encountered in the early tests with fisheye photogrammetry.
1.3. The Beginning of the Research—The FINE Benchmark Experience
- Tunnel: a dark underground tunnel (around 80 m long) excavated in the rock, with a muddy floor and humid walls. In some areas, the ceiling is lower than 1.5 m.
- Tower: an artificial passage composed of two rooms with a circular/semi-spherical shape that are connected by an interior path, starting from the tower’s ground floor and leading to the castle’s upper part, constituting staircases, planar surfaces, sharp edges, walls with squared rock blocks, and relatively uniform texture.
- The geometry of the multi-camera. The used configuration, consisting of six GoPro cameras oriented mainly in the frontal direction combined with the surface roughness of the rock walls, has resulted in an insufficient number of tie points to connect the images acquired in the forward direction with those obtained in the backward direction.
- The rolling shutter of the sensors used. The introduction of distortions due to the acquisition in motion and the use of rolling shutter sensors has led to not being able to accurately calculate the camera’s internal orientation parameters and not being able to impose constraints on the relative orientation of the cameras without high uncertainty. Nevertheless, the constraints on the distances between the cameras were effective in reducing the drift error compared to non-constrained processing.
2. Materials and Methods
2.1. Materials
2.2. Multi-Camera Stability with Movements
2.3. Multi-Camera IO and RO Calibration Method
2.4. Designing the Multi-Camera Arrangement
- Square: It consists of four cameras organized in two horizontal couples on top of each other. The cameras can be rotated at different angles along their vertical axis. The distance between the cameras is 20 cm, both on the horizontal and on the vertical direction.
- Cross: The cross geometry consists of a vertical pair of cameras and a horizontal pair of cameras. The cameras can converge or diverge towards the centre at different angles. As in the previous configuration, the distance between the cameras is 20 cm in both horizontal and vertical directions.
- Vertical: This geometry consists of the first pair of cameras in the vertical direction with a long baseline (40 cm) converging toward the centre at different angles and a second couple in the horizontal direction with a short baseline (10 cm) diverging and pointing toward the sides.
- Horizontal: The horizontal configuration consists of a couple of frontal cameras and a couple of rear cameras. The cameras within the two couples are close together (10 cm apart), while the front and rear cameras are positioned 20 cm apart. The longest baseline is, therefore, oriented along the tunnel extension.
3. Results
3.1. A Framework for Computing Displacement Error with Movements in Fisheye Cameras
3.2. Multi-Camera IO and RO Calibration Results
3.3. The Geometric Configuration of the Multi-Camera
4. Proposed Multi-Camera System—Ant3D
4.1. Description and Main Features
- Global shutter cameras: The use of global shutter sensors allows for the reliable exploitation of the calibrated internal orientations.
- Circular fisheye lenses, with a field of view of 190°, arranged in a semicircle, allow for a hemispherical shot of the framed scene, excluding the operator, allowing for omnidirectional tie-point extraction. In addition, they provide a wide depth of field, allowing for the use of fixed focus while still covering close to faraway subjects.
- Rig geometry: The relative arrangement between the cameras favours determining the device’s position at the moment of acquisition and allows for omnidirectional constraint points that make the final reconstruction more robust.
- Calibrated RO: The constrained, rigid, and calibrated position between the cameras allows for automatic and accurate scaling, even in very large environments. The accurate hardware synchronization of the cameras guarantees consistent results.
- Reduced number of images required for 360° 3D reconstruction thanks to the wide viewing angle of the fisheye optics.
- Scaled reconstructions without additional support measures, thanks to the relative position of the calibrated cameras and their accurate synchronization.
- Speed and reliability of the acquisition, feasible even for non-photogrammetric experts.
4.2. Acquisition and Processing
- Manual refinements: During the many tests performed, misalignments have been found multiple times. In the presence of misalignment, the approach used is that of a manual intervention to correct the mistakes. This is carried out by identifying the incorrectly oriented images, resetting them, and trying to re-align them. Trying to re-align a few images is usually successful since the software overwrites the valid and invalid matches selection which usually improves when most images are already oriented correctly. These operations have been implemented in the software Metashape using scripting so that by just selecting one of the images of the multi-camera pose, the re-alignment procedure is performed for all images of that multi-camera. With the experience gathered during the testing phase, the misalignment is progressively reduced to the point of mostly never needing to intervene in the initial orientation. The key to the improvements was that of increasing the redundancy in extremely complex areas by slowing down the walking speed. Complex areas include those characterized by a complex geometry, by poor texture, or by extreme contrast of illumination.
- Tie-point filtering: The removal of poor tie points is performed by exploiting the “gradual selection” tool available in the software Metashape that offers few metrics to select tie points to be removed. Through the investigation, the metrics “reprojection error” and the “reconstruction uncertainty” were used to remove up to around the 10% worst-performing tie points.
4.3. Case Studies and Discussion
5. Conclusions and Future Works
6. Patents
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- GeoSLAM: 3D Geospatial Technology Solutions. Available online: https://geoslam.com/ (accessed on 17 May 2024).
- Leica Geosystems. Available online: https://leica-geosystems.com/ (accessed on 17 May 2024).
- Gexcel. Available online: https://gexcel.it/it/ (accessed on 17 May 2024).
- NavVis: Laser Scanning Solutions. Available online: https://www.navvis.com (accessed on 17 May 2024).
- Marotta, F.; Perfetti, L.; Fassi, F.; Achille, C.; Vassena, G.P.M. LiDAR iMMS vs Handheld Multicamera System: A Stress-Test in a Mountain Trailpath. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2022, XLIII-B1, 249–256. [Google Scholar] [CrossRef]
- Perfetti, L.; Polari, C.; Fassi, F. Fisheye Photogrammetry: Tests and Methodologies for the Survey of Narrow Spaces. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, XLII-2-W3, 573–580. [Google Scholar] [CrossRef]
- Troisi, S.; Baiocchi, V.; Del Pizzo, S.; Giannone, F. A Prompt Methodology to Georeference Complex Hypogea Environments. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, XLII-2-W3, 639–644. [Google Scholar] [CrossRef]
- Marčiš, M.; Barták, P.; Valaška, D.; Fraštia, M.; Trhan, O. Use of image based modelling for documentation of intricately shaped objects. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, XLI-B5, 327–334. [Google Scholar]
- Covas, J.; Ferreira, V.; Mateus, L. 3D Reconstruction with Fisheye Images Strategies to Survey Complex Heritage Buildings. In Proceedings of the 2015 Digital Heritage, Granada, Spain, 28 September–2 October 2015; Volume 1, pp. 123–126. [Google Scholar]
- Zlot, R.; Bosse, M. Three-Dimensional Mobile Mapping of Caves. JCKS 2014, 76, 191–206. [Google Scholar] [CrossRef]
- Otero, R.; Lagüela, S.; Garrido, I.; Arias, P. Mobile Indoor Mapping Technologies: A Review. Autom. Constr. 2020, 120, 103–131. [Google Scholar] [CrossRef]
- Elhashash, M.; Albanwan, H.; Qin, R. A Review of Mobile Mapping Systems: From Sensors to Applications. Sensors 2022, 22, 4262. [Google Scholar] [CrossRef] [PubMed]
- Trybała, P.; Kasza, D.; Wajs, J.; Remondino, F. Comparison of Low-Cost Handheld LiDAR-Based SLAM Systems for Mapping Underground Tunnels. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2023, XLVIII-1-W1, 517–524. [Google Scholar] [CrossRef]
- Trybała, P.; Kujawa, P.; Romańczukiewicz, K.; Szrek, A.; Remondino, F. Designing and Evaluating a Portable LiDAR-Based SLAM System. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2023, XLVIII-1-W3, 191–198. [Google Scholar]
- Lauterbach, H.A.; Borrmann, D.; Heß, R.; Eck, D.; Schilling, K.; Nüchter, A. Evaluation of a Backpack-Mounted 3D Mobile Scanning System. Remote Sens. 2015, 7, 13753–13781. [Google Scholar] [CrossRef]
- Thomson, C.; Apostolopoulos, G.; Backes, D.; Boehm, J. Mobile Laser Scanning for Indoor Modelling. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2013, II-5-W2, 289–293. [Google Scholar] [CrossRef]
- Huang, L. Review on LiDAR-Based SLAM Techniques. In Proceedings of the 2021 International Conference on Signal Processing and Machine Learning (CONF-SPML), Stanford, CA, USA, 14 November 2021; pp. 163–168. [Google Scholar]
- Macario Barros, A.; Michel, M.; Moline, Y.; Corre, G.; Carrel, F. A Comprehensive Survey of Visual SLAM Algorithms. Robotics 2022, 11, 24. [Google Scholar] [CrossRef]
- Pérez-García, J.L.; Gómez-López, J.M.; Mozas-Calvache, A.T.; Delgado-García, J. Analysis of the Photogrammetric Use of 360-Degree Cameras in Complex Heritage-Related Scenes: Case of the Necropolis of Qubbet El-Hawa (Aswan Egypt). Sensors 2024, 24, 2268. [Google Scholar] [CrossRef] [PubMed]
- Ortiz-Coder, P.; Sánchez-Ríos, A. A Self-Assembly Portable Mobile Mapping System for Archeological Reconstruction Based on VSLAM-Photogrammetric Algorithm. Sensors 2019, 19, 3952. [Google Scholar] [CrossRef] [PubMed]
- Torresani, A.; Menna, F.; Battisti, R.; Remondino, F. A V-SLAM Guided and Portable System for Photogrammetric Applications. Remote Sens. 2021, 13, 2351. [Google Scholar] [CrossRef]
- Perfetti, L.; Fassi, F. Handheld Fisheye Multicamera System: Surveying Meandering Architectonic Spaces in Open-Loop Mode—Accuracy Assessment. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2022, 46, 435–442. [Google Scholar] [CrossRef]
- Perfetti, L.; Elalailyi, A.; Fassi, F. Portable Multi-Camera System: From Fast Tunnel Mapping to Semi-Automatic Space Decomposition and Cross-Section Extraction. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2022, 43, 259–266. [Google Scholar] [CrossRef]
- Perfetti, L.; Teruggi, S.; Achille, C.; Fassi, F. Rapid and Low-Cost Photogrammetric Survey of Hazardous Sites, From Measurements to VR Dissemination. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2022, 48, 207–214. [Google Scholar] [CrossRef]
- Bruno, N.; Perfetti, L.; Fassi, F.; Roncella, R. Photogrammetric Survey of Narrow Spaces in Cultural Heritage: Comparison of Two Multi-Camera Approaches. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2024, 48, 87–94. [Google Scholar] [CrossRef]
- Nocerino, E.; Menna, F.; Farella, E.; Remondino, F. 3D Virtualization of an Underground Semi-Submerged Cave System. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, XLII-2-W15, 857–864. [Google Scholar] [CrossRef]
- Strecha, C.; Zoller, R.; Rutishauser, S.; Brot, B.; Schneider-Zapp, K.; Chovancova, V.; Krull, M.; Glassey, L. Quality Assessment of 3D Reconstruction Using Fisheye and Perspective Sensors. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, II-3-W4, 215–222. [Google Scholar] [CrossRef]
- Alessandri, L.; Baiocchi, V.; Del Pizzo, S.; Rolfo, M.F.; Troisi, S. Photogrammetric Survey with Fisheye Lens for the Characterization of the La Sassa Cave. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, 42, 25–32. [Google Scholar] [CrossRef]
- Balletti, C.; Guerra, F.; Tsioukas, V.; Vernier, P. Calibration of Action Cameras for Photogrammetric Purposes. Sensors 2014, 14, 17471–17490. [Google Scholar] [CrossRef] [PubMed]
- Hastedt, H.; Ekkel, T.; Luhmann, T. Evaluation of the Quality of Action Cameras with Wide-Angle Lenses in UAV Photogrammetry. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, XLI-B1, 851–859. [Google Scholar]
- Fiorillo, F.; Limongiello, M.; Fernández-Palacios, B.J. Testing GoPro for 3D Model Reconstruction in Narrow Spaces. Acta IMEKO 2016, 5, 64–70. [Google Scholar] [CrossRef]
- Kwiatek, K.; Tokarczyk, R. Photogrammetric Applications of Immersive Video Cameras. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2014, II–5, 211–218. [Google Scholar] [CrossRef]
- Fangi, G.; Pierdicca, R.; Sturari, M.; Malinverni, E.S. Improving Spherical Photogrammetry Using 360° Omni-Cameras: Use Cases and New Applications. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, XLII–2, 331–337. [Google Scholar] [CrossRef]
- Teo, T. Video-Based Point Cloud Generation Using Multiple Action Cameras. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, XL-4-W5, 55–60. [Google Scholar] [CrossRef]
- Teppati Losè, L.; Chiabrando, F.; Spanò, A. Preliminary Evaluation of a Commercial 360 Multi-Camera Rig for Photogrammetric Purposes. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, XLII–2, 1113–1120. [Google Scholar] [CrossRef]
- Barazzetti, L.; Previtali, M.; Roncoroni, F. 3D Modelling with the Samsung Gear 360. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, XLII-2-W3, 85–90. [Google Scholar] [CrossRef]
- Barazzetti, L.; Previtali, M.; Roncoroni, F. Can We Use Low-Cost 360 Degree Cameras to Create Accurate 3D Models? ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, XLII–2, 69–75. [Google Scholar] [CrossRef]
- Gottardi, C.; Guerra, F. Spherical Images for Cultural Heritage: Survey and Documentation with the Nikon KM360. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, XLII–2, 385–390. [Google Scholar]
- Teppati Losè, L.; Chiabrando, F.; Giulio Tonolo, F. Documentation of Complex Environments Using 360° Cameras. The Santa Marta Belltower in Montanaro. Remote Sens. 2021, 13, 3633. [Google Scholar] [CrossRef]
- Herban, S.; Costantino, D.; Alfio, V.S.; Pepe, M. Use of Low-Cost Spherical Cameras for the Digitisation of Cultural Heritage Structures into 3D Point Clouds. J. Imaging 2022, 8, 13. [Google Scholar] [CrossRef]
- Barazzetti, L.; Previtali, M.; Roncoroni, F. 3D Modeling with 5K 360° Videos. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2022, XLVI-2-W1, 65–71. [Google Scholar] [CrossRef]
- Gómez-López, J.M.; Pérez-García, J.L.; Mozas-Calvache, A.T.; Vico-García, D. Documentation of Cultural Heritage Through the Fusion of Geomatic Techniques. Case Study of The Cloister of “Santo Domingo” (Jaén, Spain). ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2023, XLVIII-M-2, 677–683. [Google Scholar]
- Koehl, M.; Delacourt, T.; Boutry, C. Image Capture with Synchronized Multiple-Cameras for Extraction of Accurate Geometries. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, XLI-B1, 653–660. [Google Scholar]
- Holdener, D.; Nebiker, S.; Blaser, S. Design and Implementation of a Novel Portable 360° Stereo Camera System with Low-Cost Action Cameras. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, XLII-2-W8, 105–110. [Google Scholar] [CrossRef]
- Panella, F.; Roecklinger, N.; Vojnovic, L.; Loo, Y.; Boehm, J. Cost-Benefit Analysis of Rail Tunnel Inspection for Photogrammetry and Laser Scanning. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2020, XLIII-B2, 1137–1144. [Google Scholar] [CrossRef]
- Meyer, D.E.; Lo, E.; Klingspon, J.; Netchaev, A.; Ellison, C.; Kuester, F.; Lo, E.; Klingspon, J.; Netchaev, A.; Ellison, C.; et al. TunnelCAM—A HDR Spherical Camera Array for Structural Integrity Assessments of Dam Interiors. Electron. Imaging 2020, 32, 1–8. [Google Scholar] [CrossRef]
- Di Stefano, F.; Torresani, A.; Farella, E.M.; Pierdicca, R.; Menna, F.; Remondino, F. 3D Surveying of Underground Built Heritage: Opportunities and Challenges of Mobile Technologies. Sustainability 2021, 13, 13289. [Google Scholar] [CrossRef]
- Menna, F.; Torresani, A.; Battisti, R.; Nocerino, E.; Remondino, F. A Modular and Low-Cost Portable VSLAM System for Real-Time 3D Mapping: From Indoor and Outdoor Spaces to Underwater Environments. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2022, XLVIII-2-W1, 153–162. [Google Scholar] [CrossRef]
- Elalailyi, A.; Perfetti, L.; Fassi, F.; Remondino, F. V-Slam-Aided Photogrammetry to Process Fisheye Multi-Camera Systems Sequences. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2024, 48, 189–195. [Google Scholar] [CrossRef]
- Perfetti, L.; Marotta, F.; Fassi, F.; Vassena, G.P.M. Survey of Historical Gardens: Multi-Camera Photogrammetry vs Mobile Laser Scanning. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2023, 48, 387–394. [Google Scholar] [CrossRef]
- Luhmann, T.; Robson, S.; Kyle, S.; Boehm, J. Close-Range Photogrammetry and 3D Imaging. In Close-Range Photogrammetry and 3D Imaging; De Gruyter: Berlin, Germany, 2019; ISBN 978-3-11-060725-3. [Google Scholar]
1 fps | 2 fps | 4 fps | CPs Scheme | ||
---|---|---|---|---|---|
n° tie points [M] | 3.1 | 6.8 | 15.3 | ||
CPs [cm] | T07 | 1.92 | 1 | 15.99 | |
T11 | 1.38 | 2.38 | 30.82 | ||
T14 | 2.2 | 2.78 | 55.39 | ||
T18 | 5.21 | 6.61 | 75.94 | ||
T21 | 3.55 | 4.59 | 75.65 | ||
T24 | 2.82 | 3.69 | 89.26 | ||
T26 | 4.01 | 4.28 | 89.36 | ||
T27 | 7.12 | 6.6 | 83.55 | ||
T29 | 13.34 | 10.9 | 75.34 | ||
worst | 13.34 | 10.9 | 89.36 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Perfetti, L.; Fassi, F.; Vassena, G. Ant3D—A Fisheye Multi-Camera System to Survey Narrow Spaces. Sensors 2024, 24, 4177. https://doi.org/10.3390/s24134177
Perfetti L, Fassi F, Vassena G. Ant3D—A Fisheye Multi-Camera System to Survey Narrow Spaces. Sensors. 2024; 24(13):4177. https://doi.org/10.3390/s24134177
Chicago/Turabian StylePerfetti, Luca, Francesco Fassi, and Giorgio Vassena. 2024. "Ant3D—A Fisheye Multi-Camera System to Survey Narrow Spaces" Sensors 24, no. 13: 4177. https://doi.org/10.3390/s24134177
APA StylePerfetti, L., Fassi, F., & Vassena, G. (2024). Ant3D—A Fisheye Multi-Camera System to Survey Narrow Spaces. Sensors, 24(13), 4177. https://doi.org/10.3390/s24134177