Analysis of the Photogrammetric Use of 360-Degree Cameras in Complex Heritage-Related Scenes: Case of the Necropolis of Qubbet el-Hawa (Aswan Egypt)
Abstract
:1. Introduction
2. Materials and Methods
2.1. Materials: 360-Degree Camera
2.2. Materials: Images
2.3. Materials: Scenes
3. Results and Discussion
3.1. Fisheye Images vs. Spherical Images
3.1.1. Fisheye Sensor Intrinsic Calibration
3.1.2. Spherical Images
3.1.3. Orientation Using GCPs
3.1.4. Advantages and Disadvantages of Using Spherical and Fisheye Images
3.2. Using FEI without GCPs
3.2.1. Extrinsic Calibration
3.2.2. Orientation Using FEI without Using GCPs
3.2.3. Three-Dimensional Rigid Transformation
3.2.4. Modeling and Comparison of 3D Meshes
3.3. Processing Times
4. Conclusions
- Spherical images: The stitching technique selected will largely condition the geometric quality of these images and consequently the results. In this sense, we suggest the use of stitching techniques based on depth maps because this study has demonstrated a clear improvement in the results with respect to the others. This option is limited to sets of fisheye images with full overlaps, which are obtained with a larger number of sensors, such as the one selected in this study. In this regard, we recommend the use of 360-degree multi-cameras composed of more than two fisheye lenses to obtain full overlaps. Although the use of the original fisheye images has shown better results, the analysis of spherical images based on depth maps has shown sufficient accuracy for most heritage documentation studies, showing the advantage over fisheye images of a significant reduction in the number of images, and consequently orientation processing time (Table 4). However, these images require a stitching processing time. In addition, their orientation requires GCPs, which is a problem in complex scenes. In our opinion, spherical images can be used in blocks composed of a large number of images, where the distribution and measurement of GCPs is not a significant problem. In any case, in this context, the use of spherical images will be subordinated to the application of stitching techniques based on depth maps.
- Fisheye images: The results have shown a higher geometric quality in the orientation processes, although the larger number of images will mean a longer processing time (Table 4). We have also detected greater difficulties in orientation processes when we do not use constraints between sensors. In some cases, we have to include tie points manually to complete the relative orientation. On the other hand, the results in the scenes studied have shown that the use of the extrinsic parameters to determine constraints as a function of the distance between sensors (scale bars) facilitates the relative orientation processes and reduces the use of GCPs to those necessary for the block georeferenced using a 3D rigid transformation (minimum 3 points). This has been confirmed by the transformation residuals and by the comparison of 3D meshes obtained in all scenes. In summary, the reduction of field work related to surveying techniques and consequently in costs is evident. In our opinion, the use of fisheye images will be recommended in complex scenes similar to those studied in this case, using a previous calibration that allows defining distance constraints to facilitate orientation and scaling processes.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Ullman, S. The interpretation of structure from motion. Proc. R. Soc. B 1979, 203, 405–426. [Google Scholar]
- Koenderink, J.J.; Van Doorn, A.J. Affine structure from motion. J. Opt. Soc. Am. A 1991, 8, 377–385. [Google Scholar] [CrossRef]
- Lowe, D.G. Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
- Szeliski, R. Computer Vision: Algorithms and Applications; Springer: London, UK, 2011. [Google Scholar]
- Scharstein, D.; Szeliski, R. A taxonomy and evaluation of dense two-frame stereo correspondence algorithms. Int. J. Comput. Vis. 2002, 47, 7–42. [Google Scholar] [CrossRef]
- Seitz, S.M.; Curless, B.; Diebel, J.; Scharstein, D.; Szeliski, R. A comparison and evaluation of multi-view stereo reconstruction algorithms. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, New York, NY, USA, 17 June 2006. [Google Scholar]
- Furukawa, Y.; Hernández, C. Multi-view stereo: A tutorial. Found. Trends Comput. Graph. Vis. 2015, 9, 1–148. [Google Scholar]
- Brutto, M.L.; Meli, P. Computer vision tools for 3D modelling in archaeology. Int. J. Herit. Digit. Era 2012, 1, 1–6. [Google Scholar] [CrossRef]
- Green, S.; Bevan, A.; Shapland, M. A comparative assessment of structure from motion methods for archaeological research. J. Archaeol. Sci. 2014, 46, 173–181. [Google Scholar] [CrossRef]
- Colomina, I.; Molina, P. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar]
- Nex, F.; Remondino, F. UAV for 3D mapping applications: A review. Appl. Geomat. 2014, 6, 1–15. [Google Scholar]
- Campana, S. Drones in Archaeology. State-of-the-art and Future Perspectives. Archaeol. Prospect. 2017, 24, 275–296. [Google Scholar]
- Remondino, F. Heritage recording and 3D modeling with photogrammetry and 3D scanning. Remote Sens. 2011, 3, 1104–1138. [Google Scholar] [CrossRef]
- Hassani, F.; Moser, M.; Rampold, R.; Wu, C. Documentation of cultural heritage; techniques, potentials, and constraints. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2015, XL-5/W7, 207–214. [Google Scholar]
- Kadobayashi, R.; Kochi, N.; Otani, H.; Furukawa, R. Comparison and evaluation of laser scanning and photogrammetry and their combined use for digital recording of cultural heritage. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2004, 35, 401–406. [Google Scholar]
- Ahmon, J. The application of short-range 3D laser scanning for archaeological replica production: The Egyptian tomb of Seti I. Photogramm. Rec. 2004, 19, 111–127. [Google Scholar]
- Alshawabkeh, Y.; Haala, N. Integration of digital photogrammetry and laser scanning for heritage documentation. The Int. Arch. Photogramm. Remote Sens. 2004, 35, 1–6. [Google Scholar]
- Guarnieri, A.; Remondino, F.; Vettore, A. Digital photogrammetry and TLS data fusion applied to Cultural Heritage 3D modeling. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2006, 36 Pt 5, 1–6. [Google Scholar]
- Grussenmeyer, P.; Landes, T.; Voegtle, T.; Ringle, K. Comparison methods of terrestrial laser scanning, photogrammetry and tacheometry data for recording of cultural heritage buildings. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2008, XXXVII/B5, 213–218. [Google Scholar]
- Fernández-Palacios, B.J.; Rizzi, A.; Remondino, F. Etruscans in 3D-Surveying and 3D modeling for a better access and understanding of heritage. Virtual Archaeol. Rev. 2013, 4, 85–89. [Google Scholar] [CrossRef]
- Nabil, M.; Betrò, M.; Metwallya, M.N. 3D reconstruction of ancient Egyptian rockcut tombs: The case of Midan 05. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2013, XL-5/W2, 443–447. [Google Scholar]
- De Lima, R.; Vergauwen, M. From TLS recoding to VR environment for documentation of the Governor’s Tombs in Dayr al-Barsha, Egypt. In Proceedings of the 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), Munich, Germany, 16 October 2018. [Google Scholar]
- Gómez-Lahoz, J.G.; González-Aguilera, D. Recovering traditions in the digital era: The use of blimps for modelling the archaeological cultural heritage. J. Archaeol. Sci. 2009, 36, 100–109. [Google Scholar] [CrossRef]
- Mozas-Calvache, A.T.; Pérez-García, J.L.; Cardernal-Escarcena, F.J.; Delgado, J.; Mata de Castro, E. Comparison of Low Altitude Photogrammetric Methods for Obtaining Dems and Orthoimages of Archaeological Sites. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2012, XXXIX-B5, 577–581. [Google Scholar]
- Martínez, S.; Ortiz, J.; Gil, M.L.; Rego, M.T. Recording complex structures using close range photogrammetry: The cathedral of Santiago de Compostela. Photogramm. Rec. 2013, 28, 375–395. [Google Scholar] [CrossRef]
- Fiorillo, F.; Limongiello, M.; Fernández-Palacios, B.J. Testing GoPro for 3D model reconstruction in narrow spaces. Acta IMEKO 2016, 5, 64–70. [Google Scholar] [CrossRef]
- Pérez-García, J.L.; Mozas-Calvache, A.T.; Barba-Colmenero, V.; Jiménez-Serrano, A. Photogrammetric studies of inaccessible sites in archaeology: Case study of burial chambers in Qubbet el-Hawa (Aswan, Egypt). J. Archaeol. Sci. 2019, 102, 1–10. [Google Scholar]
- Boulianne, M.; Nolette, C.; Agnard, J.P.; Brindamour, M. Hemispherical photographs used for mapping confined spaces. Photogramm. Eng. Remote Sens. 1997, 63, 1103–1108. [Google Scholar]
- Kedzierski, M.; Waczykowski, P. Fisheye lens camera system application to cultural heritage data acquisition. In Proceedings of the XXI International Cipa Symposium, Athens, Greece, 1–6 October 2007. [Google Scholar]
- Kedzierski, M.; Fryskowska, A. Application of digital camera with fisheye lens in close range photogrammetry. In Proceedings of the ASPRS 2009 Annual Conference, Baltimore, MD, USA, 9–13 March 2009. [Google Scholar]
- Georgantas, A.; Brédif, M.; Pierrot-Desseilligny, M. An accuracy assessment of automated photogrammetric techniques for 3D modelling of complex interiors. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2012, 39, 23–28. [Google Scholar] [CrossRef]
- Covas, J.; Ferreira, V.; Mateus, L. 3D reconstruction with fisheye images strategies to survey complex heritage buildings. In Proceedings of the Digital Heritage 2015, Granada, Spain, 28 September–2 October 2015. [Google Scholar]
- Perfetti, L.; Polari, C.; Fassi, F. Fisheye Photogrammetry: Tests and Methodologies for the Survey of Narrow Spaces. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2017, XLII-2/W3, 573–580. [Google Scholar]
- Mandelli, A.; Fassi, F.; Perfetti, L.; Polari, C. Testing different survey techniques to model architectonic narrow spaces. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2017, XLII-2/W5, 505–511. [Google Scholar]
- Perfetti, L.; Polari, C.; Fassi, F.; Troisi, S.; Baiocchi, V.; Del Pizzo, S.; Giannone, F.; Barazzetti, L.; Previtali, M.; Roncoroni, F. Fisheye Photogrammetry to Survey Narrow Spaces in Architecture and a Hypogea Environment. In Latest Developments in Reality-Based 3D Surveying and Modelling; MDPI: Basel, Switzerland, 2018; pp. 3–28. [Google Scholar]
- Alessandri, L.; Baiocchi, V.; Del Pizzo, S.; Rolfo, M.F.; Troisi, S. Photogrammetric survey with fisheye lens for the characterization of the la Sassa cave. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2019, XLII-2/W9, 25–32. [Google Scholar]
- León-Vega, H.A.; Rodríguez-Laitón, M.I. Fisheye Lens Image Capture Analysis for Indoor 3d Reconstruction and Evaluation. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2019, 42, 179–186. [Google Scholar] [CrossRef]
- Perfetti, L.; Fassi, F.; Rossi, C. Fisheye Photogrammetry to Generate Low–Cost DTMs. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2019, XLII-2/W17, 257–263. [Google Scholar]
- Gómez-López, J.M.; Pérez-García, J.L.; Mozas-Calvache, A.T.; Vico-García, D. Documentation of cultural heritage through the fusion of geomatic techniques. Case study of the cloister of “Santo Domingo” (Jaén, Spain). Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2023, XLVIII-M-2-2023, 677–683. [Google Scholar]
- Kossieris, S.; Kourounioti, O.; Agrafiotis, P.; Georgopoulos, A. Developing a low-cost system for 3D data acquisition. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2017, XLII-2/W8, 119–126. [Google Scholar]
- Barazzetti, L.; Previtali, M.; Roncoroni, F. 3D Modelling with the Samsung Gear 360. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2017, XLII-2-W3, 85–90. [Google Scholar]
- Barazzetti, L.; Previtali, M.; Roncoroni, F. Can we use low-cost 360 degree cameras to create accurate 3D models? Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2018, XLII-2, 69–75. [Google Scholar]
- Fangi, G.; Pierdicca, R.; Sturari, M.; Malinverni, E.S. Improving spherical photogrammetry using 360° omni-cameras: Use cases and new applications. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2018, XLII-2, 331–337. [Google Scholar]
- Barazzetti, L.; Previtali, M.; Roncoroni, F.; Valente, R. Connecting inside and outside through 360° imagery for close-range photogrammetry. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2019, XLII-2/W9, 87–92. [Google Scholar]
- Cantatore, E.; Lasorella, M.; Fatiguso, F. Virtual reality to support technical knowledge in cultural heritage. The case study of cryptoporticus in the archaeological site of Egnatia (Italy). Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2020, XLIV-M-1-2020, 465–472. [Google Scholar]
- Bertellini, B.; Gottardi, C.; Vernier, P. 3D survey techniques for the conservation and the enhancement of a Venetian historical architecture. Appl. Geomat. 2019, 12, 53–68. [Google Scholar] [CrossRef]
- Fangi, G. The multi-image spherical panoramas as a tool for architectural survey. In Proceedings of the 21st CIPA Symposium, Athens, Greece, 1–6 October 2007. [Google Scholar]
- D’Annibale, E.; Fangi, G. Interactive modelling by projection of oriented spherical panorama. In Proceedings of the ISPRS International Workshop on 3D Virtual Reconstruction and Visualization of Comprex Architectures (3D-Arch’2009), Trento, Italy, 25–29 February 2009. [Google Scholar]
- Fangi, G. Further Developments of the Spherical Photogrammetry for Cultural Heritage. In Proceedings of the XXII CIPA Symposium, Kyoto, Japan, 11–15 October 2009; pp. 11–15. [Google Scholar]
- Barazzetti, L.; Fangi, G.; Remondino, F.; Scaioni, M. Automation in multi-image spherical photogrammetry for 3D architectural reconstructions. In Proceedings of the 11th International Symposium on Virtual Reality, Archaeology and Cultural Heritage (VAST), Paris, France, 21–24 September 2010. [Google Scholar]
- Fangi, G.; Nardinocchi, C. Photogrammetric Processing of Spherical Panoramas. Photogramm. Rec. 2013, 28, 293–311. [Google Scholar] [CrossRef]
- Jiang, S.; Li, Y.; Weng, D.; You, K.; Chen, W. 3D reconstruction of spherical images: A review of techniques, applications, and prospects. arXiv 2023, arXiv:2302.04495. [Google Scholar] [CrossRef]
- Scaramuzza, D. Omnidirectional camera. In Computer Vision; Springer: Berlin, Germany, 2014; pp. 552–560. [Google Scholar]
- Herban, S.; Costantino, D.; Alfio, V.S.; Pepe, M. Use of low-cost spherical cameras for the digitisation of cultural heritage structures into 3d point clouds. J. Imaging 2022, 8, 13. [Google Scholar] [CrossRef]
- Pérez Ramos, A.; Robleda Prieto, G. Only image based for the 3D metric survey of gothic structures by using frame cameras and panoramic cameras. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2016, XLI-B5, 363–370. [Google Scholar]
- Barazzetti, L.; Previtali, M.; Roncoroni, F. Fisheye lenses for 3D modeling: Evaluations and considerations. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2017, XLII-2/W3, 79–84. [Google Scholar]
- Sun, Z.; Zhang, Y. Accuracy evaluation of videogrammetry using a low-cost spherical camera for narrow architectural heritage: An observational study with variable baselines and blur filters. Sensors 2019, 19, 496. [Google Scholar] [CrossRef]
- Teppati Losè, L.; Chiabrando, F.; Giulio Tonolo, F. Documentation of complex environments using 360 cameras. The Santa Marta Belltower in Montanaro. Remote Sens. 2021, 13, 3633. [Google Scholar] [CrossRef]
- Shi, Y.; Ji, S.; Shi, Z.; Duan, Y.; Shibasaki, R. GPS-supported visual SLAM with a rigorous sensor model for a panoramic camera in outdoor environments. Sensors 2012, 13, 119–136. [Google Scholar] [CrossRef]
- Ji, S.; Qin, Z.; Shan, J.; Lu, M. Panoramic SLAM from a multiple fisheye camera rig. ISPRS J. Photogramm. Remote Sens. 2020, 159, 169–183. [Google Scholar] [CrossRef]
- Zhang, Y.; Huang, F. Panoramic visual slam technology for spherical images. Sensors 2021, 21, 705. [Google Scholar] [CrossRef]
- Wei, L.Y.U.; Zhong, Z.; Lang, C.; Yi, Z.H.O.U. A survey on image and video stitching. Virtual Real. Intell. Hardw. 2019, 1, 55–83. [Google Scholar]
- Wang, Z.; Yang, Z. Review on image-stitching techniques. Multimed. Syst. 2020, 26, 413–430. [Google Scholar] [CrossRef]
- Abbadi, N.K.E.; Al Hassani, S.A.; Abdulkhaleq, A.H. A review over panoramic image stitching techniques. J. Phys. Conf. Ser. 2021, 1999, 012115. [Google Scholar] [CrossRef]
- Cheng, H.; Xu, C.; Wang, J.; Zhao, L. Quad-fisheye Image Stitching for Monoscopic Panorama Reconstruction. Comput. Graph. Forum 2022, 41, 94–109. [Google Scholar] [CrossRef]
- Bay, H.; Ess, A.; Tuytelaars, T.; Van Gool, L. Speeded-up robust features (SURF). Comput. Vis. Image Underst. 2008, 110, 346–359. [Google Scholar] [CrossRef]
- Bosch, J.; Istenič, K.; Gracias, N.; Garcia, R.; Ridao, P. Omnidirectional multicamera video stitching using depth maps. IEEE J. Ocean. Eng. 2019, 45, 1337–1352. [Google Scholar] [CrossRef]
- Liao, T.; Li, N. Natural image stitching using depth maps. arXiv 2022, arXiv:2202.06276. [Google Scholar]
- Campos, M.B.; Tommaselli, A.M.G.; Marcato Junior, J.; Honkavaara, E. Geometric model and assessment of a dual-fisheye imaging system. Photogramm. Rec. 2018, 33, 243–263. [Google Scholar] [CrossRef]
- Perfetti, L.; Polari, C.; Fassi, F. Fisheye Multi-Camera System Calibration for Surveying Narrow and Complex Architectures. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2018, XLII-2, 877–883. [Google Scholar]
- Huang, D.; Elhashash, M.; Qin, R. Constrained bundle adjustment for structure from motion using uncalibrated multi-camera systems. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2022, V-2-2022, 17–22. [Google Scholar]
- Bruno, N.; Perfetti, L.; Fassi, F.; Roncella, R. Photogrammetric survey of narrow spaces in cultural heritage: Comparison of two multi-camera approaches. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2024, 48, 87–94. [Google Scholar] [CrossRef]
- Abraham, S.; Förstner, W. Fish-eye-stereo calibration and epipolar rectification. ISPRS J. Photogramm. Remote Sens. 2005, 59, 278–288. [Google Scholar] [CrossRef]
- Schwalbe, E. Geometric modelling and calibration of fisheye lens camera systems. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2005, XXXVI-5, W8. [Google Scholar]
- Van Den Heuvel, F.A.; Verwaal, R.; Beers, B. Calibration of fisheye camera systems and the reduction of chromatic aberration. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2006, 36, 1–6. [Google Scholar]
- Kannala, J.; Brandt, S.S. A generic camera model and calibration method for conventional, wide-angle, and fish-eye lenses. IEEE Trans. Pattern Anal. Mach. Intell. 2006, 28, 1335–1340. [Google Scholar] [CrossRef]
- Schneider, D.; Schwalbe, E.; Maas, H.G. Validation of geometric models for fisheye lenses. ISPRS J. Photogramm. Remote Sens. 2009, 64, 259–266. [Google Scholar] [CrossRef]
- Sahin, C. The Geometry and Usage of the Supplementary Fisheye Lenses in Smartphones. In Smartphones from an Applied Research Perspective; InTech: London, UK, 2017. [Google Scholar] [CrossRef]
- Choi, K.H.; Kim, Y.; Kim, C. Analysis of Fish-Eye Lens Camera Self-Calibration. Sensors 2019, 19, 1218. [Google Scholar] [CrossRef] [PubMed]
- Wagdy, A.; Garcia-Hansen, V.; Isoardi, G.; Pham, K. A parametric method for remapping and calibrating fisheye images for glare analysis. Buildings 2019, 9, 219. [Google Scholar] [CrossRef]
- Aghayari, S.; Saadatseresht, M.; Omidalizarandi, M.; Neumann, I. Geometric calibration of full spherical panoramic Ricoh-Theta camera. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, IV-1/W1, 237–245. [Google Scholar]
- Khoramshahi, E.; Campos, M.B.; Tommaselli, A.M.G.; Vilijanen, N.; Mielonen, T.; Kaartinen, H.; Kukko, A.; Honkavaara, E. Accurate Calibration Scheme for a Multi-Camera Mobile Mapping System. Remote Sens. 2019, 11, 2778. [Google Scholar] [CrossRef]
- Lichti, D.D.; Jarron, D.; Tredoux, W.; Shahbazi, M.; Radovanovic, R. Geometric modelling and calibration of a spherical camera imaging system. Photogramm. Rec. 2020, 35, 123–142. [Google Scholar] [CrossRef]
- Strecha, C.; Zoller, R.; Rutishauser, S.; Brot, B.; Schneider-Zapp, K.; Chovancova, V.; Krull, M.; Glassey, L. Quality assessment of 3D reconstruction using fisheye and perspective sensors. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, II-3/W4, 215–222. [Google Scholar]
- Kandao. 2024. Available online: https://prd.kandaovr.com/2019/04/08/kandao-releases-a-substantial-upgrade-to-obsidian-adding-speed-up-depth-stitching-video-stabilization-and-ai-slow-motion/ (accessed on 27 February 2024).
Scene | Captures | Average Distance (m) | GCPs | CPs |
---|---|---|---|---|
QH23 | 19 | 1.9 | 14 | 4 |
QH33SB | 20 | 3.4 | 14 | 5 |
QH33SP | 14 | 1.6 | 9 | 4 |
QH35P | 28 | 2.1 | 19 | 4 |
Sensor | f (Pixels) | cx (Pixels) | cy (Pixels) | K1 | K2 | K3 | K4 |
---|---|---|---|---|---|---|---|
1 | 1110.667 | 23.296 | 19.106 | −5.9095 × 10−2 | −1.6487 × 10−3 | 1.2992 × 10−4 | −8.3588 × 10−7 |
2 | 1112.981 | −25.230 | 4.259 | −6.0611 × 10−2 | −6.6440 × 10−4 | −3.1994 × 10−4 | 6.4145 × 10−5 |
3 | 1106.498 | −25.919 | 34.177 | −5.8917 × 10−2 | −2.0134 × 10−3 | 1.3292 × 10−4 | 1.7086 × 10−5 |
4 | 1106.968 | −26.215 | −39.301 | −5.9852 × 10−2 | −6.9513 × 10−4 | −4.8518 × 10−4 | 1.1370 × 10−4 |
5 | 1118.038 | −57.224 | 0.222 | −6.0680 × 10−2 | −1.3719 × 10−3 | 7.0947 × 10−5 | 7.0671 × 10−7 |
6 | 1110.727 | 24.278 | −60.529 | −5.9773 × 10−2 | −2.6219 × 10−3 | 6.0588 × 10−4 | −7.5270 × 10−5 |
Distances (m) | STD (m) | |||||||||
---|---|---|---|---|---|---|---|---|---|---|
Sensor | 2 | 3 | 4 | 5 | 6 | 2 | 3 | 4 | 5 | 6 |
1 | 0.0655 | 0.1132 | 0.1307 | 0.1134 | 0.0655 | 0.0002 | 0.0002 | 0.0002 | 0.0003 | 0.0003 |
2 | 0.0651 | 0.1131 | 0.1308 | 0.1133 | 0.0002 | 0.0002 | 0.0005 | 0.0001 | ||
3 | 0.0655 | 0.1134 | 0.1307 | 0.0002 | 0.0003 | 0.0003 | ||||
4 | 0.0653 | 0.1130 | 0.0003 | 0.0001 | ||||||
5 | 0.0654 | 0.0002 |
Cases | Number of Images | Number of GCPs | Stitching Time (s) | Orientation Time (s) | Marker Projections Time (s) | Total (s) |
---|---|---|---|---|---|---|
SI-FS | 22 | 15 | 76 | 63 | 568 | 707 |
SI-HQS | 22 | 15 | 137 | 63 | 568 | 768 |
SI-DMS | 22 | 15 | 143 | 63 | 568 | 774 |
FEI-GCPs | 134 | 15 | 0 | 1084 | 1980 | 3064 |
FEI-SBs | 134 | 0 | 0 | 1084 | 0 | 1084 |
Stage | SI | FEI-GCPs | FEI-SBs |
---|---|---|---|
Pre-processing (obtaining images) | Stitching | No | No |
Calibration | No | Intrinsic | Complete |
Orientation | Without redundancy | Higher redundancy. Problems to complete the relative orientation | Higher redundancy |
GCPs (photogrammetric) | Yes | Yes | No |
Transformation from a local CRS | No | No | Yes |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Pérez-García, J.L.; Gómez-López, J.M.; Mozas-Calvache, A.T.; Delgado-García, J. Analysis of the Photogrammetric Use of 360-Degree Cameras in Complex Heritage-Related Scenes: Case of the Necropolis of Qubbet el-Hawa (Aswan Egypt). Sensors 2024, 24, 2268. https://doi.org/10.3390/s24072268
Pérez-García JL, Gómez-López JM, Mozas-Calvache AT, Delgado-García J. Analysis of the Photogrammetric Use of 360-Degree Cameras in Complex Heritage-Related Scenes: Case of the Necropolis of Qubbet el-Hawa (Aswan Egypt). Sensors. 2024; 24(7):2268. https://doi.org/10.3390/s24072268
Chicago/Turabian StylePérez-García, José Luis, José Miguel Gómez-López, Antonio Tomás Mozas-Calvache, and Jorge Delgado-García. 2024. "Analysis of the Photogrammetric Use of 360-Degree Cameras in Complex Heritage-Related Scenes: Case of the Necropolis of Qubbet el-Hawa (Aswan Egypt)" Sensors 24, no. 7: 2268. https://doi.org/10.3390/s24072268
APA StylePérez-García, J. L., Gómez-López, J. M., Mozas-Calvache, A. T., & Delgado-García, J. (2024). Analysis of the Photogrammetric Use of 360-Degree Cameras in Complex Heritage-Related Scenes: Case of the Necropolis of Qubbet el-Hawa (Aswan Egypt). Sensors, 24(7), 2268. https://doi.org/10.3390/s24072268