Next Article in Journal
Antioxidant, Antimicrobial, Antidiabetic and Cytotoxic Activity of Crocus sativus L. Petals
Next Article in Special Issue
Generative Adversarial Network for Image Super-Resolution Combining Texture Loss
Previous Article in Journal
A Protection Scheme for a Power System with Solar Energy Penetration
Previous Article in Special Issue
Semantic 3D Reconstruction with Learning MVS and 2D Segmentation of Aerial Images
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Imperial Cathedral in Königslutter (Germany) as an Immersive Experience in Virtual Reality with Integrated 360° Panoramic Photography

by
Alexander P. Walmsley
and
Thomas P. Kersten
*
HafenCity University Hamburg, Photogrammetry & Laser Scanning lab, Überseeallee 16, 20457 Hamburg, Germany
*
Author to whom correspondence should be addressed.
Appl. Sci. 2020, 10(4), 1517; https://doi.org/10.3390/app10041517
Submission received: 6 January 2020 / Revised: 7 February 2020 / Accepted: 7 February 2020 / Published: 23 February 2020
(This article belongs to the Special Issue Augmented Reality, Virtual Reality & Semantic 3D Reconstruction)

Abstract

:
As virtual reality (VR) and the corresponding 3D documentation and modelling technologies evolve into increasingly powerful and established tools for numerous applications in architecture, monument preservation, conservation/restoration and the presentation of cultural heritage, new methods for creating information-rich interactive 3D environments are increasingly in demand. In this article, we describe the development of an immersive virtual reality application for the Imperial Cathedral in Königslutter, in which 360° panoramic photographs were integrated within the virtual environment as a novel and complementary form of visualization. The Imperial Cathedral (Kaiserdom) of Königslutter is one of the most important examples of Romanesque architecture north of the Alps. The Cathedral had previously been subjected to laser-scanning and recording with 360° panoramic photography by the Photogrammetry & Laser Scanning lab of HafenCity University Hamburg in 2010. With the recent rapid development of consumer VR technology, it was subsequently decided to investigate how these two data sources could be combined within an immersive VR application for tourism and for architectural heritage preservation. A specialised technical workflow was developed to build the virtual environment in Unreal Engine 4 (UE4) and integrate the panorama photographs so as to ensure the seamless integration of these two datasets. A simple mechanic was developed using the native UE4 node-based programming language to switch between these two modes of visualisation.

1. Introduction

Virtual reality has recently become a much broader field, finding applications in medicine, architecture, military training, and cultural heritage, among other fields. With this growth has come some discrepancy in the definition of the medium: while in some fields VR is used to refer to 360° immersive panoramas and videos, in other fields it refers to fully-realised interactive CGI environments. These two “kinds” of VR have traditionally been approached very differently, owing to highly diverging workflows and the different data sources required. However, there are currently no effective ways of bringing together these two kinds of data (each of which have their own complementary advantages in visualisation) into a single VR application. This is particularly important for applications in cultural heritage, where documentation often takes the form of multiple different kinds of complementary data (e.g., written, photographic, 3D, video and field recordings, among other forms).
Virtual reality is defined by Merriam Webster as “an artificial environment which is experienced through sensory stimuli (such as sights and sounds) provided by a computer and in which one’s actions partially determine what happens in the environment” [1]. This very broad definition allows for most modern applications of VR to be taken into account. Additional definitions may be found in literature by Dörner et al. [2], Freina and Ott [3], and Portman et al. [4].
In the following we present the development workflow for a room-scale virtual reality experience of a cultural heritage monument which integrates a high-resolution CGI environment with 360° panoramic photography, allowing the user to “toggle” between the virtual and the real environments from within the VR headset. This implementation has the advantage of exploiting the potential for the interactivity of a real-time game engine environment with the high-fidelity of high dynamic range image (HDRI) panoramic photography.

2. Related Work

While much credit for the generalization of VR technology and its increasing accessibility is due to the video game industry, which has invested heavily in pushing the industry forward [5], VR is now being employed in a wide range of disciplines. To date, VR has been successfully used for, among other applications, virtual surgery, virtual therapy, and flight and vehicle simulations. In the field of cultural heritage, VR has been instrumental in the development of the field of virtual heritage [6,7,8]. At the HafenCity University Hamburg, several VR projects concerning this subject have already been realized. The town museum in Bad Segeberg, housed in a 17th-century townhouse, was digitally constructed for a VR experience using the HTC Vive Pro [9]. Three historical cities (as well as their surrounding environments) have been developed as VR experiences: Duisburg in 1566 [10], Segeberg in 1600 [11], and Stade in 1620 [12]. In addition, two religious and cultural monuments are also available as VR experiences: the Selimiye Mosque in Edirne, Turkey [13], and a wooden model of Solomon’s Temple [14].
The amount of work specifically regarding the real-time VR visualization of cultural heritage monuments is currently limited but growing. Recent museum exhibits using real-time VR to visualize cultural heritage include Batavia 1627 at the Westfries Museum in Hoorn, Netherlands [15], and Viking VR, developed to accompany an exhibit at the British Museum [16]. A number of recent research projects also focus on the use of VR for cultural heritage visualization [17,18,19,20], as well as on aspects beyond visualisation, including recreating the physical environmental stimuli [21]. The current paper contributes to this growing discussion by seeking to integrate 360° panorama photographs within an immersive real-time visualization of a cultural heritage monument. At this stage, only very limited work regarding panoramic photography integration in real-time VR is known to the authors [22].

3. The Imperial Cathedral (Kaiserdom) in Königslutter, Germany

The town of Königslutter, some 20 km east of Braunschweig (Lower Saxony, Germany), is dominated by the Imperial Cathedral, known in German as the Kaiserdom (Figure 1). One of the most impressive examples of Romanesque architecture north of the alps, the cathedral’s construction was begun under the direction of Kaiser Lothar III, German emperor from 1133 onwards [23,24]. The church was built in the form of a three-aisled cross-shaped column basilica. The cathedral is notable particularly for its repeated architectural references to northern Italian architectural styles of the time, indicating that it might be the work of an Italian architect or indeed someone who was well-travelled in those regions. Among the most important features of the cathedral is an ornamental hunting frieze, which hugs the external wall of the main aisle (see Figure 1 centre). Between 2002 and 2006, restoration was carried out on the exterior of the cathedral, followed by the interior between 2006 and 2010. The cathedral measures 75 m in length, 42 m in width, and 56 m in height.

4. Methodology

4.1. Project Workflow

The overall workflow for the production of the VR experience of the Kaiserdom is schematically represented in Figure 2. Special focus was given to achieving a realistic 1:1 representation of the cathedral, including the integration of panoramic photos in the VR experience (see Section 4.6). The project was divided into five major phases of development (Figure 2): (1a) data acquisition by terrestrial laser scanning with one Riegl VZ-400 scanner (outside) and two Zoller + Fröhlich IMAGER 5006 scanners (inside), (1b) registration and geo-referencing of scans using RiScan Pro and LaserControl, (1c) segmentation of point clouds into object tiles, (2a) 3D solid modelling with AutoCAD using segmented point clouds, (2b) generation of panoramic images using PTGui, (3) texture mapping of polygon models using Autodesk Maya and Substance Painter, (4a) placement of meshes and building the scene within the UE4 game engine, (4b) integration of motion control and interactions in UE4, (4c) integration of 360° panoramic imagery, and (5) immersive and interactive visualisation of the cathedral in the VR system HTC Vive Pro using Steam VR 2.0 as an interface between the game engine and the Head Mounted Display (HMD).

4.2. Data Acquisition

The data acquisition was already described in 2012 in Kersten and Lindstaedt [25] and is summarised in the following. The laser scan data for the Kaiserdom was acquired at 55 stations inside the cathedral by two Zoller + Froehlich IMAGER 5006 (www.zf-laser.com) terrestrial laser scanners, and at 8 stations outside the cathedral by one Riegl VZ-400 (www.riegl.com) on 5 January and 23 June 2010 (Figure 3). In total, the scanning took 15 h. The scanning resolution was set to high (6 mm @ 10 m) for the IMAGER 5006 and to 5 mm at object space for the Riegl VZ-400. The precision of the geodetic network stations was 2.5 mm, while the precision of the control points for laser scanning was 5 mm. In order to later colourise the point cloud, as well as for the building of the virtual tour, 360° panoramic photos were taken at each IMAGER 5006 scan station and at a few supplementary stations using a Nikon DSLR camera (see Section 4.4).

4.3. 3D Modelling

The 3D modelling was also described in 2012 in Kersten and Lindstaedt [25] and is briefly summarised in the following. The generated point cloud, being too large to import directly into a CAD program, was first segmented and then transferred to AutoCAD using the plugin PointCloud. Once imported, the cathedral was blocked out manually with a 3D mesh by extruding polylines along the surfaces and edges of the point cloud structure. This method has the advantage of not generating too large a file, while retaining visual control of the built model using a superimposed point cloud. Figure 4 shows the final constructed 3D CAD model of the entire cathedral in four different perspective views.
For some smaller details on the cathedral, the automated meshing functions in Geomagic were used to quickly generate a mesh directly from the point cloud (Figure 5). This works by means of a simple triangulation algorithm, which works better for more complex and irregular shapes and surfaces.

4.4. Panoramic Photography

In order to subsequently colourise the point cloud, as well as to generate a virtual online tour of the cathedral, a series of 360° panoramic photos were taken at each IMAGER 5006 scan station using a Nikon DSLR camera with a nodal point adapter. Supplementary panoramic photos were also taken at 10 additional locations outside the cathedral, as well as 19 further points within the cathedral. These were taken without any laser-scanning targets or extraneous objects present in the shot. The acquisition and processing of the panoramic photography was also described in 2012 in Kersten and Lindstaedt [25]. For better understanding of the whole workflow, the processing of the panoramic photography is briefly summarised in the following. Each set of photographs consists of 16 images—one pointing towards the sky, three towards the ground and 12 photos for the 360° circle in the horizontal position. The software PTGui automatically generated a spherical panorama with 11,700 × 5850 pixels (ca. 43 MB) for each camera station. These panorama images were converted into a set of six cube images (in total ca. 5 MB). The panorama viewing software KRpano (https://krpano.com) was initially used to generate an interactive virtual tour for web browsers (Figure 6). The tour can be viewed at https://www.koenigslutter-kaiserdom.de/virtuelleTour/tour.html (must have Adobe Flash 9/10 enabled). In this browser-based tour, all spherical panoramas are linked to each other via hotspots or via the overview map (bottom-right corner). This provides a quick and convenient way of navigating through the panoramas, simply by clicking on the relevant map icon.

4.5. Game Engine Unreal and VR System HTC Vive

A game engine is a simulation environment where 2D or 3D graphics can be manipulated through code. Developed primarily by the video games industry, they provide ideal platforms for the creation of VR experiences for other purposes (e.g., cultural heritage), as many of the necessary functionalities are already built in, eliminating the need to engineer these features independently. While there are dozens of appropriate game engines that could be used, the most popular for small studios and production teams tend to be the Unity engine (Unity Technologies, San Francisco, California, USA), CryEngine (Crytek, Frankfurt am Main, Germany) and Unreal Engine (Epic Games, Cary, North Carolina, USA). For this project, the Unreal Engine was chosen for its advantage in the built-in blueprints visual coding system, which allows users to build in simple interactions and animations without any prior knowledge of C++, the programming language on which the engine is built [26].
The specific hardware required to run VR is a VR headset, two “lighthouse” base stations, two controllers, and a VR-ready PC. For this project, the HTC Vive Pro was chosen as a headset. The lighthouses are needed to track the user’s movement in 3D space (Figure 7), while the controllers are used for mapping interactions in the virtual world. Tracking is achieved with a gyroscope, accelerometer, and laser position sensor within the VR headset itself, and can detect movements with an accuracy of 0.1° [27]. Figure 7 shows the setup of the VR system HTC Vive Pro, including the interaction area (blue) for the user.

4.6. Implementation In Virtual Reality

In order to bring the model into virtual reality, some changes had to be made to the mesh and textures in order to make them run more efficiently within the game engine. The strict performance criteria of VR mean that every effort needs to be made to optimize the models and ensure that a sufficiently high frame rate (ideally 90 frames per second, though for many applications above 40 is sufficient) can be achieved. Much of this part of the workflow was done manually.
First, the mesh was split into different parts in order to make the data volume of the files smaller and therefore speed up the time taken for each iteration of the texturing process. Because UE4’s built-in render engine renders only those meshes and textures that are within the screen-space of the viewer at any one time, a logical approach is to separate the interior from the exterior meshes, so as to unload the exterior data when the user is inside the cathedral and vice versa when they are outside. The two principal parts of the Kaiserdom—the central nave and the cloisters—were also processed separately for the same reason. In a few areas of the model, such as the southern side of the cloister, additional modelling was done in order to supplement the scan data. A low-poly city model provided by the company CPA Software GmbH (Siegburg, Germany) was used as a basis to model low-poly buildings in the area around the Cathedral. As these buildings were not central to the experience, they were modelled only in low detail so as not to take up too much rendering space on the GPU. Buildings further away from the Cathedral, which were only visible on the periphery of the virtual environment, were left in their raw form (simple grey rectangular meshes) to avoid any extraneous modelling work.
Much of the work in the VR optimization process was dedicated to the production of high-quality textures suitable for real-time VR. There is a fundamental trade-off here between the quality of the textures needed to appear photorealistic at close range and the data streaming limit of the engine (which varies due to hardware and software specifications). As a rule, creating a photorealistic environment for VR requires high-quality textures in order to boost the experience of immersion. While the Unreal Engine automatically implements level-of-detail algorithms to reduce the load on the render engine, a certain amount of manual optimization must be done in addition to achieve performance goals. As such, texture resolution was varied depending on how far the texture would be from eye-level in the virtual environment. 4K textures (4096 × 4096 px) were used for high-detail textures that would appear at eye level, while 2K textures (2048 × 2048 px) were used for textures that appear well above eye level (for example, the ceiling and roof textures). While many of the textures for this process were adapted from photos taken at the Kaiserdom, supplementary photo-textures were sourced from a creative commons licensed CGI texture database (https://texturehaven.com/). For those materials with more exaggerated relief, such as the building stone and roof tiles, normal maps were also added and accentuated with a parallax displacement effect built with the native UE4 material creation tools.
The 3D models with their corresponding textures were exported into UE4 for placement and real-time visualization (Figure 8A,B). The version of UE4 used in this case was 4.22. Additional elements such as plant meshes, clouds, fog, environmental lighting, and audio were added to heighten the sense of photorealism and immersion. In addition, simple interactions were integrated in order to help the user navigate around the environment. Firstly, a teleportation mechanic was implemented, allowing the user to jump from location to location. This mechanic makes use of a simple ray-tracer, pre-built into UE4, that allows the user to point to any location in the virtual world and check that the location is a valid teleportation point according to a certain set of criteria (these criteria, including the space available and the slope angle at the location, are calculated by UE4 with its “Navigation Mesh” feature). If the location is valid as a teleportation point, the user can teleport there with the click of the trigger button on the controller (Figure 8D). In addition, automatic door-opening animations were added to several doors in the cathedral, allowing users to move between different parts of the building as in the real world. A short trailer of the virtual environment can be viewed online (https://www.youtube.com/watch?v=hmO0JOdlLgw).
Once the real-time environment was built and VR interactions set up, the 360° panoramas could be integrated. A simple mechanism was implemented in the UE4 engine to make each panorama viewable. This mechanism was made up of: (1) a visual clue in the virtual world that indicated where the panorama was located. As an example we used a glowing ring, which stands out well from the rest of the environment (Figure 8C)—a wide variety of other visual clues may be appropriate; (2) A trigger box overlapping with the ring, coupled with a function that fires when a certain button is pressed on the HTC Vive motion controller; (3) A separate, empty map or level in the UE4 editor; and (4) A skybox in the empty level onto which to project the cube-map panorama. Using this mechanism, the player can approach a glowing ring, representing a panorama taken on that spot, press a button on the motion controller, and be transported into the 360° panorama. By toggling the button press on the motion controller, the player can come out of the panorama and be placed back in the virtual world (Figure 9). Certain variations in this mechanic were tested (e.g., projecting the panoramic photo on the inside of a sphere in the virtual world, then using a button on the motion controller to alternately showing and hiding this sphere when the player was in the right area), but the method described above was found to provide the simplest and most robust way of toggling between the panoramic photos in the scene while retaining the original perspective of the photographs.
The finished version of the VR experience was tested with the HTC Vive Pro headset running on a PC with an 8-Core Intel Xeon CPU (3.10 GHz), an NVIDIA GTX 1070i GPU, and 32.0 GB RAM. With this setup, the experience achieved an average frame rate of 40–50 frames per second.

5. Conclusions and Outlook

This paper presented the interest and workflow in creating a VR visualization with integrated 360° panoramic photography of the Kaiserdom in Königslutter. The combination of these two kinds of media—real-time 3D visualization and HDRI panoramic photography—allows the interactive and immersive potential of the former to complement the high-fidelity and photorealism of the latter. While traditionally these two “kinds” of VR have remained separate, it is important to investigate ways of integrating them in order to build experiences that are able to integrate different kinds of data. This is particularly important for those fields, such as heritage, where documentation can take multiple forms, such as photographs, objects, 3D data, or written documents. The future development of the virtual museum, for example, depends on being able to integrate different kinds of data into a virtual space that can be navigated intuitively in virtual reality [28].
Further applications of the workflow described above can also be envisioned. In another recent project, a recreation of the town of Stade (Lower Saxony) in the year 1620 [12], panoramic photography is implemented so that users can jump between the real-time visualization of the town in 1620 and 360° photos from the modern day. This implementation allows users to directly juxtapose the historic and contemporary city, as an entry point to comparing the historical conditions of the two periods. In particular, this feature could have extra meaning for users who are already familiar with the town, by revealing the perhaps unknown history of certain well-known locations. While real-time 3D visualizations on their own may provide a certain degree of immersion, the integration of different kinds of data in these virtual worlds, such as panoramic photography, can greatly enrich the experience by inviting the user to compare different kinds of visualizations.
In addition, by taking real-time visualisations beyond being simply static virtual worlds through the integration of different kinds of information, VR becomes much more powerful as a tool for education in museums. Cultural heritage monuments such as the Kaiserdom of Königslutter are particularly suited to VR exhibition due a substantial existing audience that may be looking for new ways to extend their visitor experience. By extending real-time visualisations through panoramic photography and other kinds of information, VR can come closer to realising its potential as a tool for cultural heritage education.

Author Contributions

T.P.K. and A.P.W. conceived the main research idea about VR; T.P.K. generated the panorama photography; A.P.W. processed all data and developed the VR application; A.P.W. and T.P.K. wrote the manuscript and generated the illustrations. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by HafenCity University Hamburg, Lab for Photogrammetry & Laser Scanning.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. “Virtual Reality”. Merriam-Webster.com Dictionary, Merriam-Webster. Available online: https://www.merriam-webster.com/dictionary/virtual%20reality (accessed on 4 February 2020).
  2. Dörner, R.; Broll, W.; Grimm, P.; Jung, B. Virtual und Augmented Reality (VR/AR): Grundlagen und Methoden der Virtuellen und Augmentierten Realität; Springer: Berlin, Germany, 2014. [Google Scholar]
  3. Freina, L.; Ott, M. A Literature Review on Immersive Virtual Reality in Education: State of The Art and Perspectives; eLearning & Software for Education. Available online: https://ppm.itd.cnr.it/download/eLSE%202015%20Freina%20Ott%20Paper.pdf (accessed on 18 December 2019).
  4. Portman, M.E.; Natapov, A.; Fisher-Gewirtzman, D. To go where no man has gone before: Virtual reality in architecture, landscape architecture and environmental planning. Comput. Environ. Urban Syst. 2015, 54, 376–384. [Google Scholar] [CrossRef]
  5. Fuchs, P. Virtual Reality Headsets—A Theoretical and Pragmatic Approach; CRC Press: London, UK, 2017. [Google Scholar]
  6. Addison Alonzo, C. Emerging Trends in Virtual Heritage. IEEE MultiMedia 2000, 7, 22–25. [Google Scholar] [CrossRef]
  7. Stone, R.; Ojika, T. Virtual heritage: What next? IEEE MultiMedia 2000, 7, 73–74. [Google Scholar] [CrossRef]
  8. Affleck, J.; Thomas, K. Reinterpreting Virtual Heritage. In Proceedings of the CAADRIA 2005, New Delhi, India, 28–30 April 2005; Volume 1, pp. 169–178. [Google Scholar]
  9. Kersten, T.; Tschirschwitz, F.; Deggim, S. Development of a Virtual Museum including a 4D Presentation of Building History in Virtual Reality. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 42, 361–367. [Google Scholar] [CrossRef] [Green Version]
  10. Tschirschwitz, F.; Richerzhagen, C.; Przybilla, H.-J.; Kersten, T. Duisburg 1566–Transferring a Historic 3D City Model from Google Earth into a Virtual Reality Application. PFG J. Photogramm. Remote Sens. Geoinf. Sci. 2019, 87, 1–10. [Google Scholar] [CrossRef]
  11. Deggim, S.; Kersten, T.; Tschirschwitz, F.; Hinrichsen, N. Segeberg 1600–Reconstructing a Historic Town for Virtual Reality Visualisation as an Immersive Experience. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 42, 87–94. [Google Scholar] [CrossRef] [Green Version]
  12. Walmsley, A.; Kersten, T. Low-cost development of an interactive, immersive virtual reality experience of the historic city model Stade 1620. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, 42, 405–411. [Google Scholar] [CrossRef] [Green Version]
  13. Kersten, T.; Büyüksalih, G.; Tschirschwitz, F.; Kan, T.; Deggim, S.; Kaya, Y.; Baskaraca, A.P. The Selimiye Mosque of Edirne, Turkey—An Immersive and Interactive Virtual Reality Experience using HTC Vive. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 42, 403–409. [Google Scholar] [CrossRef] [Green Version]
  14. Kersten, T.; Tschirschwitz, F.; Lindstaedt, M.; Deggim, S. The historic wooden model of Solomon’s Temple: 3D recording, modelling and immersive virtual reality visualisation. J. Cult. Herit. Manag. Sustain. Dev. 2018, 8, 448–464. [Google Scholar] [CrossRef]
  15. Westfries Museum. Batavia 1627 in Virtual Reality. Hoorn, Netherlands. Available online: https://wfm.nl/batavia-1627vr (accessed on 17 December 2019).
  16. Schofield, G.; Beale, G.; Beale, N.; Fell, M.; Hadley, D.; Hook, J.; Murphy, D.; Richards, J.; Thresh, L. Viking VR: Designing a Virtual Reality Experience for a Museum. In Proceedings of the Designing Interactive Systems Conference, ACM DIS Conference on Designing Interactive Systems 2018, Hong Kong, China, 9–13 June 2018; Association for Computing Machinery (ACM): New York, NY, USA, 2018; pp. 805–816. [Google Scholar]
  17. Fassi, F.; Mandelli, A.; Teruggi, S.; Rechichi, F.; Fiorillo, F.; Achille, C. VR for Cultural Heritage. In International Conference on Augmented Reality, Virtual Reality and Computer Graphics; Springer: Cham, Switzerland, 2016; pp. 139–157. [Google Scholar]
  18. Dhanda, A.; Reina Ortiz, M.; Weigert, A.; Paladini, A.; Min, A.; Gyi, M.; Su, S.; Fai, S.; Santana Quintero, M. Recreating cultural heritage environments for VR using photogrammetry. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, 42, 305–310. [Google Scholar] [CrossRef] [Green Version]
  19. Skarlatos, D.; Agrafiotis, P.; Balogh, T.; Bruno, F.; Castro, F.; Petriaggi, B.D.; Demesticha, S.; Doulamis, A.; Drap, P.; Georgopoulos, A.; et al. Project iMARECULTURE: Advanced VR, iMmersive serious games and augmented REality as tools to raise awareness and access to European underwater CULTURal heritagE. In Euro-Mediterranean Conference; Springer: Cham, Switzerland, 2016; pp. 805–813. [Google Scholar]
  20. See, Z.S.; Santano, D.; Sansom, M.; Fong, C.H.; Thwaites, H. Tomb of a Sultan: A VR Digital Heritage Approach. In Proceedings of the 3rd Digital Heritage International Congress (Digital HERITAGE) Held Jointly with 24th International Conference on Virtual Systems & Multimedia (VSMM 2018), San Francisco, CA, USA, 26–30 October 2018; pp. 1–4. [Google Scholar]
  21. Manghisi, V.M.; Fiorentino, M.; Gattullo, M.; Boccaccio, A.; Bevilacqua, V.; Cascella, G.L.; Dassisti, M.; Uva, A.E. Experiencing the sights, smells, sounds, and climate of southern Italy in VR. IEEE Comput. Graph. Appl. 2018, 37, 19–25. [Google Scholar]
  22. Ramsey, E. Virtual Wolverhampton: Recreating the historic city in virtual reality. ArchNet Int. J. Archit. Res. 2017, 11, 42–57. [Google Scholar] [CrossRef]
  23. Bergmann, N.; Dobler, G.; Funke, N. Kaiserdom Königslutter—Geschichte und Restaurierung; Michael Imhof Verlag: Petersberg, Germany, 2008. [Google Scholar]
  24. Stiftung Braunschweigischer Kulturbesitz. Königslutter Kaiserdom—The Key Facts. Available online: https://www.koenigslutter-kaiserdom.de/images/cache/Kaiserdom%20Koenigslutter%20THE%20KEY%20FACTS.pdf (accessed on 6 January 2020).
  25. Kersten, T.; Lindstaedt, M. Virtual Architectural 3D Model of the Imperial Cathedral (Kaiserdom) of Königslutter, Germany through Terrestrial Laser Scanning. In Proceedings of the EuroMed 2012—International Conference on Cultural Heritage, Limassol, Cyprus, 29 October–3 November 2012; Lecture Notes in Computer Science (LNCS). Ioannides, M., Fritsch, D., Leissner, J., Davies, R., Remondino, F., Caffo, R., Eds.; Springer: Berlin/Heidelberg, Germany, 2012; Volume 7616, pp. 201–210. [Google Scholar]
  26. McCaffrey, M. Unreal Engine VR Cookbook; Addison-Wesley Professional: Boston, MA, USA, 2017. [Google Scholar]
  27. Painter, L. Hands on with HTC Vive Virtual Reality Headset. 2015. Available online: http://www.pcadvisor.co.uk/feature/gadget/hands-on-with-htc-vive-virtual-reality-headset-experience-2015-3631768/ (accessed on 18 December 2019).
  28. Giangreco, I.; Sauter, L.; Parian, M.A.; Gasser, R.; Heller, S.; Rossetto, L.; Schuldt, H. VIRTUE: A Virtual Reality Museum Experience. In Proceedings of the 24th International Conference on Intelligent User Interfaces: Companion IUI’ 19, Los Angeles, CA, USA, 16–20 March 2019; pp. 119–120. [Google Scholar]
Figure 1. Panoramic view of the Imperial Cathedral in Königslutter, Germany (top), the hunting frieze on the external wall of the main apsis of the cathedral with story-telling figures (centre), and a panoramic view of the interior of the cathedral (bottom).
Figure 1. Panoramic view of the Imperial Cathedral in Königslutter, Germany (top), the hunting frieze on the external wall of the main apsis of the cathedral with story-telling figures (centre), and a panoramic view of the interior of the cathedral (bottom).
Applsci 10 01517 g001
Figure 2. Workflow for the development of the virtual reality (VR) experience.
Figure 2. Workflow for the development of the virtual reality (VR) experience.
Applsci 10 01517 g002
Figure 3. Geodetic 3D network (blue and green lines) and position of the scan stations (IMAGER 5006 = yellow triangles, Riegl VZ-400 = red dots) at the cathedral (left), Riegl VZ-400 point cloud of (top right) and 2D presentation of an IMAGER 5006 scan (bottom right).
Figure 3. Geodetic 3D network (blue and green lines) and position of the scan stations (IMAGER 5006 = yellow triangles, Riegl VZ-400 = red dots) at the cathedral (left), Riegl VZ-400 point cloud of (top right) and 2D presentation of an IMAGER 5006 scan (bottom right).
Applsci 10 01517 g003
Figure 4. Constructed 3D model of the imperial cathedral in Königslutter, Germany—View of the four fronts in AutoCAD.
Figure 4. Constructed 3D model of the imperial cathedral in Königslutter, Germany—View of the four fronts in AutoCAD.
Applsci 10 01517 g004
Figure 5. Generation of small complex objects of the cathedral using the meshing function in Geomagic for the segmented point clouds.
Figure 5. Generation of small complex objects of the cathedral using the meshing function in Geomagic for the segmented point clouds.
Applsci 10 01517 g005
Figure 6. Interactive virtual tour through the imperial cathedral using full spherical panorama photography on several stations inside and outside of the building, including an overview map of stations (bottom-right corner).
Figure 6. Interactive virtual tour through the imperial cathedral using full spherical panorama photography on several stations inside and outside of the building, including an overview map of stations (bottom-right corner).
Applsci 10 01517 g006
Figure 7. Components and schematic setup of the VR system HTC Vive Pro with interaction area (blue).
Figure 7. Components and schematic setup of the VR system HTC Vive Pro with interaction area (blue).
Applsci 10 01517 g007
Figure 8. Two views of the Kaiserdom, inside (A) and outside (B). A third image (C) shows an example of the teleportation mechanic in action. Image (D) shows an example of a visual clue placed in the virtual world, where a panoramic photo can be viewed.
Figure 8. Two views of the Kaiserdom, inside (A) and outside (B). A third image (C) shows an example of the teleportation mechanic in action. Image (D) shows an example of a visual clue placed in the virtual world, where a panoramic photo can be viewed.
Applsci 10 01517 g008
Figure 9. View from the same position in the virtual world, with the panorama switched off (A) and on (B).
Figure 9. View from the same position in the virtual world, with the panorama switched off (A) and on (B).
Applsci 10 01517 g009

Share and Cite

MDPI and ACS Style

Walmsley, A.P.; Kersten, T.P. The Imperial Cathedral in Königslutter (Germany) as an Immersive Experience in Virtual Reality with Integrated 360° Panoramic Photography. Appl. Sci. 2020, 10, 1517. https://doi.org/10.3390/app10041517

AMA Style

Walmsley AP, Kersten TP. The Imperial Cathedral in Königslutter (Germany) as an Immersive Experience in Virtual Reality with Integrated 360° Panoramic Photography. Applied Sciences. 2020; 10(4):1517. https://doi.org/10.3390/app10041517

Chicago/Turabian Style

Walmsley, Alexander P., and Thomas P. Kersten. 2020. "The Imperial Cathedral in Königslutter (Germany) as an Immersive Experience in Virtual Reality with Integrated 360° Panoramic Photography" Applied Sciences 10, no. 4: 1517. https://doi.org/10.3390/app10041517

APA Style

Walmsley, A. P., & Kersten, T. P. (2020). The Imperial Cathedral in Königslutter (Germany) as an Immersive Experience in Virtual Reality with Integrated 360° Panoramic Photography. Applied Sciences, 10(4), 1517. https://doi.org/10.3390/app10041517

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop