Next Article in Journal
BCAFL: A Blockchain-Based Framework for Asynchronous Federated Learning Protection
Previous Article in Journal
Speed Estimation Strategy for Closed-Loop Control of PMSM Based on PSO Optimized KF Series Algorithms
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Kairos: Exploring a Virtual Botanical Garden through Point Clouds

by
Maximilian Rubin
1,*,
Jorge C. S. Cardoso
2 and
Pedro Martins Carvalho
3
1
Department of Informatics Engineering, University of Coimbra, 3030-790 Coimbra, Portugal
2
Centre for Informatics and Systems of the University of Coimbra, Department of Informatics Engineering, University of Coimbra, 3030-790 Coimbra, Portugal
3
Department of Architecture, Faculty of Sciences and Technology, University of Coimbra, 3030-790 Coimbra, Portugal
*
Author to whom correspondence should be addressed.
Electronics 2023, 12(20), 4216; https://doi.org/10.3390/electronics12204216
Submission received: 30 August 2023 / Revised: 2 October 2023 / Accepted: 7 October 2023 / Published: 11 October 2023

Abstract

:
This paper reports the design, implementation and evaluation of a point cloud-based virtual reality environment for a botanical garden. The objective of this work was the creation of a point cloud-based digital timestamp of the Botanical Garden of the University of Coimbra, capturing reality and thus a moment in time that can be run and visualised in 3D space in real-time. This environment consists of three distinct locations within the Garden, containing century-old tree specimens. These areas were digitised through photogrammetry to create point clouds, which were then used as the basis of an immersive audiovisual experience whose main goal was to convey the ambiance of the Garden, along with showing people its natural processes, usually hidden to the naked eye, with the aim of rekindling our relationship with nature. We produced multimedia content consisting of PC- and VR-based experiences and 360° videos. Both PC and VR variants were evaluated by 22 volunteers where we assessed the user experience, the resulting ambiance and the gameplay. The results show a positive response from users, which described the resulting experience as “Peaceful”, “Calming”, “Relaxing”, “Dreamy”, “Colourful”, “Immersive” and “Nice”.

1. Introduction

The essence of observing, describing and remembering information and personal experiences has been with us for as long as we can remember, and has been the very basis of our survival as a species. The prehistoric paintings on walls of caves and rocks portray the importance of our human need to communicate [1]. To this day, the basis of effective communication in the form of visual prompts has not changed. We have only gotten better at it and are continuously refining and optimising the ways we share information. When we think about how far we have come in our ability to express our thoughts and ideas with the help of virtual reality (VR) technology, we have almost reached the tipping point of limitless personal expression and of making the imagination explicit.
Technologies and projects based on VR are heading in this direction. There has recently been an influx of new methods and improved accessibility when it comes to capturing reality to facilitate the emergence of this upcoming digital realm [2,3,4]. Cheaper capturing devices, improved data processing methods and advancements in hardware performance (especially when it comes to Graphics Processing Units (GPUs)) are allowing this field to thrive, making it attractive for digital artists and content creators.
Point clouds are the foundation of the process of capturing reality. At their core, point clouds are essentially files that contain a multitude of data points in 3D space that are created by performing a scan of an object or a location. One of the main concerns behind VR is the challenge of developing interactive and immersive worlds that feel as close to reality as possible. Point clouds are typically converted into 3D polygonal meshes in an effort to create a close replica of what was captured, but this sometimes results in a “soft edged” appearance which does not appeal to the eye, especially at close proximity [5].
A small number of projects have recently approached this challenge in a different way by exploring point clouds in their crude format, taking advantage of their noisy nature to discover new aesthetics and new visual languages [6,7,8,9]. By rendering point clouds in their crude and diffused state, our focus gets taken away from the visual element’s mass, whilst maintaining a sufficient level of surface detail of what was captured for recognition, allowing our mind to focus more on the general atmosphere and tone of the locale. This opens new potentials to explore alternate modes of perception and new visual languages. We can take advantage of our brain’s innate ability to actively interpret our perceived reality by “filling in the gaps”, fabricating the illusion of reality, and thus allowing for a deeper, more authentic connection between the user and the virtual world [10,11]. This approach is able to take us on emotive journeys through digital fragments of reality.
This project’s main goal was the creation of an immersive, point cloud-based, visual representation of a memory of the Botanical Garden of the University of Coimbra [12], by capturing reality and thus a moment in time that can be run and visualised in a 3D virtual reality environment (VRE). In addition to serving as a digital backup of the Garden and commemorating its 250th anniversary, this project allows anyone to access and experience this location remotely, even in face of restrictions such as ones imposed by pandemics or physical limitations, effectively allowing unlimited visits to this site of cultural heritage [13]. This project focused on three distinct areas within the Botanical Garden, each of which highlights memorable centenary-old tree specimens: the Tilia x europaea [14], the Erythrina Crista-Galli [15] and the Ficus Macrophylla [16]. Here, we also simulated some of the area’s natural processes usually hidden to the naked eye, with the aim of evoking a sense of inquisitiveness within our users. We followed the premise that if we can observe the processes of nature in VR, we may become more conscious of its inner workings in real life, and thus gain new levels of appreciation for the natural world. These three areas are virtually interconnected, simulating snippets of a memory stroll through the Garden, allowing users to freely travel from one memory to the next.
Several factors influenced the development of this project. First, in addition to serving as a digital backup of the Garden and commemorating its 250th anniversary, this project allows anyone to access and experience this location remotely, allowing unlimited visits to this site of cultural heritage. Second, as a technology-driven species, we have been slowly losing touch with nature and have become more distant to its life-giving benefits [17]. This project aims to trigger our innate “biophilia” [18] (a term used to describe our affiliation with mother nature, its positive impact on our mental health and our overall wellbeing), by rekindling our connection to the natural world, not as a tool to substitute nature and reality itself, but rather as a tool to enhance it. Third, by using and exploring the potential of the latest reality capture and point cloud visualisation techniques, this project seeks to effortlessly transmit the feelings of a moment captured in time through contemporary technology. It is similar to a photograph, but in this case a type of photograph we can “jump into”, so to speak, allowing us to explore the intricacies of an already geometrically complex scenery. Lastly, the conservation of the Botanical Garden is something that the University of Coimbra is working on and this has become a continuous endeavour. Belonging to one of UNESCO’s World Heritage Sites [19], it is paramount that the Botanical Garden preserves its integrity and essence. This project can also function as a digital timestamp, a sort of backup, by digitally preserving the areas within the Garden, with emphasis on one of its oldest trees: the Erythrina Crista-Galli, which is currently in a structurally vulnerable state.
The result is an immersive experience of the Botanical Garden, which was made publicly available for PC and VR headsets, along with 360° videos to be viewed within a VR headset or through online video streaming platforms. This paper contributes with an in-depth description of the design and implementation process of the interactive and immersive experience, along with a user-based evaluation that assessed the user experience. The results of the user evaluation are aligned with our design goals of creating a memory-like environment that evoked a child-like wonder of (re-)visiting the Botanical Garden. An early version of this work was partially reported in a short paper published at the International Conference on Entertainment Computing [20] before the work was finished. This paper gives a full account of the design process, implementation and evaluation.

2. Related Work

This section summarises VR works that have used point cloud data in diverse ways, providing a better understanding of the degree to which this technology has currently been explored.
The use of point cloud data based on photogrammetry techniques for the cultural heritage domain is now a standard approach. This is often combined with VR or AR for visualising the final 3D model and providing not only a more engaging experience to promote cultural heritage [21], but also to improve accessibility [22], and to facilitate the creation of 3D digital records of existing architectural structures [23].
Memories of Tsukiji [8] and Shinjuku [9] are two artworks that influenced the ideas behind Kairos. They both deal with memories—a fragile subject as they tend to dissolve and become fragmented over time and the only way to prevent such occurrence is via documentation—allowing us to “fly back” in time and take a small glimpse of a moment we once experienced. Memories of Tsukiji is an immersive art experience that explores what was formerly the world’s largest fish market. Tokyo’s Tsukiji inner market shut its doors permanently in 2018 and has been completely demolished due to its ageing buildings. Ruben Frosali and his team undertook the task of recreating the fish market digitally by exploring the fragmented nature of photogrammetry’s dense cloud generation process. The photorealism is still there, relatively speaking, but the semi-abstract spatial data does a compelling job of evoking certain feelings of a memory lost in time. Shinjuku embraces a more surreal side to volumetric capture which takes us on a walk through Tokyo’s Shinjuku district. Available as both an interactive VR experience (HTC Vive) and a video preview, Shinjuku’s concept mixes point clouds from photogrammetry with force induced particle systems to create a heavily fragmented and dynamic environment. Shinjuku’s non-linear narrative was specifically designed to evoke the experience of the city dweller. Both projects take advantage of the fragmented nature of point clouds to emulate the way our minds recall lost memories, but also the way users can interact and access control within these memories, allowing for a higher level of immersion.
Point clouds have also been used for representations of the natural world. In the Eyes of the Animal [7] takes the user on a journey through nature’s food chain by putting her/him in the eyes of four distinct animals (a mosquito, a dragonfly, a frog and an owl) which inhabit the Grizedale Forest, near the Lake District of North West England. Scientific facts were taken from each one of these species which were then applied to construct a completely speculative visual narrative. Each one of these species perceives the world differently: the mosquito, which later gets eaten by the dragonfly, can see CO2, allowing you to view the scenes flora “breathing” via photosynthesis. The dragonfly, on the other hand, is able to see the world in 300 frames per second, along with seeing the full light spectrum, naturally turning everything into slow motion and allowing the user to view the forest in a completely new light. It is interesting to see the relation of the ways these animals sense and perceive their environment with the Light Detection and Ranging (LiDAR) device used to scan the forest. In a way, it seems like we (humans) are trying to compete with an already well-established ecosystem by creating our own artificial sensors. In the Eyes of the Animal clearly demonstrates the authors’ deeper appreciation of the natural world and their endeavour to create an experience that enhances what can already be considered beauty at its finest—nature. Promenade [6] takes a more synthetic approach to capturing nature in a short film that presents a high-quality bird’s eye view of the forests of the Vallée de Joux in Switzerland. Quayola used high-precision LiDAR scanners to collect these data, then translated the forest into a wave of interconnected footage, transforming the documented landscape into digital art. Furthermore, Promenade plays with the merging of the real world with technology. During the course of the video, different visual effects alternate between the real and the digital, giving us a peculiar hybrid machine-like view of the forest. To further enhance these ephemeral graphics, a carefully crafted soundtrack was added and synchronised with these visual effects.
Finally, cultural heritage has also been represented through point clouds. Braga, snapshots in virtual reality [24] takes a deeper dive into the intricacies of data representation and its relation to our perception of the real world, by challenging our notion of “self” within these fabricated virtual kingdoms. Heritage sites have recently become targets that are being digitally reconstructed by multiple layers of images composed of diverse idiosyncrasies, mainly through photographic images and moving images. These are all built on the basis of the individual perception of their environment. Living today in a world of appearances, or better yet, in a world of screens that are saturated with images, leads to a kind of “generalised aesthetic” perception of reality—leading to a form of perceptual desensitisation. Braga, snapshots in virtual reality highlights the synergy of reality and the imaginary and invites you to project yourself into a fabricated point cloud-based landscape of the city of Braga, Portugal. This project challenges this “generalised aesthetic” by stripping down the fabric of the structure of reality itself, allowing you to perceive the heritage site in a new, more global and half-transparent way. These deeper philosophical questions influenced the development of Kairos, notably the idea of our “generalised aesthetic” perception of reality, which is something most of the population have fallen victim to. The Botanical Garden of the University of Coimbra, as a site of Cultural Heritage, has innately become a target for this contemporary wave of social media-induced photographic spew. By deconstructing and exploring alternative perceptions of space, we may be able to overpower this limitation by opening new doors of sensory potential, allowing for a deeper understanding and appreciation of this site.

3. The Botanical Garden

Located in the heart of the city, the Botanical Garden is an iconic space in Coimbra and a prestigious location in Portugal for its scientific contributions to botany, consisting of a plethora of transcontinental flora that take you on a journey to regions throughout the world.
Belonging to UNESCO’s list of properties registered as World Heritage since 2013, this Garden has been expanding its visibility and consequently the presence of visitors, who, in addition to being able to participate in the area of scientific culture, can equally enjoy this public space for leisure, where they can meet for casual encounters and explore nature by strolling among rare and exotic plants from all parts of the world with other natural landscape enthusiasts. Furthermore, this Garden has also expanded its role in environmental education, as a recreational space for various activities surrounding mental health and wellbeing, tours, concerts and exhibitions. Looking back in time, the 18th century brought about a revolution of minds and significant scientific progress. The Botanical Garden, therefore, was created in order to complement and encourage the fields of Natural History and Medicinal Studies of the University of Coimbra [12].
This context led to the creation of the Botanical Garden in 1772, by the administration of the Marquis of Pombal, on land donated by Benedictine friars. Initially named Horto Botânico, the Garden took up only the area that is still known as the Quadrado Central (Central Square, Figure 1(5)).
The statutes of the University of Coimbra determined the cultivation of all sorts of medicinal flora, including plants of overseas origin. The cultivation of plants began in the Central Square in 1774 and was only completed in 1790, with the addition of a central fountain—an installation which still remains to present day. During this time, the Faculty of Medicine studied the healing properties of plants which led to an upgrade consisting of a series of rectangular beds for further cultivation of medicinal flora. Today, the Garden still keeps to this scientific component through a seedbank program. Published for the first time in 1868, this seedbank includes many Portuguese and exotic species, including several endangered varieties, which have played a pioneering role in the conservation of nature. At the present time, the Botanical Garden covers around thirteen hectares of land and can be divided into different levels, stairs and avenues, each containing their own unique space, flora and atmosphere. Figure 1 shows an illustrative map of the Garden with enumerated areas depicting its most prominent locations.

4. Methodology

This section describes the phases that were accomplished to achieve the desired outcomes for this project. The work broadly consisted of four phases, which were not necessarily executed sequentially, but rather iterated as we refined the process (see Figure 2): Data Acquisition; Data Processing; Project Development; Evaluation and Optimisation.

4.1. Data Acquisition

Data Acquisition consisted of capturing all the required data from the Botanical Garden. This included photogrammetric data capture together with audio recordings. However, before any final data capture was done, some initial experimentation was carried out in order to shape and develop the most efficient data capture workflow, such as determining what hardware to use as well as configuring device settings to collect the highest-quality data as efficiently as possible.

4.2. Data Processing

During Data Processing, we carefully analysed, cleaned and refined all the captured data to ensure proper transformation into point clouds. To achieve this, Adobe Photoshop and Lightroom (photo editing software https://www.adobe.com/products/photoshop.html and https://www.adobe.com/products/photoshop-lightroom.html (accessed on 1 October 2023)), Agisoft Metashape (software that performs photogrammetric processing of digital images and generates 3D spatial data https://www.agisoft.com/ (accessed on 1 October 2023)), and CloudCompare (3D point cloud and mesh processing software https://www.danielgm.net/cc/ (accessed on 1 October 2023)) were used.
To ensure that quality point clouds were produced, all photographs were inspected to make sure they contained no visual anomalies such as lens flares or distortions, overexposed or underexposed subjects, or out-of-focus shots, as these could hamper the quality and performance of Metashape’s processing algorithm. Any photographic anomalies found were corrected with Adobe Photoshop and Lightroom. Photographs were then processed with Agisoft Metashape to be transformed into point clouds. This stage was resource-heavy, so having an initial quality photographic dataset was essential.
The point clouds naturally contained unnecessary data due to the noisy nature of photogrammetry (such as excessive noise and/or sparse data outside the area of capture). These were manually cleaned using Metashape and CloudCompare. CloudCompare was used as it allowed for custom noise removal (e.g., remove isolated points, or points that were out of bounds of a defined radius) in addition to removal of point data of specific colour ranges (e.g., useful to remove misinterpreted and undesired floating artefacts, such as clouds or the sky). Metashape produced point clouds composed of hundreds of millions of points; these naturally had to be subsampled (also within CloudCompare) to save storage space and to improve runtime performance within Unity.
With regard to the captured audio data, each audio source went through a clean-up process to filter out any unwanted sounds and/or excessive noise (within Adobe Premiere Pro’s audio editor) to ensure smooth auditory playback during the immersive experience. Further sound manipulation was done later within Unity to create the desired 3D audio effect using Unity’s audio reproduction system [25].

4.3. Project Development

The Project Development phase consisted of merging all the previously collected and refined data into Unity, with the aim of creating the desired immersive experience. First and foremost, the expressive potential of these point clouds was explored using Keijiros point cloud importer and renderer plugin [26] along with Unity’s VFX graph, as these granted some flexibility when it came to how our point clouds could be rendered.
This phase went over the visual representation of our point clouds (point size, shape, colour and animations). The chosen visual language was then followed by a secondary exploratory phase, which emphasised the immersive factors of our VRE. This phase explored user interactivity, sound design and particle systems to bring more life into our scenes. After having completed these steps, preliminary quality and performance tests were run to make sure the project performed in a fluid manner with minimal frame rate drops and/or visual/auditory anomalies. Lastly, the produced immersive experience was adjusted to run in compatibility for a VR headset, along with the production of 360° videos.

4.4. Evaluation and Optimisation

The Evaluation and Optimisation phase was used to analyse, evaluate and optimise. Before conducting any evaluation tests, the entire experience was fine-tuned to achieve the most optimal quality-to-performance ratio for both platforms (PC and VR). Evaluation tests commenced once this balance had been set. These tests consisted of inviting participants, in-person (PC/VR), and remotely (desktop executable), granting them a hands-on experience of the project, followed by an evaluation form consisting of quantitative and qualitative evaluations that assessed the project’s strengths and shortcomings, demonstrating if it met our desired goals and objectives.

5. Kairos

The development of this project was started by establishing the number of areas we were going to capture, what natural processes we were going to include within each area and how we would interconnect these locations along with how the user would interact with said areas. We opted for three locations highlighting some of the oldest iconic tree specimens of the Garden: the Tilia x europaea [14], situated near the Alameda das Tílias, the Erythrina Crista-Galli [15], located within the Central Square and the Ficus Macrophylla [16], located beside the cold greenhouse (see Figure 1).
Initially, the capture of these areas was planned within the closest possible time frame, to conceive the idea of replaying a memory of a walk in the Garden. However, because each location requires a large set of photographs, taking around 30 to 50 min per area, we decided to split the capture session into three separate parts: morning (for the Tilia), noon (for the Crista-Galli) and midafternoon (for the Ficus), simulating a day’s passage.
Time was spent within each area before commencing our captures, by gathering and analysing information on the visible natural processes occurring in each area. Scientific information was then collected on the trees’ root system structures and their symbiotic relationships with their surrounding environment [27,28]. This was inspired by the hidden underground world of microbes, with emphasis on mycorrhizal networks [29], that is, the symbiotic relationships that form between fungi and plants.

5.1. Data Acquisition and Processing

Our initial plan to capture all locations on the same day failed due to the constant change of weather conditions, along with exceeding our camera’s limited battery and storage capacities. Consequently, each area was captured on different dates. Moreover, the first two areas became a priority due to their trees’ limited flowering periods, leaving a span of about two weeks to capture and process these two. Fortunately, the Ficus Macrophylla keeps its leaves and fruit (figs) all year round, making it less of a priority at the time.
Our photographic dataset was imported and analysed within Adobe Lightroom. Here, the process of colour correction ensued in an attempt to bring out as much detail from the scenes as possible (Figure 3). Lightroom’s “Match Total Exposure” feature was also used to automatically equalise the exposure value of all photographs in the hopes of bettering Metashape’s feature tracking process. Although not recommended by Agisoft Metashape’s user manual, we had no choice but to compress our photographic dataset to the lossy compression format JPG, to avoid running out of memory (RAM) during Metashape’s camera alignment process.
Albeit having already gained experience from our preliminary tests, new issues arose concerning failed camera alignment, running out of system memory, and failed object alignment within the scenes, consisting of multiple misaligned tree trunks and branches (Figure 4). These issues were solved by reducing the size of our photographic dataset through manual selection, where photographs deemed redundant and problematic were removed, such as those with many occlusions and ones with similar angles of capture. From an initial photographic dataset ranging from 700 to 1100 photographs per area, our final dataset consisted of a range of only 400 to 600 photographs per area.
The areas containing the Erythrina Crista-Galli and Tilia x europaea were the most problematic due to their high concentration of vegetation, low prominent feature-tracking points and the constant change in weather and lighting conditions during our capture sessions, which negatively influenced the accuracy and outcome of our point cloud. We ended up re-capturing these several times.

5.2. Data Refinement

An initial manual removal of unnecessary and out-of-bounds points within Metashape was executed using its free-form selection tool (Figure 5a,b). Subsequently, all point cloud data such as blue sky and white cloud points were removed through Metashape’s colour selection tool, together with CloudCompare’s RGB filter tool (Figure 5c). To preserve performance for Unity, we used CloudCompare’s “Remove duplicate points” feature, where points with a radius of less than 0.0099 units (CloudCompare’s units of distance) from one another were removed. This was able to remove more than half the total point count, freeing up space while maintaining visual coherence.
To further lower the point count, we employed CloudCompare’s subsampling tool using its “random” subsampling method (Figure 5d). As this tool subsamples point cloud data in a random fashion, it was decided to separate our scenes into several sections: the tree’s main trunk and leaves, the area of the scene closest to the tree, and finally, the extremities of the scene. This allowed subsampling with more control: a higher point count (and thus, level of detail) for the focal point of the area (the tree and its neighbouring area), and a reduced point count for the rest of the area (points near the extremities of the scene). The sections were individually subsampled to a manageable value for Unity, consisting of a total point count of approximately 20 to 30 million points per scene.

5.3. Representation of Natural Processes

The natural processes represented in the immersive experience are root systems, leaf systems, pollen dissemination, bees, butterflies, flies and fig simulations. Our intention in representing these natural processes was twofold. On the one hand, to draw the user’s attention to hidden or easy-to-miss processes and relationships between the trees and other biological systems. On the other hand, to incorporate into the interactive experience dynamic elements that could serve as awe- or curiosity-inspiring moments to captivate users. While we did go beyond the available point cloud data, our focus remained on maintaining visual consistency by keeping elements simple. We believe that by enabling users to observe and engage with nature’s processes in a virtual reality setting, we can promote a deeper awareness of the natural world’s inner workings and foster a greater appreciation for the environment we aim to represent.

5.3.1. Root Systems

We created visual root representations based on existing illustrations [30] or textual descriptions [31,32].
The trees’ trunks and visible roots were isolated from the rest of the scene in Metashape and then transformed into meshes (Figure 6a). These then were imported into Blender where the “Sapling Tree Gen” (https://docs.blender.org/manual/en/latest/addons/add_curve/sapling.html (accessed on 1 October 2023)) addon was used with custom parameters to simulate the characteristics of each tree specimen’s root system. As a last step, manual adjustments, scaling and orientation of our procedurally generated roots was performed to amalgamate these with the trunk structures and their visible roots (Figure 6b).
In Unity, all meshes obtained were transformed into Signed Distance Fields (SDFs) to encapsulate the particle systems used as representation of the root systems, resulting in more realistic particle simulations.

5.3.2. Leaf Systems

We isolated the tree’s leaves from the rest of the scene for these to be transformed into texture files containing position and colour data (Figure 7a). These data were then used within VFX graphs to animate the leaves. The general appearance of our leaf particles, however, was determined to be subpar, as these contained jagged edges that were noticeable at close distance. This was resolved by substituting Unity’s default 2D point texture with a higher-resolution custom point texture consisting of a smoother colour to alpha value (Figure 7b,c).

5.3.3. Bees, Butterflies, Pollen, Flies and Fig Simulations

Two distinguishable bee species were observed and registered within the area containing the Tilia x europaea, which were at the time gathering nectar from the Tilia’s fragrant, pale yellow flowers: the small Lasioglossum Dialictus and the larger Apis mellifera (also known as the European honey bee). These were both simulated within VFX graphs and also include user collision detection. Moreover, the Tilia’s leaf position data were used as input for the spawn positions of the smaller bees (encompassing the entire tree, as was observed in real life). Two swarms of the larger honey bees were manually placed within the area, positioned in two separate locations. Here, proximity sensors were implemented that trigger when the user gets too close; this incites the bees, which then start revolving around the user in a more agitated fashion.
Several Pieris rapae butterflies were observed fluttering within the area containing the Erythrina Crista-Galli, in the rose patches adjacent to the tree. These were simulated by animating two 2D textures (emulating wings) by scaling them following a sign wave (see Figure 8). As with the bees, these butterflies also react to the user’s presence and disperse when the user gets too close.
To accentuate the Crista-Galli’s red fragrant flowers, it was decided to simulate the smell of nectar and pollen being released from these (Figure 9). Particle systems where positioned in the centre of each flower cluster, which release small particles (representing the nectar and pollen). Furthermore, each cluster contains its own proximity sensor that bursts particles if the user decides to collide or come into contact with the flowers.
During the capture session of the Ficus Macrophylla, it was noticed that many of its figs had fallen to the ground. This is also evident in the final audio recording of the area, where we can hear figs dropping. Hence, it was decided to simulate these figs using two VFX graphs. The first randomly spawns figs on the ground on scene load; the second emits figs from the tree (using the tree’s leaf-position data as input), where a random fig is dropped (emitted) within a random time-space between four and eight seconds.
Small clusters of flies were also spotted within the area and implemented within the scene, following similar parameters used for the bees surrounding the Tilia, with the addition of trails.

5.4. Locomotion and Area Traversal

Figure 10 shows a top view of the Botanical Garden containing all three captured areas, including an illustration of the area traversal system where existing pathways within each area were used as exit and entry points.
We had to think of a way of interconnecting these in a subtle, yet intuitive way. This task was initiated with the implementation of exits that allow users to travel from one area to another and were initially programmed using invisible boxes as trigger colliders, where the user would trigger a scene transition when colliding with either box. However, after a few initial run-throughs, we decided a more gradual and interactive transition was necessary. To achieve this, trigger boxes were implemented and controlled by a script that constantly calculates the user’s distance from the exits—allowing for a gradual scene fade-out. These sensors detect the user’s proximity and start to fade the scene once the user gets closer than 10 units of distance from either one of the three sensors. Subsequently, the user is transported to the next area the moment they touch the trigger box (Figure 11).

5.5. Sound Design

An experimental approach was taken to achieve our goal of creating a 3D ambient sound reproduction for our immersive experience, consisting of installing multiple microphones within each of the captured areas, evenly distributed around each scene in a fashion resembling a square.
Due to unpredictable weather conditions, windscreens were made for these microphones in a hope to minimise unwanted wind noise. Each area was recorded (using a WAV format with a 48 kHz sample rate) for a period of around 10 to 15 min. However, before initialising our audio recordings, microphone gain was fine-tuned for the H1 Zoom, H2n Zoom and Sony a6300 to determine their optimal signal-to-noise ratio.
Audio files were imported into Premiere Pro, where the synchronisation, clean-up and enhancement processes were initiated. The synchronisation and clean-up of our clips was performed by picking out the peaks in the audio tracks (representing the claps), followed by cutting out all unwanted sounds comprising human chatter, wind noise and traffic sounds. The removal of these sounds was decided on due to their overly distracting nature, where these were presumed to be too interfering for the delicate character of the areas, especially in regard to the areas containing the Tilia x europaea and Erythrina Crista-Galli. A cross-fade audio effect was then used as a buffer, to smooth out the audio cuts between the separated (deleted) clips. Next, noise reduction, treble, bass and amplification adjustments were applied in order to unify our audio as a whole.
Ultimately, all tracks were imported into Unity. Here, speakers were placed in the exact same positions as the microphones had been placed in the real world.Each speaker was then transformed into a 3D sound emitter where its “spatial blend” parameter along with its min. and max. emission distances were adjusted to create our desired 3D audio effect.

5.6. Virtual Reality Implementation

Initial tests revealed that the Meta Quest 2 VR headset running our VR application could only handle about 200,000 points before running into performance issues. Therefore, we opted for connecting the Quest 2 directly to a PCVR ready computer, via a Link cable, taking advantage of our higher-performance PC hardware. Several benchmarks were performed using the Link cable for the purpose of determining the maximum quantity of points we could render at a sustainable frame rate. To stabilise the frame rate, we reduced the global point count from the scene’s original point count of 20 million points to about 2.8 million points. We also increased size of our point particles to add more volume to the scene in an attempt to counteract its lower point count.

Interaction

Two options arose when it came to user interaction: to work with the Quest’s controllers or experiment with its recently released hand tracking capability, the latter being able to detect our hands and fingers in real-time. Our goal was allowing the user to roam around and interact with the scene solely using this hand tracking feature, through gestures and by attaching sphere colliders and attractors to either hand. However, there are currently no out-of-the-box solutions for hand-based locomotion, so were only able to successfully implement the latter. We decided to activate the Quest’s controllers, along with modifying the official OVRPlayerController script to allow the user to “fly” within the scene by pointing in the desired direction of flight via the headset’s rotation/tilt, and accelerating via the Left trigger button.
The Quest 2 does not support the use of both input methods simultaneously (hand tracking, controllers), so we decided to keep both variants, as separate modes, where one can fly around the scene using the controllers; then, by placing down the controllers, Unity automatically switches to hand tracking where the user can then interact with the scene.
In comparison to the interactive elements of the Desktop (PC) variant, an addition of vegetation interaction was implemented in the VR variant, allowing the user to interact with the scene’s shrubs, along with the leaves of neighbouring trees.
As a final point, all three locations were originally planned to be implemented for our VR variant; however, due to performance limitations, it was decided to include only the scene containing the Erythrina Crista-Galli.

6. Evaluation

Evaluation started with the creation of a short teaser video that was advertised on social media platforms in an effort to attract volunteers. Intending to expand public reach, a Desktop executable was also created and shared with those interested (see Appendix A).

6.1. Procedure

Part of our evaluation consisted of inviting volunteers to explore our immersive experience at our laboratory at the Department of Informatics Engineering of the University of Coimbra as individual sessions. Here, both PC and VR variants were explored in an alternating fashion. We also conducted remote testing sessions, although in this case, participants were limited to their own hardware and none were able to experience the VR variant.
Volunteers were asked to play with and explore the environments in both PC and VR variants (we alternated the order between participants). We provided a brief explanation of the interaction modes and allowed the participants to select their preferred options. While we did offer in-ear earphones, we also permitted our volunteers to use their own audio equipment if they preferred.
After having played the chosen variants and interaction modalities, volunteers were then asked to fill in a web-based form designed to collect quantitative evaluations (as multiple choice linear scales) and qualitative evaluations (as open questions), in an effort to acquire a better understanding of our project’s strengths and shortcomings.
A set of questions addressed demographic information and volunteers’ perspectives on nature. These consisted of asking about the age group; whether they consider themselves game enthusiasts; previous experience with VR or augmented reality (AR) prior to Kairos; if they had ever visited the Botanical Garden of the University of Coimbra; if they think that society is losing touch with nature; how important they think nature is for our overall wellbeing on a scale of 1 to 10; and finally, on a scale of 1 to 10, how much knowledge and understanding they have of the natural world.
Another set of questions was about the experience of Kairos: asking our volunteers to describe their experience with Kairos in two words; how they would rate their experience on a scale of 1 to 10; how it made them feel on a scale of 1 to 10; and an open question asking if Kairos evoked a specific emotion(s), if any, within our volunteers.
Another set of questions regarded gameplay: what Kairos variant they preferred; what modes of locomotion they used; and which one they preferred; how intuitive the controls were for the PC Variant on a scale of 1 to 10; how intuitive the controls were for the VR Variant on a scale of 1 to 10; and an open question for comments regarding user control and locomotion.
Another set of questions regarded the resulting ambiance and appreciation for nature generated by the Kairos experience: how well they thought Kairos captured the ambiance of the Botanical Garden in a scale of 1 to 10; what area(s) they enjoyed the most; if they would have liked to explore more areas during the experience; how much the experience had improved their appreciation for the natural world on a scale of 1 to 10; if it made them feel any closer to nature on a scale of 1 to 10; and if the experience had increased their desire to explore nature in the real world on a scale of 1 to 10.
A final set of questions were of a more miscellaneous nature: what audio output was used during the experience; an open question regarding their auditory experience; if any serious performance issues such as stutters or a low frame rate were experienced; if any other issues were experienced during runtime; what they would add to the experience; general comments and/or suggestions.

6.2. Results

A total of 22 volunteers (12 in our laboratory, 10 remotely) spent about 4 to 6 min exploring the environments in each variant. Because the remote participants did not have access to VR equipment, the PC variant was experienced by all (22) volunteers, while the VR variant was experienced by 12 volunteers.

6.2.1. Demographics and Perspectives on Nature

Volunteers’ ages were distributed among three age groups: youth [15–24] (59.1%), adults [25–64] 31.8% and seniors [65+] (9.5%). Fourteen (63.6%) had visited the Botanical Garden previously at least once.
Sixty-three point six percent (63.6%) of the volunteers described themselves as video game enthusiasts, but only 40.9% reported previous experience with VR or AR (Figure 12a).
Seventy-two point seven percent (72.7%) of our volunteers think that society is losing touch with nature, while 27.3% were unsure (Figure 12c). Moreover, 95.5% of our volunteers rated nature as being “Very Important” to our overall wellbeing (Figure 12d). In regard to our volunteer’s knowledge and understanding of the natural world, we obtained a more varied response, but most considered themselves to have a high level of knowledge (Figure 12d).

6.2.2. User Experience

The most popular words used to describe Kairos were: “Peaceful/Calming/Relaxing”, followed by “Immersive”, “Colourful”, “Nice”, “Dreamy” (Figure 13a). Volunteers expressed emotions such as “Relaxed/Calm/Tranquil”, followed by “Joy”, “Wonderment/Curiosity”, “Nostalgic” and “Childlike wonder” (Figure 13b). The average rating for both the immersive experience and for how the experience made volunteers feel was 9.5 (Figure 13c).

6.2.3. Gameplay

Although our VR variant consisted of only a single area running at lower visual fidelity, 50.0% of our volunteers (12) who experienced both PC and VR preferred the VR variant, while the other 50% said they preferred both variants. Of these, 50% preferred the Quest’s 2 controllers, followed by the Quest’s 2 Hand Tracking System and Keyboard combination (41.7%, see Figure 14a).
When it comes to the intuitiveness rating of our controls, the majority of volunteers rated the controls as highly intuitive (see Figure 14b). We observed, however, that some volunteers had some initial struggles before getting used to the controls for both the PC and VR variants.
Comments left regarding user locomotion highlighted two main areas: the Quest’s controllers and locomotion speed (within the Desktop variant). Some users pointed out the inability to use both hand tracking and controllers simultaneously; moreover, another comment suggested adding the ability to pan up and down using the Quest’s controllers. Several comments also pointed out the speed of locomotion, stating that it was too slow, even while holding “shift” (boost).

6.2.4. Ambiance and Appreciation for Nature

When it comes to how well Kairos captured the ambiance of the three areas of the Botanical Garden, here it scored an average rating of 9.36 (Figure 15a). Half of the volunteers said they preferred all three areas. The remaining volunteers reported a greater preference for Area #2—Erythrina Crista-Galli. (22.7%) (see Figure 15b). Regarding the number of captured areas, 72.7% of our volunteers would have liked to explore more than three areas, whilst 27.3% stated that three were sufficient.
The majority of our volunteers rated Kairos as having improved their appreciation for the natural world (average of 8.00), made them feel closer to nature (average of 8.32), and given them an increase in desire to explore nature in the real world (average of 8.45) (Figure 15a).

6.2.5. Miscellaneous

Regarding our volunteer’s chosen audio input and auditory experience, 95.5% of our users used headphones/earphones (as recommended by Kairos’s splash-screen), allowing them to fully indulge themselves into Kairos’s immersive ambient sound. Comments regarding audio were all positive and most volunteers stated it improved Kairos’s immersion factor to a large degree. Whilst 86.4% of our volunteers experienced no serious performance issues during their experience, only 9.1% reported that they had experienced these issues.
Some volunteers expressed their desire for more interactive elements within the scenes, along with a guide (or a minimap) to show the user where to go, as some would become lost within the areas. Moreover, smaller suggestions were also left, such as the addition of a blue sky and stars.

6.3. Discussion

Regarding demographics, although the development of the Kairos experience did not have a target group in mind, the age distribution of the volunteers was notably skewed towards younger participants as these groups were more readily available to experiment with the gameplay experience. This was also noted in the fact that more than half of all volunteers described themselves as video game enthusiasts. Considering the transversal theme regarding nature and memory, it would be interesting to study a group with a more balanced distribution of ages. We also acknowledge the limitation of the number of participants in our study. Twenty-two volunteers may not be enough to have a representative account of the user experience. Expanding the number of volunteers would be an interesting direction for future work to have a better understanding of the possible nuances regarding the experience of Kairos. Nonetheless, some observations could still be drawn.
Although no significant variations were observed among different age groups, it was noticed that during the tests, some of the youngest volunteers made attempts to “exhaust” all potential controls and movements in an impulsive manner. Consequently, their focus was not primarily on the ambiance of Kairos itself, but on their performance within the scene. Given Kairos’ inherent nature, they managed to accomplish this quite rapidly and subsequently became disinterested in the remaining aspects of the experience. In simpler words, they approached Kairos as if it were goal-oriented, rather than engaging with it as an immersive experience in its own capacity. This behaviour is presumed to stem from the overstimulation frequently associated with the modern digital world. On the other hand, it was observed that the older generation generally had a more natural approach to Kairos, taking their time to engage with the experience thoughtfully and carefully.
The fact that the large majority of the volunteers had already visited the Botanical Garden was expected, considering its urban relevance and proximity to the university grounds. It could also be stated that, based on the results, the test group generally exhibited a positive inclination toward recognising the significance of the natural world in our lives and the necessity of enhancing our connection with it. These data validate the initial premise and overall objectives of the Kairos experience. Moreover, we can consider that the more dispersed responses towards the user’s knowledge of the natural world can also validate the relevance of the educational and exploratory aspect of the experience.
In terms of user experience, the results were very positive, especially in light of the goal to craft an immersive environment that captures the essence of the Botanical Garden. This was evident from the emergence of keywords used for classification and the elicited emotions, such as “peaceful”, “calming”, “relaxing”, and “immersive”. These emotions align with both the objectives of the Kairos environment and the authentic character of the original physical space, the Botanical Garden. Therefore, we believe that this natural ambiance transitioned effectively from reality into the point cloud-based environment.
On the aspect of the specific areas recreated, although half of the users showed no preference between the three, the rest showed a clear inclination towards Area#2—Erythrina Crista-Galli. This fact can be explained by it being the only area modelled in the VR environment and the already mentioned impact that this version’s interactivity and immersion had on the users’ experience.
One interesting conclusion regarding the overall gameplay aspects of the experience was that although the desktop variant featured all three locations and a much higher visual fidelity and point count, 50% of users still preferred the VR experience. This fact could be related to the increased immersiveness of VR and the novelty of the quest’s hand tracking system as well as the ability to directly interact with the scene’s point cloud. In our opinion, this revealed that visual fidelity was at least of equal relevance as immersion and interactiveness. In this regard, later tests of the desktop version of Kairos with a more powerful system than the one used in development suggested the possibility that the higher point count and increased visual fidelity of the desktop variant could also be brought into VR.
Regarding locomotion and control methods, the keyboard and mouse option presented fewer difficulties for all. This was expected as it was the most ubiquitous control scheme tested, similar to any first-person video game. Despite this, most users that experienced both the VR and desktop versions preferred the Quest 2 controllers for locomotion followed by the Quest 2 hand tracking + keyboard option. This disparity could be related to the perceived slow movement in the desktop version but can also reveal an increased relevance of the locomotion aspect and the hand interaction in the VR experience.
Considering the users’ final reported comments, the desire for more interactive elements as well as other areas was expected. Both these aspects relate to the exploratory and playful nature of the experience and would certainly increase its value. The mentioned need for a map feature could be related to the feeling of disorientation of the multiple entry and exit points used which ultimately stemmed from the inability to run a single large environment. While such a feature would undoubtedly facilitate the exploration of the different areas, it could also act as a distraction from the originally intended, more immersive experience.

7. Conclusions

The main objective of this thesis is the creation of a digital timestamp of the Botanical Garden of the University of Coimbra, commemorating its 250th anniversary. This was achieved by capturing reality and thus a moment in time that can be run and visualised in 3D space, in real-time. Three areas containing century-old tree specimens were captured and digitised, effectively preserving these forever. Our goal consisted of grasping the ambience within each area, together with simulating some of the natural processes usually hidden to the naked eye, in an effort to evoke feelings of childlike wonder and inquisitiveness within our users.
To achieve this, we used point clouds as the fabric of the creation of our immersive experience. By representing reality through the use of point cloud data, our focus is taken away from the visual element’s mass, whilst maintaining a high level of surface detail of what was captured, allowing our mind to focus more on the general atmosphere and tone of the locale.
The long development process of our project was described in detail, as well as the evaluation process with volunteers external to our team. Kairos was then transformed into several multimedia formats in an effort to expand audience reach.
Future work in the development of Kairos would consist of several improvements and additions suggested by volunteer evaluators that were not implemented due to the limited time-span of this project. The addition of more areas and interactive elements of the Botanical Garden would be made, following improvements in user interaction within the VR variant. Moreover, the unaccomplished goal of creating the in situ mixed reality (MR) variant of Kairos is an option and could eventually become an additional variant. Furthermore, an idea of creating a user-position-based interactive installation variant of Kairos is in development and may come into fruition in the near future, allowing multiple users (the public) to interact with the elements of our scenes using their physical position as input.
The eventual development of higher-quality material (e.g., by increasing the global point count) is not as far-fetched as we had initially thought, meaning that higher levels of visual fidelity and thus, immersion, could well be within reach in the near future. The same can be said for the VR variant, as new, more powerful headsets with improved displays are expected to be released soon (e.g., Oculus Quest 3), further expanding the possibilities of representing real-time immersive natural environments and all of their processes.

Author Contributions

Conceptualisation, M.R., J.C.S.C. and P.M.C.; methodology, M.R., J.C.S.C. and P.M.C.; software, M.R.; validation, M.R., J.C.S.C. and P.M.C.; investigation, M.R., J.C.S.C. and P.M.C.; resources, M.R., J.C.S.C. and P.M.C.; data curation, M.R.; writing—original draft preparation, M.R.; writing—review and editing, M.R., J.C.S.C. and P.M.C.; supervision, J.C.S.C. and P.M.C. All authors have read and agreed to the published version of the manuscript.

Funding

This work was funded by national funds through the FCT—Foundation for Science and Technology, I.P., within the scope of the project CISUC—UID/CEC/00326/2020 and by European Social Fund, through the Regional Operational Program Centro 2020.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data available upon request.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Kairos Links and Downloads

Table A1. Links for material related to Kairos.
Table A1. Links for material related to Kairos.
DescriptionURL
Kairos webpage (with links for application download)https://kairos.dei.uc.pt (accessed on 1 October 2023)
Kairos Teaser Trailerhttps://youtu.be/GMu2i9w6tgI (accessed on 1 October 2023)
360° Stereoscopic 5.7 K Videos (Youtube)
Kairos Area #1–Tilia x europaeahttps://youtu.be/o7z4ty5eYHg (accessed on 1 October 2023)
Kairos Area #2–Erythrina Crista Gallihttps://youtu.be/MgA-iq3xpDA (accessed on 1 October 2023)
Kairos Area #3–Ficus Macrophyllahttps://youtu.be/rAxDt6zYzaQ (accessed on 1 October 2023)

References

  1. Pettersson, R. From Cave Art to CAVE Art. In Proceedings of the 35th Annual Conference of the International Visual Literacy Association, Newport, RI, USA, 1–5 October 2003. [Google Scholar]
  2. Zhang, M.; Wang, Y.; Zhou, J.; Pan, Z. SimuMan: A Simultaneous Real-Time Method for Representing Motions and Emotions of Virtual Human in Metaverse. In Proceedings of the Internet of Things–ICIOT 2021, Virtual, 10–14 December 2021; Lecture Notes in Computer Science. Tekinerdogan, B., Wang, Y., Zhang, L.J., Eds.; Springer International Publishing: Cham, Switzerland, 2022; pp. 77–89. [Google Scholar] [CrossRef]
  3. Xu, J.; Gao, B.; Liu, C.; Wang, P.; Gao, S. An Omnidirectional 3D Sensor with Line Laser Scanning. Opt. Lasers Eng. 2016, 84, 96–104. [Google Scholar] [CrossRef]
  4. Li, H.; Vouga, E.; Gudym, A.; Luo, L.; Barron, J.T.; Gusev, G. 3D Self-Portraits. ACM Trans. Graph. 2013, 32, 1–9. [Google Scholar] [CrossRef]
  5. Berger, M.; Tagliasacchi, A.; Seversky, L.M.; Alliez, P.; Levine, J.A.; Sharf, A.; Silva, C.T. State of the Art in Surface Reconstruction from Point Clouds. In Proceedings of the Eurographics 2014-State of the Art Reports; The Eurographics Association: Strasbourg, France, 2014. [Google Scholar] [CrossRef]
  6. Quayola. Promenade. 2018. Available online: https://www.audemarspiguet.com/com/en/news/art/quayola-promenade.html (accessed on 1 October 2023).
  7. Marshmallow Laser Feast. In the Eyes of the Animal [360° Cinematic Experience]. 2019. Available online: http://intheeyesoftheanimal.com/#about (accessed on 1 October 2023).
  8. Frosali, R. Memories of Tsukiji [Video] ARTnSHELTER, Tokyo, Japan. 5–8 December 2019. Available online: https://rubenfro.com/memories-of-tsukiji (accessed on 1 October 2023).
  9. Bardou, B. ShinjukuVR [Video Recording]. 2020. Available online: https://vimeo.com/387170476 (accessed on 1 October 2023).
  10. Hung, C.P.; Ramsden, B.M.; Roe, A.W. A Functional Circuitry for Edge-Induced Brightness Perception. Nat. Neurosci. 2007, 10, 1185–1190. [Google Scholar] [CrossRef] [PubMed]
  11. Penn, R.A.; Hout, M.C. Making Reality Virtual: How VR “Tricks” Your Brain. Front. Young Minds 2018. Available online: https://kids.frontiersin.org/articles/10.3389/frym.2018.00062 (accessed on 1 October 2023).
  12. Pires, H.; Mora, T.; de Azevedo, A.F.; Bandeira, M.S. Jardim Botânico Da Universidade de Coimbra: 241 Anos de História. In Jardins-Jardineiros-Jardinagem; Centro de Estudos de Comunicação e Sociedade, Universidade do Minho: Braga, Portugal, 2014; pp. 118–137. [Google Scholar]
  13. Breton, T. How Digital Will Help Us Preserve Our Cultural Heritage. 2021. Available online: https://www.linkedin.com/pulse/how-digital-help-us-preserve-our-cultural-heritage-thierry-breton/?published=t (accessed on 1 October 2023).
  14. Wikipedia Contributors. Tilia × Europaea. Wikipedia. 2023. Available online: https://en.wikipedia.org/w/index.php?title=Tilia_%C3%97_europaea&oldid=1168615327 (accessed on 1 October 2023).
  15. Wikipedia Contributors. Erythrina Crista-Galli. Wikipedia. 2023. Available online: https://en.wikipedia.org/w/index.php?title=Erythrina_crista-galli&oldid=1167581002 (accessed on 1 October 2023).
  16. Wikipedia Contributors. Ficus Macrophylla. Wikipedia. 2023. Available online: https://en.wikipedia.org/w/index.php?title=Ficus_macrophylla&oldid=1164970128 (accessed on 1 October 2023).
  17. Kahn, P.H., Jr. Losing Touch with Nature. IAI TV-Chang. How World Thinks. 2020. Available online: https://iai.tv/articles/losing-touch-with-nature-auid-1683 (accessed on 1 October 2023).
  18. Gullone, E. The Biophilia Hypothesis and Life in the 21st Century: Increasing Mental Health or Increasing Pathology? J. Happiness Stud. 2000, 1, 293–322. [Google Scholar] [CrossRef]
  19. UNESCO World Heritage Centre. University of Coimbra—Alta and Sofia. 2013. Available online: https://whc.unesco.org/en/list/1387/ (accessed on 1 October 2023).
  20. Rubin, M.; Cardoso, J.C.S.; Carvalho, P.M. Design Explorations of Interactive Point Cloud Based Virtual Environments Using Volumetric Capture and Visualisation Techniques. In Proceedings of the Entertainment Computing-ICEC 2022, Bremen, Germany, 1–3 November 2022; Springer International Publishing: Berlin/Heidelberg, Germany, 2022. [Google Scholar] [CrossRef]
  21. Llerena-Izquierdo, J.; Cedeño-Gonzabay, L. Photogrammetry and Augmented Reality to Promote the Religious Cultural Heritage of San Pedro Cathedral in Guayaquil, Ecuador. In Proceedings of the Applied Technologies; Communications in Computer and Information Science; Botto-Tobar, M., Zambrano Vizuete, M., Torres-Carrión, P., Montes León, S., Pizarro Vásquez, G., Durakovic, B., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 593–606. [Google Scholar] [CrossRef]
  22. Paladini, A.; Dhanda, A.; Reina Ortiz, M.; Weigert, A.; Nofal, E.; Min, A.; Gyi, M.; Su, S.; Van Balen, K.; Santana Quintero, M. Impact of virtual reality experience on accessibility of cultural heritage. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, XLII-2-W11, 929–936. [Google Scholar] [CrossRef]
  23. Canciani, M.; Conigliaro, E.; Del Grasso, M.; Papalini, P.; Saccone, M. 3D survey and augmented reality for cultural heritage. The case study of aurelian wall at castra praetoria in Rome. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, XLI-B5, 931–937. [Google Scholar] [CrossRef]
  24. Moura, J.M. Braga: Snapshots in Virtual Reality of a Vibrant City. 2018. Available online: http://jmartinho.net/braga-vr/ (accessed on 1 October 2023).
  25. Unity Technologies. Unity-Manual: Audio Source. 2023. Available online: https://docs.unity3d.com/Manual/class-AudioSource.html (accessed on 1 October 2023).
  26. Takahashi, K. Pcx-Point Cloud Importer/Renderer for Unity [Software]. 2023. Available online: https://github.com/keijiro/Pcx (accessed on 1 October 2023).
  27. Arbio. Mother Trees. Available online: https://www.arbioperu.org/en/blog-posts/mother-trees/ (accessed on 1 October 2023).
  28. Pessoa Tavares, A.C. Um Tributo Ao Jardim Botânico de Coimbra, edição de autor, em versão digital ed. 2015. Available online: https://www.eumed.net/libros-gratis/2015/1465/index.htm (accessed on 1 October 2023).
  29. Lim, M.; Shu, Y. The Future Is Fungi: How Fungi Can Feed Us, Heal Us, Free Us and Save Our World; Thames & Hudson Australia: Melbourne, Australia, 2022. [Google Scholar]
  30. Wageningen University & Research. Root System Drawings. Available online: https://images.wur.nl/digital/collection/coll13/search/searchterm/tilia (accessed on 1 October 2023).
  31. Viveros Medipalm. Erythrina Crista-Galli. 2023. Available online: http://www.medipalm.com/en/catalog-of-plants/437-erythrina-crista-galli.html (accessed on 1 October 2023).
  32. Fern, K. Tropical Plants Database-Ficus Macrophylla. 2023. Available online: https://tropical.theferns.info/viewtropical.php?id=Ficus+macrophylla (accessed on 1 October 2023).
Figure 1. Bird’s-eye view illustration of the Botanical Garden depicting its most prominent locations. Adapted from https://www.facebook.com/media/set/?set=a.424667424214088&type=3 (accessed on 1 October 2023).
Figure 1. Bird’s-eye view illustration of the Botanical Garden depicting its most prominent locations. Adapted from https://www.facebook.com/media/set/?set=a.424667424214088&type=3 (accessed on 1 October 2023).
Electronics 12 04216 g001
Figure 2. Diagram of the main workflow phases for the development of Kairos.
Figure 2. Diagram of the main workflow phases for the development of Kairos.
Electronics 12 04216 g002
Figure 3. Colour correction, within Adobe Lightroom, of the area containing the Erythrina Crista-Galli, in an effort to bring out as much detail from the scene as possible.
Figure 3. Colour correction, within Adobe Lightroom, of the area containing the Erythrina Crista-Galli, in an effort to bring out as much detail from the scene as possible.
Electronics 12 04216 g003
Figure 4. Failed camera and object alignments in Agisoft Metashape. (a) Note the strange camera positioning at the bottom right corner; (b) Note the duplicate branches and wooden tree support poles.
Figure 4. Failed camera and object alignments in Agisoft Metashape. (a) Note the strange camera positioning at the bottom right corner; (b) Note the duplicate branches and wooden tree support poles.
Electronics 12 04216 g004
Figure 5. Data refinement. (a) Raw dense cloud of the area containing the Ficus tree (344 million points); (b) Dense cloud after manual clean-up (261 million points); (c) Removal of blue sky points through Metashape’s colour selection tool; (d) Subsampling the point cloud with CloudCompare.
Figure 5. Data refinement. (a) Raw dense cloud of the area containing the Ficus tree (344 million points); (b) Dense cloud after manual clean-up (261 million points); (c) Removal of blue sky points through Metashape’s colour selection tool; (d) Subsampling the point cloud with CloudCompare.
Electronics 12 04216 g005aElectronics 12 04216 g005b
Figure 6. Root system creation process. Left: Crista-Galli. Centre: Tilia. Right: Ficus. (a) Isolating the trees trunks from their scenes and transforming them into meshes. Top: Tree trunk point clouds. Bottom: Tree trunk meshes. (b) Resulting root systems with their respective trunks. Top: Bottom view. Bottom: Front view.
Figure 6. Root system creation process. Left: Crista-Galli. Centre: Tilia. Right: Ficus. (a) Isolating the trees trunks from their scenes and transforming them into meshes. Top: Tree trunk point clouds. Bottom: Tree trunk meshes. (b) Resulting root systems with their respective trunks. Top: Bottom view. Bottom: Front view.
Electronics 12 04216 g006
Figure 7. Leaf system creation process. (a) Isolating the tree’s leaves through manual selection and colour filtering—the pink points highlight the tree’s branches before deletion. (b) Unity’s default point texture, note the jagged edges. (c) Improvement of the tree’s leaf point texture with a custom texture with a smoother colour to alpha value.
Figure 7. Leaf system creation process. (a) Isolating the tree’s leaves through manual selection and colour filtering—the pink points highlight the tree’s branches before deletion. (b) Unity’s default point texture, note the jagged edges. (c) Improvement of the tree’s leaf point texture with a custom texture with a smoother colour to alpha value.
Electronics 12 04216 g007
Figure 8. Butterflies. (a) Pieris rapae butterflies (Muséum de Toulouse, CC BY-SA 4.0 https://creativecommons.org/licenses/by-sa/4.0 (accessed on 1 October 2023), via Wikimedia Commons). (b) Illustrative representation of the wing animation. (c) Representation of the Pieris rapae butterflies within Unity.
Figure 8. Butterflies. (a) Pieris rapae butterflies (Muséum de Toulouse, CC BY-SA 4.0 https://creativecommons.org/licenses/by-sa/4.0 (accessed on 1 October 2023), via Wikimedia Commons). (b) Illustrative representation of the wing animation. (c) Representation of the Pieris rapae butterflies within Unity.
Electronics 12 04216 g008
Figure 9. Crista-galli’s flowers. (a) Flower cluster of the Crista-Galli, rich in nectar. (b) Screen capture of the Crista-Galli’s flower particle systems—speculating the release of pollen and nectar.
Figure 9. Crista-galli’s flowers. (a) Flower cluster of the Crista-Galli, rich in nectar. (b) Screen capture of the Crista-Galli’s flower particle systems—speculating the release of pollen and nectar.
Electronics 12 04216 g009
Figure 10. Top view of the Garden illustrating all three areas, together with their entry and exit points.
Figure 10. Top view of the Garden illustrating all three areas, together with their entry and exit points.
Electronics 12 04216 g010
Figure 11. Trigger boxes for area traversal. Representation of the proximity system that controls the fade-out effect. The red points represent the proximity sensors.
Figure 11. Trigger boxes for area traversal. Representation of the proximity system that controls the fade-out effect. The red points represent the proximity sensors.
Electronics 12 04216 g011
Figure 12. Results regarding demographics and perspectives on Nature. (a) Video game enthusiast? (b) Previous experience with VR or AR? (c) Society is losing touch with nature? (d) How important do you think nature is for our overall wellbeing (Avg: 9.9)? How much knowledge and understanding do you have of the natural world (Avg: 7.1)?
Figure 12. Results regarding demographics and perspectives on Nature. (a) Video game enthusiast? (b) Previous experience with VR or AR? (c) Society is losing touch with nature? (d) How important do you think nature is for our overall wellbeing (Avg: 9.9)? How much knowledge and understanding do you have of the natural world (Avg: 7.1)?
Electronics 12 04216 g012
Figure 13. Results for the questions regarding the experience of Kairos. (a) Bubble graph of the description of their experience of Kairos, using two words. (b) Bubble graph of which emotions were evoked during gameplay. (c) How would you rate this immersive experience (Avg: 9.5)? How did this experience make you feel (Avg: 9.5)?
Figure 13. Results for the questions regarding the experience of Kairos. (a) Bubble graph of the description of their experience of Kairos, using two words. (b) Bubble graph of which emotions were evoked during gameplay. (c) How would you rate this immersive experience (Avg: 9.5)? How did this experience make you feel (Avg: 9.5)?
Electronics 12 04216 g013
Figure 14. Locomotion preference and control intuitiveness rating for the two variants (regarding the 12 laboratory volunteers). (a) Locomotion preference results (volunteers could select more than one option). (b) How intuitive were the controls of the Desktop (PC) variant (Avg: 8.75)? How intuitive were the controls of the Oculus Quest 2 (VR) variant (Avg: 8.58)?
Figure 14. Locomotion preference and control intuitiveness rating for the two variants (regarding the 12 laboratory volunteers). (a) Locomotion preference results (volunteers could select more than one option). (b) How intuitive were the controls of the Desktop (PC) variant (Avg: 8.75)? How intuitive were the controls of the Oculus Quest 2 (VR) variant (Avg: 8.58)?
Electronics 12 04216 g014
Figure 15. Results regarding ambiance and appreciation for Nature. (a) How well did this project capture the ambience of the three areas of the Botanical Garden (Avg: 9.36)? How much has this experience improved your appreciation for the natural world (Avg: 8.00)? Has this experience made you feel any closer to nature (Avg: 8.32)? Has this experience increased your desire to explore nature in the real world (Avg: 8.45)? (b) Preferred area.
Figure 15. Results regarding ambiance and appreciation for Nature. (a) How well did this project capture the ambience of the three areas of the Botanical Garden (Avg: 9.36)? How much has this experience improved your appreciation for the natural world (Avg: 8.00)? Has this experience made you feel any closer to nature (Avg: 8.32)? Has this experience increased your desire to explore nature in the real world (Avg: 8.45)? (b) Preferred area.
Electronics 12 04216 g015aElectronics 12 04216 g015b
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Rubin, M.; Cardoso, J.C.S.; Carvalho, P.M. Kairos: Exploring a Virtual Botanical Garden through Point Clouds. Electronics 2023, 12, 4216. https://doi.org/10.3390/electronics12204216

AMA Style

Rubin M, Cardoso JCS, Carvalho PM. Kairos: Exploring a Virtual Botanical Garden through Point Clouds. Electronics. 2023; 12(20):4216. https://doi.org/10.3390/electronics12204216

Chicago/Turabian Style

Rubin, Maximilian, Jorge C. S. Cardoso, and Pedro Martins Carvalho. 2023. "Kairos: Exploring a Virtual Botanical Garden through Point Clouds" Electronics 12, no. 20: 4216. https://doi.org/10.3390/electronics12204216

APA Style

Rubin, M., Cardoso, J. C. S., & Carvalho, P. M. (2023). Kairos: Exploring a Virtual Botanical Garden through Point Clouds. Electronics, 12(20), 4216. https://doi.org/10.3390/electronics12204216

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop