1. Introduction
Using realia and involving participants in real experiences has proven to be one of the most effective methods of transferring knowledge and maintaining an audience’s engagement. Staged events, such as escape rooms [
1], military maneuvers [
2], physical and chemical experiments [
3], behavioral games [
4], etc. may profoundly affect the user. The more involved the user becomes in the experience, the stronger the intended cognitive effects are [
5]. Of course, natural environments have many physical and technological limitations, one of which is the need to adhere to safety conditions. That is why VR technologies are widely used in broadly understood education [
6]. The level of user involvement in the experience depends on how much the user’s mind can be deceived [
7].
Natural human interaction with the real environment is multi-sensory. Therefore, the more consistent signals a person perceives, the more credible the message is. Since it is difficult to reproduce the full range of sensory experiences under VR conditions, the main principle is to stimulate the maximum number of senses with a coherent information message while minimizing the delivery of stimuli contrary to the created experience [
7]. Fortunately, the human brain has developed a robust mechanism of masking stimuli that contradict or disrupt the object of concentration. Thus, reducing the contradicting stimuli along with the brain’s masking ability brings excellent results [
8].
Additionally, it should be emphasized that the narrative of experience plays a huge role in cognition. Therefore, building the right mood and the context of the environment means that the power of the cognitive message is increased. Since VR is developing dynamically, the visual and acoustic message is becoming increasingly realistic. However, VR experiences are not free from shortcomings. One of the most significant issues is the collision dissonance with virtual objects.
The user can see, hear, and even feel (thanks to haptics) virtual objects. However, the whole experience has no logic when the user, seeing themselves as an avatar, is denied collision with the shells of surrounding virtual objects [
9]. This type of issue significantly weakens the commitment to the experience as the information provided contradicts the real world [
10].
Another issue is connected with free movement in VR. The user, immersed in the VR environment, is physically present in a limited real space. It is difficult to imagine that a user experiencing the pitch of a football field could freely use such a space while being in a small room. Manufacturers of VR peripheral devices provide various types of treadmills that try to mitigate this issue. Generally, VR developers strive to make the applications they build as intuitive as possible [
11]. The best way to achieve this goal is to base these applications on the analogies of real-world experiences.
Therefore, the natural world is often the inspiration for the design of VR environments. For example, if the VR user sees a switch placed on the wall, based on prior experience, they can easily guess that it may be a light switch. The more intuitive the VR world is, the greater the commitment. One quickly forgets that the environment is virtual when the experience is consistent with the expectations built based on the real world [
12]. Interestingly, mechanisms that resemble magic or supernatural forces are generally well received by users in VR. They perceive collisions poorly but can accept teleportation as a way to move around. The use of a magic laser pointer replaces direct collision with virtual objects.
Increasingly, VR experiences are enriched with complementary external tools, such as overload installations, low-frequency vibration generators, installations imitating rainfall, snowfall, gusts of wind, humidity, cold, and heat [
13], to enhance immersion. Recently, much has been said about the creation of aroma and taste sensations [
14]. All these procedures are intended to supplement the basic audiovisual transmission of standard headsets.
This work presents the following contributions:
We introduce the VR experience imitating earthquakes enhanced by a tilt platform. The experience consists of three events imitating a crack and a partial collapse of the ground. The events differ in the provided experiences, overload, and haptics.
The introduced application was tested on 22 subjects. The procedure includes an objective (EEG) and subjective (questionnaire) evaluation of the user’s affective state with and without haptic feedback in different realism conditions [
15]. The results are juxtaposed among specific events.
An EEG device used for collecting objective data is integrated with the headset. Thus, there is no need for additional external measuring or data collecting equipment, which can disturb the VR experience.
The rest of the paper is organized as follows:
Section 2 reviews the related work in the field of deepening VR immersion. In
Section 3, the details of the proposed solution are described, followed by the experimental results and discussion (
Section 4). Finally, our conclusions are presented in
Section 5.
2. Related Works
Haptic technology is gaining widespread acceptance as a crucial part of VR systems, enhancing visual-only interfaces with the sense of touch. Systems are being developed to use haptic interfaces for deepening immersion in VR serious games or training applications to improve their efficacy. Several companies are working on full-body or torso haptic vests as well as haptic suits designed as a means for the user to feel external stimuli with their whole body. In this section, we discuss the essential haptic systems particularly noting vibration/motion simulating platforms.
2.1. Haptics for Virtual Reality
According to [
16], haptics in VR can be classified as passive or active, tactile or proprioceptive force, and self-grounded or world-grounded. Active haptics is the most popular form of haptics as they are dynamically controlled by a computer and may provide various feelings of simulated virtual objects.
In contrast to this, passive haptics is based on the concept of matching a real physical object with the shape of a virtual one, so the possibilities are limited to just one option. Tactile haptics simulates a sense of touch through the skin, whereas proprioceptive force gives a sense of limb movement and muscular resistance. In proprioceptive force haptics, two types are distinguished—self-grounded (worn and moves with the user) and world-grounded (attached to the real world and does not move).
VR haptics technology has been a subject of rapid development. Initially, haptics technology was only applied to controllers; however, currently the market and opportunities have significantly expanded. Virtual reality may be experienced with diverse haptics devices, including gloves, shoes, whole suits or motion platforms, and treadmills.
Some haptics systems have been successfully implemented in medical and dental training. Rhienmora et al. [
17] used Phantom devices (Sensable Technologies Inc., Woburn, MA, USA) to simulate performing operations, such as the pushing, pulling, and cutting of soft or hard tissue with realistic force feedback. In [
18], the controllers provided vibrations of different intensities, creating the feeling of real tools. After the testing phase of the VR simulator, the authors stated that the system could be used in the educational process at the early stage of acquiring skills.
Petrenco et al. [
19] presented a review of several haptic gloves, among them, ExoTen-Glove, RML Glove, CyberGrasp, DEXMO, The Rutgers Master II, and Wolverine. Another study conducted by Almeida et al. [
20] compared haptic gloves to standard VR hand controller devices (typically button-based). Interestingly, the cyber-glove gave a higher sense of immersion and embodiment in the VR environment. Wrist-based haptic feedback substantially improved virtual hand-based interactions in VR as well. In [
21], Tasbi, a compact bracelet device was used to render complex multisensory squeeze and vibrotactile feedback. Additionally, haptic shoes are available on the market. These provide the sense of walking on diverse types of ground, for instance Taclim [
22] and DropLabs VR shoes [
23].
Haptic garments are one of the most important devices supporting deep feeling in VR environments. One of the examples is TactaVest, a customized vest that introduced vibration only on the upper body [
24]. Research carried out by [
25] proved that haptic vests (such as bHaptics TactSuit) may improve users’ emotional engagement in the VR experience. Moreover, it is not only the sense of touch included in certain haptic devices. TEGway ThermoReal allows users to feel the temperature while wearing gloves and sleeves [
26]. The full-body haptic feedback systems may be represented by TESLASUIT [
27].
2.2. Haptic Platforms
VR is an ideal solution for entertainment. Thus, most of the haptic platforms (e.g., the Funin standing VR simulator or VRUMBLER–VR vibration platform) serve as a peripheral device enhancing the VR experience by making it more realistic. For example, in [
28] the authors present a six DoF motion platform, which reproduces a natural sliding movement in a virtual environment, making it possible to simulate ski sports regardless of location and weather conditions. However, the possible applications are not limited to sheer entertainment, and they often include training or educational purposes.
For example, WoaH [
29] is a virtual reality work-at-height simulator that aims to mitigate the stress experienced at high altitudes. The simulator comprises a real ladder synchronized in position with a virtual one placed 11 m above the ground in a virtual environment. The creators conducted a user study evaluating the cybersic kNess, perceived realism, and anxiety through subjective (questionnaires) and objective (electrodermal activity) measurements. The results indicated that WoaH generated anxiety as expected and was perceived as realistic. Adding vibrations had a significant impact on the perceived realism but not on the electrodermal activity.
Interesting research is presented in [
30], where the authors investigated the effects of integrated visual, auditory, and vibrotactile stimuli. The participants were presented with three types of stimuli: visual and audio (the footsteps of an approaching virtual human), visual and vibration of the footsteps, and visual vibration of the footsteps along with the sounds.
Their research, based on the head trajectory and skin conductance response, proved that additional stimuli in the form of vibration enhanced social presence perception. Similarly, in [
31], the authors investigated the potential of haptics, testing users performing cognitive tasks in VR. They examined the effects of different multi-sensory feedback combinations: visual, audio, tactile (floor vibration), and smell. The results showed that a multi-sensory VR system was superior to typical VR systems (vision and audio) in terms of the sense of presence and user preference.
Only a few VR applications allow users to walk and simultaneously receive haptic feedback from the terrain. For example, in [
32], the authors presented a pair of haptic shoes designed to create realistic sensations of ground surface deformation through MR fluid (magnetorheological fluid) actuators. Their solution operates on the basis of physical interaction between the shoes and the ground surfaces while walking in a VR environment. This technology simulates a variety of surface conditions, such as snow, mud, and dry sand, by changing its viscosity when the actuators are pressed by the user’s foot. A similar solution was presented in [
33].
Li et al. [
34] proposed a VR scenario with floor vibrations for walking on sand, gravel, wood, and concrete. They tested three conditions: (1) walking sound and matched vibrations, (2) walking sound with mismatched vibrations, and (3) sound without any vibrations. In addition, the authors investigated the whole-body tactile feedback and developed a relatively multi-purpose tactile stimuli-producing platform.
Often, haptic platforms are added to systems simulating geological phenomena or emergency situations. For example, in [
35], the authors provided a serious VR game that aims to prepare the users for an emergency earthquake situation in a public building in New Zealand. The realism of the proposed solution was enhanced by the physical shaking of a vibrating platform on which users were seated during the training. The prototype was tested by 170 volunteers working at or visiting the Auckland City Hospital. The analysis indicates the importance of vibrating peripherals in designing a VR for earthquake emergencies.
In [
36], the authors presented a similar VR system serving as construction (bridge, road, and tunnel works) safety training. The architecture of this system includes the VR application and the vibration platform. Another VR research aimed at providing the effect of exposure to stressful events [
37]. The participants were immersed in the 3D multisensory environment using a Virtually Better VR System. While sitting on the vibration platform, they could feel continuous vibrations connected with the sound of a car engine and some sudden vibrations linked to the sound of an explosion. The authors investigated how the VR challenge with stressful elements influenced the cortisol levels and the heart rates of testers. Interestingly, they found that previous real experiences of participants could significantly affect their level of immersion in virtual reality.
An interesting project was presented in [
38], where the authors examined the use of a white cane controller that enabled navigation of large virtual environments with complex architecture, while devoid of the sense of sight and simulating blindness. The cane controller employs a lightweight three-axis brake mechanism to provide the large-scale shape of virtual objects. Additionally, surface textures are rendered with a voice coil actuator based on contact vibrations. Specialized audio is determined based on the progression of sound through the geometry around the user. Validation tests showed that seven out of eight users could successfully navigate the virtual room and locate targets while avoiding collisions.
In [
39], the authors presented haptic floor plates providing postural perturbations according to the collisions with obstacles in the virtual environment. Their VR-supported balance training with the haptic floor was carried out in healthy individuals and tested with a single subject who had suffered a stroke. The preliminary results demonstrated good dynamics of the haptic floor and great acceptability among the participating subjects.
Recently, Microsoft indicated a new direction for the haptic VR ecosystem by filing a patent for virtual reality floor mat activity region [
40]. This corporate giant plans to create a mat with built-in tags that the VR headset will scan to identify and configure a safe zone to avoid a collision with surrounding objects. Additionally, the company claims that the coating will be able to provide tactile feedback through vibration.
A similar approach, but for a large-scale immersive space, was presented in [
41]. Such solutions, thanks to peripheral devices that complement traditional VR experiences with a concurrent impact on several other senses, will undoubtedly become the norm in the landscape of VR applications because they significantly increase the impressions and immersion, which has been confirmed by numerous studies.
3. Methods
This project aims to provide the SIM-Motion platform simulator to put the user in a natural situation within a VR environment. Thus, the purpose is to study human behavior in an immersive context, particularly during an earthquake scenario. This section presents the platform specifications in detail and the related VR application as well as its usability validation.
3.1. SIM-Motion Platform-Design
The vibrating tilt platform (see
Figure 1) was designed to be used for multi-sensory stimulation of the VR experience user. By design, the platform assumes passive and static user participation, which means the user is expected to stand still. Therefore, the plate has marked areas for the user to place their right and left foot. Additionally, the research procedure includes providing all information and guidance on the use of the platform. The device offers two basic sensory functionalities.
The first functionality is a two-axis ±10° tilt combined with a possibility of changing vertical axis ±50 mm, which is controlled by a set of three electric actuators. Supporting the plate with the actuators ensures its positioning is uniquely determined. Since the three support points determine the plate as unique, this eliminates the need for software control over the mutual positioning of individual actuators.
Actuators with a static linear overload capacity of 1.2 kN were used to build the platform; therefore, an upper limit of the weight of the test user was set at 100 kg. The mechanical systems were protected against overloading by using three tensometers mounted at the base of the actuators. Exceeding a load of 1.1 kN in any of the sensors results in activation of the emergency mode and immediate termination of the test. Additionally, simple detection of user “imbalance” was implemented, which also forces the test to stop.
The detection logic assumes that the load on any of the actuators changes by more than 10% in less than 200 ms. Three parameters were adjusted experimentally for two subjects weighing 48 and 92 kg. To eliminate troublesome linear position detectors, we decided to prepare a system RESET procedure. This procedure, using the signals of the actuators’ built-in limit switches, enabled setting the platform in a standardized initial position.
The second functionality of the vibrating tilt platform is to induce vibrations and shocks. For this purpose, we used a set of four small vibration motors to create the sensation of “tingling” of the ground. The motors were placed in the extreme corners of the plate and can be controlled individually, which allows the implementation of spatial effects. For example, it is possible to create the effect of moving vibrations from the left to the right side of the plate. The “shaking” effect was realized using a centrally mounted 40 W industrial electric external vibrator powered by the electronically controlled inverter.
A Raspberry Pi 4 central microcontroller controls all components of the platform. The microcontroller manages the system using a set of predefined scenarios. To define the scenarios, we used a list of records representing all the control relays’ logical states and the duration of that state in standard JSON format. Remote control of the platform was implemented using web services.
The prepared API provides multiple procedures: starting the RESET() procedure (the procedure that allows setting the platform actuators’ position to a defined starting point and disables all haptic effects; this is executed at system startup), enabling the BREAKDOWN() emergency state (the procedure of absolute immobilization of the platform and complete exit from the VR experiment—this is performed when emergency states are detected: loss of balance, lack of communication with the platform, and manual enforcement of the system supervisor), adding the predefined task TASK(tas kName) to the queue, immediately interrupting the execution of the current task, and resetting the line of scheduled tasks STOP() (the procedure always starts after the end of the earthquake effect simulation cycle with an absolute stop of the actuators and haptic devices).
The data exchange server uses the Flask micro-framework, which uses the POST and GET data transfer methods to provide all information from the platform. The functional diagram of the mechatronic tilt platform is shown in
Figure 2.
Additionally, the safety module consists of a safety button that can be pressed at any time by the user to stop the experiment. For safety, we used a desktop power supply with short circuit protection.
3.2. VR Application
The virtual environment was developed using the Unity Game Engine and C# programming language in the form of three events imitating the fracture and partial collapse of the visualized substrate during the earthquake (see
Figure 3). The presented events differ in the delivered overload and haptic experiences. The first experience consists of audiovisual impressions only. The second experience is enriched with the overload effects of the moving platform. The third one additionally provides haptic effects imitating ground vibrations. The order of the presented events is randomized. The target platform for the tool is HTC Vive, however it can be easily ported to other VR platforms.
The earthquake effect is visualized by the ground cracking and a local sinkhole forming. The visual effect lasts about 4 s. The initiation of tested VR events cycles can be accessed from an external control console operated by the personnel. The decision to initiate test processes is based on the user behavior observations and objective stabilization of the monitored EEG signals. Such a procedure was necessary due to the users’ highly variable adaptation time to the appropriate measurement conditions.
Individuals who did not experience VR before took up to 3 min longer to stabilize their behavior (by looking out all around the area) than participants with previous VR experience. The measurement cycle contains five steps. The first step, recording reference signals in the VR environment, lasts 10 s. Then, three measurement cycles with earthquake visualization enhanced by optional haptic effects are generated.
The measurement ends with a 10-second recording of the subject’s suppressed emotions. The application caches all monitored data in RAM during measurement and saves the raw data in CSV format. Data from the EEG Looxid device is recorded using the manufacturer’s standard SDK. The VR application is a proprietary implementation that controls an external platform using a network interface.
There is no reason that the recorded EEG signals could have any hidden bias. To be sure, additional checking tests were conducted, consisting of recording EEG signals from a sensor mounted on a person not participating in the VR experiment. There were no signal correlations with the phases of the VR experiment.
3.3. Compliance with Ethical Standards
All objectives of the experiment were achieved with high ethics level, according to the Research Ethics Committee of the Lodz University of Technology. Participants were informed (1) about the purpose of the study, (2) that they had the right to stop the experiment at any time without providing any reason, and (3) that they could stop the experiment if they felt sick or had any discomfort. All the training sessions were performed under the supervision of a researcher in case there was an emergency. All participants signed participant information sheets and consent forms before undertaking the VR or video training.
3.4. Assessment Procedure
The platform was evaluated using objective (in the form of brain activity acquired with an EEG sensor) and subjective (questionnaire) features. The evaluation procedure is described in the following section.
3.4.1. Electroencephalography
EEG is a method to monitor the electrical activity of the brain. The relationship between brain activity and emotions was repeatedly proven. There are two main areas of the brain correlated with emotional states, namely the amygdala and the frontal lobe. For example, research has found that the amygdala is the biological basis of emotions that store fear and anxiety.
Additionally, studies have shown that the frontal scalp stores more emotional activation than other brain regions, such as the temporal, parietal, and occipital lobes [
42]. However, an EEG headset may be inconvenient since electrodes are placed along the scalp, especially when wearing a VR headset. Thus, for the purpose of this project, we used the Looxid Link device to perceive, evaluate, and monitor the affective state of the user while testing the VR app [
43].
Looxid Link is an EEG system (see
Figure 4) equipped with gold-plated sensors to detect brainwave signals from the prefrontal area. Signals that arise from brain activity are streamed at 500 samples per second to the computer. Using Looxid Link, it is possible to monitor electroencephalographic signals and consequently identify the user’s affect, such as attention, relaxation, and brain balance, as well as fundamental characteristics of brainwaves including delta, theta, alpha, beta, and gamma per 100 ms. The signals and the VR content that the user is experiencing can be synchronized on a time basis. Thus, it is easy to connect a specific emotional state with a particular event in the VR experience.
3.4.2. Evaluation Questionnaire
We applied two types of questionnaires in the study. The first one concerned the user’s mood before and after the VR session (pre-test and post-test). The second questionnaire was aimed to evaluate usability.
The pre- and post-tests contained the same questions. Each user evaluated their mood and feelings in six categories: subjectively positive or negative, being tense or relaxed, and feeling sedated or simulated.
The questionnaire provided after the VR experience consisted of primary data about the user (age, sex, and their previous experience in VR) and several questions about the feelings and level of immersion during a single VR scenario. The users evaluated three scenarios separately using a 5-level rating scale. The questions were as follows:
How deep was your immersion in each scenario? (Question 1.)
How realistic was the earthquake? (Question 2.)
How stable did you feel on the platform? (Question 3.)
Additionally, we evaluated the impact of four elements (graphics, sound, vibrations, and platform tilt) on the user’s immersion using a five-point rating scale—from no impact to very high impact.
4. Experimental Procedure and Results
A total of 22 volunteers participated in the study. First of all, testers were informed about the experimental procedure. Next, they were familiarized with the VR equipment (VR headset), additional sensors (EEG), and the platform. Prior to commencing the VR experience, users filled in the pre-test containing information about the subjective assessment of their mood and psycho-physical state. The VR session consisted of three scenarios, described in
Section 3.2. Immediately after the end of the VR session, the users were asked to complete the post-test that allowed them to determine their subjective psycho-physical state and assess the usability aspects of the conducted VR experience.
The characteristics of the participants were the following: on average, samples were 28.4 years old (median = 23, std = 8.8, range = 21–48) and the group comprised twelve men (mean = 25.0, median = 23.0, std = 5.92, range = 21–38), and ten women (mean = 32.4, median = 23.0, std = 9.83, range = 21–48). The majority of participants were familiar with VR technology and had tried it at least a few times before the session (). Only for of the users was the described VR experience their first contact with virtual reality.
The results of mood assessment showed that the psycho-physical state (evaluated subjectively by experiment participants) slightly changed after immersion in the VR earthquake. In general, there were just small changes—around 1 point in the scale. volunteers declared to be more energetic and stimulated after VR experience. Interestingly, evaluated their mood as more positive in post-test even though the real earthquake experience belongs to rather unpleasant ones. The participants usually did not feel more tense or relaxed.
The usability questionnaire revealed that the deepest presence, defined as being “pulled in” to a virtual world, was achieved in the scenario with additional platform tilt (average = 3.5). The scenario with vibrations (scenario C) was evaluated similarly to the one without them (scenario B), the average answer to (Question 1) was 2.6 and 2.8, respectively. The virtual earthquake was assessed as moderately realistic. However, the earthquake with platform tilt was the most realistic experience from all tested ones (average = 3.5). As assumed the users’ feeling of stability altered with implementation of platform vibrations and tilt.
In the basic scenario (scenario A), the participants perceived the platform as stable (average = 3.9). The additional effects led to a decrease of stability. Summary of three questions is presented in
Figure 5. According to the users, platform tilt may influence the immersion to the greatest extent. The impact of this element was evaluated as high (average = 4.5). The other components, such as graphics, sound and vibrations had a lower effect on the immersion. They were given moderate impact: graphics − average = 3.6, sound − average = 3.1, vibrations − average = 3.5.
Electroencephalography Analysis
Figure 6 presents an example of brainwaves recordings interpreted by Looxid Link as two states (attention and relaxation) of a 33-year-old woman during the session divided into separate scenarios.
The beginning of the experience (first ten seconds) is a sudden increase of attention (to the maximum value). This is likely related to the heightened curiosity of the VR world and deep immersion (interest in new surroundings, natural curiosity, the user was looking around—we observed Y rotation). The first scenario (B, audiovisual impressions only) is characterized by a high state of attention, slowly descending to approximately 0.2 level while relaxation increases to almost 0.8 level. An equalized, fairly low level of both emotions occurs at the beginning of the C-scenario (vibration-enhanced).
The change happens suddenly in the middle of the scenario when the level of attention rises to a maximum. Then, it drops to reach a minimum with a concomitant increase in the level of relaxation. A similar situation is observed for the D-scenario (vibration with platform tilt). However, the increase in the level of attention is not as significant as in the previous one. A decrease in attention and an increase in relaxation are observed when haptic stimuli are turned off. The higher level of attention during the C scenario compared to D (despite the increased haptics) can be caused by the user being accustomed to haptics itself, which, at first, is highly surprising and unexpected.
Figure 7 presents the level of attention averaged over all participants during the whole session divided into three (B, C, and D) separate scenarios.
The highest level of attention (0.67) and the highest standard deviation (0.34) were observed for the C-scenario (vibration-enhanced). Thus, the results are inconsistent with the subjective assessment of where scenario D had the most significant impact on the users. On the other hand, the lowest level of attention (0.29) is observed for the B-scenario (audiovisual impressions only).
Figure 8 presents the level of attention averaged over all participants during the whole session divided into three (B, C, D) separate scenarios.
The highest level of relaxation (0.64) and, at the same time, the lowest level of standard deviation (0.19) were observed for the B-scenario (audiovisual impressions only). On the other hand, the lowest level of relaxation (0.25) with the lowest level of standard deviation (0.17) was observed for the D-scenario (vibration with platform tilt).
As one can see in
Figure 9, the average attention does not show any correlation to the enhancement of immersion level. However, the remaining analyzed parameters show a clear correlation. Generally, for quantitative research, relaxation decreases with the increase in the number of stimuli. One can also observe clear correlations within qualitative research.
5. Conclusions
In this paper, we presented a vibrating tilt platform that enhanced the realism of a VR experience simulating earthquakes. This platform can be easily adapted to many other applications, such as rehabilitation for motor disabilities or phobias, rescue environments, and educational issues. Both objective (EEG) and subjective (questionnaire) data indicate that the haptic sensations induced by the SIM-Motion platform increase the user’s level of immersion in the proposed VR environment.
It is worth noting that both objective and subjective levels of immersion were significantly increased, even though the full spectrum of available haptics was applied for only a few seconds. This undoubtedly proves the enormous potential of the presented method. Conclusions drawn from the test research encourage us to implement multisensory interactions into realized VR applications. We expect that this will result in a significant increase of quality of experience in virtual and augmented reality. Additionally, our research, like many previous studies, confirms that participating in a VR experience results in a subjective mood increase of the user, and we make the contention that VR immersion can be used as a tool for alleviating effects of mood affective disorders.
Our future work will focus on improving stability, which is lower in the third scenario. We will also concentrate on algorithms to validate the platform movement and integrate visual feedback for haptic sensation by adding appropriate sensors to the platform. Additionally, we noticed that, to improve the sense of security, we should equip the platform with a barrier that will enable the experiment participant to save himself when he feels a loss of balance. We will also consider changing the vibration motors to more efficient ones to obtain a more pronounced ground vibration effect.
Author Contributions
Conceptualization, D.K. and G.Z.; methodology, D.K. and G.Z.; software, G.Z. and Ł.A.; validation, D.K., G.Z. and A.L.-L.; formal analysis, D.K., G.Z. and A.L.-L.; investigation, D.K., G.Z. and A.L.-L.; resources, D.K., G.Z. and A.L.-L.; data curation, D.K., G.Z. and A.L.-L.; writing—original draft preparation, D.K., G.Z. and A.L.-L.; writing—review and editing, D.K.; visualization, G.Z.; supervision, D.K.; project administration, D.K.; funding acquisition, D.K. All authors have read and agreed to the published version of the manuscript.
Funding
This research received no external funding.
Institutional Review Board Statement
All objectives of the experiment were achieved with high ethics level, according to the Polish National Science Center. Participants were informed (1) about the purpose of the study, (2) that they had the right to stop the experiment at any time without providing any reason and (3) that they could stop the experiment if they felt sick or had any discomfort. All the training sessions were performed under the supervision of a researcher in case there was an emergency. All participants signed informed consent, a Participant information sheet and consent form before undertaking the VR training. Additionally, we applied to the Commission on the Ethics of Scientific Research from the Lodz University of Technology and received a positive decision (#2/2021).
Informed Consent Statement
Informed consent was obtained from all subjects involved in the study.
Acknowledgments
This work was supported in part by a grant from The National Centre for Research and Development (Accessible Lodz University of Technology/POWR.03.05.00-00-A039/20).
Conflicts of Interest
The authors declare no conflict of interest.
References
- Kinio, A.E.; Dufresne, L.; Brandys, T.; Jetty, P. Break out of the classroom: The use of escape rooms as an alternative teaching strategy in surgical education. J. Surg. Educ. 2019, 76, 134–139. [Google Scholar] [CrossRef] [PubMed]
- Aghion, P.; Jaravel, X.; Persson, T.; Rouzet, D. Education and military rivalry. J. Eur. Econ. Assoc. 2019, 17, 376–412. [Google Scholar] [CrossRef]
- De Jong, T.; Linn, M.C.; Zacharia, Z.C. Physical and virtual laboratories in science and engineering education. Science 2013, 340, 305–308. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Keys, B.; Wolfe, J. The role of management games and simulations in education and research. J. Manag. 1990, 16, 307–336. [Google Scholar] [CrossRef]
- Mair, A.; Poirier, M.; Conway, M.A. Memory for staged events: Supporting older and younger adults’ memory with SenseCam. Q. J. Exp. Psychol. 2019, 72, 717–728. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Hussein, M.; Nätterdal, C. The Benefits of Virtual Reality in Education-A Comparision Study; University of Goteborg: Goteborg, Sweden, 2015. [Google Scholar]
- Brown, E.; Cairns, P. A grounded investigation of game immersion. In Proceedings of the CHI’04 Extended Abstracts on Human Factors in Computing Systems, Yokohama, Japan, 8–13 May 2004; pp. 1297–1300. [Google Scholar]
- Baudrillard, J. The Ecstasy of Communication; The MIT Press: Cambridge, MA, USA, 1983. [Google Scholar]
- Sohre, N.; Mackin, C.; Interrante, V.; Guy, S.J. Evaluating collision avoidance effects on discomfort in virtual environments. In Proceedings of the 2017 IEEE Virtual Humans and Crowds for Immersive Environments (VHCIE), Los Angeles, CA, USA, 19 March 2017; pp. 1–5. [Google Scholar]
- Cooper, N.; Milella, F.; Pinto, C.; Cant, I.; White, M.; Meyer, G. The effects of substitute multisensory feedback on task performance and the sense of presence in a virtual reality environment. PLoS ONE 2018, 13, e0191846. [Google Scholar] [CrossRef] [PubMed]
- Cherni, H.; Nicolas, S.; Métayer, N. Using virtual reality treadmill as a locomotion technique in a navigation task: Impact on user experience–case of the KatWalk. Int. J. Virtual Real. 2021, 21, 1–14. [Google Scholar] [CrossRef]
- Pan, X.; Hamilton, A.F.d.C. Why and how to use virtual reality to study human social interaction: The challenges of exploring a new research landscape. Br. J. Psychol. 2018, 109, 395–417. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Rietzler, M.; Plaumann, K.; Kränzle, T.; Erath, M.; Stahl, A.; Rukzio, E. VaiR: Simulating 3D Airflows in virtual reality. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, CO, USA, 6–11 May 2017; pp. 5669–5677. [Google Scholar]
- Flavián, C.; Ibáñez-Sánchez, S.; Orús, C. The influence of scent on virtual reality experiences: The role of aroma-content congruence. J. Bus. Res. 2021, 123, 289–301. [Google Scholar] [CrossRef]
- Batmaz, A.U.; de Mathelin, M.; Dresp-Langley, B. Seeing virtual while acting real: Visual display and strategy effects on the time and precision of eye-hand coordination. PLoS ONE 2017, 12, e0183789. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Jerald, J. The VR Book: Human-Centered Design for Virtual Reality; Morgan & Claypool: Williston, ND, USA, 2015. [Google Scholar]
- Rhienmora, P.; Haddawy, P.; Dailey, M.; Khanal, P.; Suebnukarn, S. Development of a Dental Skills Training Simulator Using Virtual Realityand Haptic Device. Nectec Tech. J. 2008, 8, 140–147. [Google Scholar]
- Dyulicheva, Y.; Gaponov, D.; Mladenović, R.; Kosova, Y. The virtual reality simulator development for dental students training: A pilot study. In Proceedings of the CEUR Workshop Proceedings, Braunschweig, Germany, 22–26 February 2021. [Google Scholar]
- Petrenko, V.; Tebueva, F.; Antonov, V.; Apurin, A.; Zavolokina, U. Development of haptic gloves with vibration feedback as a tool for manipulation in virtual reality based on bend sensors and absolute orientation sensors. IOP Conf. Ser. Mater. Sci. Eng. 2020, 873, 012025. [Google Scholar] [CrossRef]
- Almeida, L.; Lopes, E.; Yalçinkaya, B.; Martins, R.; Lopes, A.; Menezes, P.; Pires, G. Towards natural interaction in immersive reality with a cyber-glove. In Proceedings of the 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC), Bari, Italy, 6–9 October 2019; pp. 2653–2658. [Google Scholar] [CrossRef]
- Pezent, E.; O’Malley, M.K.; Israr, A.; Samad, M.; Robinson, S.; Agarwal, P.; Benko, H.; Colonnese, N. Explorations of Wrist Haptic Feedback for AR/VR Interactions with Tasbi. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems; Association for Computing Machinery: New York, NY, USA, 2020; pp. 1–4. [Google Scholar] [CrossRef]
- Taclim Website. Available online: https://taclim.cerevo.com/ (accessed on 30 October 2021).
- DropLabs Website. Available online: https://droplabs.com/ (accessed on 30 October 2021).
- Lindeman, R.W.; Page, R.; Yanagida, Y.; Sibert, J.L. Towards Full-Body Haptic Feedback: The Design and Deployment of a Spatialized Vibrotactile Feedback System. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology, Hong Kong, China, 10–12 November 2004; Association for Computing Machinery: New York, NY, USA, 2004; pp. 146–149. [Google Scholar] [CrossRef]
- Elor, A.; Song, A.; Kurniawan, S. Understanding Emotional Expression with Haptic Feedback Vest Patterns and Immersive Virtual Reality. In Proceedings of the 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Lisbon, Portugal, 27 March–1 April 2021. [Google Scholar] [CrossRef]
- Tegway Website. Available online: http://tegway.co/tegway/ (accessed on 30 October 2021).
- Teslasuit Website. Available online: https://teslasuit.io/ (accessed on 30 October 2021).
- Amouri, A.; Ababsa, F. Sliding movement platform for mixed reality application. IFAC-PapersOnLine 2016, 49, 662–667. [Google Scholar] [CrossRef]
- Di Loreto, C.; Chardonnet, J.R.; Ryard, J.; Rousseau, A. WoaH: A Virtual Reality Work-at-Height Simulator. In Proceedings of the 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Reutlingen, Germany, 18–22 March 2018; pp. 281–288. [Google Scholar] [CrossRef] [Green Version]
- Lee, M.; Bruder, G.; Welch, G.F. Exploring the effect of vibrotactile feedback through the floor on social presence in an immersive virtual environment. In Proceedings of the 2017 IEEE Virtual Reality (VR), Los Angeles, CA, USA, 18–22 March 2017; pp. 105–111. [Google Scholar]
- Jung, S.; Wood, A.L.; Hoermann, S.; Abhayawardhana, P.L.; Lindeman, R.W. The impact of multi-sensory stimuli on confidence levels for perceptual-cognitive tasks in vr. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 22–26 March 2020; pp. 463–472. [Google Scholar]
- Son, H.; Gil, H.; Byeon, S.; Kim, S.Y.; Kim, J.R. RealWalk: Feeling Ground Surfaces While Walking in Virtual Reality. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems; Association for Computing Machinery: New York, NY, USA, 2018; pp. 1–4. [Google Scholar] [CrossRef]
- Visell, Y.; Cooperstock, J.R.; Giordano, B.L.; Franinovic, K.; Law, A.; McAdams, S.; Jathal, K.; Fontana, F. A Vibrotactile Device for Display of Virtual Ground Materials in Walking. In Haptics: Perception, Devices and Scenarios; Ferre, M., Ed.; Springer: Berlin/Heidelberg, Germany, 2008; pp. 420–426. [Google Scholar]
- Li, R.C.; Jung, S.; McKee, R.D.; Whitton, M.C.; Lindeman, R.W. Introduce Floor Vibration to Virtual Reality. In Symposium on Spatial User Interaction; Association for Computing Machinery: New York, NY, USA, 2021. [Google Scholar] [CrossRef]
- Lovreglio, R.; Gonzalez, V.; Feng, Z.; Amor, R.; Spearpoint, M.; Thomas, J.; Trotter, M.; Sacks, R. Prototyping virtual reality serious games for building earthquake preparedness: The Auckland City Hospital case study. Adv. Eng. Inform. 2018, 38, 670–682. [Google Scholar] [CrossRef] [Green Version]
- Bin, F.; Xi, Z.; Yi, C.; Ping, W.G. Construction safety education system based on virtual reality. In IOP Conference Series: Materials Science and Engineering; IOP Publishing: Bristol, UK, 2019; Volume 563, p. 042011. [Google Scholar]
- Malta, L.S.; Giosan, C.; Szkodny, L.E.; Altemus, M.M.; Rizzo, A.A.; Silbersweig, D.A.; Difede, J. Development of a virtual reality laboratory stressor. Virtual Real. 2021, 25, 293–302. [Google Scholar] [CrossRef]
- Siu, A.F.; Sinclair, M.; Kovacs, R.; Ofek, E.; Holz, C.; Cutrell, E. Virtual reality without vision: A haptic and auditory white cane to navigate complex virtual worlds. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, New York, NY, USA, 25–30 April 2020; pp. 1–13. [Google Scholar]
- Cikajlo, I.; Oblak, J.; Matjačić, Z. Haptic floor for virtual balance training. In Proceedings of the 2011 IEEE World Haptics Conference, Istanbul, Turkey, 21–24 June 2011; pp. 179–184. [Google Scholar]
- Schwarz, J.; Ray, J.M. Virtual Reality Floor Mat Activity Region. U.S. Patent 10,496,155, 3 October 2019. [Google Scholar]
- Bouillot, N.; Seta, M. A Scalable Haptic Floor dedicated to large Immersive Spaces. In Proceedings of the 17th Linux Audio Conference (LAC-19), Stanford, CA, USA, 23–26 March 2019; Volume 5. [Google Scholar]
- Blackford, J.U.; Pine, D.S. Neural substrates of childhood anxiety disorders: A review of neuroimaging findings. Child Adolesc. Psychiatr. Clin. 2012, 21, 501–525. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Jo, A.; Chae, B.Y. Introduction to real time user interaction in virtual reality powered by brain computer interface technology. In Proceedings of the ACM SIGGRAPH 2020 Real-Time Live! Online, 17 August 2020; p. 1. [Google Scholar]
- Looxid Link. Available online: https://looxidlink.looxidlabs.com (accessed on 16 August 2021).
| Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).