Next Article in Journal / Special Issue
The Redesigned Serpens, a Low-Cost, Highly Compliant Snake Robot
Previous Article in Journal
A Natural Language Interface for an Autonomous Camera Control System on the da Vinci Surgical Robot
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Perspective Review on Integrating VR/AR with Haptics into STEM Education for Multi-Sensory Learning †

1
Department of Engineering Sciences, University of Agder (UiA), 4879 Grimstad, Norway
2
Faculty of Informatics, Kaunas University of Technology, Studentu Str. 50, 51368 Kaunas, Lithuania
3
Department of Information Engineering and Mathematics, University of Siena, 53100 Siena, Italy
4
Department of Information Systems, University of Minho, R. da Universidade, 4710-057 Braga, Portugal
5
Multimedia Research Centre, Politehnica University of Timisoara, 300006 Timisoara, Romania
6
Department of Information Systems, University of Agder (UiA), 4630 Kristiansand, Norway
*
Author to whom correspondence should be addressed.
This paper is an extended version of our paper published in Sanfilippo, F.; Blazauskas, T.; Salvietti, G.; Ramos, I.; Vert, S.; Radianti, J.; Majchrzak, T.A. Integrating VR/AR with Haptics into STEM Education. In Proceedings of the 4th International Conference on Intelligent Technologies and Applications (INTAP 2021), Grimstad, Norway, 11–13 October 2021; Springer, 2021. accepted for publication.
Robotics 2022, 11(2), 41; https://doi.org/10.3390/robotics11020041
Submission received: 25 February 2022 / Revised: 21 March 2022 / Accepted: 28 March 2022 / Published: 31 March 2022
(This article belongs to the Special Issue Intelligent Technologies and Robotics)

Abstract

:
As a result of several governments closing educational facilities in reaction to the COVID-19 pandemic in 2020, almost 80% of the world’s students were not in school for several weeks. Schools and universities are thus increasing their efforts to leverage educational resources and provide possibilities for remote learning. A variety of educational programs, platforms, and technologies are now accessible to support student learning; while these tools are important for society, they are primarily concerned with the dissemination of theoretical material. There is a lack of support for hands-on laboratory work and practical experience. This is particularly important for all disciplines related to science, technology, engineering, and mathematics (STEM), where labs and pedagogical assets must be continuously enhanced in order to provide effective study programs. In this study, we describe a unique perspective to achieving multi-sensory learning through the integration of virtual and augmented reality (VR/AR) with haptic wearables in STEM education. We address the implications of a novel viewpoint on established pedagogical notions. We want to encourage worldwide efforts to make fully immersive, open, and remote laboratory learning a reality.
Keywords:
VR; AR; haptics; STEM; education

1. Introduction

In response to the COVID-19 outbreak, educational institutions have implemented restrictions for on-site meeting and learning. They are stepping up their efforts to use a variety of educational tools to give students remote learning possibilities while schools are closed. To assist parents, teachers, schools and school administrators in facilitating student learning and providing social care and interaction during periods of school closure, the United Nations Educational, Scientific and Cultural Organisation (UNESCO) elaborated a list of educational applications, platforms and resources [1]. Resources for psycho-social support, digital learning management systems, systems built for use on basic mobile phones, systems with strong offline functionality, Massive Open Online Course (MOOC) Platforms, self-directed learning content, mobile reading apps, collaboration platforms that support live-video communication, tools for teachers to create digital learning content, and external repositories of distance learning solutions are among these solutions [2]. Although these solutions provide critical support to society in these exceptional times, they are primarily focused on facilitating the transfer of theoretical knowledge. One of the most significant shortcomings of the multitude of existing solutions is the lack of support for hands-on laboratory work and practical experiences [3].
At all levels of scientific education and throughout all disciplines, hands-on activities are critical for significantly advancing learning. This is particularly necessary for departments of science, technology, engineering, and mathematics (STEM), which must constantly improve their laboratories and pedagogical tools to secure successful study programs to their students. The development of online material and the ability to provide immersive experiences for lab activities may be valuable in the near term to address COVID-19-related difficulties, but it may also pave the way for the introduction of new e-Learning resources in the long run.
A promising teaching strategy of STEM modules involves the ideas of Learning by Doing (LBD) [4], the approaches of Problem Based Learning (PBL) [5] and the concepts of Active Learning (AL) [6]. In reality, actively integrating students and allowing them to complete the activity in the lab is one of the most successful ways of teaching them how to conduct a practical engineering task. The LBD method is not a new instructional theory, it is exactly what it sounds like. Aristotle stated: “One must learn by doing the thing, for though you think you know it, you have no certainty until you try”. Similarly, Confucius declared: “I hear and I forget. I see and I remember. I do and I understand” [7]. On top of the necessary methodology for teaching efficiently and effectively, it is also necessary to contextualise and orient teaching activities from a socio-economic perspective. In the last several decades many of the world?s most developed countries have shifted from an industrial economy to a knowledge economy, which is based on the creation of knowledge, information, and innovation [8,9].
More recently, John Dewey became one of the strongest proponents of the LBD approach. Dewey argued: “Education is not preparation for life, it is life itself”. This educational methodology is perfectly in line with the concept of multi-sensory learning [10], which assumes that individuals learn better if they are taught using more than one sense (modality). The senses usually employed in multi-sensory learning are visual, auditory, kinesthetic, and tactile—VAKT (i.e., seeing, hearing, doing, and touching), as shown in Figure 1. The recent advances in virtual reality (VR), augmented reality (AR) and haptic technology may potentially enable the possibility of extending the multi-sensory learning approach to e-Learning and for re-designing study modules and engineering remote laboratories. Through this strategy, the community of educators and students may be enabled to explore the new frontiers of education.
In this work, we describe a unique perspective for incorporating virtual and augmented reality (VR/AR) with haptic wearables into engineering education to accomplish multi-sensory learning [11]. These immersive technologies are collectively known as extended reality (XR), a broad term which incorporates the three main technologies in use today (i.e., virtual reality—VR, augmented reality—AR, and mixed reality—MR), and others that will follow in the future, while virtual reality completely immerses the user in a computer-generated world, augmented reality allows the user to continue to see the real world, on which this technology overlays computer-generated objects. Mixed reality is a form of augmented reality in which the computer-generated objects blend more realistically with the real objects and the user interacts more profoundly with the former ones [12]. For this reason, in the current paper we will not dedicate a separate chapter to mixed reality, but will mention it in the virtual/augmented reality chapters.
The purpose of this study is to provide a new viewpoint on existing educational ideas, as well as to investigate the implications of this approach. We want to support initiatives throughout the world to make fully immersive, open, and remote laboratory learning a reality. First, we do a thorough review of the literature. Our study, however, is not a systematic literature review, but rather a viewpoint piece (with a solid foundation on the literature). This decision is deliberate because we want to encourage research—besides, the fragmented field and the combination of different aspects likely indicates that it is too early for a literature review that could synthesise the existing knowledge.
This article is organised as it follows. A review of the theoretical background from a learning perspective is given in Section 2. The strategy adopted in our perspective study is provided in Section 3. An overview of related VR, AR and haptic technology are presented in Section 4, Section 5 and Section 6, respectively. In Section 7, we review technologies that can potentially be used for evaluation and assessment of learning processes. Finally, we discuss our findings in Section 8 before concluding in Section 9.

2. Learning Theories

In this section, we provide an overview of immersive learning and present the motivation for the research described in the following sections. Some concepts may be too difficult too understand in STEM fields using the traditional pedagogies, namely addressing the relevant topics in lectures and tutorials. To increase the attraction and retention of students, many universities have adopted active learning (AL) pedagogy for over 20 years [13]. AL can be generally defined as “any instructional method that engages students in the learning process. AL requires students to do meaningful learning activities and think about what they are doing” [14]. In this student-centered approach, the learning responsibility is shared between the learner, the group, and the instructor. The instructor is responsible for organising the conditions on which effective learning depends [15]. The instructor might oversee changing learners’ aptitude towards learning to increase the learning value. As depicted in Figure 2, the aim is to progress from instructor-led activities (progressive learning) to self-directed learning in which the student defines the learning goals and uses available processes and resources to achieve them.
There are several strategies and techniques to implement AL into STEM courses. Some of the more common are project-based learning, problem-based learning, cooperative-based learning, and competency-based learning. Project-based and problem-based learning are often perceived as complementary and used together, since they have similar objectives and methodologies [15]. Students are required to develop a solution to a problem presented by the instructor, individually or in groups, during multiple learning activities [16]. Cooperative-based learning is centered on the human capability to learn socially; students are divided into groups and work together to achieve a goal, such as conducting a research project or multiple-step exercises [13]. This pedagogy promotes peer teaching and collaborative skills. Competency-based learning uses systems of instruction, assessment, grading, and academic reporting to ensure that students learn the knowledge and skills deemed to be essential to succeed in the profession.
When in-person teaching is not possible or laboratories are not physically accessible, immersive learning may be included in AL pedagogy. The opportunities offered by immersive learning are summarised in Figure 3. This considers options ranging from interactive physical environments to avatars in virtual worlds [17]. Interactive physical environments enable multiple learner collaborative experiences; engaging in virtual worlds as an avatar provides a cost-effective, safe, and expansive learning experience [18]. In immersive environments, students can securely participate in controlled experiments that would be high risky in the real world, allowing them to safely make mistakes, and learn from them. Moreover, some phenomena, that would otherwise not be observable can be visualised and presented in a clear way resulting in a deep understanding of core concepts. To achieve high fidelity immersive environments, instructors can integrate VR and AR technology into their classes [19]. The use of such technology during immersive learning experiences can optimise students’ knowledge acquisition and motivate them to be active participants in their own learning [18]. Furthermore, multi-sensory technologies enable learning with multisensorial feedback. This makes it possible to achieve higher levels of affinity between the student and the simulated learning environment. Tactile feedback, e.g., force and texture, can be rendered with the use of haptic technologies, while other multi-sensory technologies can enable olfactory and audio feedback. These technologies help develop more accurate mental models and representations of different concepts, thus enhancing learning [17].
A recent study conducted a review on the usage of immersive VR technology in higher education, learning theories for VR application design as well as the evaluation methods and the learning outcomes [20]. The study shows that VR applications engage scholars, as the article found 18 domain applications in higher education. The article points out the lack of learning theories that help with developing learning-oriented VR applications. However, this study neither examines the adoption of haptic technologies, nor how they can be relevant for STEM in higher education. A market study of available apps [21] brings complementary results but does not include haptic technologies, either.
As shown in Figure 4, to ensure successful learning based on immersive technologies, it is necessary to, firstly, plan the learning activities that the students will participate in. For this, the instructor should identify the students? learning needs, this is the knowledge and skills the students lack, and are necessary for the students? professional success, based on real-life work environment context. Next, based on the learning needs, the instructor should define the learning goals, i.e., the knowledge and competencies that the students are expected to acquire through successful participation in the learning process. The instructor should also define the pedagogical concepts and technologies that are going to be used during the learning activities. Pedagogy is a very important factor in the success of the learning process [22], and describes how the learning activities will occur. Once the learning process is defined, the instructor will choose the immersive tools that best fit the content to be learned. Then, the instructor can create the immersive learning environments and experiences or co-create them with the students to allow them to constantly interact with the virtual world and adjust the learning to their own characteristics and specific needs [23].
The effectiveness of the implemented learning process must be assessed to provide feedback to the learners and adjust their learning journey if needed. This can be done through two different perspectives: the outcome and the student experience. The first variable can be measured by a theory test or a practical conversation, where knowledge and skill acquisition and retention can be determined. The second perspective can be measured through a questionnaire, where the students answer various questions in order to determine student motivation, engagement, and immersion.
Although there is an increasing number of studies on the effectiveness of immersive learning, there is still a lot to be done to understand how to design effective learning experiences. The industry has been using this approach to train operators and some parameters of efficiency and effectiveness of the immersive learning process have been systematized, namely the speed in acquiring skills, error reduction rate in the execution of tasks after training, and accuracy in the anticipation and identification of events [24]. Moreover, factors like interest, motivation, perceived self-efficacy, embodiment, cognitive load, and self-regulation have been pointed out as crucial to achieve the defined learning outcomes [25]. The literature seems to point to a negative effect of groups on learning performance, suggesting that immersive learning is more suitable for small groups or even individuals [26]. In sum, it is necessary to continue to study the most appropriate way to use immersive technologies to ensure the effectiveness of the learning process.
The use of immersive technologies to design a learning process to achieve the goals defined by the instructor, possibly in collaboration with the learners, requires mastering of the technology and knowledge of which contents can benefit most from applying the tools. As shown in Figure 5, at the beginning of a new learning journey, the instructor starts by experimenting with the technology tools available to improve the understanding of opportunities and limitations (initiate level). At a more advanced stage, the instructor acquires a greater sensitivity to the available tools, becoming able to use them to promote the learning of a wider range of topics and define a greater variety of learning experiences. The focus is now on facilitating the communication of content and encouraging student engagement in activities that benefit from the use of immersive technologies (convert level). Finally, when the instructor masters the use of immersive technologies in learning, an optimisation process can be applied to define customised learning experiences. The focus is on learners’ expectations, learning styles and preferences. The instructor co-builds the learning journeys with the learners. In a self-directed way, the learner uses the immersive technologies to explore concepts and practices; the instructor becomes a mentor for this exploration (augment level).

3. Perspective Strategy

In this section, we highlight our strategy in surveying literature for technology that may enable multi-sensory learning with VAKT feedback. We assert that the way how these technologies have been used in education, especially in STEM for higher education, is understudied. Most literature focuses on technicalities, design, and models of interaction; it rarely examines technologies in the higher education context for STEM. Moreover, most of the existing rendering devices are still relatively costly and therefore still not available to the vast majority of students [27]. This represents a consistent gap in the systematic review of VR/AR and haptics for e-Learning. Among the existing technological solutions, the focus of this paper is on the possibility of adopting low-cost commercial off-the-shelf (COTS) components for STEM learning purposes to enable multi-sensory learning with VAKT feedback.
A possible setup for the novel e-learning approach is pictorially represented in Figure 6.
A student is at home attending an on-line course and he is wearing on their right hand three wearable haptic interfaces embedded in rings. He their exploring an object perceiving its stiffness, surface roughness and temperature through the wearable interfaces. The immersiveness of the experience is completed by the head-mounted display and the headphones that serves for video and audio streaming, respectively. This novel paradigm of e-learning needs a technological update both on the schools and students’ homes facilities.

3.1. Required School Tools

Beside the formation of the teachers and the study of specifically designed contents, the production of e-Learning courses will require an investment in the technology available in the classrooms. The creation of VR and AR contents will pass through the use of novel programming environments like Unity [28] and the adoption of 3D cameras and 3D scanners to acquire in the right format the needed content. Effort on designing mixed reality classes are on-going leading to the possibility of accessing the same contents on-site or on-line [29]. Finally, the addition of haptic contents would require both the use of databases where data-driven haptic texture and friction models are available [30] and the possibility to register tactile information using specific sensors, such as force sensors, accelerometer, temperature sensors, etc. [31]. Novel haptic contents could also be shared in an open-source way to foster the diffusion of multi-sensory e-Learning platforms.

3.2. Potential Home Solutions

To achieve a broad distribution of the devices needed for the VAKT model of e-Learning there are two main possibilities. The first is that commercial products reach a price that is affordable for most of the students. This is what happened for instance for smartphones and is what is very likely to happen also for VR/AR displays. For haptic interfaces this road appears to be longer since the technology is relatively younger, and only recently big tech companies have started investigating the tactile communication channel [32]. A possible way to speed up this process is the diffusion of open-sources repositories where it is possible to find detailed instructions to easily build wearable haptic interfaces by using off-the-shelves components. Similarly to what happened for instance with soft robotic hands [33], source files for 3D printable designs could be made available to users, together with all the instructions to assemble motors, controllers and to connect the devices with PCs. Several models of wearable haptic interfaces, that will be introduced in Section 6, would perfectly fit these initiatives. In this perspective, our research group is currently pursuing this possibilities with a research project that was funded by the European Union through the Erasmus+ Program under Grant 2020-1-NO01-KA203-076540, project title Integrating virtual and AUGMENTED reality with WEARable technology into engineering EDUcation (AugmentedWearEdu), https://augmentedwearedu.uia.no/ (accessed on 27 March 2022) [34].

3.3. Requirements to Facilities

When it comes to physical space, the idea is to make use of the existing facilities, but this needs to be altered/updated to comply with different requirements and experience of the VR/AR users. Literature has highlighted both physiological discomfort [35,36] and psychological discomfort [19] when using VR. Physiological discomfort can appear as motion sickness such as nausea and oculomotor, dizziness, blurred vision and vertigo, postural instability, drowsiness, eye fatigue, and degraded eye-hand coordination (especially in children). Other physiological signals, such as blood pressure, heart rate and sugar level, can also be affected especially when having unpleasant experiences in the synthetic environments (horrific venues, realistic unpleasant virtual venues). Moreover, psychological factors can stem from the uncomfortable feeling when wearing a VR/AR headset in public. Thus, in certain learning settings, one needs a private space when using VR for learning [19].
To prevent such potential discomfort issues, which could especially have direct impacts to the health, VR/AR haptic multisensory learning should pay attention for the following issues:
  • Spacious: i.e., no obstruction, giving enough space to act without worries of hitting solid objects, such as walls or tables when using VR/AR and disconnect from “reality”.
  • Flexible use: for private and public room purposes. This is true for those who want to learn in a small group as VR/AR apps require one to do certain movements or even repeat from the auditory feedback to be able to conduct the VR tasks. This solution will give an opportunity to minimise the psychological discomfort for those who need it. The same space could also be used in a larger group as a means to learn together and to get peer feedback.
  • Health and safety issue: as some physiological discomforts have been reported in many studies [35,36], in the facility requirement guideline, we recommend having more than one person in a room to anticipate if an unwanted event occurs to a learner such as motion sickness symptoms, thus others can help.
These facility requirements are especially important for a strategy that includes not only procuring technology enablers for VR/AR development, but also conducting learning sessions with VR/AR devices.

4. VR Technology

There are many definitions for Virtual reality (VR). VR is investigated from different perspectives, such as technology, interaction, immersion, semantics and philosophy [37]. All of these perspectives are important when talking about applications of VR in learning. For examples, Mütterlein [38] discusses the three pillars of VR, i.e., immersion, presence and interactivity, and investigates how they are interrelated. When considering the application of VR solutions in learning, VR technology should support these three pillars as well as tackling sensory perception channels for multi-sensory learning [39].
Immersive VR devices seek to place a user in a virtual environment. The best-studied solutions are VR Cave Automatic Virtual Environment (CAVE) systems and head mounted displays (HMDs). Early HMD systems were seeking accurate and fast tracking of the head rotation. It was necessary to solve the Motion-to-Photon (End-to-End) latency problem [40], because high latency does not allow full immersion. People can sense the delay and the artificial nature of the environment. Nowadays this problem is largely solved, but some low-cost solutions, such as Google cardboard, still can not provide full immersion; even worse, response lags and jittery movements may lead to motion sickness [41]. The latter can also affect users of high-end HMDs if these do not cater for individual needs [42].
Another step into enhancing the immersiveness is allowing six degrees of movement freedom that requires tracking not only rotation, but also translation. The two approaches in use include outside-in and inside-out tracking. Outside-in tracking uses external devices (e.g., Oculus Rift external cameras [42]) to ensure head motion within tracked area. The inside-out tracking adopts cameras inside HMDs to track movements.
The immersiveness is further enhanced by tracking body parts and even external devices (e.g., pens, guns, sports equipment). This is relevant for learning applications that require the learning by doing approach. Tracking is done by using additional devices. The systems that could use such tracking are the outside-in systems. The inside-out systems can track palms and fingers, using infrared cameras, but the use-cases are limited in comparison to outside-in tracking systems. The additional trackers that are used in outside-in systems come in various dimensions and forms. For example, the VR ink device [43] from Logitech uses a similar form as a normal pen. This device or similar ones could be used for activities that require precision. Usage may span a wide range; a pen-like instrument could be used for such virtual activities as cutting tissue for a surgical procedure or for learning to solder.
VR technology enables educators to provide comprehensive assistance, because the tracked students’ activities can be used to give feedback in real-time or for briefing/debriefing purposes. Some recently introduced VR systems (e.g., HTC VIVE Pro Eye [44], HoloLens [45], …) also include eye tracking capabilities. Eye tracking makes it possible to automatically adjust the interpupillary distance (IPD) and track the gaze of the user [46]. Gaze tracking is used in many areas and it might be important for learning applications as well [47].
Providing haptic feedback in VR to make the experience more realistic has become a strong focus of research in recent years [48]. Haptic feedback has been shown to have an added value to extend immersiveness and add additional dimension related to senses [49]. Haptic experiences with VR, however, remain a challenge [50]. Currently available VR systems are mostly commercial applications and games, employing mostly hardware input. For example, the HaptX Gloves provides true-contact haptics [51], with 133 points of tactile feedback per hand. Dexmo, a hand haptic device for VR medical education by Dexta Robotics [52] is one of the few, existing, haptic VR learning systems. It uses force feedback to enable user feelings of size and shape, and captures 11 degrees of freedom (DoF) of users’ hand motion. However, most of the existing high fidelity haptic rendering devices are still relatively costly. For this reason, a few frameworks exist to facilitate the integration of haptics with different applications. For instance, Interhaptics provides hand interactions and haptic feedback integration for VR, mobile devices, and console applications [53].

5. AR Technology

AR superimposes virtual information over a user’s view of the surrounding environment, in such a way that this information seems naturally part of the real environment [54]. The main advantage over VR is that “AR connects users to the people, locations and objects around them, rather than cutting them off from the surrounding environment” [55]. This effect has a big potential in education, as it is demonstrated in an increasing number of research papers [56,57,58].
While the performance of AR technology has increased steadily over time, the main components of the hardware have stayed the same: sensors, processors, and displays. The role of the sensors is to provide information for tracking and registration. This is mainly achieved through an optical camera, with or without the help of sensors such as Global Positioning System (GPS), accelerometers, and gyroscopes. Optical tracking is categorised in the literature as marker-based, in which a static image is recognised (such as a quick response (QR) code), or marker-less, in which natural features of the surrounding environment are recognised. Other forms of tracking exist, but are much less common, such as acoustic, electromagnetic or mechanic [59].
Displays are the most prominent part of the hardware and the most impactful for the end user. They are usually visual ones, in the form of head mounted displays (HMD), handheld displays (HHD) or spatial AR (SAR) [60]. Less common forms of display are: tactile, audio and olfactory. Notably, audio displays are still much more common than tactile and olfactory displays, as there are already consumer devices specifically aimed at audio AR [61]. Input devices are often considered as a separate category, ranging from keyboards to voice inputs.
Nowadays, due to their ubiquity and versatility, handheld devices, in the form of smartphones, have become the main vehicle for AR experiences in many fields, including education [62], while hardware in smartphones has not changed fundamentally in recent years (yet continued to advance gradually), software evolution has been much more prominent (e.g., see the work presented in [63]).
Regarding software in AR, besides the low-level software that powers sensors, processors and displays to perform their tasks, higher-level software is used to enable creators to design different AR experiences. Depending on the technical skills of the creator, and the business or educational needs, one can use AR software development kits (SDKs) such as Vuforia, Wikitude, ARKit or ARCore, or all-in-one platforms, such as Cospaces Edu or EON Reality (which do not require programming skills).
A search in the scientific literature revealed no consistent existing research on exploiting AR in STEM education for fully-immersive remote laboratory learning. Most literature surveys cover the whole spectrum of target groups, from early childhood education to doctoral education. In general, AR is known to increase the understanding of the learning content, especially in spatial structure and function, compared to other forms of media, such as books or video; to aid with long-term memory retention, compared to non-AR experiences; to improve physical task performance, but also collaboration; to increase student motivation, through providing satisfaction and fun to the activities [64]. Use cases for education might be transferable from non-educational, professional AR usage such as from application that counter information overload [65].
A recent review of the literature on how AR is supporting STEM education [62] showed that the majority of the developed applications were exploration apps and simulation tools. At the same time, most were self-developed native applications, while the others used AR development tools. Furthermore, the vast majority were marker-based and only a few were location-based. These existing applications almost exclusively stimulate sight, leaving other senses unexplored. The study also surveyed what learning outcomes were measured and how, concluding that we are missing a deeper understanding of how AR learning experiences take place in STEM environments [62].
Another recent review on AR in STEM recognises intensive research in this area in recent years [66], although this work still mainly addresses early childhood education. The authors categorised the advantages of applying AR in STEM: contribution to learner, educational outcomes, student interaction and others. However, they also identified challenges. Most of these are owed to technical problems (i.e., weak detection of markers or GPS position). Other challenges include teachers’ resistance to adopting the AR technology, in which the time required prolonged periods to develop high-quality content plays an important role [66].
In contrast to VR, where we have somewhat established devices, both low-cost, low performance ones (Google Cardboard and similar) and high-cost, high-performance ones (i.e., Oculus Quest 2 or HTC Vive Pro 2), in AR we can rely only on the first type. These are the smartphones, which can actually be considered zero-cost, since the vast majority of the users already own one, and medium performance, since the a lot of effort has been put into developing high-performing AR on these devices, by big players (Google, Apple). Their disadvantage, however, in virtual labs in STEM education for example, is that (at least) one of the hands of the student is busy holding the smartphone, so haptic interaction is limited.
More appropriate and powerful AR devices, as in AR glasses, are still to become mainstream. Google’s AR glasses, launched for the general public in 2013, have been retired quickly, in 2015, and are now produced only for the enterprise domain. Apple’s AR glasses have been rumoured to be released for several years now, so they are still to come. Other brands of glasses, such as Moverio or Magic Leap One, have failed to become mainstream (at least in the sense that Oculus or HTC Vive are in the VR world).
In the area of mixed reality (MR), a form of AR in which the interaction of the user with the virtual objects is more profound [12], by far the best-known devices are Microsoft’s Hololens. These high-cost, high-performance devices have set a standard for the mixed reality technology up until now.

6. Haptic Technology

Touch is one of the most reliable and robust senses, and is fundamental to the human memory and in discerning the surrounding environment. In fact, touch provides more certainty than other senses, especially vision. To provide the user with tactile information, haptic technology can be employed. Haptic feedback, also known as haptics, is the use of the sense of touch in a human–computer interface. A variety of possible applications are made possible by the use of haptics, including the possibility of expanding the abilities of humans [67]: increasing physical strength, improving manual dexterity, augmenting the senses, and most fascinating, projecting human users into remote or virtual environments. Haptic technology is the key for achieving the tactile feedback experience of the VAKT model.
Early examples of haptic technology applied for gaining “touch” experience of the users through the sensation of forces, vibration or motion can be found in [68,69].
Most of the haptic devices available on the market like the sigma.x, omega.x and delta.x series (Force Dimension, Swiss) or the Phantom Premium (3D Systems Inc., USA) [70] are usually very accurate from a rendering perspective, and able to provide a wide range of forces. However, such devices present a limited workspace with a high cost of production. The pursuit for bigger workspaces and the possibility to achieve multi-contact interaction [71] lead researchers to the development and design of exoskeletons, a type of haptic interface grounded to the body [72]. Exoskeletons can be seen as wearable haptic systems, however they are rather cumbersome and usually heavy to carry, reducing their applicability and effectiveness.
To deal with these limitations, a new generation of wearable haptic interfaces have been investigated [31]. Haptic thimbles [73,74,75], haptic rings [76,77], and haptic armbands [78], have been designed for several applications, ranging from tele-operation to VR or AR interaction. Most of the available wearable haptic interfaces are only capable of providing cutaneous cues that indent and stretch the skin [79], and not kinaesthetic cues, i.e., stimuli that act on skeleton, muscles, and joints [80]. Wearable haptic interfaces, providing only cutaneous stimuli, do not exhibit any unstable behaviour due, for instance, to the presence of communication delay in the closed haptic loop [81]. To close-loop control the haptic feedback, the platform requires a cohesively integrated system. As a consequence, the haptic loop with wearable tactile interfaces results to be intrinsically stable. Wearable haptic devices are light, portable and can be used in combination to achieve multi-contact interaction [71]. Moreover, recent results demonstrated that wearable haptics can also be used in virtual and mixed reality to alter the perception of physical proprieties of tangible objects including stiffness, friction and shape perception [82].
Most of the proposed devices are built combining rapid prototyping techniques with off-the-shelves components including servomotors, vibromotors, programmable board, etc. This aspect can dramatically foster the diffusion of these devices in “at home” scenarios. We can imagine a future scenario where students could easily download, print and build their own devices and access haptic contents available for the novel VAKT model of e-learning.

7. Evaluation, Assessment and Eye-Tracking Technology

Haptic feedback can be used to increase the degree of presence in a virtual environment, allowing one to touch and feel virtual objects [83], which is very important for learning. Most scholars in educational research have acknowledged and concur that there is a strong connection between assessment and student learning. Traditionally, evaluation and assessment can be done using various methods such as: knowledge tests (i.e., written multiple choice, open-ended questions, oral examinations); practical knowledge evaluation and reports with narrative feedback or peer feedback and portfolio containing reflections [84]. However, we concur with Kreimeier et. al. [83] that how to assess and evaluate haptic feedback on its task-based presence and performance in virtual reality for STEM education is still rarely discussed in the literature.
Many evaluation efforts often focus on the usability of VR [85]. In particular areas, such as medical education, haptic-based VR for learning has increasingly been adopted, e.g., for simulating surgeries. However, the evaluation part often emphasises the overall impression on realism of the VR simulator, realism of tactile sensation, and other simulator elements, e.g., [86], while usability, acceptability and user experience assessment can be considered very relevant, there is also a need for exploring what evaluation and assessment are possible for multi-sensory learning and the use of VR. In Section 2, we argue on the need for integrating immersive learning into the learning process. In this perspective, we show a model proposing two evaluation measurements, i.e., measuring outcome (knowledge and skills, acquisition, and retention) and measuring experience (learner motivation, engagement, and immersion) to support multisensory learning.
Moreover, the shift in education theories from behavioralist toward a constructivist perspective [87] assume that students should be regarded as active learners of their own knowledge, skills, and competencies. The authors suggest assessment variations such as using social interaction, reflection and feedback involving peers and teacher both narrative/ oral feedback and multi-source feedback, in addition to other assessment method such as portfolio or collection of student products that reveal the achievements and efforts in specific areas. In fact, “simple” pass/fail decisions for learning assessment are gradually changing the assessment environment encouraging students to be more responsible to enhance their own learning [88]. This shift further supports the idea of incorporating immersive learning into the learning process.
Recently there are new developments and possibilities in combining eye-tracking with VR for usability and evaluation studies. Previously, eye-tracking products have appeared as screen-based devices, eye-tracking webcam, or wearable glasses. These devices have been applied in research and business settings to understand how humans interact with systems, machines, and processes. Eye tracking has also been used for understanding media habits including preferences and visual perception on various digital media devices [89], including in educational settings. By tracking gaze behaviour, researchers can measure visual attention to specific elements [90].
Thus, not only VR and haptic technology are advancing, but the eye tracking capability has been combined with VR technology. There are two variants in the market so far, i.e., a VR device that has built-in eye-tracking capabilities, for example HTC-Vive Pro Eye, and an independent eye-tracking device that can be mounted to existing VR device. Through such eye-tracking enhanced VR, it is possible to obtain user tracking data that allow researchers to learn about various behaviours with respect to the distribution of the visual attention within the virtual environment.
When taking the eye-tracking enhanced VR into learning context, different evaluation possibilities can be considered. Referring to Section 2, especially considering Figure 5, the model suggests two evaluation measurements, i.e., measuring outcome (knowledge and skills, acquisition, and retention) and measuring experience (learner motivation, engagement, and immersion). Instructors can verify whether the expected behaviours are achieved, the right objects are seen touched or moved, whether unnecessary distractions occur. There are many more criteria that can be used to evaluate the success of expected behaviour in the virtual environment. Advanced visualisation techniques presenting visual attention data can be presented in many ways such as heat-maps, as shown in Figure 7. Different eye-tracking metrics can be calculated from the raw data and have been proposed, such as pupil diameter (mean of left and right), gaze entropy, fixation duration, and percentage of eyelid closure to decide the importance objects seen by the eye-tracking users [91,92]. The power of incorporating eye-tracking into the multisensory learning case, lies especially on the capability to link kinesthetics/tactile, visual, and auditory feedback of the learners with virtual environment, and generate various data that can be evaluated after the learning session.
Instructors can predetermine some metrics to show successful sensory-based learning outcomes, e.g., by linking haptic feedback to a specific object in the virtual environment, such as objects that relate to eye-hand coordination [93]. Some scholars have used metrics such as task time, economy of motion, drops, instrument collisions, excessive instrument force, instrument out of view, and master workspace range to assess perceived workload. Higher intensity in particular object can be interpreted as perceived higher workload [91]. One can even determine distracting elements that push away user attention from their actual mission/learning points. Similar principles can be reused for integrating VR/AR with haptics into STEM education.
The use of VR in combination with eyetracker can actually enrich the variations of learning assessment method and can engage peers and teachers. For example, it can be done by projecting the virtual environment so that the peers and teachers can observe together and provide feedback on how to improve the learning acquisition and documented for the student’s own learning.
In [94,95], eye-tracking was combined with different information, such as audio, video, bio-metric data, and annotations to improve planning, execution and assessment of demanding training operations, by adopting newly designed risk-evaluation tools. This integration is the base for research on novel situational awareness (SA) assessment methodologies. This can serve the industry for the purpose of improving operational effectiveness and safety through the use of simulators. Such capability has the potential to be adopted for evaluation and assessment of the multi-sensory learning process using VR and haptic technologies, both for measuring the outcome (knowledge and skills, acquisition, and retention) and experience (learner motivation, engagement, and immersion).
VR-eye-tracking assessment can also be employed when seeking more understanding about visual attentions [96], decision making and judgement capability [97], and visual-driven emotional attention [98]. The students can reflect their own learning, perspective and other contextual factors that influence the learning stage. However, to be beneficial and maximising the learning impacts, it is the task of the teacher and facilitator to formulate the assignments, imminence of assessment, the design of the assessment system and the cues. In other words, no single solution what the best assessment systems would be, as the assessment itself really needs innovation and creativity and be anchored to the overall course goal. Our role here is to show the opportunities that the VR supported eyetracking can encourage reflective learning in STEM education, and expand the possibilities for learning assessment and evaluation.

8. Discussion

In the following, we first discuss implications of our work. We then explain its limitations before analysing the potential for future research.

8.1. Implications

It is possible to analyse the implications of virtual/augmented reality (VR/AR) technology and haptics when they are studied in the context of STEM education. The whole field of our study can be characterised as heterogeneous and highly dynamic when it comes to its maturity to be brought into STEM education supported by multi-sensory learning.
In alignment with other works (e.g., [20]), we conclude that approaches often are experimental and exploratory. Even if learning and teaching are explicit goals, the theoretical foundations in pedagogy tend to be shallow. Educators who want to use VR, AR and haptics typically need to start from scratch and try out what might work. Arguable, the state-of-the-art of AR and haptics in education is even less mature than that of VR, where educators can find more and more practical advice [19].
At the same time, the technological progress is rapid. Hardware and arguably to an even higher degree software support for VR, AR and haptics is continuously becoming more powerful, and the application in other areas—such as entertainment—will likely also pave the way for easier usage in education. Existing tools are still relatively expensive and not commonly available to all students. Even for educational institutions, buying hardware for large courses seems unfeasible. However, it can be expected that this problem will become less important with a wider spread of VR, AR and haptics applications.
The implications of our findings vary by perspective and role:
  • Practitioners will find much need to advance the current technology as well as to find ways how it can be applied effectively and efficiently.
  • For theory, much additional work is needed. Our work has shown that the gaps in understanding VR, AR and haptics in education are still large, and they are rather widening than being closed. We expect that more targeted research will be needed, and that research will need to be multi-disciplinary to follow pace with the technological development yet grasp the consequences for teaching.
  • Educators will be able to use rich tools in the near future, but they will need much help in doing so. Selecting appropriate courses, fitting didactic, setting up equipment, and evaluating its use will need support. Moreover, close collaboration with those who develop VR/AR environment wit haptics will be required.
  • Developers and designers of virtual worlds and augmented reality applications will need better tools to create multi-sensory learning environments that have high educational value. Work is also needed to improve their interface to educators, making exchange easy and fruitful.

8.2. Limitations

Our work has limitations that need to be mentioned. First, we worked rigorously with the literature. However, our paper is not a systematic literature review but rather a perspective paper (with a solid foundation on the literature). This choice was deliberate, as we want to stimulate research—besides, the fragmented field and the combination of different aspects likely means that it is too early for a literature review that could synthesise the existing knowledge.
Second, there are few works yet that combine VR or AR (let alone VR and AR) with haptics in an educational context, and in STEM subjects specifically. As a consequence of this immaturity, lessons need to be taken from the fields of VR, AR and STEM education individually. As an extension to the first limitation, it is impossible to get a full overview of any combination of VR, AR, haptics, and education. Thus, we might have missed works that do not squarely fall into our topics but would still help to get a better understanding of innovative pedagogic models for learning in immersive environments.
Third, the development in the field is rapid, which makes it hard to project future progress. Additionally, advances from non-scientific research could offer new possibilities or render current approaches obsolete.
The limitations of our work do not diminish its value—in fact, they align with the implications we identified.

8.3. Future Research

The need for maturing the multi-sensory learning concept in combination with the technological opportunities allows us for setting a list of research questions. In particular, we propose which steps research should take to leverage the current possibilities, aiming at improving STEM education. We setup the following research and implementation agenda:
  • to operationalise further the multisensory learning concept for STEM education by taking advantages of current development of the VR, AR and haptic technologies, by focusing on the more affordable technologies;
  • to focus on hands-on laboratory work and pedagogical tools that provide practical experiences for the students;
  • to provide assessment tools for educators by having competencies evaluation on using VR, AR and wearable haptics in STEM education;
  • to explore open source libraries of VR, AR and wearable haptics to be used for re-designed study modules and engineering laboratories;
  • to develop teaching modules and to test our concept with engineering students in an experimental setting, to evaluate the applicability of the concept.
Based on this agenda, our research group recently presented a novel haptic-enabled framework for hands-on e-Learning [99]. The framework enables a fully-immersive tactile, auditory, and visual experience. This is achieved by combining virtual reality (VR) tools with a novel wearable haptic device, which is designed by augmenting a low-cost commercial off-the-shelf (COTS) controller with vibrotactile actuators. Results suggest that the proposed haptic-enabled framework improves the student engagement and illusion of presence. As a future work, our research group will continue the development of this framework.

9. Concluding Remarks

In this paper, we presented a perspective review on VR and AR with haptics integrated into STEM education. For this purpose, we first reviewed learning theories before discussing VR and AR technology as well as haptics. We then reviewed technologies that can potentially be adopted for evaluation and assessment of learning processes. Finally, we discussed our findings.
Our work has implications for research and practice, as well as for educators and for developers. Neither technological development nor scientific assessment of the field are close to being finished. This leaves plenty of room for future activities. Our work on this topic will continue.

Author Contributions

Conceptualisation, F.S., T.B., G.S., I.R., S.V., J.R., T.A.M. and D.O.; methodology, F.S., T.B., G.S., I.R., S.V., J.R., T.A.M. and D.O.; investigation, F.S., T.B., G.S., I.R., S.V., J.R., T.A.M. and D.O.; resources, F.S., T.B., G.S., I.R., S.V., J.R., T.A.M. and D.O.; writing—original draft preparation, F.S., T.B., G.S., I.R., S.V., J.R., T.A.M. and D.O.; writing—review and editing, F.S., T.B., G.S., I.R., S.V., J.R., T.A.M. and D.O.; project administration, F.S.; funding acquisition, F.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the European Union through the Erasmus+ Program under Grant 2020-1-NO01-KA203-076540, project title Integrating virtual and AUGMENTED reality with WEARable technology into engineering EDUcation (AugmentedWearEdu), https://augmentedwearedu.uia.no/ [34] (accessed on 27 March 2022). This work was also supported by the Top Research Centre Mechatronics (TRCM), University of Agder (UiA), Norway.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

Abbreviations

The following abbreviations are used in this manuscript:
STEMScience, technology, engineering, and mathematics
COTSCommercially available off-the-shelf
VRVirtual reality
ARAugmented reality
VAKTVisual, auditory, kinesthetic, and tactile

References

  1. United Nations Educational, Scientific and Cultural Organization (UNESCO). National Learning Platforms and Tools. 2021. Available online: https://en.unesco.org/covid19/educationresponse/nationalresponses (accessed on 6 May 2021).
  2. United Nations Educational, Scientific and Cultural Organization (UNESCO). Distance Learning Solutions. 2021. Available online: https://en.unesco.org/covid19/educationresponse/solutions (accessed on 6 May 2021).
  3. Colthorpe, K.; Ainscough, L. Do-it-yourself physiology labs: Can hands-on laboratory classes be effectively replicated online? Adv. Physiol. Educ. 2021, 45, 95–102. [Google Scholar] [CrossRef] [PubMed]
  4. Thompson, P. Learning by doing. In Handbook of the Economics of Innovation; Elsevier: Amsterdam, The Netherlands, 2010; Volume 1, pp. 429–476. [Google Scholar]
  5. Wood, D.F. Problem based learning. BMJ 2003, 326, 328–330. [Google Scholar] [CrossRef] [PubMed]
  6. Settles, B. Active Learning Literature Survey; Technical Report 1648; University of Wisconsin—Madison Department of Computer Sciences: Madison, WI, USA, 2009. [Google Scholar]
  7. Sanfilippo, F.; Osen, O.L.; Alaliyat, S. Recycling A Discarded Robotic Arm For Automation Engineering Education. In Proceedings of the 28th European Conference on Modelling and Simulation (ECMS), Brescia, Italy, 27–30 May 2014; pp. 81–86. [Google Scholar]
  8. Sanfilippo, F.; Austreng, K. Enhancing teaching methods on embedded systems with project-based learning. In Proceedings of the IEEE International Conference on Teaching, Assessment and Learning for Engineering (TALE), Wollongong, Australia, 4–7 December 2018; pp. 169–176. [Google Scholar]
  9. Sanfilippo, F.; Austreng, K. Sustainable Approach to Teaching Embedded Systems with Hands-On Project-Based Visible Learning. Int. J. Eng. Educ. 2021, 37, 814–829. [Google Scholar]
  10. Shams, L.; Seitz, A.R. Benefits of multisensory learning. Trends Cogn. Sci. 2008, 12, 411–417. [Google Scholar] [CrossRef]
  11. Sanfilippo, F.; Blauskas, T.; Salvietti, G.; Ramos, I.; Vert, S.; Radianti, J.; Majchrzak, T.A. Integrating VR/AR with Haptics into STEM Education. In Proceedings of the 4th International Conference on Intelligent Technologies and Applications (INTAP 2021), Grimstad, Norway, 11–13 October 2021. accepted for publication. [Google Scholar]
  12. Alizadehsalehi, S.; Hadavi, A.; Huang, J.C. From BIM to extended reality in AEC industry. Autom. Constr. 2020, 116, 103254. [Google Scholar] [CrossRef]
  13. Hernández-de Menéndez, M.; Guevara, A.V.; Martínez, J.C.T.; Alcántara, D.H.; Morales-Menendez, R. Active learning in engineering education. A review of fundamentals, best practices and experiences. Int. J. Interact. Des. Manuf. 2019, 13, 909–922. [Google Scholar] [CrossRef]
  14. Bonwell, C.C.; Eison, J.A. Active Learning: Creating Excitement in the Classroom. 1991 ASHE-ERIC Higher Education Reports; ERIC: Washington, DC, USA, 1991. [Google Scholar]
  15. Christie, M.; De Graaff, E. The philosophical and pedagogical underpinnings of Active Learning in Engineering Education. Eur. J. Eng. Educ. 2017, 42, 5–16. [Google Scholar] [CrossRef]
  16. Lucas, B.; Hanson, J. Thinking like an engineer: Using engineering habits of mind and signature pedagogies to redesign engineering education. Int. J. Eng. Pedagog. 2016, 6, 4–13. [Google Scholar] [CrossRef]
  17. Roberts, D.; Roberts, N.J. Maximising sensory learning through immersive education. J. Nurs. Educ. Pract. 2014, 4, 74–79. [Google Scholar] [CrossRef] [Green Version]
  18. Holly, M.; Pirker, J.; Resch, S.; Brettschuh, S.; Gütl, C. Designing VR Experiences–Expectations for Teaching and Learning in VR. Educ. Technol. Soc. 2021, 24, 107–119. [Google Scholar]
  19. Fromm, J.; Radianti, J.; Wehking, C.; Stieglitz, S.; Majchrzak, T.A.; vom Brocke, J. More than Experience?—On the Unique Opportunities of Virtual Reality to Afford an Holistic Experiential Learning Cycle. Internet High. Educ. 2021, 50, 100804. [Google Scholar] [CrossRef]
  20. Radianti, J.; Majchrzak, T.A.; Fromm, J.; Wohlgenannt, I. A systematic review of immersive virtual reality applications for higher education: Design elements, lessons learned, and research agenda. Comput. Educ. 2020, 147, 103778. [Google Scholar] [CrossRef]
  21. Radianti, J.; Majchrzak, T.A.; Fromm, J.; Stieglitz, S.; vom Brocke, J. Virtual Reality Applications for Higher Educations: A Market Analysis. In Proceedings of the 54th Hawaii International Conference on Systems Science (HICSS-54), Maui, HI, USA, 4–9 January 2021. [Google Scholar]
  22. Ip, H.H.S.; Li, C.; Leoni, S.; Chen, Y.; Ma, K.F.; Wong, C.H.t.; Li, Q. Design and evaluate immersive learning experience for massive open online courses (MOOCs). IEEE Trans. Learn. Technol. 2018, 12, 503–515. [Google Scholar] [CrossRef]
  23. Bhattacharjee, D.; Paul, A.; Kim, J.H.; Karthigaikumar, P. An immersive learning model using evolutionary learning. Comput. Electr. Eng. 2018, 65, 236–249. [Google Scholar] [CrossRef]
  24. Fracaro, S.G.; Glassey, J.; Bernaerts, K.; Wilk, M. Immersive technologies for the training of operators in the process industry: A Systematic Literature Review. Comput. Chem. Eng. 2022, 160, 107691. [Google Scholar]
  25. Makransky, G.; Petersen, G.B. The cognitive affective model of immersive learning (CAMIL): A theoretical research-based model of learning in immersive virtual reality. Educ. Psychol. Rev. 2021, 33, 937–958. [Google Scholar] [CrossRef]
  26. De Back, T.T.; Tinga, A.M.; Louwerse, M.M. CAVE-based immersive learning in undergraduate courses: Examining the effect of group size and time of application. Int. J. Educ. Technol. High. Educ. 2021, 18, 56. [Google Scholar] [CrossRef]
  27. Swensen, H. Potential of augmented reality in sciences education. A literature review. In Proceedings of the 9th International Conference of Education, Research and Innovation (ICERI), Seville, Spain, 14–16 November 2016; pp. 2540–2547. [Google Scholar]
  28. Unity Real-Time Development Platform. 2021. Available online: https://unity.com/ (accessed on 6 May 2021).
  29. Walkington, C. Exploring Collaborative Embodiment for Learning (EXCEL): Understanding Geometry Through Multiple Modalities. 2022. Available online: https://ies.ed.gov/funding/grantsearch/details.asp?ID=4484 (accessed on 23 February 2022).
  30. Culbertson, H.; Lpez Delgado, J.J.; Kuchenbecker, K.J. One hundred data-driven haptic texture models and open-source methods for rendering on 3D objects. In Proceedings of the 2014 IEEE Haptics Symposium (HAPTICS), Houston, TX, USA, 23–26 February 2014; pp. 319–325. [Google Scholar] [CrossRef]
  31. Pacchierotti, C.; Sinclair, S.; Solazzi, M.; Frisoli, A.; Hayward, V.; Prattichizzo, D. Wearable haptic systems for the fingertip and the hand: Taxonomy, review, and perspectives. IEEE Trans. Haptics 2017, 10, 580–600. [Google Scholar] [CrossRef] [Green Version]
  32. Haptics in Apple User Interaction. 2022. Available online: https://developer.apple.com/design/human-interface-guidelines/ios/user-interaction/haptics/ (accessed on 18 March 2022).
  33. Ma, R.; Dollar, A. Yale openhand project: Optimizing open-source hand designs for ease of fabrication and adoption. IEEE Robot. Autom. Mag. 2017, 24, 32–40. [Google Scholar] [CrossRef]
  34. AugmentedWearEdu. Available online: https://augmentedwearedu.uia.no/ (accessed on 27 March 2022).
  35. Chattha, U.A.; Janjua, U.I.; Anwar, F.; Madni, T.M.; Cheema, M.F.; Janjua, S.I. Motion sickness in virtual reality: An empirical evaluation. IEEE Access 2020, 8, 130486–130499. [Google Scholar] [CrossRef]
  36. Tychsen, L.; Foeller, P. Effects of immersive virtual reality headset viewing on young children: Visuomotor function, postural stability, and motion sickness. Am. J. Ophthalmol. 2020, 209, 151–159. [Google Scholar] [CrossRef] [PubMed]
  37. Zhou, N.N.; Deng, Y.L. Virtual reality: A state-of-the-art survey. Int. J. Autom. Comput. 2009, 6, 319–325. [Google Scholar] [CrossRef]
  38. Mütterlein, J. The three pillars of virtual reality? Investigating the roles of immersion, presence, and interactivity. In Proceedings of the 51st Hawaii International Conference on System Sciences, Waikoloa Village, HI, USA, 2–6 January 2018. [Google Scholar]
  39. Freina, L.; Ott, M. A literature review on immersive virtual reality in education: State of the art and perspectives. Int. Sci. Conf. Elearning Softw. Educ. 2015, 1, 10–1007. [Google Scholar]
  40. Zhao, J.; Allison, R.S.; Vinnikov, M.; Jennings, S. Estimating the motion-to-photon latency in head mounted displays. In Proceedings of the IEEE Virtual Reality (VR), Los Angeles, CA, USA, 18–22 March 2017; pp. 313–314. [Google Scholar]
  41. Clay, V.; König, P.; Koenig, S. Eye tracking in virtual reality. J. Eye Mov. Res. 2019, 12. [Google Scholar] [CrossRef] [PubMed]
  42. Munafo, J.; Diedrick, M.; Stoffregen, T.A. The virtual reality head-mounted display Oculus Rift induces motion sickness and is sexist in its effects. Exp. Brain Res. 2017, 235, 889–901. [Google Scholar] [CrossRef]
  43. Logitech. VR Ink Stylus. 2021. Available online: https://www.logitech.com/en-roeu/promo/vr-ink.html (accessed on 6 May 2021).
  44. Sipatchin, A.; Wahl, S.; Rifai, K. Eye-tracking for low vision with virtual reality (VR): Testing status quo usability of the HTC Vive Pro Eye. bioRxiv 2020. [Google Scholar] [CrossRef]
  45. Ogdon, D.C. HoloLens and VIVE pro: Virtual reality headsets. J. Med. Libr. Assoc. 2019, 107, 118. [Google Scholar] [CrossRef] [Green Version]
  46. Stengel, M.; Grogorick, S.; Eisemann, M.; Eisemann, E.; Magnor, M.A. An affordable solution for binocular eye tracking and calibration in head-mounted displays. In Proceedings of the 23rd ACM international conference on Multimedia, Brisbane, Australia, 26–30 October 2015; pp. 15–24. [Google Scholar]
  47. Syed, R.; Collins-Thompson, K.; Bennett, P.N.; Teng, M.; Williams, S.; Tay, D.W.W.; Iqbal, S. Improving Learning Outcomes with Gaze Tracking and Automatic Question Generation. In Proceedings of the Web Conference, Taipei, Taiwan, 20–25 April 2020; pp. 1693–1703. [Google Scholar]
  48. Muender, T.; Bonfert, M.; Reinschluessel, A.V.; Malaka, R.; Döring, T. Haptic Fidelity Framework: Defining the Factors of Realistic Haptic Feedback for Virtual Reality. 2022; preprint. [Google Scholar]
  49. Kang, N.; Lee, S. A meta-analysis of recent studies on haptic feedback enhancement in immersive-augmented reality. In Proceedings of the 4th International Conference on Virtual Reality, Hong Kong, China, 24–26 February 2018; pp. 3–9. [Google Scholar]
  50. Edwards, B.I.; Bielawski, K.S.; Prada, R.; Cheok, A.D. Haptic virtual reality and immersive learning for enhanced organic chemistry instruction. Virtual Real. 2019, 23, 363–373. [Google Scholar] [CrossRef]
  51. HaptX. HaptX Gloves. 2022. Available online: https://haptx.com/ (accessed on 23 February 2021).
  52. Gu, X.; Zhang, Y.; Sun, W.; Bian, Y.; Zhou, D.; Kristensson, P.O. Dexmo: An inexpensive and lightweight mechanical exoskeleton for motion capture and force feedback in VR. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, San Jose, CA, USA, 7–12 May 2016; pp. 1991–1995. [Google Scholar]
  53. Interhaptics. Haptics for Virtual Reality (VR) and Mixed Reality (MR). 2022. Available online: https://www.interhaptics.com/ (accessed on 23 February 2021).
  54. Azuma, R.T. A survey of augmented reality. Presence Teleoperators Virtual Environ. 1997, 6, 355–385. [Google Scholar] [CrossRef]
  55. Azuma, R.T. Making augmented reality a reality. In Applied Industrial Optics: Spectroscopy, Imaging and Metrology; Optical Society of America: Washington, DC, USA, 2017; p. JTu1F-1. [Google Scholar] [CrossRef] [Green Version]
  56. Bacca Acosta, J.L.; Baldiris Navarro, S.M.; Fabregat Gesa, R.; Graf, S.; Kinshuk, D. Augmented reality trends in education: A systematic review of research and applications. J. Educ. Technol. Soc. 2014, 17, 133–149. [Google Scholar]
  57. Chen, P.; Liu, X.; Cheng, W.; Huang, R. A review of using Augmented Reality in Education from 2011 to 2016. In Innovations in Smart Learning; Springer: Berlin/Heidelberg, Germany, 2017; pp. 13–18. [Google Scholar]
  58. Garzón, J.; Pavón, J.; Baldiris, S. Systematic review and meta-analysis of augmented reality in educational settings. Virtual Real. 2019, 23, 447–459. [Google Scholar] [CrossRef]
  59. Craig, A.B. Understanding Augmented Reality: Concepts and Applications Newnes; Morgan Kaufmann: Burlington, MA, USA, 2013. [Google Scholar]
  60. Wang, J.; Zhu, M.; Fan, X.; Yin, X.; Zhou, Z. Multi-Channel Augmented Reality Interactive Framework Design for Ship Outfitting Guidance. IFAC Pap. Online 2020, 53, 189–196. [Google Scholar] [CrossRef]
  61. Ren, G.; Wei, S.; O’Neill, E.; Chen, F. Towards the design of effective haptic and audio displays for augmented reality and mixed reality applications. Adv. Multimed. 2018, 2018, 4517150. [Google Scholar] [CrossRef] [Green Version]
  62. Ibáñez, M.B.; Delgado-Kloos, C. Augmented reality for STEM learning: A systematic review. Comput. Educ. 2018, 123, 109–123. [Google Scholar] [CrossRef]
  63. Rieger, C.; Majchrzak, T.A. Towards the Definitive Evaluation Framework for Cross-Platform App Development Approaches. J. Syst. Softw. 2019, 153, 175–199. [Google Scholar] [CrossRef]
  64. Radu, I. Augmented reality in education: A meta-review and cross-media analysis. Pers. Ubiquitous Comput. 2014, 18, 1533–1543. [Google Scholar] [CrossRef]
  65. Fromm, J.; Eyilmez, K.; Ba’feld, M.; Majchrzak, T.A.; Stieglitz, S. Social Media Data in an Augmented Reality System for Situation Awareness Support in Emergency Control Rooms. Inf. Syst. Front. 2021, 1–24. [Google Scholar] [CrossRef]
  66. Sırakaya, M.; Alsancak Sırakaya, D. Augmented reality in STEM education: A systematic review. Interact. Learn. Environ. 2020, 1–14. [Google Scholar] [CrossRef]
  67. Sanfilippo, F.; Weustink, P.B.; Pettersen, K.Y. A coupling library for the force dimension haptic devices and the 20-sim modelling and simulation environment. In Proceedings of the 41st Annual Conference (IECON) of the IEEE Industrial Electronics Society, Yokohama, Japan, 9–12 November 2015; pp. 168–173. [Google Scholar]
  68. Williams, R.L., II; Chen, M.Y.; Seaton, J.M. Haptics-augmented high school physics tutorials. Int. J. Virtual Real. 2001, 5, 167–184. [Google Scholar] [CrossRef]
  69. Williams, R.L.; Srivastava, M.; Conaster, R.; Howell, J.N. Implementation and evaluation of a haptic playback system. Haptics-e Electron. J. Haptics Res. 2004. Available online: http://hdl.handle.net/1773/34888 (accessed on 27 March 2022).
  70. Teklemariam, H.G.; Das, A. A case study of phantom omni force feedback device for virtual product design. Int. J. Interact. Des. Manuf. 2017, 11, 881–892. [Google Scholar] [CrossRef]
  71. Salvietti, G.; Meli, L.; Gioioso, G.; Malvezzi, M.; Prattichizzo, D. Multicontact Bilateral Telemanipulation with Kinematic Asymmetries. IEEE/ASME Trans. Mechatron. 2017, 22, 445–456. [Google Scholar] [CrossRef]
  72. Leonardis, D.; Barsotti, M.; Loconsole, C.; Solazzi, M.; Troncossi, M.; Mazzotti, C.; Castelli, V.P.; Procopio, C.; Lamola, G.; Chisari, C.; et al. An EMG-controlled robotic hand exoskeleton for bilateral rehabilitation. IEEE Trans. Haptics 2015, 8, 140–151. [Google Scholar] [CrossRef] [PubMed]
  73. Leonardis, D.; Solazzi, M.; Bortone, I.; Frisoli, A. A wearable fingertip haptic device with 3 DoF asymmetric 3-RSR kinematics. In Proceedings of the 2015 IEEE World Haptics Conference (WHC), Evanston, IL, USA, 22–26 June 2015; pp. 388–393. [Google Scholar]
  74. Minamizawa, K.; Fukamachi, S.; Kajimoto, H.; Kawakami, N.; Tachi, S. Gravity grabber: Wearable haptic display to present virtual mass sensation. In Proceedings of the ACM SIGGRAPH 2007 Emerging Technologies, San Diego, CA, USA, 5–9 August 2007; p. 8. [Google Scholar]
  75. Prattichizzo, D.; Chinello, F.; Pacchierotti, C.; Malvezzi, M. Towards wearability in fingertip haptics: A 3-dof wearable device for cutaneous force feedback. IEEE Trans. Haptics 2013, 6, 506–516. [Google Scholar] [CrossRef] [PubMed]
  76. Maisto, M.; Pacchierotti, C.; Chinello, F.; Salvietti, G.; De Luca, A.; Prattichizzo, D. Evaluation of wearable haptic systems for the fingers in augmented reality applications. IEEE Trans. Haptics 2017, 10, 511–522. [Google Scholar] [CrossRef] [Green Version]
  77. Pacchierotti, C.; Salvietti, G.; Hussain, I.; Meli, L.; Prattichizzo, D. The hRing: A wearable haptic device to avoid occlusions in hand tracking. In Proceedings of the 2016 IEEE Haptics Symposium (HAPTICS), Philadelphia, PA, USA, 8–11 April 2016; pp. 134–139. [Google Scholar]
  78. Baldi, T.L.; Scheggi, S.; Aggravi, M.; Prattichizzo, D. Haptic guidance in dynamic environments using optimal reciprocal collision avoidance. IEEE Robot. Autom. Lett. 2017, 3, 265–272. [Google Scholar] [CrossRef] [Green Version]
  79. Chinello, F.; Malvezzi, M.; Pacchierotti, C.; Prattichizzo, D. Design and development of a 3RRS wearable fingertip cutaneous device. In Proceedings of the IEEE International Conference on Advanced Intelligent Mechatronics (AIM), Busan, Korea, 7–11 July 2015; pp. 293–298. [Google Scholar]
  80. Hayward, V.; Astley, O.R.; Cruz-Hernandez, M.; Grant, D.; Robles-De-La-Torre, G. Haptic interfaces and devices. Sens. Rev. 2004, 24, 16–29. [Google Scholar] [CrossRef]
  81. Pacchierotti, C.; Meli, L.; Chinello, F.; Malvezzi, M.; Prattichizzo, D. Cutaneous haptic feedback to ensure the stability of robotic teleoperation systems. Int. J. Robot. Res. 2015, 34, 1773–1787. [Google Scholar] [CrossRef]
  82. Salazar, S.V.; Pacchierotti, C.; de Tinguy, X.; Maciel, A.; Marchal, M. Altering the stiffness, friction, and shape perception of tangible objects in virtual reality using wearable haptics. IEEE Trans. Haptics 2020, 13, 167–174. [Google Scholar] [CrossRef] [Green Version]
  83. Kreimeier, J.; Hammer, S.; Friedmann, D.; Karg, P.; Bühner, C.; Bankel, L.; Götzelmann, T. Evaluation of different types of haptic feedback influencing the task-based presence and performance in virtual reality. In Proceedings of the 12th ACM International Conference on PErvasive Technologies Related to Assistive Environments, Rhodes, Greece, 5–7 June 2019; pp. 289–298. [Google Scholar]
  84. Heeneman, S.; Oudkerk Pool, A.; Schuwirth, L.W.; van der Vleuten, C.P.; Driessen, E.W. The impact of programmatic assessment on student learning: Theory versus practice. Med. Educ. 2015, 49, 487–498. [Google Scholar] [CrossRef]
  85. Kamińska, D.; Zwoliński, G.; Wiak, S.; Petkovska, L.; Cvetkovski, G.; Barba, P.D.; Mognaschi, M.E.; Haamer, R.E.; Anbarjafari, G. Virtual Reality-Based Training: Case Study in Mechatronics. Technol. Knowl. Learn. 2021, 26, 1043–1059. [Google Scholar] [CrossRef]
  86. Fucentese, S.F.; Rahm, S.; Wieser, K.; Spillmann, J.; Harders, M.; Koch, P.P. Evaluation of a virtual-reality-based simulator using passive haptic feedback for knee arthroscopy. Knee Surg. Sport. Traumatol. Arthrosc. 2015, 23, 1077–1085. [Google Scholar] [CrossRef] [PubMed]
  87. Yurdabakan, İ. The view of constructivist theory on assessment: Alternative assessment methods in education. Ank. Univ. J. Fac. Educ. Sci. 2011, 44, 51–78. [Google Scholar] [CrossRef]
  88. Schuwirth, L.W.; Van der Vleuten, C.P. Programmatic assessment: From assessment of learning to assessment for learning. Med. Teach. 2011, 33, 478–485. [Google Scholar] [CrossRef] [PubMed]
  89. Vraga, E.; Bode, L.; Troller-Renfree, S. Beyond self-reports: Using eye tracking to measure topic and style differences in attention to social media content. Commun. Methods Meas. 2016, 10, 149–164. [Google Scholar] [CrossRef]
  90. Alemdag, E.; Cagiltay, K. A systematic review of eye tracking research on multimedia learning. Comput. Educ. 2018, 125, 413–428. [Google Scholar] [CrossRef]
  91. Wu, C.; Cha, J.; Sulek, J.; Zhou, T.; Sundaram, C.P.; Wachs, J.; Yu, D. Eye-tracking metrics predict perceived workload in robotic surgical skills training. Hum. Factors 2020, 62, 1365–1386. [Google Scholar] [CrossRef] [Green Version]
  92. Da Silva, A.C.; Sierra-Franco, C.A.; Silva-Calpa, G.F.M.; Carvalho, F.; Raposo, A.B. Eye-tracking Data Analysis for Visual Exploration Assessment and Decision Making Interpretation in Virtual Reality Environments. In Proceedings of the 2020 22nd Symposium on Virtual and Augmented Reality (SVR), Porto de Galinhas, Brazil, 7–10 November 2020; pp. 39–46. [Google Scholar]
  93. Pernalete, N.; Raheja, A.; Segura, M.; Menychtas, D.; Wieczorek, T.; Carey, S. Eye-Hand Coordination Assessment Metrics Using a Multi-Platform Haptic System with Eye-Tracking and Motion Capture Feedback. In Proceedings of the 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Honolulu, HI, USA, 17–21 July 2018; pp. 2150–2153. [Google Scholar] [CrossRef]
  94. Sanfilippo, F. A multi-sensor system for enhancing situational awareness in offshore training. In Proceedings of the IEEE International Conference On Cyber Situational Awareness, Data Analytics Furthermore, Assessment (CyberSA), London, UK, 13–16 June 2016; pp. 1–6. [Google Scholar]
  95. Sanfilippo, F. A multi-sensor fusion framework for improving situational awareness in demanding maritime training. Reliab. Eng. Syst. Saf. 2017, 161, 12–24. [Google Scholar] [CrossRef]
  96. Ziv, G. Gaze behavior and visual attention: A review of eye tracking studies in aviation. Int. J. Aviat. Psychol. 2016, 26, 75–104. [Google Scholar] [CrossRef]
  97. Chen, Y.; Jermias, J.; Panggabean, T. The role of visual attention in the managerial Judgment of Balanced-Scorecard performance evaluation: Insights from using an eye-tracking device. J. Account. Res. 2016, 54, 113–146. [Google Scholar] [CrossRef]
  98. Fan, S.; Shen, Z.; Jiang, M.; Koenig, B.L.; Xu, J.; Kankanhalli, M.S.; Zhao, Q. Emotional attention: A study of image sentiment and visual attention. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–22 June 2018; pp. 7521–7531. [Google Scholar]
  99. Sanfilippo, F.; Bla’auskas, T.; Gird’i’na, M.; Janonis, A.; Kiudys, E.; Salvietti, G. A Multi-Modal Auditory-Visual-Tactile e-Learning Framework. In Proceedings of the 4th International Conference on Intelligent Technologies and Applications (INTAP 2021), Grimstad, Norway, 11–13 October 2021. accepted for publication. [Google Scholar]
Figure 1. The multi-sensory learning approach, which involves visual, auditory, kinesthetic, and tactile—VAKT feedback.
Figure 1. The multi-sensory learning approach, which involves visual, auditory, kinesthetic, and tactile—VAKT feedback.
Robotics 11 00041 g001
Figure 2. Learner aptitude towards learning.
Figure 2. Learner aptitude towards learning.
Robotics 11 00041 g002
Figure 3. Opportunities offered by immersive learning.
Figure 3. Opportunities offered by immersive learning.
Robotics 11 00041 g003
Figure 4. Integrating immersive learning into the learning process.
Figure 4. Integrating immersive learning into the learning process.
Robotics 11 00041 g004
Figure 5. Immersive learning maturity model.
Figure 5. Immersive learning maturity model.
Robotics 11 00041 g005
Figure 6. Setup for multi-sensory learning. The user wears a head-mounted display to get access to the visual contents. He also wears wearable haptic interfaces in their right hand reproducing tactile cues.
Figure 6. Setup for multi-sensory learning. The user wears a head-mounted display to get access to the visual contents. He also wears wearable haptic interfaces in their right hand reproducing tactile cues.
Robotics 11 00041 g006
Figure 7. Assessment and evaluation concept using VR with eye-tracking capability.
Figure 7. Assessment and evaluation concept using VR with eye-tracking capability.
Robotics 11 00041 g007
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Sanfilippo, F.; Blazauskas, T.; Salvietti, G.; Ramos, I.; Vert, S.; Radianti, J.; Majchrzak, T.A.; Oliveira, D. A Perspective Review on Integrating VR/AR with Haptics into STEM Education for Multi-Sensory Learning. Robotics 2022, 11, 41. https://doi.org/10.3390/robotics11020041

AMA Style

Sanfilippo F, Blazauskas T, Salvietti G, Ramos I, Vert S, Radianti J, Majchrzak TA, Oliveira D. A Perspective Review on Integrating VR/AR with Haptics into STEM Education for Multi-Sensory Learning. Robotics. 2022; 11(2):41. https://doi.org/10.3390/robotics11020041

Chicago/Turabian Style

Sanfilippo, Filippo, Tomas Blazauskas, Gionata Salvietti, Isabel Ramos, Silviu Vert, Jaziar Radianti, Tim A. Majchrzak, and Daniel Oliveira. 2022. "A Perspective Review on Integrating VR/AR with Haptics into STEM Education for Multi-Sensory Learning" Robotics 11, no. 2: 41. https://doi.org/10.3390/robotics11020041

APA Style

Sanfilippo, F., Blazauskas, T., Salvietti, G., Ramos, I., Vert, S., Radianti, J., Majchrzak, T. A., & Oliveira, D. (2022). A Perspective Review on Integrating VR/AR with Haptics into STEM Education for Multi-Sensory Learning. Robotics, 11(2), 41. https://doi.org/10.3390/robotics11020041

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop