Next Article in Journal
Improved Visual SLAM Using Semantic Segmentation and Layout Estimation
Next Article in Special Issue
An Educational Test Rig for Kinesthetic Learning of Mechanisms for Underactuated Robotic Hands
Previous Article in Journal
Design of a Rapid Structure from Motion (SfM) Based 3D Reconstruction Framework Using a Team of Autonomous Small Unmanned Aerial Systems (sUAS)
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Augmented and Virtual Reality Experiences for Learning Robotics and Training Integrative Thinking Skills

1
Faculty of Education in Science and Technology, Technion-Israel Institute of Technology, Haifa 3200003, Israel
2
Faculty of Industrial Engineering and Management, Technion-Israel Institute of Technology, Haifa 3200003, Israel
*
Author to whom correspondence should be addressed.
Robotics 2022, 11(5), 90; https://doi.org/10.3390/robotics11050090
Submission received: 25 July 2022 / Revised: 29 August 2022 / Accepted: 1 September 2022 / Published: 6 September 2022
(This article belongs to the Special Issue Advances and Challenges in Educational Robotics II)

Abstract

:
Learning through augmented reality (AR) and virtual reality (VR) experiences has become a valuable approach in modern robotics education. This study evaluated this approach and investigated how 99 first-year industrial engineering students explored robot systems through such online experiences while staying at home. The objective was to examine learning in the AR/VR environment and evaluate its contribution to understanding the robot systems and to fostering integrative thinking. During the AR experiences that we developed using Vuforia Studio, the students learned about TurtleBot2 and RACECAR MN robots while disassembling and modifying their models and by obtaining information about their components. In the VR experience with the RacecarSim simulator, the students explored sensor-based robot navigation. Quizzes were used to assess understanding of robot systems, and a post-workshop questionnaire evaluated the workshop’s contribution to learning about the robots and to training integrative thinking skills. The data indicate that the students gained understanding of the robot systems, appreciated the contribution of the augmented and virtual reality apps, and widely used integrative thinking throughout the practice. Our study shows that AR apps and virtual simulators can be effectively used for experiential learning about robot systems in online courses. However, these experiences cannot replace practice with real robots.

1. Introduction

In educational literature, there is a lively debate on utilizing the concepts of virtual reality (VR) and augmented reality (AR) environments in which learners interact with virtual objects as part of their learning process [1]. In VR, learning concentrates on interaction only with virtual objects [2,3]. In AR, virtual objects are superimposed on the real environment, and learners interact with real and virtual objects, while these two kinds of objects do not interact among themselves [4]. In mixed reality (MR), learning includes interactions among real and virtual objects [5]. In this paper, we will focus on learning in AR and VR environments.

1.1. Augmented Reality in Education

Augmented reality enhances traditional education and provides several advantages to the learning process. Ref. [6] performed a meta-analysis on several studies from 2007 to 2017 for determining the effectiveness of AR applications for the learning process with different variables. The findings of their study indicated that learning with AR positively impacts learning outcomes. The mentioned educational benefits and added values of AR were more attractive learning environments, improved subject comprehension, and making abstract concepts more concrete, leading to better understanding, recall, and concentration.
Regarding the implementation of AR in instructional laboratories, Ref. [7] studied the effects of using AR environments on learning and cognitive load in university physics laboratory courses, and Ref. [8] carried out a similar study in an electrical circuits course. In both studies, the courses were taught to two groups of undergraduate students, one with a traditional lab-work format and the other with AR-assisted lab-work. Results of the post-course knowledge tests in the first study did not reveal significant differences in learning outcomes of the groups, while in the second study they indicated that AR-supported learning led to higher conceptual knowledge gains than non-AR learning. The post-course questionnaires in both studies showed that the cognitive load of the students studying in the AR-assisted lab was significantly lower than that in the traditional lab.

1.2. Augmented Reality in Engineering Education and Robotics

Learning activities in augmented reality are effectively used in engineering education. For example, in the study [9], 60 first-year undergraduate students majoring in electronics and electrical engineering participated in a laboratory course and learned to use oscilloscope and function generator. The students were divided into an experimental group and a control group. The experimental group practiced operating the equipment in the augmented reality environment and then operated the actual equipment, while the control group was taught with the traditional approach using the laboratory manual. Results of the study indicated significant advantages of the experimental group over the control group in terms of operation skills, lower cognitive load, and higher appreciation of the learning experience.
Recent studies considered the use of augmented reality environments for learning mobile robots. The study [10] presents an experiment in which the students developed a mobile robot controlled by the GoGo Board and operated it in augmented reality using the data from the virtual sensors, supplied through the HoloLens app. Participants of the study were 36 university students divided into the experimental and control groups who practiced in operating robots to perform maze navigation tasks. For students of the experimental group, the sensor data were presented through AR, while for the control group they were displayed on the computer screen. The study found that students’ achievements in learning robot operation in the experimental group were significantly higher than in the control group.
The study [11] considers a lab exercise in which 20 undergraduate students practiced in path planning and navigation of the Khepera II mobile robot using information supplied through AR. The participants appreciated the acquired experience in using AR and the contribution of the AR tools to learning the subject.
While demonstrating that AR apps can be used in robotics education, the discussed case studies did not consider the pedagogical aspects of learning the subject in AR environments. In the study presented in this paper, we focused on these aspects and developed and implemented AR experiences as an online, remotely accessible practice in exploring modern robot systems. The AR experiences were applied not only to help students learn concepts, but also to foster the development of integrative thinking skills for solving problems in robot construction, programming, and operation. Modern intelligent robots are complex integrated systems of components that implement different disruptive technologies. Learning how to develop and operate such robots requires the students to make connections among the concepts and develop a holistic view of the robot system, i.e., apply integrative thinking [12].
In the study presented in this paper, we developed, implemented, and evaluated a workshop “Mobile Robots in Augmented Reality”, in which the students explored two different mobile robots: TurtleBot2 of Yujin Robotics [13] and RACECAR MN developed at MIT [14]. The students, while at home, practiced with the robots in augmented and virtual reality environments. This setting for the workshop practice was chosen due to the social distancing restrictions caused by the pandemic. In the workshop, the students used their own mobile devices to “place” virtual robots on top of real tables at their homes. In the AR experiences, the students investigated the architecture and the components of two different robots, while in the VR experiences, they used the RacecarSim simulator to explore the sensor-based navigation of the RACECAR. The evaluation of the workshop outcomes focused on the contribution of the experiences to understanding the robot systems and fostering the students to apply integrative thinking.

1.3. Learning with Understanding

Learning with understanding (LWU) is characterized by making connections among the learned concepts and prior knowledge [15]. LWU includes the ability to use the knowledge to solve practical problems [16]. Ausubel et al. [17] pointed out that LWU can occur when the learning material is understandable and the learners have prior knowledge needed to understand the subject, are willing to make an effort to understand it, and are scaffolded to apply appropriate learning strategies. De Weck et al. [18] (p. 168) emphasized that solving real problems related to a complex engineering system requires deep understanding of the system’s structure, functionality, and principles of its operation. Understanding an engineering system requires a careful assessment of the system’s scale and scope, its function, its structure (or architecture), its dynamics, and changes over time.
Shen and Zhu [19] noted that complex engineering systems encompass concepts from different fields and that scaffolding needed to facilitate learning of such systems should take forms adequate to these concepts. The authors proposed to provide scaffolding for learning concepts of complex systems in physical and virtual environments in the forms of presentation, conceptualization, and problem-solving. Presentation scaffolding helps students to understand the learned concepts by using simulations, animations, and other forms of tangible illustration. Conceptual scaffolding contributes to student understanding by providing conscious explanations about the concepts that underlie the learned system. Problem-solving scaffolding enables the students to deeper understand a system by involving them in solving practical problems related to its architecture and functionality. In this study, we implemented all three scaffolding techniques to assist student learning of complex robot systems.

1.4. Integrative Thinking

Researchers from different fields interpreted the meaning of integrative thinking (IT) differently, depending on the context of their studies [20,21]. Martin [22] (p. 13) studied IT in the area of management and defined it as an ability to resolve tensions among opposing ideas and generate a creative solution in the form of a new superior idea. Tynjälä [23] considered the concept of IT in adult education and perceived it as the ability to synthesize different ideas and perspectives, not obligatory opposing.
In engineering, integrative thinking skills are vital when dealing with complex technological systems [24]. Engineers apply IT when they analyze systems to comprehend their architecture and functionality and develop new integrated systems. In robotics, the robot operating system (ROS) is an excellent example of how integrative thinking is being applied. ROS allows robot systems to be extended and different robots to be integrated into multi-robot systems [25].
Although IT is crucial for engineering, we could not find a definition of this ability in the engineering education literature. Some authors, e.g., [26], used Martin’s definition even though it does not account for the specific characteristics of integrative thinking when applied to engineering systems. As we explored ways to develop students’ IT skills through practice with robot systems, we found it necessary to define this ability within the context of engineering systems. The following is a working definition of IT that we propose:
Integrative thinking on engineering systems is the cognitive process of considering the systems as structures of devices that by working in coordination provide a system-level functionality beyond the functions of the individual devices.
When describing the ability of integrative thinking, educators distinguish between verbal integrative thinking skills and visual integrative thinking skills [27]. With regard to engineering systems, using visual integrative thinking skills allows to combine images of the system’s parts and workspace and create its comprehensive view needed for solving problems related to the system design and operation. Using the skills of verbal integrative thinking allows to comprehend and analyze different verbal information on a specific aspect of the system and draw inferences needed for solving system engineering problems related to this aspect.
Engineering education asserts that integrative thinking skills can be developed through learning practice and that engineering curricula should foster development of these skills [28,29,30]. The experiential learning of engineering systems, and particularly that of robot systems, offers students rich opportunities to develop their IT skills, make connections between different disciplines, and integrate theoretical knowledge and practical skills. Team projects in robotics bring together students with different backgrounds, who apply integrative thinking when collectively performing their project tasks [31]. When the students analyze a robot system or create a new one from components, they need to comprehend the functionality and affordances of each component, and how it interacts with other components, physically or digitally. When programming and operating robots in physical and virtual environments, the students should perceive the affordances of robot operation and monitor its behaviors in the workspace. In these intellectual activities of comprehension, perception, and monitoring, the students can vigorously apply and train their integrative thinking skills.
Educators point out that engineering programs rarely pay explicit attention to fostering and assessing students’ integrative thinking skills [32]. We did not find studies addressing this subject in the educational robotics literature.

2. Materials and Methods

2.1. Research Goal and Questions

The study was conducted in the 2020–2021 academic year in the framework of the workshop “Mobile Robots in Augmented Reality” delivered to first-year students of the Technion Faculty of Industrial Engineering and Management (IEM). The goal of the study was to examine how the students learned through online practice with robot models in virtual and augmented realities and how they evaluated the contribution of this practice to training their integrative thinking. The research questions were:
  • Did the students, involved in the online augmented and virtual reality experiences with robot systems, develop an understanding of the systems?
  • How did the students evaluate the contribution of augmented and virtual reality experiences to training integrative thinking skills?
In this study, we developed augmented reality experiences, conducted the workshop based on these and previously developed experiences, and evaluated students’ learning outcomes.

2.2. Method

This research is an exploratory pilot study which evaluates the feasibility of involving first-year students majoring in industrial engineering in online experiential learning of robot systems outside the robotics lab, through augmented and virtual reality experiences. Because of the limited access to the students caused by the epidemic and course restrictions, the study did not have a control group practicing in parallel to the experimental group with the same robotic systems, not using AR and VR. To increase the validity of the research findings, we compared the learning outcomes of the experimental group in the current workshop with the relevant outcomes of our previous workshops that are presented below in Section 2.3.
The study used the mixed-method design that employs the qualitative and quantitative approaches to data collection and analysis of the learning outcomes. According to [33] the outcomes can be divided into cognitive, skill-based, and affective. In our study, the evaluation of cognitive outcomes focused on understanding the robot systems and on how the learning activities engaged the students in applying integrative thinking skills. Evaluation of technical skills concentrated on the ability to explore and operate the robots in AR and VR environments. For affective attitudinal outcomes considered students reflections on their practice with Industry 4.0 technologies.

2.3. The Workshop Intentions

In the last ten years, we have been conducting robotics workshops for the IEM’s first-year students, with the goal of increasing their awareness about industrial engineering, enabling them to experience robot systems for the first time, and developing their generic skills required for engineering studies and practice. Over these years, the workshop’s technological environment and learning activities have undergone significant upgrades.
The first version of the workshop was held in the faculty robotics lab. The students practiced operating Scorbot robot-manipulators in physical and virtual environments to assemble block structures. The assembling tasks were oriented towards developing spatial skills [34].
The second version of the workshop was developed after the laboratory was equipped with an advanced industrial robot Baxter. In the workshop, the students, while in the faculty computer class, practiced operating the robots Scorbot ER5 and Baxter using the RoboCell and Gazebo simulators. The tasks focused on exploring robot manipulation affordances [35]. The third version of the workshop was developed in order to provide practical training to students when the faculty robotics laboratory and computer class were closed because of pandemic restrictions, and classes were conducted only remotely. Therefore, our intentions in designing the workshop evolved and were as follows:
  • Provide student practice with modern robots using the online environment that we developed based on the AR and VR technologies.
  • Facilitate learning of a complex robot system, in which students discover the principles of operation of its components and do not take them as black boxes.
  • Offer opportunities for training integrative thinking skills through practice with the robot systems.
  • Help students to understand the essence of the technological transformation brought by the Fourth Industrial Revolution and the learning opportunities it brings.
  • Test a possible implementation of the above intentions in a short-term online workshop.

2.4. AR Experience with RACECAR MN

In our past study [31] we developed an AR application for studying the TurtleBot2 robot system. Based on this experience, in the current study, we developed a new AR application that allows students to practice with a more complex and advanced RACECAR MN robot. The Rapid Autonomous Complex Environment Competing Ackermann steering Robot Model Nano (RACECAR MN) is an autonomous mini racecar (Figure 1) designed by the MIT Beaver Works Summer Institute (BWSI) program for teaching students the concepts and technologies of self-driving cars [36]. The rationale for developing the new AR application was to support experiential learning about RACECAR MN even when the physical robot is not accessible. The AR application was developed in three stages, in which we created the 3D model, the animations, and the experience, as described below.

2.4.1. Creating a 3D Model of the RACECAR MN

The created model presented the main subsystems of the robot: a Jetson Nano onboard computer; an Intel RealSense color and depth camera with an integrated Inertial Measurement Unit (IMU); a laser imaging, detection, and ranging device (Lidar); and a car chassis, including all the components for driving and steering the car. We developed the model of the robot chassis (Figure 2a) using the SolidWorks software and integrated it with the computer-aided design (CAD) models of other robot components provided by the MIT BWSI into the holistic model of the robot system presented in Figure 2b.

2.4.2. Creating the Animations

At the next step of the AR app development, Creo Illustrate (www.ptc.com/en/products/creo/illustrate, accessed on 30 August 2022) was used to create animations of the robot model. The animations presented how to break down RACECAR MN into subsystems and break down the subsystems into components, while providing information about them on the attached labels.

2.4.3. Creating the AR Experience

At the final stage of the AR app development, Vuforia Studio 8.3.0 (ptc.com/en/products/Vuforia/Vuforia-studio, accessed on 30 August 2022) was used to upload the model and animation to the cloud and create the graphical user interface (GUI) for interacting with the experience. The GUI provides the main view of the entire robot and separate views of each of the three robot layers. In the main view, the animation enabled the user to manage the presentation of the tags attached to the components of the virtual robot. The user could choose to display only the red tags, related to the main component (Figure 3a), only the blue tags of the peripheral components (Figure 3b), or hide all the tags to better observe the model (Figure 3c).
To provide better visualization of the model, the application enables to observe separately each of the three layers of robot components: top, middle, and bottom. The top layer displayed in Figure 4a includes the Lidar, monitor, and peripherals. The middle layer includes the Jetson Nano onboard computer, RealSense camera, and peripherals, displayed in Figure 4b. The lower layer is the car chassis (Figure 2a) consisting of the driving and steering mechanisms. The button in the upper right corner of the main view (Figure 3c) opens an option to view each of the layers and get extended information about the main components. In each of the layer views, the animation enables the user to “explode” the layer into discrete components and assemble it back again while controlling the distance between the components using a slider that can be seen at the bottom of Figure 4a,b.
In each of the layer views, pressing on any of the red labels opened an explanatory page consisting of descriptions of the principle of operation of the component and its function in the RACECAR. Figure 4c depicts the explanatory page displayed when pressing the label of the RealSense depth camera.

2.5. The Robotics Workshop

The 6-h workshop was conducted as part of the Introduction to Industrial Engineering and System Integration (IIESI) course. The workshop included a 2-h introductory lecture and two 2-h practical sessions, all given online because of the social distancing restrictions. The outline of the workshop is presented in Figure 5 and is described below.
The lecture consisted of three parts. The first part briefly introduced the subject area of industrial engineering, the concept of the Fourth Industrial Revolution (Industry 4.0), and autonomous robotics in virtual and augmented realities.
In the second part, we presented the TurtleBot2 mobile robot, the robot operating system ROS, and the RGB-D camera. Then, we provided guidelines on how to prepare for the lab sessions: install the Vuforia View software on the mobile device, run the AR app, and access the instructional resources. After that, we presented the three activities included in the AR experience with TurtleBot2 [31]: disassembling the robot into components, attaching a basket on top of the robot, and replacing the single board computer.
At the beginning of the third part, we introduced the concept of smart transportation and presented the RACECAR MN robot system and its main components: NVIDIA Jetson Nano computer, Intel RealSense depth camera, LIDAR sensor, and PWM motor controller. Then, the AR experience, which included disassembling the robot and learning about its components, was presented. Finally, the guidelines on how to install and use the 3D driving simulator RacecarSim were provided.
The students, who stayed at home, used the Vuforia View app. on their smartphones and laptops to set up the augmented workspace we created using Vuforia Studio. In the practical sessions, the students were engaged in experiential learning of two robot systems: TurtleBot 2 and RACECAR MN.
The first session offered the augmented reality experience with TurtleBot2 that we developed and first implemented in 2020 [31]. The students first set up their AR workspace using personal mobile devices and laptops. Then they performed exercises specified in the worksheet and answered the quiz questions.
In the first exercise, the students virtually dismantled TurleBot2, learned its main components, and composed the block diagram of the robot system. Then, the students virtually assembled the robot, composed a to-do list for assembling its motion sub-system, and answered questions related to the robot components and their functions. In the second exercise, the assignment was to attach to the robot a suitable basket container for cargo transportation. The students analyzed the available containers and chose one that can be firmly assembled on top of the robot. The assignment for the third exercise was to replace the Raspberry Pi computer of TurtleBot2 with one from the list of available single board computers (SBCs). The students observed the characteristics of the boards presented in the worksheet, compared them with the characteristics of the Raspberry Pi displayed on the mobile device, and selected the SBC which had characteristics compatible with those of the Raspberry Pi.
The second session implemented our new online lab, composed of experiences with RACECAR MN in virtual and augmented realities. In the virtual experience, the students operated the RacecarSim simulator and used the keyboard to manually navigate the robot and explore the surroundings in different scenarios. Throughout the tasks, the students monitored the outputs of all robot sensors and observed how they changed dynamically during robot movement. Figure 6a depicts the simulator screen in which the virtual racecar traverses the track, while the three monitors on the left side of the screen display the readings of the virtual LIDAR sensor, depth camera, and color camera. The LIDAR monitor presents the robot’s position by a green triangle in the center of the circle, while obstacles in the surrounding are represented by red lines. The depth camera monitor displays an image, in which distances to the objects in front of it are represented by different colors.
The learning tasks focused on understanding the data from the virtual robot sensors provided by the simulator. In the first step, the students drove the virtual robot and figured out the maximal velocity indicated by the inertial measurement unit (IMU) in the top right of the simulator screen. Then, they calibrated the LIDAR and depth camera readings, appearing with no scales or units. To calibrate the LIDAR sensor, the students drove the robot with the maximal velocity towards a distant object. They measured the time between the moment when the LIDAR detects the object and displays the red line seen on top of the LIDAR monitor in Figure 7a, until the moment when the robot reaches the object (Figure 7f).
In the augmented experience, the students loaded the virtual model of RACECAR MN on their mobile devices and placed it on the worktable in their physical environments (Figure 6b). The students inspect the virtual robot from different viewpoints, disassembled it, and learned about its main components from virtual tags. We focused learning through AR experiences on the notion of the robot as a system of components, each of which performs a certain function, and altogether provide the robot functionality. As most students did not have prior knowledge of robotics, the workshop started with learning basic concepts in the lecture, and then the students applied them to construct new understandings during the practical sessions. The experience acquired from the practice with a simpler TurtleBot2 system in the first session helped the students understand the more advanced robot RACECAR MN.
We utilized the three scaffolding techniques noted in Section 1.3. The AR apps together with the simulator provided presentation scaffolds that included a variety of tangible visualizations and opportunities for the students to observe the robot systems from different viewpoints and in different motion conditions. Conceptual scaffolds were given in the lecture where all the studied concepts were consciously explained and in the exploratory pages that were incorporated in the AR visualizations. Problem-solving scaffolds were offered while the students were engaged in a series of practical problem-solving activities with robot systems.
The workshop prompted the students to use integrative thinking in different ways. Table 1 lists the workshop tasks and related learning activities, along with the applied integrative thinking skills.
The table indicates that the students applied IT skills, both visual and verbal, when setting up and practicing in the AR environment (task 1), creating block diagrams of the robot system (task 2), determining the order of assembly operations (tasks 3), replacing existing and attaching new robot components (tasks 4 and 5), exploring sensor fusion (task 6) and navigating the robot (task 7).

2.6. Evaluation of Learning Outcomes

We followed up the workshop and collected data from students’ answers to the two online quizzes that were part of the learning activities directed by the worksheet. The follow-up aimed to examine how the students learned through online practice with robot models in virtual and augmented realities and how they evaluated the contribution of this practice to training their integrative thinking.
The evaluation study involved 99 first-year students who participated in the workshop. Most of them were 20–24 years old, 60% were female and 40% male students. Less than 13% of the students learned robotics before the course in school or other settings.

2.6.1. Data Collection

We collected and analyzed qualitative and quantitative data on the workshop outcomes using the mixed method. We also collected students’ evaluations of practices with TurtleBot and RACECAR MN robot systems and compared the evaluations by applying the method of comparative case studies [37]. For data collection, we used two online learning quizzes included in the worksheet: one related to the experience with TurtleBot2, and another related to RACECAR MN. Moreover, the workshop evaluation questionnaire was administered at the end of the workshop, as an online Google Form.
The quiz related to TurtleBot2 was the same as that developed and used in our previous study [31]. It included twelve questions directed to evaluate the understanding of the concepts acquired in the workshop and the ability to apply integrative thinking. For example, one of the questions related to the robot motion system by exploring an AR animation of disassembling/assembling the robot system. The students were asked to compose a to-do list for assembling the motion system of TurtleBot2.
The quiz related to learning with RACECAR MN included ten questions. Questions 1–5 addressed the augmented reality experience, and questions 6–10 referred to robot operation using RacecarSim. In question 1, the students were given an empty block diagram of a robot and asked to specify in each block the name of the actual component of RACECAR MN and indicate by arrows the connections between the robot components. Questions 2–5 evaluated understanding the principles of operation of the main robot components. For example, question 2 was as follows:
In the PWM motor controller of the RACECAR MN:
  • If the on-time of the pulses increases and the duty cycle decreases, then the speed of the motor increases.
  • If the on-time of the pulses decreases and the duty cycle increases, then the speed of the motor increases.
  • If the on-time of the pulses decreases and the duty cycle remains the same, then the speed of the motor decreases.
  • If the on-time of the pulses remains the same and the duty cycle increases, then the speed of the motor decreases.
Question 3 was devoted to the issue of the efficiency of the central and graphics processing units. Questions 4 and 5 examined understanding of the functionality of the depth camera and the LIDAR sensor. Questions 6–10 related to the tasks performed with RacecarSim simulator. For example, in Question 8, the students were asked to estimate the distance represented by each of the colors displayed by the depth camera by comparing the colors to known distances determined by using the LIDAR sensor (Figure 7).
In the questionnaire, the students were asked to specify the used technological tools (smartphone/tablet, Apple/Android, Vuforia-view/Video-substitute). They also shared their reflections on the robots, the AR/VR technologies, and the difficulties in using them. The students evaluated the contribution of the AR/VR experiences to learning about TurtleBot2 and RACECAR MN. The students were requested to evaluate also to what extent the practices with each robot helped them to understand it as an integrated system and fostered their integrative thinking. The answers were given according to a five-point Likert style scale, with 1 for ‘no contribution’ and 5 for ‘high contribution’. The students were also asked to explain their answers.

2.6.2. Data Analysis

Data analysis focused on the research questions of the study. To answer the first research question, we evaluated students’ understanding of robot organization and functionality based on the scores that they got on the two quizzes. To evaluate students’ performance on the quizzes, for each quiz we designated weights to its questions and developed rubrics for evaluating the answers. For example, the rubric for the question about the block diagram of TurtleBot2 and RACECAR MN (included in each of the two quizzes) evaluated to what extent the diagram correctly presented all the components and the connections among them. In the answers to the question on the TurtleBot2 motion system, we evaluated the presence of all relevant components and the correctness of their order in the assembly sequence. In the analysis of the answers to multiple-choice questions related to practice with virtual RACECAR MN, we evaluated if the answers selected by the students are correct.
Part of the questions in the workshop evaluation questionnaire related to the first research question and addressed students’ reflections on how the VR/AR environment supported understanding the robot systems. We analyzed the use of hardware and software tools, categorized the difficulties experienced by the students, and summarized students’ evaluations of the contribution of the online practice to learning the robots and their components. The students’ evaluations of the current workshop were compared with those of the previous workshops [31].
To answer the second research question, we considered students’ answers to the questions of the workshop evaluation questionnaire, in which the students evaluated how the learning activities contributed to the development of the determined integrative thinking skills. Statistical analysis was performed on the responses to the multiple-choice questions, including a test for correlations between the scores on different aspects of the workshop contribution. The qualitative responses were used to verify and explain the quantitative results. The results obtained for each of the robots were compared.

3. Results

3.1. Workshop Assignments

Regarding the first research question, all the students successfully performed the workshop assignments and answered the questions of the related quizzes. Students’ mean scores on the TurtleBot2 and RACECAR MN quizzes were 92.8 and 94.6 correspondingly. A lower percentage of correct answers was obtained for the question on the TurtleBot2 motion system in the first quiz (49.4) and the RACECAR MN block diagram question in the second quiz (74.5). For other questions both quizzes scored above 97.

3.2. Technological Tools

The findings regarding the use of hardware and software tools by the students are presented in Figure 8. As shown, 94 students noted their participation in the AR experience in the workshop evaluation questionnaire. 85% of them practiced with the AR app through Vuforia View and the rest reported that they failed to run the app for technical reasons, and instead, watched the video recorded experience. Among the AR app users, 84% used smartphones and the rest used tablets. Moreover, 53% of their mobile devices were run on Apple OS and the rest were from other manufacturers, running Android.
The findings indicate that all the students who could not run the AR app and instead watched the video were Android users. Some of the students who succeeded to run the app reported difficulties in using it. Specifically, 27% of the students reported difficulties in “placing” the robot in the workspace and viewing it from different perspectives, and problems with panning and zooming the image. Further, 14% of all the students participated in the workshop reported that they were challenged by online learning of the subject that was completely new for them.

3.3. AR/VR Experience

Despite the difficulties, the students highly appreciated the contribution of the VR/AR practice to understanding the robots and components. Indeed, 81% evaluated this contribution as notable and 46% of them considered it as high or very high. The written reflections support the quantitative results. The students noted several advantages of AR experiences compared to teacher explanations:
“The AR experiences helped us understand the robot structure and components much better than the verbal explanations.”
They acknowledged the opportunities for robot exploration provided by the AR app:
“Using AR, we were able to observe from different angles how the robots are built, just as if we were looking at them in real life. In addition, the explanation given for each component helped to understand the robot better.”
“By observing the robot being disassembled into parts and then being reassembled, we were able to better understand the relationships between the different components and their functions.”
The students also appreciated the practice with the RacecarSim. They wrote:
“Interesting simulation! Through it, we understood the importance of each of the sensors and their meaning.”
“Working on the simulator helped us better understand the different components, such as the laser sensor and the distance camera.”
While appreciating the AR/VR experience, many students noted that it cannot become an equal substitute for practice with real robots. The repeated reflections were:
“Augmented reality allowed me to get to know the components well, but if they could be physically handled, connected, and disconnected, that would be a great improvement.”

3.4. Difficulties in AR/VR Experience

Though most of the students succeeded to gain an understanding of the robot structure and components from the AR experiences, several students reported on difficulties that they experienced in this practice. Some of them related these difficulties to the lack of technical background. One student wrote:
“I think we need more prior knowledge of the parts that are building the robot. I don’t know about motors, etc., so the issue was difficult for me.”
Certain visualizations were not completely clear to the students for several reasons. One was that the small smartphone screen made it difficult to see details. In this regard one of the students noted:
“It is difficult to understand on a small screen of a cellphone where each part connects.”
Another difficulty was in observing disassembling and assembling components that had the same color, as indicated in the following reflection:
“It was difficult to understand what each component was because they are all black.”
One more difficulty was that the components’ labels were not presented in the visualizations of assembling and disassembling the robots. The relevant reflection:
“Because the explanatory labels were absent, it was difficult to understand how the parts were connecting.”

3.5. Contribution to Understanding Robot Structure

We compared evaluations of the contribution of the workshop given in the current study with those conducted in the previous studies, regarding the understanding of the robot structure and components. As found, this contribution was evaluated as notable by the same percentage of students (81%) from both workshops. However, the percentage of students who evaluated the contribution as high was significantly larger in the current workshop (46%) than in the past workshop (29%).

3.6. Understanding Robots as Integrated Systems

To answer the second research question, we focused on students’ evaluation of how the different activities of the workshop engaged them in applying integrative thinking skills. Students’ evaluations of the workshop are summarized in Table 2. The first column lists the aspects of the workshop contribution. The second and third columns present the percentage of the students who evaluated the workshop contribution to each learning outcome as notable or high. Aspects 1–4 relate to integrative thinking about the studied robot systems, while Aspect 5 addresses the possible integration of AR/VR experiences as experiential activities in learning about robotics in industrial engineering.
Table 3 presents Spearman correlation coefficients calculated to test for a relationship between students’ evaluations of the workshop contribution for understanding the robot structure and components (Aspect 1) and the three other aspects of the contribution (Aspects 2–4). As found, the correlations were relatively strong with p < 0.001.
As mentioned above and follows from Table 2, most students positively evaluated the contribution of the AR and VR practices to understanding the robot structure and components. From students’ reflections:
“The workshop gave me a lot of practice in integrative thinking.”
“The workshop helped me a lot in understanding the whole subject since I did not have knowledge about robots before. I could see the robot falling apart and how it connects. What each part is and to what category it belongs.”
In addition, 84% evaluated the contribution to understanding the interactions among robot components as notable, and half of the students evaluated this contribution as high. A typical reflection:
“As I observed the decomposition and assembly of the robots, I gained a better understanding of the relationships among the different components and their functions.”
Most students noted that the practices contributed to their understanding of how the TurtleBot2 and RACECAR MN operate as integrated systems (77% and 91% accordingly). The Wilcoxon signed ranks test indicated that the scores for understanding the RACECAR MN as an integrated system were significantly higher (Z = −4.4 with p < 0.001) than that of TurtleBot2.
In their reflections, the students wrote that from first-hand experience with the two instructional robots, they came to understand the basic principles of robot construction and operation:
“By experimenting with the robots and exploring most of their parts, we were able to generate more general ideas, such as how larger and more useful robots work, how they are operated, and what their basic parts are. Thus, the private experience with a small number of robots has contributed to understanding general ideas related to other robots.”
“The experience has made me think about the solutions robots can offer in many different areas, and about streamlining the processes that using robots can bring.”

3.7. Student Evaluations Compared

We compared the student evaluations of the workshop given in the current study with those in the previous study [31]. As found, the evaluation of the workshop contribution in the current study was significantly higher than in the previous study. High evaluations of the contribution were given by 46% vs. 29% for understanding the robot structure and components, 40% vs. 14% for understanding the robot as an integrated system, and 68% vs. 45% for understanding the interactions among the components.

4. Discussion

In this paper, we explore the potential for online experiential learning about robot systems in virtual and augmented reality environments, learning which is particularly relevant in times of social distancing. The published studies evaluated the knowledge gained through such educational processes but did not consider the integrative thinking skills that are applied and can be fostered by this learning practice. Our study demonstrated that such learning practice can facilitate the understanding of the structure and functionality of robots, expose novice engineering students to novel digital technologies, and provide rich opportunities for applying integrative thinking skills.
The study included technological, pedagogical, and educational research parts. In the technological part, we developed an augmented reality experience with the RACECAR MN robot system. Our motivation to develop this AR experience was to use it for learning the structure of the system and the functionality of its components. This was required because under the social distancing restrictions the students did not have access to the physical robot and practiced only with the RacecarSim simulator which adequately represents robot kinematics but not its structure.
We coped with three technological challenges. The first was creating an authentic 3D model which represents in all the details the whole RACECAR MN system and its components. The second was creating the AR animation which allows to isolate each robot subsystem and component, observe them from different viewpoints, and learn about them. The third challenge was creating an interactive cloud-based AR experience for students to explore the robot system.
In the pedagogical part of the study, we developed and conducted a workshop, in which the students learned about the TurtleBot2 and RACECAR MN robot systems through augmented reality experiences. Our motivation was to engage the students in the study of robot systems through online experimentation in VR and AR. We had to cope with two main pedagogical challenges. The first was teaching the advanced technological concepts to students, most of whom did not have any background in robotics. The second was on using the VR and AR experiences to teach about real robot systems that could not be accessed physically.
In the educational research part, we examined the learning through online exploration of robot systems in VR and AR and evaluated the contribution of this practice to fostering integrative thinking, based on the students’ feedback. The motivation was to verify if novice engineering students can understand the structure and functionality of robot systems through VR and AR experiences and if this practice can foster their integrative thinking skills. The first challenge in conducting this research was that we could not observe student learning directly. Second, the students used a variety of mobile devices and computers, some of which could not support the developed AR experiences. The diversity of the devices and computers affected the learning process and its outcomes. The third challenge was related to the lack of a widely accepted notion of integrative thinking in the literature as well as the absence of tools for its assessment in the area of engineering systems.
To answer the first research question, we examined the results of the workshop quizzes and students’ self-evaluations of the learning experience to determine whether they understood the structure and functionality of the studied robot systems. The collected data indicate that most students comprehended the structure and functionality of the robot systems they learned about. The students received high scores on both TurtleBot2 and RACECAR MN knowledge quizzes and highly appreciated the contribution of the experiences with the AR app and the RacecarSim simulator to understanding the structure, components, and functionality of the robots. The students, however, noted that these experiences cannot completely replace actual practice with robots. Some students mentioned difficulties in working with augmented reality related to the lack of technical skills, using incompatible smartphones, and misunderstanding certain visualizations. We will take these comments into account when updating the AR apps.
To answer the second research question, we examined students’ reflections on the use of integrative thinking during workshop activities. The students highly appreciated the experience gained in learning the structure and functionality of the robots. The high scores for the workshop’s contribution to the understanding of the two robots as integrated systems and their relatively strong correlation with the overall contribution score confirm that integrative thinking was a major component of the learning experience. The students applied IT from the beginning of the learning activities when they set up a system consisting of a mobile device, a laptop, and Vuforia View software, and used the AR app to ‘place’ the virtual robot on the real home table. The students systematically analyzed the robot structure, components, and their interactions using IT skills to make block diagrams of the robot systems. They explored disassembling and assembling the robots, modified them, and performed maintenance operations. In the practice with RacecarSim simulations, the students used integrated thinking to explore the dynamic behavior of the virtual racecar. The high scores given by the students for the usefulness of the workshop experience for studying real engineering systems indicate that through practice with the instructional robots they grasped a basic understanding with regard to the architecture and operation of robot systems.
The current version of the workshop differed from the two previously conducted workshops described in Section 2.3. We compared results of the current and previous workshops in the percentages of students who succeeded in the workshop tasks and highly evaluated the workshop contributions. Results of the comparison are presented in Table 4.
The first row of Table 4 includes evaluation of students’ performance of the workshop tasks. Further, rows 2–4 present percentage of students who highly evaluated the workshops’ contributions (scores 3–5 in the 5-point scale). The comparison of evaluations indicates that the current version contributes as much as previous versions.

5. Conclusions

In conclusion, our study demonstrates the feasibility and effectiveness of experiential learning of engineering systems through the exploration of their digital models in augmented and virtual reality environments. There are still some technical difficulties with the approach, but the results are promising. The approach becomes particularly suitable in conditions of students’ isolation and social distancing restrictions. Despite the restrictions, the use of AR, VR, and online technologies enabled to provide students with meaningful practice in learning about the mobile robots. In our study, the knowledge quizzes indicated that through the AR experience the students gained understanding of the robot systems and their components. Through the practice with the RacecarSim simulator, the students developed their robot operation skills and learned to navigate the robot based on the information from different sensors.
Our study proposed to foster the integrative thinking skills of first-year engineering students through practice with robot systems and showed a possible approach to such a practice in AR and VR environments. From the theoretical perspective, the study contributes to shaping the concept of student integrative thinking in the context of learning about robot systems. Based on our study, we recommend further research and development of AR experiences and virtual simulations that will combine the experiential learning of engineering systems with the development of generic skills needed for modern engineering.
In the further R&D, robotics can serve as a testbed for developing pedagogical strategies for online learning practice with engineering systems, for introducing students to the innovative digital technologies, support learning STEM subjects, and develop students’ cognitive and social skills needed in the age of Industry 4.0.
A decade ago, Alimisis [38] analyzed the state of educational robotics and identified several challenges it faces as a tool to develop students’ cognitive and social skills and support learning STEM subjects. Here we formulate our perception of the new challenges that educational robotics has to meet in the current times of digital transformation and the pandemic:
  • Develop innovative technology-rich learning environments accessible for different types of experiential practice with instructional models of modern robots.
  • Transform learning activities that treat robots as black boxes into ones in which students explore the robot’s architecture and functionality and use it as a platform for making.
  • Prioritize learning activities that foster the cognitive and social skills required for the use and development of modern engineering systems.
  • Develop new programs in educational robotics based on the understanding of its central role in preparing students for life in the modern technological world.
  • Perform the assessment of learning outcomes, based on students’ progress in both knowledge and skills, an integral part of educational robotics programs.
Our intentions for the robotics workshop and the educational study, as described in Section 2.2, reflect our endeavor to meet these challenges. We call for wider involvement in addressing the new challenges in educational robotics programs and research.

Author Contributions

Conceptualization and methodology, I.V. and D.C.; software, S.G.; validation A.P.; formal analysis, I.V. and A.P.; investigation, D.C., S.G. and I.V.; data curation, A.P., S.G. and H.P.-V.; writing—original draft preparation, I.V., D.C. and H.P.-V.; editing, I.V., D.C. and A.P.; supervision, I.V.; project administration, I.V. and D.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by PTC Inc Grant 2022409. and the Technion Autonomous Systems Program Grant 86600233.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Institutional Behavioral Sciences Ethics Committee of Technion–Israel Institute of Technology (Approval no. 2020-103 date 8 December 2020). To carry out the study, the authors applied and obtained permission from the Behavioral Sciences Research Ethics Committee of the Technion (Approval number 2020-103).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study. The goal and method of our study was explained to the participating students in advance. The participants of this study were first-year engineering students from the Technion Faculty of In-dustrial Engineering and Management. The participants’ identities were kept anonymous for all our publications; no information was disclosed that can be used to recognize any specific individual. The researchers collected research data with the consent of the course lecturer and with the per-mission of the faculty. All the authors of the manuscript agreed with its content, gave explicit consent to submit it, and obtained consent from the responsible authorities at the Technion—Israel Institute of Technology where the work has been carried out.

Data Availability Statement

Data and materials will not be published.

Conflicts of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

  1. Huber, A.M.; Waxman, L.K.; Dyar, C. Using systems thinking to understand the evolving role of technology in the design process. Int. J. Technol. Des. Educ. 2022, 32, 447–477. [Google Scholar] [CrossRef]
  2. Zhou, N.N.; Deng, Y.L. Virtual reality: A state-of-the-art survey. Int. J. Autom. Comput. 2009, 6, 319–325. [Google Scholar] [CrossRef]
  3. Gorman, D.; Hoermann, S.; Lindeman, R.W.; Shahri, B. Using virtual reality to enhance food technology education. Int. J. Technol. Des. Educ. 2022, 32, 1659–1677. [Google Scholar] [CrossRef] [PubMed]
  4. Hoenig, W.; Milanes, C.; Scaria, L.; Phan, T.; Bolas, M.; Ayanian, N. Mixed reality for robotics. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015. [Google Scholar]
  5. Song, P.; Yu, H.; Winkler, S. Vision-based 3D finger interactions for mixed reality games with physics simulation. In Proceedings of the ACM SIGGRAPH International Conference on Virtual Reality Continuum and Its Applications in Industry, Singapore, 8–9 December 2008. [Google Scholar]
  6. Ozdemir, M.; Sahin, C.; Arcagok, S.; Demir, M.K. The effect of augmented reality applications in the learning process: A meta-analysis study. Eurasian J. Educ. Res. 2018, 18, 165–186. [Google Scholar] [CrossRef]
  7. Thees, M.; Kapp, S.; Strzys, M.P.; Beil, F.; Lukowicz, P.; Kuhn, J. Effects of augmented reality on learning and cognitive load in university physics laboratory courses. Comput. Hum. Behav. 2020, 108, 106316. [Google Scholar] [CrossRef]
  8. Altmeyer, K.; Kapp, S.; Thees, M.; Malone, S.; Kuhn, J.; Brünken, R. The use of augmented reality to foster conceptual knowledge acquisition in STEM laboratory courses—Theoretical background and empirical results. Br. J. Educ. Technol. 2020, 51, 611–628. [Google Scholar] [CrossRef]
  9. Singh, G.; Mantri, A.; Sharma, O.; Dutta, R.; Kaur, R. Evaluating the impact of the augmented reality learning environment on electronics laboratory skills of engineering students. Comput. Appl. Eng. Educ. 2019, 27, 1361–1375. [Google Scholar] [CrossRef]
  10. AlNajdi, S.; Alrashidi, M.; Almohamadi, K. The effectiveness of using augmented reality (AR) on assembling and exploring educational mobile robot in pedagogical virtual machine. Interact. Learn. Environ. 2020, 28, 964–990. [Google Scholar] [CrossRef]
  11. Borrero, A.; Márquez, J. A pilot study of the effectiveness of augmented reality to enhance the use of remote labs in electrical engineering education. J. Sci. Educ. Technol. 2012, 21, 540–557. [Google Scholar] [CrossRef]
  12. Verner, I.; Cuperman, D.; Polishuk, A. Inservice teachers explore RACECAR MN in physical and augmented environments. In Proceedings of the 2022 17th Annual System of Systems Engineering Conference (SOSE), Rochester, NY, USA, 7–11 June 2022; pp. 228–230. [Google Scholar]
  13. TurtleBot2. Open-Source Robot Development Kit. Available online: https://www.turtlebot.com/turtlebot2/ (accessed on 28 August 2022).
  14. MITLL RACECAR-MN. Available online: https://mitll-racecar-mn.readthedocs.io/en/latest/ (accessed on 28 August 2022).
  15. Wang, X.; Mayer, R.E.; Zhou, P.; Lin, L. Benefits of interactive graphic organizers in online learning: Evidence for generative learning theory. J. Educ. Psychol. 2021, 113, 1024–1037. [Google Scholar] [CrossRef]
  16. Clark, D.; Linn, M.C. Designing for knowledge integration: The impact of instructional time. J. Learn. Sci. 2003, 12, 451–493. [Google Scholar] [CrossRef]
  17. Ausubel, D.P. A subsumption theory of meaningful verbal learning and retention. J. Gen. Psychol. 1962, 66, 213–224. [Google Scholar] [CrossRef] [PubMed]
  18. De Weck, O.L.; Roos, D.; Magee, C.L. Engineering Systems: Meeting Human Needs in a Complex Technological World; MIT Press: Cambridge, MA, USA, 2011; pp. 168–184. [Google Scholar]
  19. Shen, Z.; Zhu, Y. Complex engineering system learning through study of engineering failure cases using 3D animations. In Proceedings of the ASEE 2011 Annual Conference and Exposition, Vancouver, BC, Canada, 26–29 June 2011. [Google Scholar]
  20. Schörger, D.; Sewchurran, K. Towards an interpretive measurement framework to assess the levels of integrated and integrative thinking within organizations. Risk Gov. Control Financ. Mark. Inst. 2015, 5, 44–66. [Google Scholar]
  21. Kallio, E. Integrative thinking is the key: An evaluation of current research into the development of adult thinking. Theory Psychol. 2011, 21, 785–801. [Google Scholar] [CrossRef]
  22. Martin, R.L. The Opposable Mind: How Successful Leaders Win through Integrative Thinking; Harvard Business School Publishing: Boston, MA, USA, 2009. [Google Scholar]
  23. Tynjälä, P.; Kallio, E.K.; Heikkinen, H.L. Professional expertise, integrative thinking, wisdom, and phronesis. In Development of Adult Thinking, 1st ed.; Routledge: London, UK, 2020; pp. 156–174. [Google Scholar]
  24. Qadir, J.; Yau, K.L.A.; Imran, M.A.; Al-Fuqaha, A. Engineering education, moving into 2020s: Essential competencies for effective 21st century electrical & computer engineers. In Proceedings of the 2020 IEEE Frontiers in Education Conference (FIE), Uppsala, Sweden, 21–24 October 2022. [Google Scholar]
  25. Quigley, M.; Conley, K.; Gerkey, B.; Faust, J.; Foote, T.; Leibs, J.; Wheeler, R.; Ng, A.Y. ROS: An open-source robot operating system. In Proceedings of the ICRA Workshop on Open-Source Software, Kobe, Japan, 12–13 May 2009. [Google Scholar]
  26. Pasek, Z.J. Helping engineers develop and exercise creative muscles. In Proceedings of the Canadian Engineering Education Association Conference (CEEA), Toronto, ON, Canada, 4–7 June 2017. [Google Scholar]
  27. Berlin, N.; Tavani, J.L.; Beasançon, M. An exploratory study of creativity, personality, and schooling achievement. Educ. Econ. 2016, 24, 536–556. [Google Scholar] [CrossRef]
  28. Malik, A.; Setiawan, A. The development of higher order thinking laboratory to improve transferable skills of students. In Proceedings of the 2015 International Conference on Innovation in Engineering and Vocational Education, Bandung, Indonesia, 14 November 2015. [Google Scholar]
  29. Asok, D.; Abirami, A.M.; Angeline, N.; Lavanya, R. Active learning environment for achieving higher-order thinking skills in engineering education. In Proceedings of the 2016 IEEE 4th International Conference on MOOCs, Innovation and Technology in Education (MITE), Innovation, and Technology in Education (MITE), Madurai, India, 9–10 December 2016. [Google Scholar]
  30. Rawat, K.S.; Massiha, G.H. A hands-on laboratory-based approach to undergraduate robotics education. In Proceedings of the IEEE International Conference on Robotics and Automation, New Orleans, LA, USA, 26 Apr–1 May 2004. [Google Scholar]
  31. Cuperman, D.; Verner, I.; Perez, H.; Gamer, S.; Polishuk, A. Fostering integrative thinking through an online AR-based robot system analysis. In Proceedings of the World Engineering Education Forum, Madrid, Spain, 15–18 November 2021. [Google Scholar]
  32. Ortega, P.E.; Lagoudas, M.Z.; Froyd, J.E. Overview and comparison of assessment tools for integrative thinking. In Proceedings of the 2017 ASEE Annual Conference & Exposition, Columbus, OH, USA, 24–28 June 2017. [Google Scholar]
  33. Kraiger, K.; Ford, J.K.; Salas, E. Application of cognitive, skill-based, and affective theories of learning outcomes to new methods of training evaluation. Int. J. Appl. Psychol. 1993, 78, 311–328. [Google Scholar] [CrossRef]
  34. Verner, I.M.; Gamer, S. Robotics laboratory classes for spatial training of novice engineering students. Int. J. Eng. Educ. 2015, 31, 1376–1388. [Google Scholar]
  35. Verner, I.; Cuperman, D.; Gamer, S.; Polishuk, A. Exploring affordances of robot manipulators in an introductory engineering course. Int. J. Eng. Educ. 2020, 36, 1691–1707. [Google Scholar]
  36. Chen, S.; Fishberg, A.; Shimelis, E.; Grimm, J.; van Broekhoven, S.; Shin, R.; Karaman, S. A Hands-on Middle-School Robotics Software Program at MIT. In Proceedings of the IEEE Integrated STEM Education Conference, Princeton, NJ, USA, 1 August 2020. [Google Scholar]
  37. Goodrick, D. Comparative Case Studies; Methodological Briefs: Impact Evaluation 9; United Nations Children’s Fund (UNICEF): Florence, Italy, 2014; pp. 1–17. [Google Scholar]
  38. Alimisis, D. Educational robotics: Open questions and new challenges. Themes Sci. Technol. Educ. 2013, 6, 63–71. [Google Scholar]
Figure 1. The RACECAR MN.
Figure 1. The RACECAR MN.
Robotics 11 00090 g001
Figure 2. The robot’s CAD models: (a) The chassis CAD model; (b) RACECAR MN virtual model.
Figure 2. The robot’s CAD models: (a) The chassis CAD model; (b) RACECAR MN virtual model.
Robotics 11 00090 g002
Figure 3. Model main view: (a) Labels of main components; (b) Labels of peripherals; (c) No labels.
Figure 3. Model main view: (a) Labels of main components; (b) Labels of peripherals; (c) No labels.
Robotics 11 00090 g003
Figure 4. Layer views: (a) Top layer; (b) Middle layer fully exploded; (c) Exploratory page.
Figure 4. Layer views: (a) Top layer; (b) Middle layer fully exploded; (c) Exploratory page.
Robotics 11 00090 g004
Figure 5. Workshop outline.
Figure 5. Workshop outline.
Robotics 11 00090 g005
Figure 6. The robot: (a) In virtual reality; (b) In augmented reality.
Figure 6. The robot: (a) In virtual reality; (b) In augmented reality.
Robotics 11 00090 g006
Figure 7. The color scale of the depth camera readings at different distances: (a) 4 m, (b) 3 m, (c) 2 m, (d) 1 m, (e) 0.5 m, (f) 0 m.
Figure 7. The color scale of the depth camera readings at different distances: (a) 4 m, (b) 3 m, (c) 2 m, (d) 1 m, (e) 0.5 m, (f) 0 m.
Robotics 11 00090 g007
Figure 8. Use of technological tools.
Figure 8. Use of technological tools.
Robotics 11 00090 g008
Table 1. Workshop tasks, learning activities, and applied integrative thinking skills.
Table 1. Workshop tasks, learning activities, and applied integrative thinking skills.
TaskStudents’ ActivitiesApplications of IT
Set up a personal AR workspacePlacing the virtual robot on the home table using the mobile device screenCreating an integrated view of real and virtual objects
Make a block diagram of the robot systemExploring the robot structure, components, and their interactionsCreating a concept map of the robot system
Set the robot motion system assembly orderDetermining the assembly sequence for the robot motion systemCreating a visual representation of an assembly
Replace the on-board computer of the robotReplacing the robot computer with a selected alternative oneSelecting an item by analysis of its technical characteristics
Attach a container to the robotUpgrading the robot system by attaching a suitable containerSelecting an item by analysis of its shape and dimensions
Explore robot sensors and their fusionMeasuring distances in the simulated environment using the robot sensorsCreating a workspace image based on multi-sensor data
Navigate the robot to avoid obstaclesDetermining the path and speed of robot motion using sensor fusionDynamic integration of spatial and kinematics data
Table 2. Students’ evaluations of the workshop contribution (%).
Table 2. Students’ evaluations of the workshop contribution (%).
Aspects of the ContributionContribution Level (%)
NotableHigh
Understanding the robot structure and its components8146
Understanding TurtleBot2 as an integrated system7740
Understanding RACECAR MN as an integrated system9160
Understanding the interactions among robot components8451
Experience that can be used for studying real robot systems9068
Table 3. Spearman’s correlations among the aspects of the contribution.
Table 3. Spearman’s correlations among the aspects of the contribution.
Aspects of the ContributionUnderstanding TurtleBot2 as an Integrated System Understanding RACECAR MN as an Integrated SystemUnderstanding the Interactions among Robot Components
Understanding the robot structure and its components0.6570.5840.398
Table 4. The students who succeeded in the workshop and highly evaluated its contributions (%).
Table 4. The students who succeeded in the workshop and highly evaluated its contributions (%).
First WorkshopSecond WorkshopThird Workshop
Successfully performed the task100%86%Over 90%
Contribution to learning the subject--
--
83%
Robotic manipulations
81%
Mobile robot system
Contribution to learning about robotics in IE78%
Robot-manipulators
82%
Robot-manipulators
90%
Mobile robots
Contribution to training the skill46%
Spatial skills
85%
Spatial skills
Over 80%
Integrative thinking skills
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Verner, I.; Cuperman, D.; Perez-Villalobos, H.; Polishuk, A.; Gamer, S. Augmented and Virtual Reality Experiences for Learning Robotics and Training Integrative Thinking Skills. Robotics 2022, 11, 90. https://doi.org/10.3390/robotics11050090

AMA Style

Verner I, Cuperman D, Perez-Villalobos H, Polishuk A, Gamer S. Augmented and Virtual Reality Experiences for Learning Robotics and Training Integrative Thinking Skills. Robotics. 2022; 11(5):90. https://doi.org/10.3390/robotics11050090

Chicago/Turabian Style

Verner, Igor, Dan Cuperman, Huberth Perez-Villalobos, Alex Polishuk, and Sergei Gamer. 2022. "Augmented and Virtual Reality Experiences for Learning Robotics and Training Integrative Thinking Skills" Robotics 11, no. 5: 90. https://doi.org/10.3390/robotics11050090

APA Style

Verner, I., Cuperman, D., Perez-Villalobos, H., Polishuk, A., & Gamer, S. (2022). Augmented and Virtual Reality Experiences for Learning Robotics and Training Integrative Thinking Skills. Robotics, 11(5), 90. https://doi.org/10.3390/robotics11050090

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop