Next Article in Journal
Study Benefits of Smartphones: Perceptions of Female Emirati Pre-Service Teacher Undergraduates
Next Article in Special Issue
An Evidence-Based Study on Teaching Computer Aided Design in Higher Education during the COVID-19 Pandemic
Previous Article in Journal
Leadership in Implementing Inclusive Education Policy in Early Childhood Education and Care Playrooms in South Africa
Previous Article in Special Issue
STEM Students’ Perceptions on Emergency Online Learning during the COVID-19 Pandemic: Challenges and Successes
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Fostering Performance in Hands-On Laboratory Work with the Use of Mobile Augmented Reality (AR) Glasses

1
Centre for University Teaching and Learning (HYPE), University of Helsinki, 00014 Helsinki, Finland
2
Division of Pharmaceutical Biosciences, Faculty of Pharmacy, University of Helsinki, 00014 Helsinki, Finland
3
Division of Pharmaceutical Chemistry and Technology, Faculty of Pharmacy, University of Helsinki, 00014 Helsinki, Finland
*
Author to whom correspondence should be addressed.
Educ. Sci. 2021, 11(12), 816; https://doi.org/10.3390/educsci11120816
Submission received: 12 October 2021 / Revised: 26 November 2021 / Accepted: 10 December 2021 / Published: 16 December 2021

Abstract

:
The learning of laboratory skills is essential in science education, but students often get too little individual guidance in this area. Augmented reality (AR) technologies are a promising tool to tackle this challenge and promote students’ high-level learning and performance in science laboratories. Thus, the purpose of this study was (1) to design an AR-assisted learning environment to support individual knowledge construction, (2) to investigate students’ learning processes and learning outcomes and (3) to examine the usability of the system. Pharmacy students (n = 16) were assigned to experimental (n = 10) and control (n = 6) groups and performed the same laboratory work together with pre- and post-tests. The experimental group worked with AR glasses that provided additional support and timely guidance during the work with additional info-screens, questions related to choosing correct reagents and laboratory tools and think-aloud questions, whereas the control group worked in a traditional laboratory context. The results showed that AR was more effective in fostering performance in the science laboratory compared to traditional laboratory instruction and prevented most of the mistakes. The AR group considered the guidance and feedback provided by AR to be beneficial for their learning. However, no apparent differences were found in tasks measuring students’ understanding of the content knowledge. Thus, an AR environment embedded with supportive tools could partly replace the teacher in science teaching laboratories by providing individual and timely guidance for the students.

1. Introduction

Augmented reality (AR), mixed reality (MR) and virtual reality (VR) technologies have begun to be more frequently used in education in recent years and they are likely to have a considerable influence on education [1]. However, most previous studies focused mainly on the technical approaches and often only superficial learning outcomes have been investigated. In educational settings, AR refers to a learning environment that bridges and enriches the real world with digital components, such as digital texts, pictures, learning tasks, animations or other objects in real time, with the aim of supporting learning [2]. The literature presents numerous potential benefits and advantages of using AR-based instruction [3,4,5]. AR applications have been reported to facilitate learning in several ways: they illustrate content and make it concrete by, for example, visualizing abstract 3D structures [6]; increase students’ motivation [7]; and engage students in scientific thinking and argumentation [8]. On the other hand, it has been stated that AR imposes an extra cognitive load on students and causes usability problems [7], leading, in fact, into weaker learning results [9]. However, as stated in a previous review study [3], rather superficial learning outcomes, such as fact memorizing, have typically been investigated. Additionally, only a limited number of educational contents utilizing innovative technology have been developed to support the learning of laboratory skills in the higher education context, for example in the pharmacy context [10].
Most previous research focused on developing and studying technical solutions (usually AR implemented in smart phone solutions) from usability and technology points of view, whereas the underlying learning theories, based on which the solution has been developed, have been mostly ignored [5]. With little existing research evidence available, instructional design decisions are often made based on practical or economic considerations rather than evidence-based arguments [9]. On the other hand, in studies that focus specifically on learning processes and outcomes, the role of developing technical solutions has been less stringently investigated. Studies that aim to improve these aspects hand in hand are scarce. We still lack systematic, empirical research evidence on how AR (or MR/VR) technology could and should be implemented in education to support high-level learning [9,11,12]. Thus, there is a clear need to design, develop and explore pedagogically meaningful AR applications and solutions that aim to support students’ individual learning processes of practical skills, for example in the laboratory environment.

1.1. AR Assisted Learning of Laboratory Skills in Higher Education

To enhance learning at universities, instruction must focus on the development of adequate knowledge structures and skills in students; this means that teaching, training, coaching, modelling and practicing should adapt to the actual knowledge organization of the student and provide opportunities to practice authentic tasks that will be faced later in their working life [13]. However, the challenge for higher education is that students typically study great amounts of theoretical knowledge, but have few opportunities to practice linking this theoretical knowledge with practical skills. For example, laboratory practicals are an essential part of science education while being expensive and time-consuming. They require lots of teaching resources, posing a challenge for many universities. These constraints restrict students’ opportunities to practice and learn from their mistakes and may even lead to students graduating university ill-prepared for work life. However, the use of a cutting-edge learning technology, such as AR, in science labs offers opportunities for students to practice the necessary hands-on skills that are otherwise difficult, or even impossible, to practice enough during their studies.
Research on technology use in higher education science laboratories has yielded positive outcomes [14,15,16]. For example, a study [17] revealed that AR technology improved students’ laboratory skills and helped students to acquire more positive attitudes to laboratory work in higher education. It has generally been noted that instructional strategies should be shifted towards promoting an active role for students in the teaching/learning process, and here, AR technology offers potential learning affordances [3]. Combining virtual and physical labs has the potential to promote science learning, because this allows students to make use of virtual elements and receive adaptive and personal guidance, support and feedback in a timely fashion during the processing of the task without being separated from the real laboratory environment and equipment [18,19,20,21].
Studies have also shown that AR technology may enhance educational outcomes [18] and students’ practical laboratory skills in higher education [17,22]. AR has been used successfully to engage students in scientific thinking, including argumentation, by pushing students to develop and argue scientific explanations in gamified learning environments [8]. AR has also been shown to increase students’ motivation, to foster their learning, and to improve performance in physical tasks [7,23]. However, the cognitive outcome evaluated in most quantitative studies is knowledge acquisition, utilizing questionnaires and surveys [1,3]. Therefore, more research is needed on how AR-based tools should be designed to promote the development of high-level learning, such as problem-solving and skill-based outcomes, and how to assess these outcomes [3,24].

1.2. Designing a Pedagogically Meaningful AR Environment

In order to design pedagogically meaningful technologies and learning environments, it is essential to define the learning theoretical foundation on which the development and decisions have been based, but this is often neglected [5]. In this study, we apply constructivist learning theory, according to which learning means the active processing of information and linking it to the learner’s prior knowledge base. Thus, learning processes are subjective and require active knowledge construction by the students, as well as timely individual guidance [25]. Therefore, instruction should adapt to the purposes of different learners and support students’ self-regulation and problem-solving skills effectively.
Effective scaffolding mechanisms that assist students to actively regulate their own learning processes are beneficial in promoting the learning of scientific content and methods [3]. Furthermore, training activities that allow students to make errors, together with the provision of timely and personal feedback during the processing, help create an adequate and flexible knowledge base [26,27,28]. Feedback supports students to monitor and evaluate their own learning, allowing them to succeed in evolving circumstances and adaptive problem-solving situations [29]. AR can potentially foster students’ learning of laboratory skills by providing adaptive instruction and guidance, well-timed feedback and opportunities to obtain extra support when needed. However, the potential of AR to facilitate learning with high-quality feedback is barely mentioned in previous studies.
To summarize, AR applications have the potential to support learning in science laboratories by providing students with timely and personal support, guidance and feedback. However, empirical research related to such AR applications is scarce.

1.3. Aims

We designed an AR-assisted learning environment embedded with aids that provide individual adaptive guidance, support and feedback to promote students’ active and individual knowledge construction and learning of hands-on skills during the laboratory practical. We compared whether the AR-assisted environment, embedded with supportive tools, can promote students’ high-level learning and guide students’ performance in the laboratory more effectively than in a traditional learning environment. Students’ opinion towards AR technology, as well as their user experiences with regard to the use of AR glasses, are also subjects of interest.
The research questions of this study are:
  • How does the AR-assisted learning environment vs. traditional learning environment support university students’ learning of content knowledge?
  • How does the AR-assisted learning environment vs. traditional learning environment guide university students’ successful performance in the learning of laboratory skills?
  • How do students’ user experiences relate to working with the innovative AR-assisted learning environment?

2. Materials and Methods

2.1. Participants

A total of 16 second-year pharmacy students (women n = 11, men n = 5) participated in the study in November–December 2019 at the University of Helsinki. These participants were randomly divided into experimental (n = 10) and control (n = 6) groups. Participation in the study was voluntary, and informed consent was obtained from the participants. The study was conducted according to the ethical regulations of the context university.

2.2. Context—The Laboratory Work

The laboratory work was carried out during a laboratory course focusing on microbiology and asepsis. AR technology was implemented during a laboratory task of antimicrobial susceptibility testing using the broth microdilution method. The susceptibility testing was carried out using a 96-well plate with the addition of resazurin dye as a cell viability indicator. The aim of the laboratory work was to determine the minimum inhibitory concentration (MIC) value of an antimicrobial substance of tested bacteria. This particular task was chosen as it contained steps and elements that are new to most students in their second year of pharmacy studies, pointing out the need for further guidance and instant feedback.
The susceptibility testing is one of the most demanding laboratory tasks in the entire course due to its novelty to students and requirement for aseptic techniques. Typically, with all the preparations, it takes approximately 1.5 h for students to carry out the susceptibility testing.
The AR environment provided a step-by-step electronic environment with various interactive elements to foster the experimental group students’ performance with the task by providing timely support and feedback and ensuring that the students were working with the correct materials and preparing the correct solutions. The control group worked with traditional paper instructions.

2.3. Measures

2.3.1. Pre-and Post-Tests of Content Knowledge

Before and after the laboratory work, the students from both the experimental and control group answered six open-ended questions without the help of course material, measuring their understanding of the content knowledge underlying the laboratory work (Table 1) (e.g., Why do you prepare dilutions with different concentrations from the antimicrobial substance? What is the purpose of the positive and negative control?). After the work, in the post-test, the students had the opportunity to complete and correct their pre-test answers.

2.3.2. Questionnaire on Usability of the AR Environment

To measure the usability of the AR equipment, a five-point Likert scale (1 = strongly disagree to 5 = strongly agree) surveyed the experimental groups students’ opinions on whether the AR environment could provide guidance and feedback in a timely and supportive manner for learning (see Appendix A). The questionnaire was filled in after the laboratory work. In an open question, the students had an opportunity to comment on the overall user experience (such as comfort or any inconvenience they encountered with the new technology).

2.4. AR Apparatus and Environment

The AR system consisted of interactive software developed in a multidisciplinary collaboration with researchers from the University of (removed for blind review) and (company name removed for blind review) (country removed for blind review). Vuzix® augmented reality smart glasses (Vuzix M300XL) were integrated with laboratory goggles (Figure 1). These AR glasses give a very natural human–machine interface, with the ability to present certain necessary information right in front of the user’s eye, in parallel with the observation of the real world, and the ability to communicate with the system with simple gestures. The equipment was controlled via an external wireless keyboard or integrated buttons in the equipment.
The AR environment was designed to provide a step-by-step electronic interactive laboratory protocol in the student’s field of vision. Several tools were embedded to guide and regulate the laboratory work, refining the traditional instructions on paper. During the laboratory work, the AR environment prompted the student to perform various tasks to ensure that the student was working with the correct materials, preparing the correct solutions and simultaneously promoting the student’s learning via feedback and extra guidance (Table 2). The AR environment presented the student with (A) gate questions (n = 4) to pause the work until the student could respond with the correct answers and continue with the work. The gate questions checked whether the student had made the correct calculations before proceeding with the work. The student also had to choose the correct reagents and laboratory tools. The reagents and tools, most critical for the successful laboratory work performance, were tagged with (B) Quick Response (QR) codes (n = 11). In order to avoid leading the student’s selection, some incorrect or unnecessary tools and reagents were marked with inactive QR codes. The student’s choices of reagents and tools were recorded with a QR code reader that was implemented within the AR equipment to eliminate further mistakes from the student. The QR check would indicate whether the reagent or equipment chosen for the current task was correct or incorrect. If the student’s selection was wrong, they had to find the correct one before the AR system would allow them to proceed. Additionally, the system provided additional (C) info-screens (n = 8) and (D) feedback (n = 7) throughout the work. The info-screens contained further information about the specific steps of the work, or contained hints regarding gate questions. The feedback system was designed to update the student on the progress of the work and to make uplifting comments to motivate the student. The AR protocol also provided (E) think aloud tasks (n = 8). The think aloud tasks presented students with questions about the test, encouraging them to think about the test in greater detail and to reflect upon their work with knowledge previously learned during the course. The answers to the think aloud tasks were recorded by an external (GoPro) action camera. During the laboratory work, the AR equipment generated a digital log about the student’s answers to the gate questions, as well as their choices of reagents and tools. From the digital log, it could be seen which info-screens had been visited, and how often.
The control group received paper instructions for the test. These instructions included all of the information needed to conduct the test correctly. It could be consulted during the test, but the students in the control group did not receive any extra guidance.

2.5. Procedure

A quasi-experimental pre-test/post-test with a control group design was employed. The experimental group carried out the laboratory work with the aid of mobile AR glasses. The control group worked in a traditional laboratory environment with paper instructions. The design and procedure of the study are presented in Figure 2.
All participants first answered a set of open-ended tasks measuring their understanding of the content knowledge related to the laboratory work. Following this, the students of the experimental group were shown a short video introducing the AR system and were given a demonstration on how to operate it. The students could practice using the AR and its functions for a while on their own. When the students felt comfortable with the equipment, they were instructed to begin the laboratory work and proceed according to the AR environment. The students in the control group were instructed to accomplish the task using traditional paper instructions. The work of both the experimental and control groups was recorded on video with an action camera, to assist in detecting mistakes and verifying students’ actions during the work. After the laboratory work, the students performed the same open-ended tasks and they had the opportunity to complete and correct their pretest answers. The students of the experimental group were also asked to answer the AR usability evaluation form with both Likert-scale items and open-ended questions. The participants were not allowed to ask for help related to the performance of the work from the teacher or from other students. The teacher could interrupt if the safety of the student was at risk or if the actions of the students might have substantially harmed the other students in the laboratory.

2.6. Data Analysis

2.6.1. Analysis of Pre-Test/Post-Test Content Knowledge

An expert (the teacher responsible for the laboratory course with experience in susceptibility testing) evaluated and scored the pre-test and post-test answers on a scale of 0–5 using the following criteria: 0 points—the question has not been answered or the answer is incorrect; 1—the answer has major deficiencies or/and is in major part incorrect or unclear; 2—exceeds score 1, but does not reach score 3; 3 points—the answer is correct and presented with sufficient accuracy and understandability; 4 points—exceeds score 3, but does not reach score 5; 5—the answer is detailed and without error.
Any bias was excluded by having the given points re-evaluated by a group of experts consisting of the authors and a student, all with experience in susceptibility testing. The re-evaluations of the points revealed a few cases where the experts disagreed slightly with the points given. In these cases, the evaluation was discussed until a consensus was reached.

2.6.2. Analysis of the Performance during the Laboratory Work

The students’ performance was analyzed by examining the recorded videos and AR logs (the AR logs of two participants failed). Mistakes related to the laboratory work made during the process by the control group and the AR-assisted group were investigated and categorized as critical or non-critical, with a note on whether the mistake was preventable or non-preventable by the AR system. Mistakes were categorized as critical if they disrupted the work, or if they had a negative impact on the result of the laboratory work. Non-critical mistakes would allow the work to succeed, but carried a risk of failure: for example, the incorrect use of equipment would increase the risk of failure. Additionally, it was noted whether the AR system prevented or corrected a mistake the student was about to make. It was also calculated how many times the students in the experiment group opened additional info-screens during the test.

2.6.3. Analysis of Usability of the AR Environment

The frequencies of the students’ answers for the Likert-scale measure of user experience were calculated. The answers for the open-ended question concerning the AR environment were categorized as positive and negative experiences, and citations are presented in the results section.

3. Results

3.1. Content Knowledge in Pre- and Post-Tests

The students’ content knowledge was evaluated with pre- and post-tests before and after the experiment. The students of the experimental group performed somewhat better in the pre-test compared to the students of the control group (Table 3). Students in both groups were able to improve their performance in the post-test compared to the pre-test. The final score of the experimental group was slightly higher than that of the control group.
In total, 24 revisions were made by the experimental and 19 revisions by the control students. The evaluation score was improved in 15 and 9 revision cases for the experimental and control students, respectively. In proportion to the total number of experimental and control group students, 2.4 revisions were made by the experimental and 3.2 revisions by the control group students (data not shown). Out of these, both the experimental and control group 1.5 revisions resulted in improved evaluation scores (data not shown).
The highest average evaluation scores before the laboratory work (PRE) were obtained for the question 6. This applied for the experimental and the control group. Although the PRE-score was higher (4.8) in the control group compared to the experimental group (4.0), the POST-score (4.2) improved for the experimental group, whereas for the control students, the score remained the same (4.8). The experimental group students obtained an equally good PRE-score (4.0) for question 1. Here, the POST-score remained the same (4.0).
The lowest average score before the laboratory work was seen for question 3, at 1.8 and 1.7 for the experimental and control group, respectively. Besides the rather similar PRE-score, the POST-score for question 3 was the same for both groups (3.5). Questions 2 and 4 also had notably lower scores. Here, the POST-score of question 2 was improved by 0.3 points by the experimental group and 0.5 points by the control group. Regarding question 4, the POST-score of experimental group was improved by 1.4 points and the score of control group improved by 1.1 points.
When summing up the average scores of questions 1–6, the experimental group improved their POST-score by 3.7 points and the control group by 4.5 points. The final score of the experimental group (22.8) was slightly higher than of the control group (22.4). Our results showed that both groups learned the content knowledge related to the laboratory task equally, and based on the average scores of questions 1–6, no extensive differences between the experimental and the control group occurred. However, the benefits of the interactive AR environment were visible based on the revision pattern of questions 1–6.
The least revised question was question 6, as its content was discussed with all of the students earlier in relation to another susceptibility testing laboratory session. However, the only correction with an increased score was made by an experimental group student, which may have derived from the learning supportiveness of think aloud task 3 that clearly overlapped with question 6. It is noteworthy that all of the experimental group students revised their answers to questions 3 and 4, which is likely related to the think aloud tasks, particularly the ones focusing on pipetting and asepsis. Obtaining readable results from the susceptibility testing may also have played a role, aided by the AR environment that ensured a proper laboratory work sequence and the use of correct tools and reagents via the QR codes. In addition, the AR environment promoted the answering of question 2, which showed a superior POST-score compared to the control group. In susceptibility testing on a well plate, the core content is to understand the purpose of each column and the interpretation of the dye’s color. This content was especially targeted in questions 2–4.

3.2. AR Environment Supporting Performance in Teaching Laboratory

Both the experimental and the control group faced certain task-related challenges during the laboratory work. The types of mistakes and the ability of the AR system to intervene in them are shown in Figure 3. In the control group, six mistakes in total were made by six students. Of these mistakes, the AR system would have detected and prevented four. The experimental group reported ten potential mistakes made by six students. Thus, four students in the experimental group faced no problems during the work, but some students faced several problems. Of ten potential mistakes in the experimental group, five were prevented by the AR environment. Both groups faced similar challenges during their work, but the AR prevented 50% of all mistakes and most (70%) of the critical mistakes that would have drastically affected the experiment. These mistakes were similar to the mistakes observed in the control group, such as selecting the wrong reagent, and incorrect amounts or the wrong concentrations of reagents and prepared solutions. Critical mistakes that were not preventable by the AR were: pipetting in the wrong well, using the wrong reagents in the pipetting reservoirs, and insufficient mixing of the bacterial suspension. Other non-preventable mistakes that were attributed to incorrect interpretation of the instructions were incorrect pipetting scheme and the unorthodox use of the pipetting reservoir. These mistakes did not affect the outcome of the students’ work and were therefore classified as non-critical. The students of the experimental group varied in terms of utilizing the additional info-screens during their performance. Three out of ten students opened none of the additional info-screens and five students opened one to six additional info-screens.

3.3. Usability of the AR Environment

When the students were asked their opinion about the guidance provided by the AR after the laboratory work, the highest individual item scores representing agreement were obtained with the following phrases: “The AR equipment gave guidance that was useful to me” (average 4.3 ± 0.48); “The guidance given by the AR equipment did benefit me in working” (4.3 ± 0.67) and “The AR equipment checked my working in the right stages of the work” (4.0 ± 0.94) (Figure 4; Appendix A). In an open question, the students’ attitudes towards the new technology were mainly positive. One example of a student’s positive comment was that: “Pipetting proceeded better as the correct volumes and columns were always in the field of vision while doing serial dilutions or pipetting to the well-plate”, and the respondents were eager to use the equipment in the future as well: “An innovative invention which will surely be (hopefully as soon as possible) very useful some day in the future”. However, a few students noted that the physical properties of the equipment should be improved, as they had experienced some discomfort in using the glasses: “The AR equipment [used with laboratory safety goggles in the study] was heavy and in long-term use it started to press on the ears, which caused some pain”.

4. Discussion

In this study, we designed an AR-assisted learning environment that provided timely adaptive guidance to support students’ individual knowledge construction in a pharmacy teaching laboratory. We investigated the effectiveness of the AR solution by studying students’ performance and learning together with experiences by conducting a quasi-experimental experiment. Increasing interest in cutting-edge technologies such as AR have inspired us to investigate whether new technological tools can provide students with more opportunities to practice hands-on skills (AR headsets are less expensive than much of the specialized equipment needed in pharmaceutical laboratory work). However, more empirical research is needed, because in several previous studies, AR applications have been designed from the technological rather than pedagogical angle and learning has been measured mainly on the level of knowledge acquisition [3].

4.1. AR in Guiding the Laboratory Work

The results of our study showed that the AR managed to guide the laboratory work more successfully than traditional instruction. This became apparent when many students of the experimental group managed to perform the task without mistakes or potential mistakes that needed intervention from the AR, whereas all students from the control group made mistakes. In addition, the AR prevented most of the critical mistakes that the control group struggled with. Thus, the AR was more effective in guiding and controlling students’ performance, and hence could partly replace the teacher in science teaching laboratories in the future. The result is encouraging, considering that the challenge in several science disciplines is that while physical laboratory work is considered an essential part of education, it is costly and time-consuming to arrange, and the students have only limited opportunities to receive individual guidance and support related to the learning of laboratory skills.
Previous research has shown the potential of AR in promoting science learning [12,17,18], but contradictory results have also been found [9,15]. In our study, even though the students of the experimental group were encouraged to process the content more than the control group (via, for example, think aloud tasks) during the laboratory work, they still did not succeed better in tasks measuring understanding of the content knowledge compared to the control group. Similar results have been reported; for example, a previous study [9] showed that although working in an immersive virtual science laboratory enhanced university students’ motivation to perform lab tasks, it did not improve their learning—the high-immersive VR group of students learned less than those who used the low-immersion version of the simulation on a desktop computer. Thus, more research is still needed on how technological tools could help students to link underlying content knowledge and hands-on laboratory activities.
Several previous studies have stated that AR is often complicated for students to use [7,8], and usability problems have been recognized as one of the biggest barriers to using AR in educational settings [30]. Our study did not emphasize these results: in contrast, the students in this study had positive experiences of using AR in their laboratory work [31]. Additionally, the participants in this study were happy with the timing, quality and amount of feedback that the system provided throughout the laboratory work. Thus, it seems that AR-assisted technology can also be applied to teaching in a manner that is easily utilized among students. However, it needs to be pointed out here that “liking is not learning” [9], meaning that activities or tools that make learning more fun do not necessarily increase student learning. On the other hand, low motivation and/or negative attitudes could ruin opportunities for learning; hence, usability aspects are important in developing new learning environments.
The AR-assisted environment of this study was designed to guide and monitor students’ performance and simultaneously support their self-regulation skills by not providing ready answers, but rather to give warnings to stop and reconsider if the student was about to make a mistake. This opportunity to notice one’s potential mistakes and learn from them is essential for supporting the development of students’ pharmaceutical expertise. Being able to make mistakes while learning allows students to find flaws in their thinking and uncover assumptions. However, pharmaceutical laboratory work is a situation in which learning from one’s mistakes can have unacceptable consequences in the real world. Thereby, AR technology could revolutionize higher education by offering opportunities to make mistakes and learn from them, practice and receive timely personal feedback related to the skills that are otherwise difficult or even impossible to practice enough during higher education studies.

4.2. Limitations of the Study and Future Study Ideas

This study faces some limitations regarding the reliability, impact and generalizability of the results. Firstly, our sample size was small, focusing on a specific group of students because it is time-consuming to conduct this type of research. It would be recommended to repeat the study with a larger and different sample. Secondly, though it offers many advantages, AR poses some challenges that must be considered. A downside of today’s smart glass technology is the limited field of view. Based on previous studies, a limited field of view in combination with large amounts of information to master may increase students’ cognitive load and inhibit learning [32,33].
AR technology in educational settings is still in its early stages of development, which limits its usefulness. Headsets are typically relatively large and heavy, with a limited battery life and cables that may interfere with function. In this study, the participants mentioned these factors and our study would have benefitted from lighter and better fitting AR equipment. However, based on the feedback, Sciar Company Ltd. has already adjusted the AR equipment for greater comfort in future studies. The students of our study participated voluntarily, and it is possible that they represented more technologically-oriented students; those with a more reserved attitude towards new technologies, or laboratory rehearsals generally, possibly did not participate. Hence, although this study gives us some preliminary views of the effectiveness of AR in guiding laboratory rehearsals, future work should involve other types of learning tasks, larger and more diverse samples, different contexts and even delayed tests.

4.3. Conclusions

The present study offers a step towards developing meaningful learning environments that make use of cutting-edge educational technology to support learning in science labs. The development of learning environments that enable students to practice the necessary hands-on skills, make mistakes and learn from them without suffering from harmful consequences is of the utmost importance in supporting their development of expertise in higher education. The results of our experimental study support the idea of the potential benefit of AR in laboratory courses and have encouraged us to continue the development of AR-supported science laboratory environments further. Our results also offer insights for other higher education instructors and software developers who wish to utilize AR technology in promoting the learning of practical laboratory skills.

Author Contributions

Conceptualization, I.S., N.K., K.K., P.L. and M.S.; methodology, I.S., N.K., K.K., P.L. and M.S.; software, I.S., N.K., K.K., P.L. and M.S.; formal analysis, I.S., N.K., K.K., P.L. and M.S.; investigation, I.S., N.K., K.K., P.L. and M.S.; resources, K.K.; data curation, I.S., K.K., P.L., M.A. and M.S..; writing—original draft preparation, I.S., N.K., K.K., P.L. and M.S.; writing—review and editing, I.S., N.K., K.K., P.L., M.A. and M.S.; visualization, K.K., P.L. and M.S.; funding acquisition, I.S. and M.S. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the University of Helsinki via the Digital Leap in Education (2017–2020) and project of Cultivating Expertise in Learning of Life Sciences, CELLS (Research Funds of the University of Helsinki, HY/716/05.01.07/2018).

Institutional Review Board Statement

Voluntary participation, informed consent, and anonymity of the participants were ensured in the research process. The study did not involve intervention in the physical integrity of the participants, deviation from informed consent, studying children under the age of 15 without parental consent, exposure to exceptionally strong stimuli, causing long-term mental harm beyond the risks of daily life, or risking participants’ security (cf. Finnish Advisory Board on Research Integrity 2019). Consequently, this study did not require a Finnish ethics review.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

Open access funding provided by University of Helsinki. We thank Sciar Company Ltd. for its collaboration in the development of the AR environment and for providing access to the AR technology. We also thank the students who participated in this study.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

AR user experience adapted from [34]. Values are described as average item scores (±standard deviation) for each individual statement presented to the students (n = 10) after the AR experience.
POST Mean (SD)
The AR equipment gave guidance that was useful to me.4.3 (0.48)
The guidance given by the AR equipment did benefit me in working.4.3 (0.67)
The AR equipment gave me enough guidance.3.9 (1.2)
The AR equipment gave feedback in the right stages of the work.3.9 (0.88)
The AR equipment gave feedback that was useful to me.3.7 (0.48)
The AR equipment gave me enough feedback.3.6 (1.1)
Working with the AR equipment supported my learning.3.6 (1.1)
I would have learned better without the AR equipment.2.6 (0.84)
The AR equipment checked my working in the right stages of the work.4.0 (0.94)
The responders specified their level of agreement in a five-point Likert scale with descriptors (1) strongly disagree; (2) disagree; (3) neither agree or disagree; (4) agree; (5) strongly agree.

References

  1. Arici, F.; Yildirim, P.; Şeyma, C.; Yilmaz, R.M. Research trends in the use of augmented reality in science education: Content and bibliometric mapping analysis. Comput. Educ. 2019, 142, 103647. [Google Scholar] [CrossRef]
  2. Billinghurst, M.; Kato, H.; Poupyrev, I. The MagicBook: A transitional AR interface. Comput. Graph. 2001, 25, 745–753. [Google Scholar] [CrossRef] [Green Version]
  3. Ibáñez, M.-B.; Delgado-Kloos, C. Augmented reality for STEM learning: A systematic review. Comput. Educ. 2018, 123, 109–123. [Google Scholar] [CrossRef]
  4. Merchant, Z.; Goetz, E.T.; Cifuentes, L.; Keeney-Kennicutt, W.; Davis, T.J. Effectiveness of virtual reality-based instruction on students’ learning outcomes in K-12 and higher education: A meta-analysis. Comput. Educ. 2014, 70, 29–40. [Google Scholar] [CrossRef]
  5. Radianti, J.; Majchrzak, T.A.; Fromm, J.; Wohlgenannt, I. A systematic review of immersive virtual reality applications for higher education: Design elements, lessons learned, and research agenda. Comput. Educ. 2020, 147, 103778. [Google Scholar] [CrossRef]
  6. Wu, H.-K.; Lee, S.W.-Y.; Chang, H.-Y.; Liang, J.-C. Current status, opportunities and challenges of augmented reality in education. Comput. Educ. 2013, 62, 41–49. [Google Scholar] [CrossRef]
  7. Radu, I. Augmented reality in education: A meta-review and cross-media analysis. Pers. Ubiquitous Comput. 2014, 18, 1533–1543. [Google Scholar] [CrossRef]
  8. Squire, K.D.; Jan, M. Mad City Mystery: Developing Scientific Argumentation Skills with a Place-based Augmented Reality Game on Handheld Computers. J. Sci. Educ. Technol. 2007, 16, 5–29. [Google Scholar] [CrossRef]
  9. Makransky, G.; Terkildsen, T.; Mayer, R.E. Adding immersive virtual reality to a science lab simulation causes more presence but less learning. Learn. Instr. 2019, 60, 225–236. [Google Scholar] [CrossRef]
  10. Coyne, L.; Merritt, T.A.; Parmentier, B.L.; Sharpton, R.A.; Takemoto, J.K. The Past, Present, and Future of Virtual Reality in Pharmacy Education. Am. J. Pharm. Educ. 2019, 83, 7456. [Google Scholar] [CrossRef] [PubMed]
  11. Jang, S.; Vitale, J.M.; Jyung, R.W.; Black, J.B. Direct manipulation is better than passive viewing for learning anatomy in a three-dimensional virtual reality environment. Comput. Educ. 2017, 106, 150–165. [Google Scholar] [CrossRef]
  12. Strzys, M.P.; Thees, M.; Kapp, S.; Knierim, P.; Schmidt, A.; Lukowicz, P.; Kuhn, J. Smartglasses as assistive tools for undergraduate and introductory STEM laboratory courses. In Perspectives on Wearable Enhanced Learning (WELL); Springer: Cham, Switzerland, 2019; pp. 35–58. [Google Scholar]
  13. Lubarsky, S.; Dory, V.; Audétat, M.-C.; Custers, E.; Charlin, B. Using script theory to cultivate illness script formation and clinical reasoning in health professions education. Can. Med. Educ. J. 2015, 6, e61–e70. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  14. Olympiou, G.; Zacharias, Z.; DeJong, T. Making the invisible visible: Enhancing students’ conceptual understanding by introducing representations of abstract objects in a simulation. Instr. Sci. 2013, 41, 575–596. [Google Scholar] [CrossRef]
  15. Moro, C.; Smith, J.; Finch, E. Improving stroke education with augmented reality: A randomized control trial. Comput. Educ. Open 2021, 2, 100032. [Google Scholar] [CrossRef]
  16. Trundle, K.C.; Bell, R.L. The use of a computer simulation to promote conceptual change: A quasi-experimental study. Comput. Educ. 2010, 54, 1078–1088. [Google Scholar] [CrossRef]
  17. Akçayır, M.; Akçayır, G.; Pektaş, H.M.; Ocak, M.A. Augmented reality in science laboratories: The effects of augmented reality on university students’ laboratory skills and attitudes toward science laboratories. Comput. Hum. Behav. 2016, 57, 334–342. [Google Scholar] [CrossRef]
  18. Chiu, J.L.; DeJaegher, C.J.; Chao, J. The effects of augmented virtual science laboratories on middle school students’ understanding of gas properties. Comput. Educ. 2015, 85, 59–73. [Google Scholar] [CrossRef]
  19. de Jong, T.; Linn, M.C.; Zacharia, Z.C. Physical and Virtual Laboratories in Science and Engineering Education. Science 2013, 340, 305–308. [Google Scholar] [CrossRef] [Green Version]
  20. Zacharia, Z. Comparing and combining real and virtual experimentation: An effort to enhance students’ conceptual understanding of electric circuits. J. Comput. Assist. Learn. 2007, 23, 120–132. [Google Scholar] [CrossRef]
  21. Zacharia, Z.C.; Olympiou, G. Physical versus virtual manipulative experimentation in physics learning. Learn. Instr. 2011, 21, 317–331. [Google Scholar] [CrossRef]
  22. Barrett, R.; Gandhi, H.A.; Naganathan, A.; Daniels, D.; Zhang, Y.; Onwunaka, C.; Luehmann, A.; White, A.D. Social and Tactile Mixed Reality Increases Student Engagement in Undergraduate Lab Activities. J. Chem. Educ. 2018, 95, 1755–1762. [Google Scholar] [CrossRef]
  23. Sotiriou, S.; Bogner, F.X. Visualizing the Invisible: Augmented Reality as an Innovative Science Education Scheme. Adv. Sci. Lett. 2008, 1, 114–122. [Google Scholar] [CrossRef]
  24. Zydney, J.M.; Warner, Z. Mobile apps for science learning: Review of research. Comput. Educ. 2016, 94, 1–17. [Google Scholar] [CrossRef]
  25. Bransford, J.D.; Brown, A.L.; Cocking, R.D. How People Learn: Brain, Mind, Experience, and School: Expanded Edition of Sciences; National Academy Press: Washington, DC, USA, 2000. [Google Scholar]
  26. Carbonell, K.B.; Stalmeijer, R.E.; Könings, K.; Segers, M.; van Merriënboer, J.J. How experts deal with novel situations: A review of adaptive expertise. Educ. Res. Rev. 2014, 12, 14–29. [Google Scholar] [CrossRef]
  27. Ericsson, A.; Pool, R. Peak: The Secrets of New Science of Expertise; Mariner Books Houghton Mifflin Harcourt: Boston, MA, USA; New York, NY, USA, 2016. [Google Scholar]
  28. Evans, C. Making Sense of Assessment Feedback in Higher Education. Rev. Educ. Res. 2013, 83, 70–120. [Google Scholar] [CrossRef] [Green Version]
  29. Ferguson, P. Student perceptions of quality feedback in teacher education. Assess. Eval. High. Educ. 2011, 36, 51–62. [Google Scholar] [CrossRef]
  30. Bacca, J.; Baldiris, S.; Fabregat, R.; Graf, S.; Kinshuk, D. Augmented Reality Trends in Education: A Systematic Review of Research and Applications. Educ. Technol. Soc. 2014, 17, 133–149. Available online: http://www.jstor.org/stable/jeductechsoci.17.4.133 (accessed on 15 September 2021).
  31. Czerkawski, B.; Berti, M. Learning experience design for augmented reality. Res. Learn. Technol. 2021, 29. [Google Scholar] [CrossRef]
  32. Baumeister, J.; Ssin, S.Y.; Elsayed, N.A.M.; Dorrian, J.; Webb, D.P.; Walsh, J.; Simon, T.M.; Irlitti, A.; Smith, R.; Kohler, M.; et al. Cognitive Cost of Using Augmented Reality Displays. IEEE Trans. Vis. Comput. Graph. 2017, 23, 2378–2388. [Google Scholar] [CrossRef]
  33. Cheng, K.H.; Tsai, C.C. Affordances of Augmented Reality in Science Learning: Suggestions for Future Research. J. Sci. Educ. Technol. 2013, 22, 449–462. [Google Scholar] [CrossRef] [Green Version]
  34. Lee, E.A.-L.; Wong, K.W.; Fung, C.C. How does desktop virtual reality enhance learning outcomes? A structural equation modeling approach. Comput. Educ. 2010, 55, 1424–1442. [Google Scholar] [CrossRef] [Green Version]
Figure 1. A student working in a teaching laboratory with Vuzix® AR glasses integrated with laboratory goggles. See also the video: https://youtu.be/fhT4r47a-es (accessed on 14 November 2021).
Figure 1. A student working in a teaching laboratory with Vuzix® AR glasses integrated with laboratory goggles. See also the video: https://youtu.be/fhT4r47a-es (accessed on 14 November 2021).
Education 11 00816 g001
Figure 2. Design of the procedure (experimental group n = 10; control group n = 6).
Figure 2. Design of the procedure (experimental group n = 10; control group n = 6).
Education 11 00816 g002
Figure 3. The (potential) mistakes made by the experimental (AR) group and the control group.
Figure 3. The (potential) mistakes made by the experimental (AR) group and the control group.
Education 11 00816 g003
Figure 4. AR user experience. The students’ (n = 10) opinions were surveyed after the use of equipment on a five-point Likert scale. The number of responses is given for each statement in the bar, with the values above the midpoint (3) indicating agreement, and values below for disagreement.
Figure 4. AR user experience. The students’ (n = 10) opinions were surveyed after the use of equipment on a five-point Likert scale. The number of responses is given for each statement in the bar, with the values above the midpoint (3) indicating agreement, and values below for disagreement.
Education 11 00816 g004
Table 1. Open-ended questions measuring underlying content knowledge, intended learning outcomes of questions and expected keywords in the answers evaluated with the maximum score of 5.
Table 1. Open-ended questions measuring underlying content knowledge, intended learning outcomes of questions and expected keywords in the answers evaluated with the maximum score of 5.
NoQuestionIntended Learning OutcomeExpected Keywords if the Answer Evaluated with the Score 5 *
1Why do you prepare dilutions with different concentrations from the antibiotic?Understanding the aim, methodology and performance of the susceptibility test on a well plate.The smallest concentration that is still bioactive; which concentration inhibits the growth.
2What is the purpose of the positive and negative control?Understanding and evaluation of the susceptibility test methodology and result.Indicates the color with and without bacterial growth; indicates the color of bioactive antibiotic concentration.
3What does the blue and pink-purple color mean?Evaluation and analyzing of susceptibility test result. Understanding the mechanism of action of resazurin dye.Blue indicates no bacterial growth; pink indicates bacterial growth.
4From which column and well would you determine the MIC value?Understanding the methodology of susceptibility test on a well plate. Understanding the mechanism of action of antimicrobial reagent and resazurin dye.Column containing the smallest bioactive concentration.
5Why pipetting of samples in correct order is important in this laboratory work?Applying the necessity of asepsis in susceptibility test. Understanding the methodology of susceptibility test. Appraising the use of laboratory tools.Prevention of contamination and mixing up concentration; economical use of tools.
6Why is it necessary to check the turbidity of the bacterial suspension?Understanding susceptibility testing methodology and interpretation of results. Evaluation and understanding the life cycle of microbe.To verify sufficient bacterial growth.
* Answers to the questions were evaluated and scored in points of 0–5 using the following criteria: 0, the question has not been answered or the answer is incorrect; 1, the answer has major deficiencies or/and is in major part incorrect or unclear; 2, exceeds score 1 but does not reach score 3; 3, the answer is correct and presented with sufficient accuracy and understandability; 4, exceeds score 3 but does not reach score 5; 5, the answer is detailed and without errors.
Table 2. The operations to support and guide students’ performance in laboratory work in the AR protocol.
Table 2. The operations to support and guide students’ performance in laboratory work in the AR protocol.
OperationPurposeExample
A: Gate questionsTo facilitate students’ thinking and ensure correct process“Please answer the following question: I will add X ml of stock solution of antimicrobial substance to Y ml of nutrient broth”
B: Quick Response (QR) codesTo enhance the correct use of reagents and instrumentsPlease read the correct QR code of the reagent needed in the following work phase” (e.g., Ampicillin stock 1.0 mg/mL)
C: Info-ScreensTo give more information and extra guidance about the testNote that the concentration of the antimicrobial substance is ten times more dilute in each dilution.
D: FeedbackTo give information about the progress of the test and to motivate studentsCorrect answer! You can now do the pipetting.
You are progressing well! Only one step to go
E: Think aloud tasksTo facilitate students’ learning and understanding of the content of the testIn which order did you just pipette the dilutions of the antimicrobial substance. From most concentrated to more dilute or the other way? Why? Could you have done it the other way?
Table 3. Average scores per task (from 0 for fail to 5 for excellent response) (M) and total score (max = 30) before (PRE) and after the laboratory work (POST) of tasks measuring understanding of content knowledge. Scores marked with red color indicate changes in the student’s answer between the pre- and post-test.
Table 3. Average scores per task (from 0 for fail to 5 for excellent response) (M) and total score (max = 30) before (PRE) and after the laboratory work (POST) of tasks measuring understanding of content knowledge. Scores marked with red color indicate changes in the student’s answer between the pre- and post-test.
Experimental Control
Q1Q2Q3Q4Q5Q6Q1Q2Q3Q4Q5Q6
Scores of answersidPrePostPrePostPrePostPrePostPrePostPrePostidPrePostPrePostPrePostPrePostPrePostPrePost
155222535445511342215553355
233552555445512552503054455
355250455445513331133334455
444333302225514245504020455
544220155332215443333554444
655334455445516551133004455
7221103034444
8551113253322
9445534553455
10334433044424
Average score 4.04.02.83.11.83.53.04.43.53.64.04.2 3.74.22.32.81.73.52.23.33.23.84.84.8
Sum of average Pre score 19.1 17.9
Sum of average Post score 22.8 22.4
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Södervik, I.; Katajavuori, N.; Kapp, K.; Laurén, P.; Aejmelaeus, M.; Sivén, M. Fostering Performance in Hands-On Laboratory Work with the Use of Mobile Augmented Reality (AR) Glasses. Educ. Sci. 2021, 11, 816. https://doi.org/10.3390/educsci11120816

AMA Style

Södervik I, Katajavuori N, Kapp K, Laurén P, Aejmelaeus M, Sivén M. Fostering Performance in Hands-On Laboratory Work with the Use of Mobile Augmented Reality (AR) Glasses. Education Sciences. 2021; 11(12):816. https://doi.org/10.3390/educsci11120816

Chicago/Turabian Style

Södervik, Ilona, Nina Katajavuori, Karmen Kapp, Patrick Laurén, Monica Aejmelaeus, and Mia Sivén. 2021. "Fostering Performance in Hands-On Laboratory Work with the Use of Mobile Augmented Reality (AR) Glasses" Education Sciences 11, no. 12: 816. https://doi.org/10.3390/educsci11120816

APA Style

Södervik, I., Katajavuori, N., Kapp, K., Laurén, P., Aejmelaeus, M., & Sivén, M. (2021). Fostering Performance in Hands-On Laboratory Work with the Use of Mobile Augmented Reality (AR) Glasses. Education Sciences, 11(12), 816. https://doi.org/10.3390/educsci11120816

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop