Next Article in Journal
Potentials and Challenges in Students’ Meaning-Making via Sign Systems
Next Article in Special Issue
The Pedagogical Value of Creating Accessible Games: A Case Study with Higher Education Students
Previous Article in Journal
Remote Dyslexia Screening for Bilingual Children
Previous Article in Special Issue
Music to My Ears: Developing Kanji Stroke Knowledge through an Educational Music Game
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Digital Escape Rooms as Game-Based Learning Environments: A Study in Sex Education

by
Lena von Kotzebue
1,*,†,
Joerg Zumbach
2,*,† and
Anna Brandlmayr
3
1
Biology Education, University of Salzburg, 5020 Salzburg, Austria
2
Digital Learning Research Group, University of Salzburg, 5020 Salzburg, Austria
3
Musikmittelschule 2 Lambach, Paris-Lodron University, 4650 Lambach, Austria
*
Authors to whom correspondence should be addressed.
These authors contributed equally to this work.
Multimodal Technol. Interact. 2022, 6(2), 8; https://doi.org/10.3390/mti6020008
Submission received: 17 December 2021 / Revised: 13 January 2022 / Accepted: 14 January 2022 / Published: 18 January 2022
(This article belongs to the Special Issue Innovations in Game-Based Learning)

Abstract

:
Game-based learning is becoming increasingly popular in education. The playful experience especially promises a high degree of students’ motivation. In this research, we examine the influence of sequential scaffolding within a digital educational escape room game. Escape rooms are usually games where players have to escape from a room within a given time limit by completing different tasks and quests. Therefore, we developed an educational virtual escape room for biology classes, focusing on the topic of sex education. In an experiment, we modified this learning environment and developed two different conditions: in one escape room, scaffolding was implemented using sequential learning aids; in the other escape room, which was assigned to the control group, no additional learner support was provided. The main objective of this quantitative research is to measure the escape room’s impact on learning and cognitive load. In addition, motivation, flow experience and experience of immersion are analyzed. A comparison between the two escape rooms shows that additional scaffolding does not significantly increase cognitive load or have any effect on learning. Results show that motivation and knowledge acquisition can be successfully supported by using game-based learning with escape rooms.

1. Introduction

1.1. Game-Based Learning

Learning with digital technologies covers a broad field of instructional and technological approaches. One of the most frequently encountered approaches is using simulations for educational purposes, which, in turn, is closely related to using digital game-based scenarios for digital teaching and learning, i.e., so-called “game-based learning” (GBL). GBL does not necessarily need to involve digital technologies [1]. Nevertheless, this research focuses on digital GBL. Digital GBL, at its core, is concerned with designing computer and video games that enable players to achieve pre-determined learning objectives. Even though simulation-based games are very common, there is a huge variety of digital educational game genres [2]. In addition, the type of gameplay and whether it is more in the foreground or background can vary. Often, the term “serious games” is used if the gameplay—or story—is in the background. Nevertheless, the most important feature of GBL is the combination of gameplay and knowledge acquisition, which distinguishes it from gamification. Gamification refers to approaches that enrich learning environments with game elements and/or competitive elements, thus highlighting the learning environment instead of the gameplay [3,4].
GBL itself is an umbrella term for several (digital) learning environment approaches that are primarily games. Nonetheless, the principal objective is knowledge acquisition; hence, these games are often described as serious games [5]. Arguments in favor of using games as a learning environment are: first, learners need to be motivated (and/or entertained) and second, learners need to reach intended learning outcomes covered by the game.
One major advantage of GBL is that learners can monitor their own actions, which is supported by interactive formats [6]. Game-based learning and learning with simulations often cannot be separated because some simulations (e.g., flight simulators, railroad simulators, etc.) are relatively diverse, especially in the gaming sector. Both types are interactive and represent digital (learning) environments [7,8]. Nevertheless, Vogel et al. [7] (p. 231) distinguished games from simulations as follows: “A computer game is defined as such by the author, or inferred by the reader because the activity has goals, is interactive, and is rewarding (gives feedback). Interactive simulation activities must interact with the user by offering the options to choose or define parameters of the simulation then observe the newly created sequence rather than simply selecting a pre-recorded simulation.” Consequently, the main difference is that in games the sequence of actions is more or less pre-defined, whereas in simulations the action results from interaction between the system and the learner. Since this distinction is not really sufficient, Sitzmann [9] introduced “simulation games” as a mixed category. In short, learners’ perception determines whether a game is seen as a game, a simulation or something else. All of these approaches expect learners to make decisions in a digital learning environment and to learn from the resulting consequences [9,10]. However, this only applies to complex games and difficult games conveying basic information rather than complex problem solving [11]. In fact, game-based learning is not always effective in terms of knowledge transfer [8]. Kim et al. [12] identified six processes they argued learners need to go through in order to ensure sustainability of learning:
  • During the game, learners must identify with one or several problems as well as the content areas covered by those problems.
  • Learners must develop different solution and action strategies.
  • Learners must select a problem solving strategy.
  • Feedback from the learning environment must be responded to or utilized by learners.
  • Useful strategies must be pursued by learners.
  • Learners must modify less helpful strategies.
These strategies can be applied to many types of games—even to those outside the educational sector—and extended to learning environments which focus on learning through problem solving. Meta-analysis conducted by Wouters and van Oostendorp [8] suggests that additional instructional support or scaffolding significantly increases learning success in GBL concerning these types of learning environments (e.g., metacognitive support). Kim et al. [12] compared three supportive strategies where participants were divided into three groups: learners in Group One were asked to keep a learning log during the game (within the domain of economics); in Group Two examples were given; Group Three was asked to use thinking aloud as a strategy. Results show that both giving examples and thinking aloud are helpful strategies because they do not disrupt the flow of the game, whereas keeping a log does and consequently proves to be obstructive. In addition, Sung and Hwang [13] showed that collaborative game play can be more effective than individual game play. However, this finding is not supported by other studies [14] or the meta-analysis by Clark et al. [10]. Nevertheless, Sung and Hwang [13] also showed that additional support in the form of visualizing one’s own problem solving strategies (in terms of a metacognitive reflection strategy) leads to better learning outcomes. Other factors contributing to the success of GBL concern the degree of subjective challenge. According to Hamari et al. [15], success is greater when learners feel challenged. Moreover, the focus must always be on learning. If learners are aware of this, the learning success is higher than if the entertainment aspect is in the foreground [16].
Individual meta-analyses also indicate that game-based learning leads to positive effects concerning content areas. For instance, Chiu et al. [17] examined different studies for English as a second language. They showed that GBL leads to large effects on learning, though unpublished studies report lower effect sizes (i.e., publication bias). The authors also pointed out that the use of games seems to be more appropriate for more complex problem solving learning than for less complex practice tasks (e.g., game-based vocabulary trainers). Especially in vocabulary learning, the type of game seems to be an important factor for success. Chen et al. [18] reported large effect sizes of games compared to traditional digital learning environment approaches (e.g., computer-based training); still, perceived challenge of the game is a determining factor: the greater the experienced challenge, the greater the learning outcome. Another meta-analysis on the use of digital games in mathematics also showed advantages of GBL in comparison to traditional (analogous) approaches; however, effect sizes were rather small in this study [19].
Experienced challenge is a significant factor determining success or failure of game-based learning. Chen et al. [20] found medium effect size for using games across many genres and contents. Promoting challenge and competition is particularly effective in subjects such as mathematics, science and languages. However, it is less effective in areas such as social sciences. Within game genres, this factor proves effective for simulations, role playing and strategy games and puzzle games, but not for action games.
In conclusion, results show that GBL can contribute to effective learning and increased learners’ motivation, depending on the game genre, the subsequent actions and tasks and the specific domain. GBL seems to be rather effective in science education that is challenging and requires problem solving. Escape Room games are a specific game genre that could meet these requirements.

1.2. Escape Rooms

The first escape rooms (ERs) were documented in Japan in 2007. Soon, they spread over numerous countries and became very popular [21]. ERs are team-based live-action games in which players have to solve numerous tasks within a certain time limit [22]. If all challenges are completed successfully and on time, the players win and are able to leave the room. If they do not finish within the given time limit, they lose [23,24]. The games often involve role plays that are driven by a narrative [21]. All activities/tasks in an ER are called puzzles. These always follow the same principle: challenge, solution and reward (e.g., a code for a lock or information needed to solve the next puzzle [23]). Cognitive puzzles are prevalent in ERs [25]. These may include encoded messages, combination locks, ciphers and jumbles [26].
ERs promote numerous skills, including teamwork, communication and logic, as well as critical thinking, searching, observation, reasoning, pattern recognition, problem solving, creativity, application of knowledge and coping with time pressure [24,27,28]. ER puzzles can be organized in four different ways: open-ended, sequential, path-based and hybrid structured [25]. As players must place themselves in the game context, immersion plays a central role in ERs. Immersion is the process of being mentally drawn into the story or into a particular problem [29]. This can motivate players to solve challenges and complete tasks [30].
ERs were developed for recreational purposes and most ERs still are; however, using ERs in education is becoming more and more popular. Educational escape rooms (EERs) have now reached all grade levels; however, they are particularly popular in professional development programs because, in addition to fostering cognitive skills, they are designed to stimulate collaboration and social skills [31,32]. While EERs seem to be a new instructional approach of promoting situated problem solving, its foundations are not. Approaches such as anchored instruction [33,34,35] or goal-based scenarios [36,37] have already implemented applied problem solving within learning environments that are close to what now are promoted as EERs.
The implementation of ERs in education began with enthusiastic teachers who adapted recreational escape rooms for their classrooms [31]. On platforms such as Breakout EDU, members can share, adopt or modify ERs [31,38]. Recreational and educational ERs differ in several ways. One of the main differences is that recreational ER are designed to appeal to a broad audience, whereas EERs are developed for a specific target audience with clearly defined learning objectives [23,39]. Furthermore, the two types of ERs differ in the number of participants that can play simultaneously. While recreational ERs average 3–7 players [22], teachers must organize EERs to accommodate an entire class or course of up to hundreds of students [40,41]. Another difference is space. While EERs take place in educational facilities (course rooms) and usually need to be prepared in a short period of time, recreational ERs take place in several interconnected fixed spaces. In the latter case, more preparation time and a higher budget are needed. In addition, the role of the game leader or teacher is different. For example, in recreational ERs the game leader directs the game from an adjoining room, whereas in EERs teacher and players are in the same room while ensuring that the players can act autonomously [23].
When designing an EER, care should be taken to ensure that students have positive learning experiences while solving the puzzles and that as many students as possible can “escape” the EER. Therefore, the puzzles need to be challenging in order to prevent boredom; at the same time, they also must not be too difficult in order to prevent frustration [23,39,41]. Additionally, EER breakout activities can be created in the classroom. These two scenarios represent promising learner-centered and collaborative activities [42]. Until now, most EER actions have been documented in the health/medical domain, followed by STEM, with play time usually taking about 60 min (e.g., [21,23,43,44,45,46]). So far, ERs have been documented especially in medicine and nursing professions because, besides the mostly analyzed increase in knowledge, ERs help promote nursing skills, such as teamwork and communication. [44]. GBL and serious games are also used in sexual education. For instance, Chu et al. [47] showed that adolescents/students in Hong Kong were able to increase their sexual knowledge through an interactive game application. Moreover, their self-reported attitudes towards sex and relationships and their awareness of making wise sexual choices improved. Haruna and collogues [48] also found an increase in motivation, attitudes, knowledge and engagement in sexual health among Tanzanian students following a lesson based on GBL or gamification compared to a traditional lesson. Combining the interactive game-based approach with traditional instruction led to teachers and students engaging in more collaborative discussions during and after the game [49]. However, research on the implementation of (D)EERs in the field of sex education is still relatively new.
Digital elements, such as QR codes, AR and VR, are increasingly embedded in EER. It is important to ensure that these elements positively complement physical EER elements [21]. In the context of increasing digitalization and in light of the COVID-19 pandemic, there are more and more EERs that are completely digital, which are called digital educational escape rooms (DEERs) [21]. DEERs offer multiple benefits, such as low-cost and flexible learning experiences [50]. Similar to EERs, they seek to provide students with immersive, dynamic, exciting and actively oriented online learning experiences [21]. They are typically created from a combination of free web-based applications [51]. When working on EERs, 21st-century soft skills and inquiry-based learning are fostered as students take on an active role in EERs and search for new data/knowledge to answer (research) questions independently [32,52].
Enthusiastic teachers have adopted recreational ERs in the educational context; hence, little is yet known about their theoretical basis. In addition, the use of EERs has rarely been evaluated [23]. Since EERs are based on game structure and design, they are closely related to gamification methodology [21]. They share many characteristics with educational games and, therefore, it is possible to use GBL theory as a reference point to develop a theoretical background for EERs [23]. Research on GBL shows that, in order to foster an increase in knowledge as well as students’ motivation, pedagogical and as game design aspects need to be considered when developing EERs [23,53,54,55]. These aspects may include clear feedback and affirmation as well as progressive challenges during the learning process [23]. In addition, a narrative guiding the players through the game as well as active and autonomous roles of the players are important [29,54,55].
For the past few years, several studies in the field of ERs have been conducted on diverse subject areas and in educational contexts. Veldkamp et al. [23] provides an overview of EERs, such as: an analysis of students’ information seeking behavior [56], an analysis of learning processes in student teams [57] and the use of teamwork and leadership skills among students [58]. In addition, students developing ERs in order to foster their design skills were examined [59,60]. The majority of studies on EERs addresses the advancement of domain-specific skills and knowledge [26,40,61,62]. However, despite these reported studies, literature highlights that there is a clear need to evaluate EER implementations and to define guidelines for EERs that lead to the development and implementation of learning environments which are as motivational and effective as possible [23,39,63]. In particular, DEERs have been poorly reported and evaluated. The question remains whether and how a DEER can be supported as an instance of GBL using scaffolding [8]. Such support can relate to several different variables, such as metacognition or direct elaboration of content.

1.3. Open Research Questions

Our research aims at applying findings from research on GBL and learning with simulations to a digital educational escape room (DEER). More precisely, the question is how a DEER contributes to knowledge acquisition and motivation within the area of sex education in biology learning and teaching. We assume that such a learning environment contributes to knowledge acquisition (Hypothesis 1) and that it increases subject-related situational interests as a motivational variable when comparing pre- and post-test results (Hypothesis 2). Both hypotheses are in line with prior research on GBL [8,23,54]. Additionally, the influence of scaffolding in the form of sequential hints (graded learning aids) is examined. For example, we assume that this type of learner support can help reduce extraneous cognitive load during problem solving, while assuming that intrinsic and germane cognitive load are similar to a condition without this support (Hypothesis 3). Moreover, we expect sequential hints to act as metacognitive support [12] in monitoring one’s own progress, thus contributing to learners’ knowledge-related self-confidence (Hypothesis 4) and at the same time reducing the experience of flow and immersion by interrupting the gameplay (Hypothesis 5). Finally, we assume that there is an aptitude–treatment interaction that requires to control for prior knowledge, biology grade, general intrinsic and extrinsic motivation and knowledge-related self-confidence prior to exposition to the DEER. Consequently, these variables are used as covariates in this study.

2. Materials and Methods

The aim of this study is to investigate the effect of a digital education escape room (DEER) as an alternative method of teaching. We analyze the impact of a DEER concerning the topic of sex education with graded learning aids and compare it to a DEER without graded learning aids regarding learning outcomes, cognitive load and performance motivation, as well as immersion and flow experience.

2.1. Sample

A total of 84 middle school students aged between 12 and 16 (M = 13.15; SD = 0.93) voluntarily participated in this study; 48 of them were female and 36 were male. The middle school, which is an Austrian compulsory school, is located in a rural location, and the socioeconomic status is rather high. No reward was given, and each participant was randomly assigned to one of the escape rooms. The study took place during a regular lesson; thus, students from an entire class participated at the same time.

2.2. Design

In this study, there was one independent variable, namely, the provision of sequential learning aids in the DEER. The two rooms were exactly the same in terms of content and puzzles. The only difference was that the second DEER was equipped with sequential learning aids and the first did not offer any additional aids. The dependent variables were knowledge acquisition; knowledge-related self-confidence; self-reported cognitive load and situational motivation, including the sub-dimensions situational interest and experienced challenge; as well as immersion and flow experience after completing the escape room. The control variables were prior knowledge, knowledge-related self-confidence and intrinsic and extrinsic motivation, as well as achievement motivation with the sub-dimensions situational interest and experienced challenge.

2.3. Material

Learning Environments

Two DEERs were created with Google Sites and Google Forms. The puzzles were integrated into the DEERs using LearningApps. The graded learning aids were also created with Google Sites and were linked directly in the DEERs. With the help of Google Sites, a website can be created, and Google Forms offers the possibility to design surveys or quizzes. For example, a password or code had to be cracked and entered in order to proceed with each DEER. LearningApps offers different puzzles, such as a search puzzle, crossword puzzles, assignments, quizzes and many more. The links to these puzzles were directly integrated in the DEER so that the students only had to click on them to solve the puzzle. The DEER itself could be called up very easily with the help of a link and students could start working on it straight away. The students received iPads with the DEER already open and they were able to start immediately after getting them. If the link was not open yet, they either had to enter the link or scan a QR Code. Internet connection was also necessary because the DEER is hosted online. After opening the DEER, a video explaining context and setting started. Then, the user saw a fictitious room (laboratory) in which they were “locked”. The various puzzles and clues were hidden in this room. After clicking on the first clue, the user was redirected to a new page where the puzzle appeared (e.g., LearningApp). When the students successfully completed this puzzle, they received the next clue. This could be another puzzle, a code or a password. For a better idea of what these might look like, examples of puzzles are shown in Figure 1 and Figure 2. The puzzle in Figure 1 shows four contraceptives and how they work: a condom, a birth control pill, an IUD and a hormone patch. Students were required to correctly match the names with the descriptions. After solving the puzzle, the following information appeared: “Great, you’ve put everything back in the correct order. However, this word hasn’t been put in the right order yet?! ‘epocsorcim’ If you need help, follow this clue: https://sites.google.com/view/…”. Then, the next clue could be discovered under the microscope. The indicated segments of the uterus in Figure 3 had to be named correctly. For this purpose, the students were given a selection of names and descriptions which they had to assign correctly.
The DEER must be worked through in a linear sequence, which means that the puzzles have to be solved in the given order (sequential ER [22]). This had to be completed while pressed for time and the countdown was visible on the screen in the classroom. The topic of the DEER was sex education, more precisely, pregnancy and contraceptives. Sex education is a central topic in biology, and, above all, it is a very tricky topic. Biology teachers tend to find it difficult to teach since talking about it is often difficult for students and adults. Additionally, it is an embarrassing topic for students, and they often react with laughter. The DEERs were presented in German and both groups received the same instructions at the beginning. The groups were divided randomly after instruction. During instruction, the concept of an escape room was explained in whole-class frontal instruction. For this purpose, both rooms were opened and shown to the students, using a beamer, in order to explain the principle of an ER and the procedure. Afterwards, a video describing context and setting was watched. The mission was described as follows: “For years we have been researching a drug that massively increases women’s fertility and now we are on the verge of a breakthrough. Last night we were attacked and the research results were stolen. The thieves left the following message: ‘The research results are hidden in our laboratory, you have 45 min to find them, otherwise we will destroy them!’”. Following this instruction, the countdown started and the participants had to begin their search.
In one condition, additional sequential learning aids were implemented. Examples concerning these aids were as follows: “Take another look at the virtual lab and search for this female sex organ!”, “First, sort the text boxes by information. Now try to correctly assign the text boxes to the months.” or “This link will lead you to an explanation, which will make the task easier for you: …” (see Figure 3). These learning aids were linked with instructions such as “If you need help, click here”. If a code or password was entered incorrectly, this message came up: “Oh dear, I’m afraid this is wrong. Get help here”. Within the puzzles, like LearningApps, there was a lightbulb icon to click on in order to get help. The corresponding link then led to sequential learning aids on Google Sites. Once you used a learning aid and it did not help you with the puzzle, you could click on further aid buttons.

2.4. Instruments

In order to assess pre-knowledge and knowledge acquisition, a test based on the learning environment’s objectives was developed. It consisted of 9 multiple-choice questions (e.g., “Which contraceptive is the most reliable one?”—condom, birth control pill, diaphragm or hormone patch), a short essay task (“Explain the process of fertilization up to the implantation of the fertilized egg”) and a graphical task, where participants had to label the female reproductive organs within a given graphic. The same test was administered in the pre- and post-test. In the pre-test, the additional option “I do not know” was given. All correct answers were added up to an overall score.
Knowledge-related self-confidence was assessed by a single item in the pre- and post-test (“How confident are you about your knowledge within the area of female fertilization and contraception?” using a 5-point Likert-scale (from 1—“not confident at all”—to 5—“very confident”).
Motivation as a control variable was measured in the pre-test via the Motivated Strategies for Learning Questionnaire (MSLQ; [64]). This questionnaire has two sub-scales: one for assessing intrinsic motivation (4 items, e.g., “I prefer challenging tasks in school that motivate me, even when they are difficult”; Cronbach’s Alpha = 0.48) and one for assessing extrinsic motivation (4 items, e.g., “Getting good grades is most important for me”; Cronbach’s Alpha = 0.66; all 5-point Likert scales). Due to insufficient scores regarding reliability, these scales were excluded from further analyses.
Participants’ situational interest in the topic and experienced challenge were assessed via the Questionnaire on Current Motivation (QCM; [65]). The QCM uses 18 items to measure anxiety, probability of success, situational interest and experienced challenge. In the present study, only the sub-scales situational interest (five items, e.g., “I like such riddles”; 5-point Likert-scale from 1—completely disagree—to 5—completely agree; Cronbach’s Alpha = 0.80) and experienced challenge (four items, e.g., “I like such riddles”; 5-point Likert-scale from 1—completely disagree—to 5—completely agree; Cronbach’s Alpha = 0.43) were used. Moreover, due to insufficient reliability, experienced challenge was excluded from further analyses. Situational interest was assessed in the pre- and post-test.
Immersion and flow experience as dependent variables were operationalized by using two sub-scales of the ARI (Augmented Reality Immersion Questionnaire; [66]). The first sub-scale assesses immersion experience (4 items, e.g., “I experienced the learning environment as real”; Cronbach’s Alpha = 0.67 after exclusion of 2 items; all 5-point Likert scales). The second assesses flow experience (3 items, e.g., “I did not think about anything else during the learning process”; Cronbach’s Alpha = 0.64). Due to reasons of internal consistency, only the immersion sub-scale was used for further analyses.
In order to assess cognitive load as a dependent variable, the instrument provided by [67] was used in the post-test. This self-reporting questionnaire with 5-point Likert scales has three sub-scales: intrinsic cognitive load (3 items, e.g., “This task was complex”; Cronbach’s Alpha = 0.61 after exclusion of 1 item), germane cognitive load (3 items, e.g., “I tried to understand everything correctly”; Cronbach’s Alpha = 0.60 after deleting one item) and extraneous cognitive load (3 items, e.g., “It was hard to find the important information”; Cronbach’s Alpha = 0.62). However, these variables were also excluded from further analyses.

2.5. Procedure

The study was conducted during regular school time in students’ biology classes. At the beginning, students were informed about the nature of the study and its content. Afterwards, they were able to choose to participate or not (with the alternative to work on a self-directed similar biology topic). Participating students were randomly assigned one of the two conditions. In class, all students worked simultaneously on the pre-test. Afterwards, they were divided into two experimental groups. The post-test was applied directly after the treatment. Overall, participation took about 100 min.

3. Results

The descriptive results (see Table 1) indicate that both groups increased their performance on the knowledge test from pre- to post-test. Furthermore, situational interest and knowledge-related self-confidence increased in both groups from pre- to post-test. The descriptive data show only small differences between both groups regarding all variables.
A MANCOVA was conducted to analyse differences between the experimental and control group. Additionally, covariates, such as pre-test knowledge score, knowledge-related self-confidence in the pre-test, last final grade in biology and situational interest in the pre-test, were included. Originally, it was planned to also include other scores of motivational scales and cognitive load; however, due to their low reliability, these variables were dropped from further analyses (see Table 2 for an overview).
Results in the knowledge post-test, knowledge-related self-confidence and situational interest in the post-test were used as dependent variables.
Covariates show a significant influence of knowledge pre-test score (F(4, 75) = 4.65, p = 0.002; ηp2 = 0.20). This is significant with regard to the knowledge post-test score (F(1, 78) = 136.87, p < 0.001; ηp2 = 0.18). The pre-test knowledge test score correlates positively and significantly with the knowledge post-test score (r = 0.46; p < 0.001).
Situational interest in the pre-test is also significant in the overall model (F(4, 75) = 6.82, p < 0.001; ηp2 = 0.27). This variable significantly correlates with situational interest in the post-test (F(1, 78) = 25.37, p < 0.001; ηp2 = 0.25). Situational interest in the pre-test correlates positively and significantly with situational interest in post-test (r = 0.58; p < 0.001).
Another significant covariate is knowledge-related self-confidence in the pre-test (F(4, 75) = 10.32, p < 0.001; ηp2 = 0.36). It significantly affects scores of knowledge-related self-confidence in the post-test (F(1, 78) = 21.65, p < 0.001; ηp2 = 0.22), immersion experience (F(1, 78) = 9.77, p = 0.002; ηp2 = 0.11) and situational interest in the post-test (F(1, 78) = 7.11, p = 0.009; ηp2 = 0.08).
Pre-test knowledge-related self-confidence correlates positively and significantly with the same score in the post-test (r = 0.51; p < 0.001), negatively with immersion experience (r = −0.33; p = 0.002) and negatively with situational interest in the post-test (r = −0.24; p = 0.03). The biology grade as a covariate was not statistically significant (F(7, 72) = 2.43, p = 0.06; ηp2 = 0.12).
Multivariate comparison across both conditions shows no significant difference (F(4, 75) = 1.43, p = 0.23; ηp2 = 0.07). Consequently, there were no significant differences between both groups regarding experience of immersion (Hypothesis 5).
A second analysis of variance with repeated measurement across both conditions and with the variables situational interest, knowledge and knowledge-related self-confidence showed a significant difference between pre-test and post-test (F(3, 80) = 118.70, p < 0.001; ηp2 = 0.82). Univariate analyses reveal a significant difference in situational interest (F(1, 82) = 9.67, p = 0.003; ηp2 = 0.11) indicating that this value increased from pre- to post-test (Hypothesis 2), knowledge (F(1, 82) = 324.71, p < 0.001; ηp2 = 0.80); increase from pre- to post-test; (Hypothesis 1) and knowledge-related self-confidence (F(1, 82) = 87.41, p < 0.001; ηp2 = 0.52); increase from pre- to post-test; (Hypothesis 4).

4. Discussion

In this research we examined the impact of a digital educational escape room (DEER) on cognitive and motivational outcomes under controlled conditions. We assumed that a DEER, as a form of GBL, can contribute to effective problem solving and, thus, knowledge acquisition and simultaneously not only maintain intrinsic motivation but also enhance it. Previous research has shown that GBL is an effective instructional device which contributes to learning and fun [8,23,55]. Indeed, this study’s outcomes reveal that the DEER developed for this experimental setting leads to significantly improved performance from pre- to post-test (Hypothesis 1). At first glance, this seems like a rather trivial outcome. However, it is not: if the entertaining factor is dominant, learning objectives may disappear from learners’ focus. Pre-test scores show a high level of interest on behalf of the learners, which significantly improved in the post-test by using the DEER (Hypothesis 2). This corresponds with findings in most GBL research, stating that learners’ motivation can be maintained or even enhanced [16]. Nevertheless, this does not happen automatically but is influenced by learners’ prior motivation as well as the design of the game environment itself [68]. Consequently, using a DEER such as the one in this study seems to support maintaining and increasing motivation compared to other genres of GBL.
Regarding the independent variable, no significant differences were found concerning learning motivation (Hypothesis 3) and experience of immersion (Hypothesis 5). This contradicts the results presented, e.g., provided by [69], which show that while scaffolding may contribute to learning, it may also reduce motivation. One possible explanation here might be the fact that the sequential scaffolds were either not needed or did not interfere with the gameplay of the DEER. Consequently, learners might not have used this kind of scaffolding and, thus, were neither supported nor distracted by them. Nevertheless, these issues have to be discussed carefully, especially because low scores in internal consistency of cognitive load and immersion measures did not allow including these variables in the analyses.
Concerning knowledge-related self-confidence, results suggest that the DEER supports learners becoming aware of having learned via this learning environment (Hypothesis 4). There is no difference between the groups; however, both showed an overall significant increase from pre- to post-test, highlighting the effective design of the intervention.
In addition, results reveal that there are some, but not many, aptitude–treatment interactions. Evidently, covariates such as pre-knowledge do have an impact on performance in knowledge post-test and knowledge-related self-confidence, revealing a Matthew Effect. Consequently, knowledge gains could also be explained more by prior knowledge than by the intervention, for example. Hence, learners with greater prior knowledge can more easily assimilate incoming information than learners with less prior knowledge. Nevertheless, the missing significant impact of the biology grade is also an indicator that the DEER has been effectively designed to be beneficial not only to gifted students but to all students.

5. Conclusions

In conclusion, the DEER, as developed and investigated here, is an effective and motivating example of digital GBL. There are some methodological shortcomings of this research (e.g., insufficient internal consistency of some scales and being a single-shot study), but findings still encourage the use of DEERs in order to provide a different way of science teaching and learning. Major findings suggest that the DEER is a meaningful and empirically proven approach to support learning (here within the domain of sex education) and it equally contributes to enhancing learners’ motivation. Additional learning support, as provided here, is not necessarily needed when the DEER design meets the pre-requisites of learners and is neither too demanding nor too easy. In sum, using DEERs as game-based learning environments—when designed appropriately—is an effective and motivating approach of game-based learning.

Author Contributions

Conceptualization, methodology, formal analysis, investigation, resources, data curation, writing—original draft preparation, writing—review and editing, L.v.K., J.Z. and A.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Ethical review and approval were waived for this study, due to non-person-related data acquisition. As this study was a voluntary part of biology learning in school, permission by school administration was granted.

Informed Consent Statement

Informed consent was obtained from all people involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Prensky, M. Digital Game-Based Learning; McGraw-Hill: New York, NY, USA, 2001. [Google Scholar]
  2. Plass, J.L.; Homer, B.D.; Kinzer, C.K. Foundations of Game-Based Learning. Educ. Psychol. 2015, 50, 258–283. [Google Scholar] [CrossRef]
  3. Dicheva, D.; Dichev, C.; Agre, G.; Angelova, G. Gamification in education: A systematic mapping study. J. Educ. Technol. Soc. 2015, 18, 75–88. [Google Scholar]
  4. Nah, F.F.H.; Zeng, Q.; Telaprolu, V.R.; Ayyappa, A.P.; Eschenbrenner, B. Gamification of education: A review of literature. In International Conference on HCI in Business; Nah, F.F.-H., Ed.; Springer: Cham, Switzerland, 2014; pp. 401–409. [Google Scholar]
  5. Djaouti, D.; Alvarez, J.; Jessel, J.P. Classifying serious games: The G/P/S model. In Handbook of Research on Improving Learning and Motivation through Educational Games: Multidisciplinary Approaches; Felicia, P., Ed.; IGI Global: Hershey, PA, USA, 2011; pp. 118–136. [Google Scholar]
  6. Egenfeldt-Nielsen, S. Third generation educational use of computer games. J. Educ. Multimed. Hypermed. 2007, 16, 263–281. [Google Scholar]
  7. Vogel, J.J.; Vogel, D.S.; Cannon-Bowers, J.; Bowers, C.A.; Muse, K.; Wright, M. Computer gaming and interactive simulations for learning: A meta-analysis. J. Educ. Comput. Res. 2006, 34, 229–243. [Google Scholar] [CrossRef]
  8. Wouters, P.; van Oostendorp, H. A meta-analytic review of the role of instructional support in game-based learning. Comput. Educ. 2013, 60, 412–425. [Google Scholar] [CrossRef]
  9. Sitzmann, T. A meta-analytic examination of the instructional effectiveness of computer-based simulation games. Pers. Psychol. 2011, 64, 489–528. [Google Scholar] [CrossRef]
  10. Clark, D.; Tanner-Smith, E.; Killingsworth, S. Digital Games, Design and Learning: A Systematic Review and Meta-Analysis; SRI International: Menlo Park, CA, USA, 2014. [Google Scholar]
  11. Esquembre, F. Computers in physics education. Comput. Phys. Commun. 2002, 147, 13–18. [Google Scholar] [CrossRef]
  12. Kim, B.; Park, H.; Baek, Y. Not just fun, but serious strategies: Using meta-cognitive strategies in game-based learning. Comput. Educ. 2009, 52, 800–810. [Google Scholar] [CrossRef]
  13. Sung, H.Y.; Hwang, G.J. A collaborative game-based learning approach to improving students’ learning performance in science courses. Comput. Educ. 2013, 63, 43–51. [Google Scholar] [CrossRef]
  14. Meluso, A.; Zheng, M.; Spires, H.A.; Lester, J. Enhancing 5th graders’ science content knowledge and self-efficacy through game-based learning. Comput. Educ. 2012, 59, 497–504. [Google Scholar] [CrossRef]
  15. Hamari, J.; Shernoff, D.J.; Rowe, E.; Coller, B.; Asbell-Clarke, J.; Edwards, T. Challenging games help students learn: An empirical study on engagement, flow and immersion in game-based learning. Comput. Hum. Behav. 2016, 54, 170–179. [Google Scholar] [CrossRef]
  16. Erhel, S.; Jamet, E. Digital game-based learning: Impact of instructions and feedback on motivation and learning effectiveness. Comput. Educ. 2013, 67, 156–167. [Google Scholar] [CrossRef]
  17. Chiu, Y.; Kao, C.; Reynolds, B. The relative effectiveness of digital game-based learning types in English as a foreign language setting: A meta-analysis. Br. J. Educ. Technol. 2012, 43, 104–107. [Google Scholar] [CrossRef]
  18. Chen, M.H.; Tseng, W.T.; Hsiao, T.Y. The effectiveness of digital game-based vocabulary learning: A framework-based view of meta-analysis. Br. J. Educ. Technol. 2018, 49, 69–77. [Google Scholar] [CrossRef]
  19. Tokac, U.; Novak, E.; Thompson, C.G. Effects of game-based learning on students’ mathematics achievement: A meta-analysis. J. Comput. Assist. Learn. 2019, 35, 407–420. [Google Scholar] [CrossRef]
  20. Chen, C.; Shih, C.C.; Law, V. The effects of competition in digital game-based learning (DGBL): A meta-analysis. Educ. Technol. Res. Dev. 2020, 68, 1–19. [Google Scholar] [CrossRef]
  21. Makri, A.; Vlachopoulos, D.; Martina, R. Digital Escape Rooms as Innovative Pedagogical Tools in Education: A Systematic Literature Review. Sustainability 2021, 13, 4587. [Google Scholar] [CrossRef]
  22. Nicholson, S. Creating Engaging Escape Rooms for the Classroom. Child. Educ. 2018, 94, 44–49. [Google Scholar] [CrossRef]
  23. Veldkamp, A.; Grint, L.; Knippels, M.-C.; van Joolingen, W. Escape education: A systematic review on escape rooms in education. Educ. Res. Rev. 2020, 31, 100364. [Google Scholar] [CrossRef]
  24. Wiemker, M.; Elumir, E.; Clare, A. Escape room games: Can you transform an unpleasant situation into a pleasant one? In Game Based Learning; Haag, J., Weißenböck, J., Gruber, M.W., Christian, M., Freisleben-Teutscher, F., Eds.; Fachhochschule St. Pölten GmbH: St. Pölten, Austria, 2015; pp. 55–68. [Google Scholar]
  25. Nicholson, S. Peeking Behind the Locked Door: A Survey of Escape Room Facilities. 2015. Available online: https://scottnicholson.com/pubs/erfacwhite.pdf (accessed on 23 December 2021).
  26. Eukel, H.N.; Frenzel, J.E.; Cernusca, D. Educational gaming for pharmacy students—Design and evaluation of a diabetes-themed escape room. Am. J. Pharm. Educ. 2017, 81, 1–5. [Google Scholar] [CrossRef]
  27. Glavas, A.; Stascik, A. Enhancing positive attitude towards mathematics through introducing Escape Room games. In Mathematics Education as a Science and a Profession; Element: Osijek, Croatia, 2017; pp. 281–294. [Google Scholar]
  28. Pan, R.; Lo, H.; Neustaedter, C. Collaboration, Awareness, and Communication in Real-Life Escape Rooms. In Proceedings of the 2017 Conference on Interaction Design and Children, Stanford, CA, USA, 27–30 June 2017; pp. 1353–1364. [Google Scholar]
  29. Douglas, J.Y.; Hargadon, A. The pleasures of immersion and engagement: Schemas, scripts and the fifth business. Digit. Creat. 2001, 12, 153–166. [Google Scholar] [CrossRef]
  30. Annetta, L.A. The “I’s” have it: A framework for serious educational game design. Rev. Gen. Psychol. 2010, 14, 105–113. [Google Scholar] [CrossRef] [Green Version]
  31. Sanchez, E.; Plumettaz-Sieber, M. Teaching and learning with escape games from debriefing to institutionalization of knowledge. In Proceedings of the International Conference on Games and Learning Alliance, Athens, Greece, 27–29 November 2019; pp. 242–253. [Google Scholar] [CrossRef] [Green Version]
  32. Kinio, A.E.; Dufresne, L.; Brandys, T.; Jetty, P. Break out of the Classroom: The Use of Escape Rooms as an Alternative Teaching Strategy in Surgical Education. J. Surg. Educ. 2019, 76, 134–139. [Google Scholar] [CrossRef]
  33. Bottge, B.A.; Toland, M.D.; Gassaway, L.; Butler, M.; Choo, S.; Griffen, A.K.; Ma, X. Impact of enhanced anchored instruction in inclusive math classrooms. Except. Child. 2015, 81, 158–175. [Google Scholar] [CrossRef] [Green Version]
  34. The Cognition and Technology Group at Vanderbilt. Technology and the design of generative learning environments. Educ. Technol. 1991, 31, 34–40. [Google Scholar]
  35. The Cognition and Technology Group at Vanderbilt. The Jasper series as an example of anchored instruction: Theory, program, description, and assessment data. Educ. Psychol. 1992, 27, 291–315. [Google Scholar] [CrossRef]
  36. Schank, R.C.; Fano, A.; Bell, B.; Jona, M. The design of goal-based scenarios. J. Learn. Sci. 1994, 3, 305–345. [Google Scholar] [CrossRef]
  37. Zumbach, J.; Reimann, P. Enhancing learning from hypertext by inducing a goal orientation: Comparing different approaches. Instr. Sci. 2002, 30, 243–267. [Google Scholar] [CrossRef]
  38. Breakout EDU. 2018. Available online: http://www.breakoutedu.com/ (accessed on 23 December 2021).
  39. Veldkamp, A.; Daemen, J.; Teekens, S.; Koelewijn, S.; Knippels, M.P.J.; Van Joolingen, W.R. Escape boxes: Bringing escape room experience into the classroom. Br. J. Educ. Technol. 2020, 51, 1220–1239. [Google Scholar] [CrossRef]
  40. Cain, J. Exploratory implementation of a blended format escape room in a large enrollment pharmacy management class. Curr. Pharm. Teach. Learn. 2019, 11, 44–50. [Google Scholar] [CrossRef]
  41. Hermanns, M.; Deal, B.; Campbell, A.M.; Hillhouse, S.; Opella, J.B.; Faigle, C.; Campbell, R.H. IV Using an “escape room” toolbox approach to enhance pharmacology education. J. Nurs. Educ. Pract. 2018, 8, 89–95. [Google Scholar] [CrossRef] [Green Version]
  42. Guckian, J.; Eveson, L.; May, H. The great escape? The rise of the escape room in medical education. Future Health J. 2020, 7, 112–115. [Google Scholar] [CrossRef] [PubMed]
  43. Hawkins, J.E.; Wiles, L.L.; Tremblay, B.; Thompson, B.A. Behind the Scenes of an Educational Escape Room. Am. J. Nurs. 2020, 120, 50–56. [Google Scholar] [CrossRef] [PubMed]
  44. Roman, P.; Rodriguez-Arrastia, M.; Molina-Torres, G.; Márquez-Hernández, V.V.; Gutiérrez-Puertas, L.; Ropero-Padilla, C. The escape room as evaluation method: A qualitative study of nursing students’ experiences. Med. Teach. 2019, 42, 403–410. [Google Scholar] [CrossRef]
  45. Charlo, J.C.P. Educational Escape Rooms as a Tool for Horizontal Mathematization: Learning Process Evidence. Educ. Sci. 2020, 10, 213. [Google Scholar] [CrossRef]
  46. Alonso, G.; Schroeder, K.T. Applying active learning in a virtual classroom such as a molecular biology escape room. Biochem. Mol. Biol. Educ. 2020, 48, 514–515. [Google Scholar] [CrossRef]
  47. Chu, S.; Kwan, A.; Reynolds, R.; Mellecker, R.; Tam, F.; Lee, G.; Hong, A.; Leung, C. Promoting Sex Education among Teenagers through an Interactive Game: Reasons for Success and Implications. Games Health J. 2015, 4, 168–174. [Google Scholar] [CrossRef]
  48. Haruna, H.; Hu, X.; Chu, S.K.W.; Mellecker, R.R.; Gabriel, G.; Ndekao, P.S. Improving Sexual Health Education Programs for Adolescent Students through Game-Based Learning and Gamification. Int. J. Environ. Res. Public Health 2018, 15, 2027. [Google Scholar] [CrossRef] [Green Version]
  49. Arnab, S.; Brown, K.; Clarke, S.; Dunwell, I.; Lim, T.; Suttie, N.; Louchart, S.; Hendrix, M.; De Freitas, S. The development approach of a pedagogically-driven serious game to support Relationship and Sex Education (RSE) within a classroom setting. Comput. Educ. 2013, 69, 15–30. [Google Scholar] [CrossRef] [Green Version]
  50. Ang, J.W.J.; Ng, Y.N.A.; Liew, R.S. Physical and Digital Educational Escape Room for Teaching Chemical Bonding. J. Chem. Educ. 2020, 97, 2849–2856. [Google Scholar] [CrossRef]
  51. Kroski, E. What Is a Digital Breakout Game? Libr. Technol. Rep. 2020, 56, 5–7. [Google Scholar]
  52. Hou, H.T.; Chou, Y.S. Exploring the technology acceptance and flow state of a chamber escape game-Escape the Lab© for learning electromagnet concept. In Proceedings of the 20th International Conference on Computers in Education ICCE, Singapore, 26–30 November 2012; pp. 38–41. [Google Scholar]
  53. Connolly, M.T.; Boyle, A.Z.; MacAuthor, E.; Hainey, T.; Boyle, M.J. A systematic literature review of empirical evidence on computer games and serious games. Comput. Educ. 2012, 59, 661–686. [Google Scholar] [CrossRef]
  54. Jabbar, A.I.; Felicia, P. Gameplay engagement and learning in game-based learning: A systematic review. Rev. Educ. Res. 2015, 85, 740–779. [Google Scholar] [CrossRef]
  55. Subhash, S.; Cudney, E.A. Gamified learning in higher education: A systematic review of the literature. Comput. Hum. Behav. 2018, 87, 192–206. [Google Scholar] [CrossRef]
  56. Choi, D.; An, J.; Shah, C.; Singh, V. Examining information search behaviors in small physical space: An escape room study. Proc. Assoc. Inf. Sci. Technol. 2017, 54, 640–641. [Google Scholar] [CrossRef]
  57. Järveläinen, J.; Paavilainen-Mäntymäki, E. Escape room as game-based learning process: Causation—Effectuation perspective. In Proceedings of the 52nd Hawaii International Conference on System Sciences, Maui, HI, USA, 8–11 January 2019; Volume 6, pp. 1466–1475. [Google Scholar] [CrossRef] [Green Version]
  58. Warmelink, H.; Haggis, M.; Mayer, I.; Peters, E.; Weber, J.; Louwerse, M. AMELIO: Evaluating the team-building potential of a mixed reality escape room game. CHI PLAY 17 Ext. Abstr. 2017, 111–123. [Google Scholar] [CrossRef]
  59. Li, P.Y.; Chou, Y.K.; Chen, Y.J.; Chiu, R.S. Problem-based learning (PBL) in interactive design: A case study of escape the room puzzle design. In Proceedings of the IEEE International Conference on Knowledge Innovation and Invention (ICKII), Jeju, Korea, 23–27 July 2018. [Google Scholar] [CrossRef]
  60. Ma, J.P.; Chuang, M.H.; Lin, R. An innovated design of escape room game box through integrating STEAM education and PBL Principle. In Proceedings of the International Conference on Cross-Cultural Design, Las Vegas, NV, USA, 15–20 July 2018; pp. 70–79. [Google Scholar] [CrossRef]
  61. Adams, V.; Burger, S.; Crawford, K.; Setter, R. Can you escape? Creating an escape room to facilitate active learning. J. Nurs. Prof. Dev. 2018, 34, E1–E5. [Google Scholar] [CrossRef]
  62. Brown, N.; Darby, W.; Coronel, H. An escape room as a simulation teaching strategy. Clin. Simul. Nurs. 2019, 30, 1–6. [Google Scholar] [CrossRef]
  63. Jenkin, I.; Fairfurst, N. Escape room to operating room: A potential training modality? Med. Teach. 2019, 42, 596. [Google Scholar] [CrossRef]
  64. Pintrich, P.R.; Smith, D.A.F.; Garcia, T.; McKeachie, W.J.A. Manual for the Use of the Motivated Strategies for Learning Questionnaire (MSLQ); University of Michigan: Michigan, IL, USA, 1991. [Google Scholar]
  65. Rheinberg, F.; Vollmeyer, R.; Burns, B.D. FAM: Ein Fragebogen zur Erfassung aktueller Motivation in Lern- und Leistungssituationen (QCM: A questionnaire to assess current motivation in learning situations). Diagnostica 2001, 2, 57–66. [Google Scholar] [CrossRef]
  66. Georgiou, Y.; Kyza, E.A. The development and validation of the ARI questionnaire: An instrument for measuring immersion in location-based augmented reality settings. Int. J. Hum.-Comput. Stud. 2017, 98, 24–37. [Google Scholar] [CrossRef]
  67. Klepsch, M.; Schmitz, F.; Seufert, H. Development and Validation of Two Instruments Measuring Intrinsic, Extraneous, and Germane Cognitive Load. Front. Psychol. 2017, 8, 18. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  68. Eseryel, D.; Law, V.; Ifenthaler, D.; Ge, X.; Miller, R. An investigation of the interrelationships between motivation, engagement, and complex problem solving in game-based learning. J. Educ. Technol. Soc. 2014, 17, 42–53. [Google Scholar]
  69. Chen, C.H.; Law, V. Scaffolding individual and collaborative game-based learning in learning performance and intrinsic motivation. Comput. Hum. Behav. 2016, 55, 1201–1212. [Google Scholar] [CrossRef]
Figure 1. Characteristics of contraceptives.
Figure 1. Characteristics of contraceptives.
Mti 06 00008 g001
Figure 2. Labelling of segments of the uterus.
Figure 2. Labelling of segments of the uterus.
Mti 06 00008 g002
Figure 3. Example of a sequential learning aid.
Figure 3. Example of a sequential learning aid.
Mti 06 00008 g003
Table 1. Mean values (and standard deviations in brackets) of dependent variables.
Table 1. Mean values (and standard deviations in brackets) of dependent variables.
Without Support (n = 42)With Support (n = 42)
Pre-TestPost-TestPre-TestPost-Test
Last grade in biology1.62 (0.76)N/A1.93 (1.05)N/A
Situational interest4.11 (0.61)4.31 (0.55)3.79 (0.88)4.04 (0.79)
Knowledge test2.30 (2.23)8.74 (3.19)2.23 (2.63)7.82 (3.41)
Knowledge-related self-confidence2.79 (1.09)3.86 (0.72)2.36 (1.10)3.31 (0.84)
Experience of immersionN/A4.10 (0.79)N/A3.98 (0.90)
Note: All scales are on a 5-point Likert scale with 1 minimum and 5 maximum except grade in biology (1—very good; 5—failed) and knowledge test (0 minimum–15 maximum).
Table 2. Outcomes of MANOVA.
Table 2. Outcomes of MANOVA.
Fpηp2
Between-group comparison1.430.230.07
Last grade in biology2.430.060.12
Situational interest6.82<0.0010.27
Knowledge test4.650.0020.20
Knowledge-related self-confidence10.32<0.0010.36
Repeated measurement across all conditions118.70<0.0010.82
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

von Kotzebue, L.; Zumbach, J.; Brandlmayr, A. Digital Escape Rooms as Game-Based Learning Environments: A Study in Sex Education. Multimodal Technol. Interact. 2022, 6, 8. https://doi.org/10.3390/mti6020008

AMA Style

von Kotzebue L, Zumbach J, Brandlmayr A. Digital Escape Rooms as Game-Based Learning Environments: A Study in Sex Education. Multimodal Technologies and Interaction. 2022; 6(2):8. https://doi.org/10.3390/mti6020008

Chicago/Turabian Style

von Kotzebue, Lena, Joerg Zumbach, and Anna Brandlmayr. 2022. "Digital Escape Rooms as Game-Based Learning Environments: A Study in Sex Education" Multimodal Technologies and Interaction 6, no. 2: 8. https://doi.org/10.3390/mti6020008

APA Style

von Kotzebue, L., Zumbach, J., & Brandlmayr, A. (2022). Digital Escape Rooms as Game-Based Learning Environments: A Study in Sex Education. Multimodal Technologies and Interaction, 6(2), 8. https://doi.org/10.3390/mti6020008

Article Metrics

Back to TopTop