Next Article in Journal
Four-Way Decomposition of Effect of Alcohol Consumption and Body Mass Index on Lipid Profile
Next Article in Special Issue
Critical Thinking and Motivation in Vocational Training and Baccalaureate: A Comparison Study of Students of Spanish Nationality, Unaccompanied Foreign Minors and Young Care Leavers
Previous Article in Journal
The Simulation Game—Virtual Reality Therapy for the Treatment of Social Anxiety Disorder: A Systematic Review
Previous Article in Special Issue
The Role of Controlled Motivation in the Self-Esteem of Adolescent Students in Physical Education Classes
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Effects of Gamification on the Benefits of Student Response Systems in Learning of Human Anatomy: Three Experimental Studies

by
Juan J. López-Jiménez
1,*,
José L. Fernández-Alemán
1,
José A. García-Berná
1,
Laura López González
2,
Ofelia González Sequeros
2,
Joaquín Nicolás Ros
1,
Juan M. Carrillo de Gea
1,
Ali Idri
3 and
Ambrosio Toval
1
1
Department of Informatics and System, Faculty of Computer Science, University of Murcia, 30100 Murcia, Spain
2
Department of Human Anatomy, Faculty of Medicine, University of Murcia, 30100 Murcia, Spain
3
Software Project Management Research Team, ENSIAS, Mohammed V University in Rabat, Rabat 10000, Morocco
*
Author to whom correspondence should be addressed.
Int. J. Environ. Res. Public Health 2021, 18(24), 13210; https://doi.org/10.3390/ijerph182413210
Submission received: 25 October 2021 / Revised: 17 November 2021 / Accepted: 22 November 2021 / Published: 15 December 2021

Abstract

:
This paper presents three experiments to assess the impact of gamifying an audience response system on the perceptions and educational performance of students. An audience response system called SIDRA (Immediate Audience Response System in Spanish) and two audience response systems with gamification features, R-G-SIDRA (gamified SIDRA with ranking) and RB-G-SIDRA (gamified SIDRA with ranking and badges), were used in a General and Descriptive Human Anatomy course. Students participated in an empirical study. In the academic year 2019–2020, a total of 90 students used RB-G-SIDRA, 90 students employed R-G-SIDRA in the academic year 2018–2019, and 92 students used SIDRA in the academic year 2017–2018. Statistically significant differences were found between final exam grades obtained by using RB-G-SIDRA and SIDRA, U = 39.211 adjusted p = 0.001 and RB-G-SIDRA and R-G-SIDRA U = 31.157 adjusted p = 0.015, thus finding strong evidence with respect to the benefit of the badges used in RB-G-SIDRA. Moreover, in the students’ SIDRA systems scores, statistically significant differences were found between RB-G-SIDRA and SIDRA, U = −90.521 adjusted p < 0.001, and between R-G-SIDRA and SIDRA, U = −87.998 adjusted p < 0.001. Significant correlations between individual and team scores were also found in all of the tests in RB-G-SIDRA and G-SIDRA. The students expressed satisfaction, engagement, and motivation with SIDRA, R-G-SIDRA, and RB-G-SIDRA, thus obtaining a final average assessment of 4.28, 4.61, and 4.47 out of 5, respectively. Students perform better academically with gamified versus non-gamified audience response systems. Findings can be used to build a gamified adaptive learning system.

1. Introduction

Clickers are an interactive learning tool used to ask students questions in class. These tools can be used to assess the academic achievement of students over a short period of time [1]. The first clickers were handheld devices on which students had to answer questions proposed by professors in class. Clickers have evolved into web-based systems [2,3], which allow students to use their smartphone as the handheld device, thus resulting in classroom response systems (CRSs).
Interactive learning activities have shown to improve the learning outcomes. Particularly, there is evidence that CRSs promote conceptual knowledge [4]. Therefore, CRSs are a valuable instrument for education in health sciences and a reliable and objective professor evaluation resource to assess complex capabilities and understanding.
Gamification is associated with the adoption of game mechanics, techniques, and game theory in non-gaming contexts [5,6]. Feedback, challenges, social sharing, rewards, leaderboards (rankings), points, tips, levels, avatars, badges, and user generated content are gamification elements employed successfully in literature [7]. Although a comprehensive list of different types of game elements has been published in grey literature [8], there is a lack of consensus with regards to terminology employed in game elements [9]. For example, different terms are used for rewards: badges, donuts, or iPads.
A large number of studies have used gamification approaches in health professions’ education. However, research is ongoing as to when and for what reasons gamification can be a suitable educational tool [10,11]. Gamification features can be added to CRSs, which leads to increase student concentration and active participation. The game principles have been applied to CRSs such as Kahoot and Socrative to promote fun learning. Gamified CRS sessions are perceived as being more interesting than traditional e-learning quizzes [12].
This paper presents three experiments to evaluate the impact of ranking, badges, teams, and points in a gamified mobile CRS on students’ academic performance and perceptions. To the best of the authors’ knowledge, no other studies have compared different gamification elements used in a CRS. The results of this experiment will help designers and developers to build more effective CRSs in teaching in general, and human anatomy education in particular. As suggested by Ahmad et al. [13], learning techniques used in the teaching of human anatomy must be modernized to take advantage of 21st century technology. Our work adds to the corpus of knowledge of digital learning innovations in the teaching of human anatomy [14,15].

2. Related Work

CRSs have been successfully used in pharmacy [16,17], pediatrics [18,19], advanced nursing therapeutic [20], multidisciplinary healthcare providers [21], nursing health assessment [22], medical-surgical [23], family medicine residents [24], ethics [25], anatomy and physiology [26], pathophysiology [27], anticoagulation [28], emergency [29], physical basis of medicine [30], clinical medicine [31], cardiology [32], medical prescription [33], pre-clinical medicine [12], and histology [34]. CRSs can employ many kinds of questions: multiple-choice questions (MCQs), find on image, quiz by combining items, fill in blanks, true/false questions, find a number, and word cloud, among others. CRSs such as Socrative, Yammer [16], and Kahoot [12,34] have been used in health sciences.
Gamification has been also widely employed in a variety of healthcare courses: psychiatric [35], COPD (acronym of Chronic Obstructive Pulmonary Disease) treatment [36], oncology [37], obstetrics [38], urology [39], surgery [40,41,42,43,44] emergency medicine [45], physiology [46,47,48], gynecology [49], internal medicine [50], resuscitation principles [51], anatomy [48,52], urine catheterization [53], radiology [54], and pediatrics [55]. The most used gamification elements in healthcare are scoring [37,38,39,41,42,45,46,50,52,53,55,56,57,58,59,60,61,62,63,64,65,66] and competition [37,38,39,41,42,44,49,50,52,57,58,59,64,65,66,67,68,69]. Rewards [36,41,43,47,48,54,67,68,69], signposting [36,62], and time [45,47,53,54,60,61,62,68] are also frequently used. Other gamification elements less employed in the teaching of health sciences are puzzles [35,70], role playing [35,61,71,72], achievements [73], missions [73], avatars [36,47], levels [36], quizzes [36,73], badges [50,56], levelling [45,56,62,63], quests [56,65], awards [40,74], teams [59,60,61,67], mystery characters [51,60,68], progress [44], social networks [44,58], and storytellings [65]. Certainly, game elements motivate and attract user in teaching activities [50,52,53,56]. All of them aim at ensuring user commitment to perform the learning activities. Evidence on the impact of the gamification on student academic outcomes has been reported in a meta-analysis of 24 empirical studies, involving a total of 3202 participants [75].

3. Materials and Methods

Three experiments conducted to assess the educational effectiveness of four gamification elements (ranking, badges, teams, and points) were designed. Two experiments employed a gamified CRS and one experiment employed a non-gamified CRS. In the following subsections the methodology is presented.

3.1. Participants and Data Collection

The participants were enrolled in a first-year medical course named General Anatomy of Human Musculoskeletal System (GAHMS) at the University of Murcia. This course is taught during the first 15 weeks of the academic year. GAHMS introduces human anatomy, especially the bone, joint, and muscle systems. A total of three thematic blocks are addressed in the aforementioned course: Unit 1: Description of gross anatomy and introduction to the musculoskeletal anatomy of the pelvis, abdomen, and thorax; Unit 2: Overview of the musculoskeletal anatomy, including both lower and upper limbs; Unit 3: Introduction of the musculoskeletal anatomy, presenting both head and neck composition. GAHMS is a six ECTS (European Credit Transfer and Accumulation System) credit course organized into lectures of four hours per week and skills practice on human cadaveric dissection of 2 h per week to encounter each of the structures of the human body. Students could opt out the study at any time without detriment to their final marks. The participants in the experiment were not repeaters. Moreover, they all had the same background. Therefore, they were all in the same condition to perform the experiment. None of the participants dropped out of the experiment.
The recruitment process started with a verbal presentation and the delivery of a document describing the goal, the procedures, and the tools used in the study. It is worth noting this study passed the approval of the Ethics Committee of University of Murcia.

3.2. Instruments

The G-SIDRA (Gamified Immediate Audience Response System in Spanish) is an evolution of an audience response system (https://docentis.inf.um.es/sidra/) called SIDRA (Immediate Audience Response System in Spanish) to endow this tool with gamification elements [76]. In 2018, R-G-SIDRA (gamified SIDRA with ranking) was built by adding three gamification elements (ranking, teams, and points). This extension was used in the academic course 2018/2019. The gamification process was organized in a total of four level stages [77,78]: (1) Business Modeling and Requirements to evaluate the tool and business goals that are documented; (2) Propose the gamification design; (3) Implementation of the software artifacts based on step 2 and test its functionally, and (4) Monitoring and Adaptation to measure business goal achievement and carry out subsequent design modifications if needed. In phase 2, Gamicards were used in the design process to support the gamification [79]. The gamification elements ranking, teams, and points were used to motivate two of the three most common user types (Socializer, and achiever) [80]. The user types Hexad model scale was employed with this aim [81]. In phase 3, a self-built solution in order to support gamification strategies was adopted for the sake of adaption flexibility and to have the control of the whole gamification engine. As reported in [82], self-built solutions to monitor the systems are preferred by experts rather than general gamification platforms.
In 2019, a non-digital gamification element was adopted to promote the gamification process. These elements consisted of metal badges representing gold, silver, and bronze medals, which were delivered at the end of each MCQ test. This system is identified as RB-G-SIDRA (gamified SIDRA with ranking and badges) and was employed in the academic course 2019/2020.
Table 1 shows the game elements used in each SIDRA system. Figure 1 shows the board and the badges used in the RB-G-SIDRA system. Observe that the rows denote the MCQs and the columns represent the teams.
In the evolved system, a test is formed by a list of MCQs about a specific topic. The client-server architecture of the SIDRA system provides the instructor with the possibility to gather and evaluate answers to MCQ sent from any device connected to the Internet. A professor can also add respondents, build and launch an MCQ test, download the test results, and display the students’ responses along with a ranking of groups or individuals. Access is granted to professors by sending a G-SIDRA account request to the administrator. A respondent can check the MCQs, complete the questionnaire, and see the percentage of correct answers for each question. All of these actions can be done online during the lecture via web or a mobile app. Figure 2 depicts the mobile interface of G-SIDRA. This interface is common to all gamified SIDRA extensions. Figure 3 and Figure 4 illustrate the gamification elements used in R-G-SIDRA and RB-G-SIDRA: individual ranking, badges, points, and classification of 10 teams, which can be viewed at the end of each test.

3.3. Design

Three versions of SIDRA were implemented for comparison in the context of the anatomy of the locomotor system. The sample was split into three groups. A group of 90 students used RB-G-SIDRA in the academic year 2019–2020, another group of 90 participants employed R-G-SIDRA in the academic year 2018–2019 and in the academic year 2017–2018, another group comprising 92 students used SIDRA. The same professors taught and the same explaining method was carried out in the three groups. Moreover, similar training was given concerning GAHMS skills and competences.
Data corresponding to answers of the students from seven, four, and seven MCQ tests taken in the academic years 2017/2018, 2018/2019, and 2019/2020, respectively, were collected. The questions dealt with gross anatomy and musculoskeletal anatomy. Moreover, the students responded to a questionnaire, scoring each question on a five-point Likert scale. The aim was to know the experience on using SIDRA, R-G-SIDRA, and RB-G-SIDRA.
Up-to-date literature on present recommended medical practices was considered when proposing the questionnaire. Furthermore, MCQ-writing recommendations were taken into account [83]. All of the questionnaires consisted of a set of 10 to 14 questions thus avoiding the fatigue effect.

3.4. Hypotheses

The following hypotheses were investigated in order to assess the impact on the learning process of students through the use of the aforementioned CRSs. Table 2 depicts a summary of the statistical treatments carried out in this study.
H1. Students using RB-G-SIDRA will obtain higher final exam grades compared to students who used R-G-SIDRA and SIDRA. EducationalTool was the independent variable, with three values: RB-G-SIDRA (academic course 2019/2020), R-G-SIDRA (academic course 2018/2019) and SIDRA (academic course 2017/2018). A dependent variable (Performance, measured using final exam grades) was defined to test the statistical hypothesis.
H2. The students using RB-G-SIDRA will obtain higher MCQ scores than the ones who used R-G-SIDRA and SIDRA. Again, EducationalTool was the independent variable, with three values: RB-G-SIDRA (academic course 2019/2020), R-G-SIDRA (academic course 2018/2019) and SIDRA (academic course 2017/2018). The dependent variable was Score. With this variable the number of correct answers in four MCQ tests was measured. The resulting averages were normalized on 10.
H3. The students with higher MCQ scores will achieve higher final exam grades. A grouping variable called SIDRAScore was used as the independent variable, which gave the low scores (between 0 and first tertile) a value of “1”, the medium scores (between first tertile and second tertile) a value of “2” and the high scores (between second tertile and 10) a value of “3”. Mark in the final exam was entered under the variable name Performance (the dependent variable). The relation between these variables was studied in the three academic courses.
H4. The gamification element individual ranking had an encouraging effect on the students. Ranking variations between each two consecutive tests were calculated for each student, thus resulting in three (VR1_18_19, VR2_18_19, VR3_18_19) and six (VR1_19_20, VR2_19_20, VR3_19_20, VR4_19_20, VR5_19_20, VR6_19_20) variables for the academic course 2018/2019 and 2019/2020, respectively. For example, if a student is ranked on third position in test 1 and on first position in test 2, a variation of two is stored in variable VR1_18_19 in academic course 2018/2019.
H5. The results of the team had an encouraging effect on the results of the individuals. Two variables were used: TeamScoreTx with the average of the team to which the student belongs in test Tx and IndividualScoreTx, which is the MCQ score of the students in test Tx.
H6. Students’ satisfaction with RB-G-SIDRA, R-G-SIDRA, and SIDRA. A questionnaire to know the students’ perspectives concerning their experience with SIDRA systems was completed by the participants in the experiments. A five-point Likert-type scale (5 = very high; 4 = high; 3 = medium; 2 = low; 1 = very low) was used in a nine-question questionnaire with also a Yes/No question.

3.5. Statistical Analysis

The tools SPSS 24.0 (IBM Corporation, Armonk, NY, USA) and Office Excel 2020 (Microsoft Corporation, Redmond, WA, USA) allowed to analyze the data and generate the figures. In order to detect statistically significant differences, a conventional significance level of 0.05 was used. The Kolmogorov–Smirnov statistical test allowed to verify if the study groups followed a normal distribution. When data of the dependent variable was not normally distributed, non-parametric tests were used. Particularly, the Mann–Whitney U test allowed to compare differences between the medians of two independent groups. Moreover, the Kruskal–Wallis H test or the one-way ANOVA on ranks was performed between the medians of more of two independent groups to compare the differences. Spearman’s correlation was also employed, allowing to measure the direction of association and the strength between two variables representing paired observations which are not normally distributed.

4. Results

H1. The performance (final exam score) varies as regards academic course, with Kruskal–Wallis χ2(2) = 14.349, p = 0.001. The highest average score was obtained by students using RB-G-SIDRA in the academic course 2019/2020 (M = 7.44; SD = 1.33) and the lowest average score by students using SIDRA in the academic course 2017/2018 (M = 6.43; SD = 1.78). Post-hoc paired comparisons were applied by using Mann–Whitney U tests (non-parametric). Statistically significant differences were found between RB-G-SIDRA and SIDRA, U = 39.211 adjusted p = 0.001 and RB-G-SIDRA and R-G-SIDRA U = 31.157 adjusted p = 0.015.
H2. The MCQ score varies as regards academic course, with Kruskal–Wallis chi-squared (2) = 96.217, p < 0.001. The highest average score was obtained by students using RB-G-SIDRA in the academic course 2019/2020 (M = 6.67; SD = 1.11) and the lowest average score by students using SIDRA in the academic course 2017/2018 (M = 3.98; SD = 1.42). Post-hoc paired comparisons were applied by using Mann–Whitney U tests (non-parametric). Statistically significant differences were found between RB-G-SIDRA and SIDRA, U = −90.521 adjusted p < 0.001, and between R-G-SIDRA and SIDRA, U = −87.998 adjusted p < 0.001. However, statistically significant differences were not found between RB-G-SIDRA and R-G-SIDRA U = −2.523 adjusted p = 1.
H3 Table 3 shows the average final exam score for each group formed by tertiles in SIDRA, R-G-SIDRA, and RB-G-SIDRA.
Academic course 2017–2018. There was a statistically significant difference between groups as determined by one-way ANOVA (F(2.71) = 11.243, p < 0.001). A Tukey post hoc test revealed that the final exam mark was statistically significantly higher in the group of students with high score (7.70 ± 1.22 points) in SIDRA compared to the group of students with medium score (6.59 ± 1.29 points, p = 0.030) and low score (5.71 ± 1.85 points, p < 0.001) in SIDRA. There was no statistically significant difference between the group of students with medium score compared with low score (p = 0.101).
Academic course 2018–2019. There was a statistically significant difference between groups as determined by one-way ANOVA (F(2.64) = 15.096, p < 0.001). A Tukey post hoc test revealed that the final exam mark was statistically significantly lower in the group of students with low score (5.23 ± 2.02 points) in R-G-SIDRA compared to the group of students with medium score (6.96 ± 1.23 points, p = 0.001) and high score (7.73 ± 1.28 points, p < 0.001) in R-G-SIDRA. There was no statistically significant difference between the group of students with medium score compared with high score (p = 0.226).
Academic course 2019–2020. There was not a statistically significant difference between groups as determined by the Kruskal–Wallis H test χ2(2) = 4.042, p = 0.133.
H4. Ranking variations between each two consecutive tests were calculated for each student in RB-G-SIDRA and R-G-SIDRA, which included the gamification element ranking. Figure 5 and Figure 6 show two box diagrams to study the dispersion of data. The dispersion of the ranking variations revealed a slight decreasing trend as the tests are taken during the academic year. This means that the classification shows some tendency to stabilize. Notice that R-G-SIDRA diagram (academic course 2018/2019) satisfies that, in the last test, more than half of the students achieved negative ranking variations. In contrast, RB-G-SIDRA diagram (academic course 2019/2020) satisfies that, in the last test, some students obtained remarkable increases concerning ranking variations (first quartile).
H5. Spearman’s rank correlation coefficients between individual and team scores for each MCQ test in RB-G-SIDRA and R-G-SIDRA were calculated. Significant correlations between individual and team scores were found in all of the tests as shown in Table 4. Notice that the correlations become stronger as the tests progress. These findings revealed that the inertia of the team can have a crucial influence on the individual performance of each team member.
H6. Table 5 presents several statistical parameters such as the means, standard deviations and medians of the scores obtained for 87, 71, and 38 students who used RB-G-SIDRA, R-G-SIDRA, and SIDRA, respectively. The use of the three SIDRA systems was positively evaluated by the students, with median 4 or 5 in all of the questions for the three systems, confirming hypothesis H6. Moreover, the gamification elements used in the learning of human anatomy (ranking, badges, teams and points) were positively evaluated as a motivational factor in the classroom (median 4 in Q6). The system allows trainees to understand better theoretical and practical concepts at the same time (median 4 or 5 in Q3 in the three systems). Teamwork also was highly valued (median 5 and 4 in Q7 in R-G-SIDRA and RB-G-SIDRA, respectively). Significant differences are also found in the assessment of the climate in class (1 point difference in medians in Q8).
Finally, there was a question with a dichotomous answer asking if you would use the SIDRA system in other courses. Observe that 96%, 100%, and 99% of the students (using RB-G-SIDRA, R-G-SIDRA, and SIDRA, respectively) would like the system to be used in more subjects.

5. Discussion

In this section, the main findings on hypotheses investigated to assess the impact of the use of gamified and non-gamified CRSs on the learning process of students are examined, analyzed, and compared with those of other studies.

5.1. Improving learning Outcomes

H1 hypothesis testing revealed that in the final exam of the anatomy course, the marks of the students who used RB-G-SIDRA were significantly better than those of the SIDRA group. These results confirmed previous research in which the use of gamified CRSs was studied [50,52,84]. Increased knowledge has been reported by a high number of experiments [39,40,43,47,50,51,52,55,56,58,63,64,65,67,68,69]. It is observed that the positive effect on students’ knowledge is independent of age and gender [85]. Gamification has been widely used in healthcare education [86].
In particular, an experiment to study the impact of points and leaderboard in computer science and psychology education reported a statistically significant increase on users’ performance [87], which provides indications to believe that the gamification elements adopted in RB-G-SIDRA are effective. In contrast, there were no statistically significant differences when investigating the ranking event in our experiment (H1), that is to say, there were no statistically significant differences between R-G-SIDRA and SIDRA.
Gamification has also been successfully implemented in human anatomy education [87,88]. The highest post-test versus pre-test scores were found in a group that adopted a gamified approach, being different from the non-gamified approach used in the other two groups [89]. Nevertheless, an experiment on leaderboard and badges revealed negative effects with the marks of the students’ final exams attending to a communication course [90]. Notice that ranking can generate both stress by the competition and feelings of inferiority in students, resulting in a reduced sense of autonomy and competence [91], thus negatively impacting the performance of the student. Those who fail to go up in the ranking table may feel a lower competence, which could lead to discouragement [92]. Therefore, lower-performing students may not benefit from the gamified presentation [93]. That was the case in our experiment (H1) as previous mentioned, since R-G-SIDRA did not improve the student performance compared to SIDRA. To remove this limitation, R-G-SIDRA depicted leaderboard only when each test was finished. Moreover, the scores were removed when starting each test.
Our study found significant differences between SIDRA and RB-G-SIDRA. This fact leads us to conclude that badges have a positive influence on learning outcomes. Previous research [50,56] revealed that students who received badges are more likely to achieve better marks. To avoid the comparative progress tracking provided by leaderboards/rankings, badges are excellent alternatives as game mechanics. These gamification elements allow instructors to show failure to the student without imposing punishment [94]. Moreover, badges reinforce certain learning behaviors such as perseverance. Notice that scientific evidence supports the use of a dopamine reward system as a powerful physiologic ally to achieve effective learning. Dopamine, which produces satisfaction, is released each time the student responds correctly and receives a badge [95]. Students strive to increase mastery of course content with the ultimate goal of maintaining the flow of satisfaction. Flow occurs when students are engaged in an activity (physical, mental, or both) in such a way that they lose track of time and the outside world [96]. After initial excitement at earning badges, students can be less motivating than the leaderboard [50] when they lost interest over time. For this reason, this flow must be considered by design [56] and gamification must be planned to keep students continuously satisfied. Any additional classroom tasks such as textbook reading and professor handouts must be integrated in the gamification activities to minimize the interruption of flow [97]. Our proposal addresses this point in the gamification process followed to keep students continuously satisfied.
The results obtained show no statistically significant difference between groups formed by tertile based on the RB-G-SIDRA score. Final exam average and SIDRA system score intervals are significantly higher in the three groups formed for RB-G-SIDRA with respect to the groups formed by R-G-SIDRA and SIDRA, as observed in Table 3. As an example, the score of the third tertile interval in RB-G-SIDRA (8.7 ≤ SCORE ≤ 10) with M = 7.989 is higher than that in R-G-SIDRA (8 ≤ SCORE ≤ 10) with M = 7.733 and SIDRA (6.8 ≤ SCORE ≤ 10) with M = 7.702. The same thing happened in the rest of tertile score intervals. We can conclude that badges included in RB-G-SIDRA allow students to achieve better and more homogeneous learning outcome during the course. This finding is confirmed by previous research [56].

5.2. Effect of Rankings and Teams

Social Comparison Theory (SCT) affirms that each individual possesses an inherent drive to receive accurate self-evaluations with the aim of ascertaining the validity of their own opinions and judgments [98]. Previous research has been reported on importance of the role played by social comparison in the development of academic performance [99]. Academic competition allows instructors to underpin a learning environment with social comparison. Notice that 57 out of 90 students achieved a higher or equal number of positive ranking variations than negative ranking variations in the academic course 2019/2020. This is an indicator of the motivation behind the competition. In contrast, 33 out of 90 students had a higher number of negative ranking variations than positive ranking variations. This group of students may be frustrated and have feelings of incompetence and dependency [91], thus falling into a cycle of disinterest in the subject [92]. These students obtained lower performance with an average score in the final exam of 7.14, which is lower than average score of the whole group (M = 7.44). They do not benefit from the gamified activities as confirmed in previous research [93]. Obviously, the motivating factor of competition may vary depending on many factors such as ethnicity, society, age, and individual preferences in the learning styles [64,100]. This duality present in the competition with respect to student motivation has been confirmed in other experiments [37]. Finally, regarding the ranking variations, similar conclusions to those of academic course 2019/2020 were found in the academic course 2018/2019.
Part of the activities carried out by health professionals involve working in teams in different clinical environments [61]. Therefore, learning and understanding the dynamics of teamwork is an added value provided by the gamification element team. For example, questions are formulated to allow students to explore and discuss aspects of theory and practice in a range of common situations in a hospital. The benefit is mutual among team members as evidenced by the positive correlation between team ratings and the individual ratings of each team member in our study as confirmed in H5. In the learning environment proposed in R-G-SIDRA and RB-G-SIDRA gamified SIDRA, team competition was adopted by using one device per student as it is the preferred modality for students [60]. Observe that CRS promotes social cohesion in classrooms through viewing responses sent by peers over time or knowing what classmates think [60]. The data generated by CRSs can be used to spark discussion [101] and to develop communication skills to learn from and with each other. In addition to being enjoyable [59], teams allow instructors to foster the idea of social fabric since students build a higher level of confidence and have a greater willingness to collaborate after playing games together [102]. Competition by teams also endowed SIDRA with an educational instrument that allowed a balance between cooperation and competition [64].

5.3. Survey

Satisfaction in using gamification has been widely recognized in previous studies on health professions education [46,52,55,73], in general, and using gamified CRS [12], in particular. This is confirmed in our survey in which students highly rated the use of the system in the classroom (question Q1 in academic courses 2018/19 and 2019/20).
Our survey showed that the gamified systems were more motivating than the non-gamified system in students’ learning process (question Q2). This finding is confirmed in a previous experiment. Significant differences were found on the motivation of students who took lectures with a gamified CRS and those who took lectures with a non-gamified CRS [103]. In most of the educational innovations, students are very enthusiastic at the beginning when using a CRS for the first time. However, novelty and its benefits are lost after being used several times [104].
Notice that the evaluation of the feedback provided by the instructor (RB-G-SIDRA and R-G-SIDRA) is notably superior to that of SIDRA (two-point difference in medians in Q4). The same feedback was given by the same instructors in the three systems. Probably, the students highly valued the discussion groups created in RB-G-SIDRA and R-G-SIDRA.
In a survey responded by students enrolled on an undergraduate human anatomy course, 50% of participants felt that the competitive situation motivated them, whereas 25% of participants did not agree [52]. In our survey (question Q6) the results are varied according to whether or not badges are used: RB-G-SIDRA (M = 4.34) and R-G-SIDRA (M = 3.86).
Our survey (question Q7) achieved similar results to a previous study based on a simulation game, in which 94% of participants considered that teamwork was important for their nursing learning activities [61]. An educational ultrasound event named Sound Games was also used for medical training in emergency medicine [59]. Most of the participants (93.75%) agreed or strongly agreed that working in a team was enjoyable. Health disciplines can benefit from this game element to understand the dynamics of many clinical environments. Finally, our survey revealed intentions to continue the use of gamification elements in other subjects in similar percentages to other surveys in the pediatric primary care (100%) [67] and blood grouping (98%) [73].
Fun is another benefit reported on literature [35,59,61,73]. Q8 shows that classes were more dynamic and fun when using RB-G-SIDRA (M = 4.66) and R-G-SIDRA (M = 4.77). The results were slightly lower in the non-gamified system. This is in line with a study in which 99% students indicated fun in using an online blood grouping game [73].
Notice that the field of study can influence the perception of students on the use of gamified CRS. Students in technological disciplines can perceive CRS as a more useful tool than student in social science disciplines [103].

6. Conclusions

This paper reported the effects of three experiences, two with a gamified CRS and one CRS without gamified features, on student performance and perceptions in a course on anatomy of the locomotor system. Findings supported that the use of ranking, badges, teams, and points in a CRS had a positive statistically significant effect on the marking of the students’ final exam. Strong evidence was found considering the benefit of the badges in RB-G-SIDRA in comparing R-G-SIDRA. Moreover, statistical tests revealed that the activity of the team can have an important impact on the individual performance of each team member. Perceptions collected in a survey about gamification confirmed higher motivation to participate in the classroom using RB-G-SIDRA with respect to R-G-SIDRA.
The improvement in the learning outcomes of the course could be summarized basically in that the students were able to identify more easily the axes and planes of orientation and their relationship with the most important anatomical structures, as well as the topographical regions of interest. In addition, they were able to adequately use anatomical terminology with respect to the morphology and global structure of the human body, especially with respect to the bones, muscles, and joints of the human body, acquiring these concepts more easily. The academic results showed that the use of RB-G-SIDRA led to an improvement in the acquisition of the learning objectives.
In comparing, rankings and badges, this last gamification element allows instructors to reward students without the stress and the possible feelings of inferiority produced by the competition. For students who are lagging behind, rankings can negatively impact on their performance. However, badges provide instructors with an excellent resource to show failure to the student without infringing a penalty such as being at the bottom of a ranking. Our results confirmed the evidence found in most of the scientific literature on the effects of gamification on health science student academic performance, motivation, and engagement. New experiments should be designed to compare the impact of the different gamification elements, in consideration of the types of learners and players. As a result, a gamified adaptive learning system could be built to address the different types of learning.
The integration of gamification elements into a CRS is a feasible settlement to tackle overcrowded classrooms, which prevent adequate communication with students. Moreover, these systems enable safe and sustainable education to face the new reality caused by COVID-19 [105]. In the synchronous education, a gamified CRS can be used in live interactive lessons by videocalls, whereby instructors and students are able to interact in real-time. A gamified CRS satisfying educational standards such as IMS (acronym of “Instructional Management System”) Content Packaging and SCORM (acronym of “Shareable Content Object Reference Model”) specifications can be integrated into Learning Management Systems (LMS) such as Sakai or Moodle, which are widely used in educational center. The visits to the academic organization can be drastically reduced when the learner attendance is not required. In future work, we intend to integrate G-SIDRA into an LMS such as Sakai in order to facilitate the adoption of this type of environment.

Author Contributions

J.J.L.-J., J.L.F.-A., L.L.G., and O.G.S. contributed to the following: the conception and design of the study, acquisition of data, analysis and interpretation of data, drafting the article and approval of submitted version. The authors A.T., J.N.R., J.M.C.d.G., J.A.G.-B. and A.I. made the following contributions to the study: analysis and interpretation of data, drafting the article, and approval of submitted version. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Spanish Ministry of Science, Innovation and Universities and the European Regional Development Fund (ERDF). This research is part of the BIZDEVOPS-GLOBAL-UMU (RTI2018–098309-B-C33) project, and the Network of Excellence in Software Quality and Sustainability (TIN2017–90689-REDT).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Harden, R.M. Student feedback from MCQ examinations. Med. Educ. 1975, 9, 102–105. [Google Scholar] [CrossRef] [PubMed]
  2. Syerov, Y.; Fedushko, S.; Loboda, Z. Determination of development scenarios of the educational web forum. In Proceedings of the 2016 XIth International Scientific and Technical Conference Computer Sciences and Information Technologies (CSIT), Lviv, Ukraine, 6–10 September 2016; pp. 73–76. [Google Scholar] [CrossRef]
  3. Kelsey, A.H.C.M.; McCulloch, V.; Gillingwater, T.H.; Findlater, G.S.; Paxton, J.Z. Anatomical sciences at the University of Edinburgh: Initial experiences of teaching anatomy online. Transl. Res. Anat. 2020, 19, 100065. [Google Scholar] [CrossRef]
  4. Cheng, L.T.W.; Wang, J.W. Enhancing learning performance through Classroom Response Systems: The effect of knowledge type and social presence. Int. J. Manag. Educ. 2019, 17, 103–118. [Google Scholar] [CrossRef]
  5. Wenk, N.; Gobron, S. Reinforcing the difference between simulation, gamification, and serious game. In Proceedings of the Gamification & Serious Game Symposium (GSGS), Neuchâtel, Switzerland, 30 June–1 July 2017; pp. 1–3. Available online: https://www.stephane-gobron.net/Core/Publications/Papers/2017_GSGS17-1.pdf (accessed on 14 November 2021).
  6. Deterding, S.; Sicart, M.; Nacke, L.E.; O’Hara, K.; Dixon, D. Gamification. Using game-design elements in non-gaming contexts. In Proceedings of the Extended Abstracts on Human Factors in Computing Systems (CHI EA ’11), Vancouver, BC, Canada, 7–12 May 2011. [Google Scholar] [CrossRef]
  7. Johnson, D.; Horton, E.; Mulcahy, R.; Foth, M. Gamification and serious games within the domain of domestic energy consumption: A systematic review. Renew. Sustain. Energy Rev. 2017, 73, 249–264. [Google Scholar] [CrossRef] [Green Version]
  8. Marczewski, A. 52 Gamification Mechanics and Elements. 2017. Available online: https://gist.github.com/Potherca/0c732e23fc0f1d0b94497faa0d0e08ba (accessed on 14 November 2021).
  9. Gorbanev, I.; Agudelo-Londoño, S.; Gonzalez, R.; Cortes, A.; Pomares, A.; Delgadillo, V.; Yepes, F.J.; Muñoz, Ó. A systematic review of serious games in medical education: Quality of evidence and pedagogical strategy. Med. Educ. Online 2018, 23, 1438718. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  10. Dicheva, D.; Dichev, C.; Agre, G.; Angelova, G. Gamification in education: A systematic mapping study. Educ. Technol. Soc. 2015, 18, 75–88. Available online: https://www.jstor.org/stable/10.2307/jeductechsoci.18.3.75 (accessed on 14 November 2021).
  11. Landers, R.N.; Bauer, K.N.; Callan, R.C.; Armstrong, M.B. Psychological theory and the gamification of learning. In Gamification in Education and Business; Springer: Cham, Switzerland, 2015; pp. 165–186. [Google Scholar] [CrossRef]
  12. Ismail, M.A.-A.; Ahmad, A.; Mohammad, J.A.-M.; Fakri, N.M.R.M.; Nor, M.Z.M.; Pa, M.N.M. Using Kahoot! as a formative assessment tool in medical education: A phenomenological study. BMC Med. Educ. 2019, 19, 230. [Google Scholar] [CrossRef] [Green Version]
  13. Ahmad, K.; Khaleeq, T.; Hanif, U.; Ahmad, N. Addressing the failures of undergraduate anatomy education: Dissecting the issue and innovating a solution. Ann. Med. Surg. 2021, 61, 81–84. [Google Scholar] [CrossRef]
  14. Chimmalgi, M. Interactive Lecture in the Dissection Hall: Transforming Passive Lecture into a Dynamic Learning Experience. Anat. Sci. Educ. 2019, 12, 191–199. [Google Scholar] [CrossRef]
  15. Maresky, H.S.; Oikonomou, A.; Ali, I.; Ditkofsky, N.; Pakkal, M.; Ballyk, B. Virtual reality and cardiac anatomy: Exploring immersive three-dimensional cardiac imaging, a pilot study in undergraduate medical anatomy education. Clin. Anat. 2019, 32, 238–243. [Google Scholar] [CrossRef]
  16. Munusamy, S.; Osman, A.; Riaz, S.; Ali, S.; Mraiche, F. The use of Socrative and Yammer online tools to promote interactive learning in pharmacy education. Curr. Pharm. Teach. Learn. 2019, 11, 76–80. [Google Scholar] [CrossRef]
  17. Slain, D.; Abate, M.; Hodges, B.M.; Stamatakis, M.K.; Wolak, S. Aninteractive response system to promote active learning in the doctor of pharmacy curriculum. Am. J. Pharm. Educ. 2004, 68, 1–9. [Google Scholar] [CrossRef]
  18. Berry, J. Technology support in nursing education: Clickers in the classroom. Nurs. Educ. Perspect. 2009, 30, 295–298. [Google Scholar] [CrossRef]
  19. Uhari, M.; Renko, M.; Soini, H. Experiences of using an interactive audience response system in lectures. BMC Med. Educ. 2003, 3, 1–6. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  20. DeBourgh, G.A. Use of classroom ‘clickers’ to promote acquisition of advanced reasoning skills. Nurse Educ. Pract. 2008, 8, 76–87. [Google Scholar] [CrossRef] [PubMed]
  21. Latessa, R.; Mouw, D. Use of an audience response system to augment interactive learning. Fam. Med. 2005, 37, 12–14. Available online: https://pubmed.ncbi.nlm.nih.gov/15619147/ (accessed on 14 November 2021). [PubMed]
  22. Meedzan, N.; Fisher, K.L. Clickers in nursing education: An active learning tool in the classroom. Online J. Nurs. Inform. 2009, 13, 1–19. Available online: www.ojni.org/13_2/Meedzan_Fisher.pdf (accessed on 14 November 2021).
  23. Patterson, B.; Kilpatrick, J.; Woebkenberg, E. Evidence for teaching practice: The impact of clickers in a large classroom environment. Nurse Educ. Today 2010, 30, 603–607. [Google Scholar] [CrossRef] [PubMed]
  24. Schackow, T.E.; Chavez, M.; Loya, L.; Friedman, M. Audience response system: Effect on learning in family medicine residents. Fam. Med. 2004, 36, 496–504. Available online: https://pubmed.ncbi.nlm.nih.gov/15243831/ (accessed on 14 November 2021).
  25. Smith, D.A.; Rosenkoetter, M.M. Effectiveness, challenges, and perceptions of classroom participation systems. Nurse Educ. 2009, 34, 156–161. [Google Scholar] [CrossRef] [PubMed]
  26. Stein, P.S.; Challman, S.D.; Brueckner, J.K. Using audience response technology for pretest reviews in an undergraduate nursing course. J. Nurs. Educ. 2006, 45, 469–473. [Google Scholar] [CrossRef] [PubMed]
  27. Stevenson, F. Clickers: The Use of Audience Response Questions to Enliven Lectures and Stimulate Teamwork. J. Int. Assoc. Med. Sci. Educ. 2007, 17, 106–111. Available online: http://njms.rutgers.edu/education/office_education/faculty/prot/documents/AudienceResponseArticle.pdf (accessed on 14 November 2021).
  28. Trapskin, P.J.; Smith, K.M.; Armitstead, J.A.; Davis, G.A. Use of an audience response system to introduce an anticoagulation guide to physicians, pharmacists, and pharmacy students. Am. J. Pharm. Educ. 2005, 69, 190–197. [Google Scholar] [CrossRef]
  29. Boyle, M.; Williams, B. The use of interactive wireless keypads for interprofessional learning experiences by undergraduate emergency health students. Int. J. Educ. Dev. Using Inf. Commun. Technol. 2008, 4, 41–48. Available online: https://www.learntechlib.org/p/42212/ (accessed on 14 November 2021).
  30. Nájera, A.; Villalba, J.M.; Arribas, E. Student peer evaluation using a remote response system. Med. Educ. 2010, 44, 1146. [Google Scholar] [CrossRef] [PubMed]
  31. Hashim, M. Standard setting using an audience response system with ‘clickers’. Med. Educ. 2013, 47, 530. [Google Scholar] [CrossRef] [PubMed]
  32. Schick, P.; Abramson, S.; Burke, J. Audience response technology: Under-appreciated value of post hoc analysis. Med. Educ. 2011, 45, 1157–1158. [Google Scholar] [CrossRef]
  33. Garbutt, J.; DeFer, T.; Highstein, G.; Mcnaughton, C.; Milligan, P.; Fraser, V. Safe Prescribing: An Educational Intervention for Medical Students. Teach. Learn. Med. 2006, 18, 244–250. [Google Scholar] [CrossRef] [PubMed]
  34. Felszeghy, S.; Pasonen-Seppänen, S.; Koskela, A.; Nieminen, P.; Härkönen, K.; Paldanius, K.M.A.; Gabbouj, S.; Ketola, K.; Hiltunen, M.; Lundin, M.; et al. Using online game-based platforms to improve student performance and engagement in histology teaching. BMC Med. Educ. 2019, 19, 273. [Google Scholar] [CrossRef] [PubMed]
  35. Ballon, B.; Silver, I. Context is key: An interactive experiential and content frame game. Med. Teach. 2004, 26, 525–528. [Google Scholar] [CrossRef] [PubMed]
  36. Chia, P. Using a virtual game to enhance simulation based learning in nursing education. Singap. Nurs. J. 2016, 40, 21–26. Available online: https://www.researchgate.net/publication/303146066_Using_a_virtual_game_to_enhance_simulation_based_learning_in_nursing_education (accessed on 15 November 2021).
  37. Janssen, A.; Shaw, T.; Bradbury, L.; Moujaber, T.; Nørrelykke, A.M.; Zerillo, J.A.; LaCasce, A.; Co, J.P.T.; Robinson, T.; Starr, A.; et al. A mixed methods approach to developing and evaluating oncology trainee education around minimization of adverse events and improved patient quality and safety. BMC Med. Educ. 2016, 16, 91. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  38. Kalin, D.; Nemer, L.B.; Fiorentino, D.; Estes, C.; Garcia, J. The Labor Games: A Simulation-Based Workshop Teaching Obstetrical Skills to Medical Students [2B]. Obstet. Gynecol. 2016, 127, 19S. [Google Scholar] [CrossRef]
  39. Kerfoot, B.P.; Baker, H.; Pangaro, L.; Agarwal, K.; Taffet, G.; Mechaber, A.J.; Armstrong, E.G. An online spaced-education game to teach and assess medical students: A multi-institutional prospective trial. Acad. Med. 2012, 87, 1443–1449. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  40. Leach, M.E.H.; Pasha, N.; McKinnon, K.; Etheridge, L. Quality improvement project to reduce paediatric prescribing errors in a teaching hospital. Arch. Dis. Child.-Educ. Pract. Ed. 2016, 101, 311–315. [Google Scholar] [CrossRef] [PubMed]
  41. van Dongen, K.W.; van der Wal, W.A.; Rinkes, I.H.M.B.; Schijven, M.P.; Broeders, I.A.M.J. Virtual reality training for endoscopic surgery: Voluntary or obligatory? Surg. Endosc. 2008, 22, 664–667. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  42. El-Beheiry, M.; McCreery, G.; Schlachta, C.M. A serious game skills competition increases voluntary usage and proficiency of a virtual reality laparoscopic simulator during first-year surgical residents’ simulation curriculum. Surg. Endosc. 2017, 31, 1643–1650. [Google Scholar] [CrossRef] [PubMed]
  43. Kerfoot, B.; Kissane, N. The Use of Gamification to Boost Residents’ Engagement in Simulation Training. JAMA Surg. 2014, 149, 1208–1209. [Google Scholar] [CrossRef] [Green Version]
  44. Petrucci, A.M.; Kaneva, P.; Lebedeva, E.; Feldman, L.S.; Fried, G.M.; Vassiliou, M.C. You Have a Message! Social Networking as a Motivator for FLS Training. J. Surg. Educ. 2015, 72, 542–548. [Google Scholar] [CrossRef] [PubMed]
  45. Lin, D.T.; Park, J.; Liebert, C.A.; Lau, J.N. Validity evidence for Surgical Improvement of Clinical Knowledge Ops: A novel gaming platform to assess surgical decision making. Am. J. Surg. 2015, 209, 79–85. [Google Scholar] [CrossRef] [PubMed]
  46. Longmuir, K.J. Interactive computer-assisted instruction in acid-base physiology for mobile computer platforms. Adv. Physiol. Educ. 2014, 38, 34–41. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  47. Lameris, A.; Hoenderop, J.; Bindels, R.; Eijsvogels, T. The impact of formative testing on study behaviour and study performance of (bio)medical students: A smartphone application intervention study. BMC Med. Educ. 2015, 15, 72. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  48. Rondon-Melo, S.; Sassi, F.; Andrade, C. Computer game-based and traditional learning method: A comparison regarding students’ knowledge retention. BMC Med. Educ. 2013, 13, 30. [Google Scholar] [CrossRef] [Green Version]
  49. LNemer, B.; Kalin, D.; Fiorentino, D.; Garcia, J.J.; Estes, C.M. The labor games. Obstet. Gynecol. 2016, 128, 1S–5S. [Google Scholar] [CrossRef]
  50. Nevin, C.R.; Westfall, A.O.; Rodriguez, J.M.; Dempsey, D.M.; Cherrington, A.; Roy, B.; Patel, M.; Willig, J.H. Gamification as a tool for enhancing graduate medical education. Postgrad. Med. J. 2014, 90, 685–693. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  51. Adami, F.; Cecchini, M. Crosswords and word games improve retention of cardiopulmonary resuscitation principles. Resuscitation 2014, 85, e189. [Google Scholar] [CrossRef] [PubMed]
  52. Van Nuland, S.E.; Roach, V.A.; Wilson, T.D.; Belliveau, D.J. Head to head: The role of academic competition in undergraduate anatomical education. Anat. Sci. Educ. 2015, 8, 404–412. [Google Scholar] [CrossRef] [PubMed]
  53. Butt, A.; Kardong-Edgren, S.; Ellertson, A. Using Game-Based Virtual Reality with Haptics for Skill Acquisition. Clin. Simul. Nurs. 2018, 16, 25–32. [Google Scholar] [CrossRef] [Green Version]
  54. Chen, P.-H.; Roth, H.; Galperin-Aizenberg, M.; Ruutiainen, A.T.; Gefter, W.; Cook, T.S. Improving Abnormality Detection on Chest Radiography Using Game-Like Reinforcement Mechanics. Acad. Radiol. 2017, 24, 1428–1435. [Google Scholar] [CrossRef] [PubMed]
  55. Verkuyl, M.; Romaniuk, D.; Atack, L.; Mastrilli, P. Virtual Gaming Simulation for Nursing Education: An Experiment. Clin. Simul. Nurs. 2017, 13, 238–244. [Google Scholar] [CrossRef]
  56. Davidson, S.; Candy, L. Teaching EBP Using Game-Based Learning: Improving the Student Experience. Worldviews Evid. Based Nurs. 2016, 13, 285–293. [Google Scholar] [CrossRef]
  57. Kow, A.W.C.; Ang, B.L.S.; Chong, C.S.; Tan, W.B.; Menon, K.R. Innovative Patient Safety Curriculum Using iPAD Game (PASSED) Improved Patient Safety Concepts in Undergraduate Medical Students. World J. Surg. 2016, 40, 2571–2580. [Google Scholar] [CrossRef]
  58. Lamb, L.; DiFiori, M.; Jayaraman, V.; Shames, B.; Feeney, J. Gamified Twitter Microblogging to Support Resident Preparation for the American Board of Surgery In-Service Training Examination. J. Surg. Educ. 2017, 74, 986–991. [Google Scholar] [CrossRef]
  59. Lobo, V.; Stromberg, A.; Rosston, P. The Sound Games: Introducing Gamification into Stanford’s Orientation on Emergency Ultrasound. Cureus 2017, 9, e1699. [Google Scholar] [CrossRef]
  60. Pettit, R.K.; McCoy, L.; Kinney, M.; Schwartz, F.N. Student perceptions of gamified audience response system interactions in large group lectures and via lecture capture technology. BMC Med. Educ. 2015, 15, 92. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  61. Stanley, D.; Latimer, K. ‘The Ward’: A simulation game for nursing students. Nurse Educ. Pract. 2011, 11, 20–25. [Google Scholar] [CrossRef]
  62. Cook, N.; McAloon, T.; O’Neill, P.; Beggs, R. Impact of a web based interactive simulation game (PULSE) on nursing students’ experience and performance in life support training—A pilot study. Nurse Educ. Today 2011, 32, 714–720. [Google Scholar] [CrossRef] [PubMed]
  63. Finley, J.; Caissie, R.; Hoyt, B. 046 15 Minute Reinforcement Test Restores Murmur Recognition Skills in Medical Students. Can. J. Cardiol. 2012, 28, S102. [Google Scholar] [CrossRef]
  64. Worm, B.; Buch, S. Does Competition Work as a Motivating Factor in E-Learning? A Randomized Controlled Trial. PLoS ONE 2014, 9, e85434. [Google Scholar] [CrossRef]
  65. El Tantawi, M.; Sadaf, S.; AlHumaid, J. Using gamification to develop academic writing skills in dental undergraduate students. Eur. J. Dent. Educ. 2018, 22, 15–22. [Google Scholar] [CrossRef]
  66. Koivisto, J.-M.; Multisilta, J.; Niemi, H.; Katajisto, J.; Haavisto, E.E. Learning by playing: A cross-sectional descriptive study of nursing students’ experiences of learning clinical reasoning. Nurse Educ. Today 2016, 45, 22–28. [Google Scholar] [CrossRef] [PubMed]
  67. Mallon, D.; Vernacchio, L.; Leichtner, A.M.; Kerfoot, B.P. ‘Constipation Challenge’ game improves guideline knowledge and implementation. Med. Educ. 2016, 50, 589–590. [Google Scholar] [CrossRef]
  68. Snyder, E.; Hartig, J.R. Gamification of board review: A residency curricular innovation. Med. Educ. 2013, 47, 524–525. [Google Scholar] [CrossRef]
  69. Scales, C.D., Jr.; Moin, T.; Fink, A.; Berry, S.H.; Afsar-Manesh, N.; Mangione, C.M.; Kerfoot, B.P. A randomized, controlled trial of team-based competition to increase learner participation in quality-improvement education. Int. J. Qual. Health Care 2016, 28, 227–232. [Google Scholar] [CrossRef] [Green Version]
  70. Forni, M.; Garcia-Neto, W.; Kowaltowski, A.; Marson, G. An active-learning methodology for teaching oxidative phosphorylation. Med. Educ. 2017, 51, 1169–1170. [Google Scholar] [CrossRef] [PubMed]
  71. Henry, B.; Douglass, C.; Kostiwa, I. Effects of participation in an aging game simulation activity on the attitudes of allied health students toward older adults. Internet J. Allied Health Sci. Pract. 2007, 5, 5. [Google Scholar] [CrossRef]
  72. Pacala, J.T.; Boult, C.; Hepburn, K. Ten Years’ Experience Conducting the Aging Game Workshop: Was It Worth It? J. Am. Geriatr. Soc. 2006, 54, 144–149. [Google Scholar] [CrossRef]
  73. Bhaskar, A. Playing games during a lecture hour: Experience with an online blood grouping game. AJP Adv. Physiol. Educ. 2014, 38, 277. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  74. Fleiszer, D.; Fleiszer, T.; Russell, R. Doughnut Rounds: A self-directed learning approach to teaching critical care in surgery. Med. Teach. 1997, 19, 190–193. [Google Scholar] [CrossRef]
  75. Bai, S.; Hew, K.; Huang, B. Does gamification improve student learning outcome? Evidence from a meta-analysis and synthesis of qualitative data in educational contexts. Educ. Res. Rev. 2020, 30, 100322. [Google Scholar] [CrossRef]
  76. López-Jiménez, J.J.; Fernández-Alemán, J.L. SIDRA. 2011. Available online: https://docentis.inf.um.es/sidra/index.php (accessed on 15 November 2021).
  77. Herzig, P.; Ameling, M.; Wolf, B.; Schill, A. Implementing gamification: Requirements and gamification platforms BT. In Gamification in Education and Business; Springer: Cham, Switzerland, 2015; pp. 431–450. [Google Scholar] [CrossRef]
  78. Herzig, P.; Schill, A.; Zarnekow, R. Gamification as a Service: Conceptualization of a Generic Enterprise Gamification Platform. 2014. Available online: https://tud.qucosa.de/landing-page/?tx_dlf[id]=https%3A%2F%2Ftud.qucosa.de%2Fapi%2Fqucosa%253A28187%2Fmets (accessed on 15 November 2021).
  79. Ferro, L.S.; Walz, S.P.; Greuter, S. Gamicards—An alternative method for paper-prototyping the design of gamified systems. Lect. Notes Comput. Sci. 2014, 8770, 11–18. [Google Scholar] [CrossRef] [Green Version]
  80. Cunningham, C.; Zichermann, G. Gamification by Design: Implementing Game Mechanics in Web and Mobile Apps; O’Reilly Media, Inc.: Sebastopol, CA, USA, 2011. [Google Scholar]
  81. Tondello, G.F.; Wehbe, R.R.; Diamond, L.; Busch, M.; Marczewski, A.; Nacke, L.E. The gamification user types hexad scale. In Proceedings of the 2016 Annual Symposium on Computer-Human Interaction in Play, Austin Texas, TX, USA, 16–19 October 2016; pp. 229–243. [Google Scholar] [CrossRef] [Green Version]
  82. Heilbrunn, B.; Herzig, P.; Schill, A. Tools for gamification analytics: A survey. In Proceedings of the 2014 IEEE/ACM 7th International Conference on Utility and Cloud Computing, London, UK, 8–11 December 2014; pp. 603–608. [Google Scholar] [CrossRef]
  83. Haladyna, T.M.; Rodriguez, M.C. Developing and validating test items. In Developing and Validating Test Items; Routledge: New York, NY, USA, 2013; pp. 1–446. [Google Scholar] [CrossRef]
  84. Ohn, M.; Ohn, K.-M. An evaluation study on gamified online learning experiences and its acceptance among medical students. Tzu Chi Med. J. 2020, 32, 211–215. [Google Scholar] [CrossRef]
  85. Putz, L.-M.; Hofbauer, F.; Treiblmaier, H. Can gamification help to improve education? Findings from a longitudinal study. Comput. Hum. Behav. 2020, 110, 106392. [Google Scholar] [CrossRef]
  86. Gentry, S.V.; Gauthier, A.; Ehrstrom, B.L.; Wortley, D.; Lilienthal, A.; Car, L.T.; Dauwels-Okutsu, S.; Nikolaou, C.K.; Zary, N.; Campbell, J.; et al. Serious Gaming and Gamification Education in Health Professions: Systematic Review. J. Med. Internet Res. 2019, 21, e12994. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  87. Mekler, E.D.; Brühlmann, F.; Opwis, K.; Tuch, A.N. Do points, levels and leaderboards harm intrinsic motivation? An empirical analysis of common gamification elements. In Proceedings of the First International Conference on Gameful Design, Research, and Applications, Toronto Ontario, ON, Canada, 2–4 October 2013; pp. 66–73. [Google Scholar] [CrossRef]
  88. Ang, E.T.; Chan, J.M.; Gopal, V.; Shia, N.L. Gamifying anatomy education. Clin. Anat. 2018, 31, 997–1005. [Google Scholar] [CrossRef]
  89. Javed, D.K. Teaching anatomy to medical students through flipped classroom with gamification approach. Int. J. Sci. Eng. Res. 2020, 11, 133–137. [Google Scholar] [CrossRef]
  90. Hanus, M.D.; Fox, J. Assessing the effects of gamification in the classroom: A longitudinal study on intrinsic motivation, social comparison, satisfaction, effort, and academic performance. Comput. Educ. 2015, 80, 152–161. [Google Scholar] [CrossRef]
  91. Rutledge, C.; Walsh, C.M.; Swinger, N.; Auerbach, M.; Castro, D.; Dewan, M.; Khattab, M.; Rake, A.; Harwayne-Gidansky, I.; Raymond, T.T.; et al. Gamification in action: Theoretical and practical considerations for medical educators. Acad. Med. 2018, 93, 1014–1020. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  92. Schumacher, D.J.; Englander, R.; Carraccio, C. Developing the master learner: Applying learning theory to the learner, the teacher, and the learning environment. Acad. Med. 2013, 88, 1635–1645. [Google Scholar] [CrossRef] [Green Version]
  93. Sanchez, D.R.; Langer, M.; Kaur, R. Gamification in the classroom: Examining the impact of gamified quizzes on student learning. Comput. Educ. 2020, 144, 103666. [Google Scholar] [CrossRef]
  94. Haskell, C. Understanding Quest-Based Learning. White Paper. Boise State University. 2013. Available online: https://classroomaid.files.wordpress.com/2013/03/qbl-whitepaper_haskell-final.pdf (accessed on 15 November 2021).
  95. Willis, J. A Neurologist Makes the Case for the Video Game Model as a Learning Tool. 2011. Available online: https://www.edutopia.org/blog/neurologist-makes-case-video-game-model-learning-tool (accessed on 15 November 2021).
  96. Mirvis, P.H.; Csikszentmihalyi, M. Flow: The Psychology of Optimal Experience. Acad. Manag. Rev. 1991, 16, 636. [Google Scholar] [CrossRef] [Green Version]
  97. Van Eck, R. Digital Game-Based Learning: It’s Not Just the Digital Natives Who Are Restless. Educ. Rev. 2006, 41, 1–16. Available online: http://edergbl.pbworks.com/w/file/fetch/47991237/digitalgamebasedlearning2006.pdf (accessed on 15 November 2021).
  98. Festinger, L. A Theory of Social Comparison Processes. Hum. Relat. 1954, 7, 117–140. [Google Scholar] [CrossRef]
  99. Marsh, H.W. Big-fish-little-pond effect on academic self-concept. Z. Für Pädagogische Psychol. 2005, 19, 119–127. [Google Scholar] [CrossRef]
  100. Kolb, A.Y.; Kolb, D.A. Learning Styles and Learning Spaces: Enhancing Experiential Learning in Higher Education. Acad. Manag. Learn. Educ. 2005, 4, 193–212. [Google Scholar] [CrossRef] [Green Version]
  101. Hoekstra, A.; Mollborn, S. How clicker use facilitates existing pedagogical practices in higher education: Data from interdisciplinary research on student response systems. Learn. Media Technol. 2011, 37, 303–320. [Google Scholar] [CrossRef]
  102. Schell, J. The Art of Game Design; Schell Games: Pittsburgh, PA, USA, 2008. [Google Scholar] [CrossRef]
  103. Barrio, C.; Organero, M.; Sanchez-Soriano, J. Can Gamification Improve the Benefits of Student Response Systems in Learning? An Experimental Study. IEEE Trans. Emerg. Top. Comput. 2015, 4, 429–438. [Google Scholar] [CrossRef]
  104. Lantz, M. The use of ‘Clickers’ in the classroom: Teaching innovation or merely an amusing novelty? Comput. Hum. Behav. 2010, 26, 556–561. [Google Scholar] [CrossRef]
  105. Jones, V.A.; Clark, K.A.; Puyana, C.; Tsoukas, M.M. Rescuing Medical Education in Times of COVID-19. Clin. Dermatol. 2020, 39, 33–40. [Google Scholar] [CrossRef]
Figure 1. Badges and Team Ranking of RB-G-SIDRA (gamified SIDRA with ranking and badges).
Figure 1. Badges and Team Ranking of RB-G-SIDRA (gamified SIDRA with ranking and badges).
Ijerph 18 13210 g001
Figure 2. Example of Question Formulated in the G-SIDRA (Gamified Immediate Audience Response System) Mobile Interface.
Figure 2. Example of Question Formulated in the G-SIDRA (Gamified Immediate Audience Response System) Mobile Interface.
Ijerph 18 13210 g002
Figure 3. Individual Ranking of A Test.
Figure 3. Individual Ranking of A Test.
Ijerph 18 13210 g003
Figure 4. Score and Success Rate of A Test by Team.
Figure 4. Score and Success Rate of A Test by Team.
Ijerph 18 13210 g004
Figure 5. Box Diagram for Ranking Variations in the Academic Course 2018/2019 R-G-SIDRA (gamified SIDRA with ranking).
Figure 5. Box Diagram for Ranking Variations in the Academic Course 2018/2019 R-G-SIDRA (gamified SIDRA with ranking).
Ijerph 18 13210 g005
Figure 6. Box Diagram for Ranking Variations in the Academic Course 2019/2020 (RB-G-SIDRA).
Figure 6. Box Diagram for Ranking Variations in the Academic Course 2019/2020 (RB-G-SIDRA).
Ijerph 18 13210 g006
Table 1. Gamification Elements in SIDRA Systems (Immediate Audience Response System in Spanish). R-G-SIDRA (gamified SIDRA with ranking); RB-G-SIDRA (gamified SIDRA with ranking and badges).
Table 1. Gamification Elements in SIDRA Systems (Immediate Audience Response System in Spanish). R-G-SIDRA (gamified SIDRA with ranking); RB-G-SIDRA (gamified SIDRA with ranking and badges).
COURSERANKINGBADGESTEAMPOINTSNº OF MCQ TEST
SIDRA2017/18NONONONO7
R-G-SIDRA2018/19YESNOYESYES4
RB-G-SIDRA2019/20YESYESYESYES7
Table 2. A Summary of the Statistical Treatments Performed for Each Hypothesis.
Table 2. A Summary of the Statistical Treatments Performed for Each Hypothesis.
H1H2H3H5
TestKruskal–WallisKruskal–WallisANOVA, Tukey post hoc test and Kruskal–WallisSpearman’s correlation
Independent
variable
SIDRA system usedSIDRA system usedTotal correct
answers in SIDRA
Individual ScoreTX
Dependent
variable
Final marksTotal correct
answers in SIDRA
Final marksTeamScoreTx
Table 3. Descriptive Statistics for Final Exam. “N”: Number of students; “M”: Mean; “SD”: Standard deviation.
Table 3. Descriptive Statistics for Final Exam. “N”: Number of students; “M”: Mean; “SD”: Standard deviation.
Academic Year 2017/2018—Final Exam Score
NMSD
SIDRA SCORE. FIRST TERTILE (0 ≤ SCORE < 5)255.7101.853
SIDRA SCORE. SECOND TERTILE (5 ≤ SCORE < 6.8)246.5961.297
SIDRA SCORE. THIRD TERTILE (6.8 ≤ SCORE ≤ 10)257.7021.223
Academic Year 2018/2019—Final Exam Score
NMSD
R-G-SIDRA SCORE. FIRST TERTILE (0 ≤ SCORE < 6.35)225.2352.026
R-G-SIDRA SCORE. SECOND TERTILE (6.35 ≤ SCORE < 8)226.9601.238
R-G-SIDRA SCORE. THIRD TERTILE (8 ≤ SCORE ≤ 10)237.7331.289
Academic Year 2019/2020—Final Exam Score
NMSD
RB-G-SIDRA SCORE. FIRST TERTILE (0 ≤ SCORE < 7.6)276.8031.651
RB-G-SIDRA SCORE. SECOND TERTILE (7.6 ≤ SCORE < 8.7)267.4431.056
RB-G-SIDRA SCORE. THIRD TERTILE (8.7 ≤ SCORE ≤ 10)277.9891.657
Table 4. Spearman’s Rank Correlation Coefficient Results between Individual and Team Score in RB-G-SIDRA (gamified SIDRA with ranking and badges) and R-G-SIDRA (gamified SIDRA with ranking). “Tx”: Test x; “CC”: Correlation Coefficient; “N”: Sample Size; “p”: p Value.
Table 4. Spearman’s Rank Correlation Coefficient Results between Individual and Team Score in RB-G-SIDRA (gamified SIDRA with ranking and badges) and R-G-SIDRA (gamified SIDRA with ranking). “Tx”: Test x; “CC”: Correlation Coefficient; “N”: Sample Size; “p”: p Value.
Academic Year 2018/2019
T1T2T3T4
CC0.5890.5640.3900.829
N78787878
p0.0000.0000.0000.000
Academic Year 2019/2020
T1T2T3T4T5T6T7
CC0.4680.7010.6240.6070.7290.7220.660
N87878787878787
p0.0000.0000.0000.0000.0000.0000.000
Table 5. Students’ Perceptions. “M”: Mean; “SD”: Standard Deviations; “Md”: Median.
Table 5. Students’ Perceptions. “M”: Mean; “SD”: Standard Deviations; “Md”: Median.
IdQuestionSIDRA
2017/18
R-G-SIDRA 2018/19RB-G-SIDRA 2019/20
MSDMdMSDMdMSDMd
Q1Are you pleased with the use of the system in the classroom?4.290.6544.730.5354.550.615
Q2Does the system motivate you in your learning process?4.370.6944.610.6454.410.775
Q3Does the system helped you to better understand both theoretical and practical concepts?4.240.6544.390.7354.090.924
Q4Does the instructor’s feedback help you in your learning process?3.531.2034.670.5354.420.905
Q5Is the time spent on the system based learning activity appropriate?4.630.6954.390.7654.150.924
Q6Do the gamification elements included in the system motivate participation in the classroom?---3.861.1344.341.015
Q7Does teamwork helped you to improve in your learning process?---4.380.8554.220.914
Q8Are classes more dynamic and fun when using the system?4.320.6744.770.5454.660.665
Q9Your final assessment of the platform is:4.280.7244.610.6254.470.635
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

López-Jiménez, J.J.; Fernández-Alemán, J.L.; García-Berná, J.A.; López González, L.; González Sequeros, O.; Nicolás Ros, J.; Carrillo de Gea, J.M.; Idri, A.; Toval, A. Effects of Gamification on the Benefits of Student Response Systems in Learning of Human Anatomy: Three Experimental Studies. Int. J. Environ. Res. Public Health 2021, 18, 13210. https://doi.org/10.3390/ijerph182413210

AMA Style

López-Jiménez JJ, Fernández-Alemán JL, García-Berná JA, López González L, González Sequeros O, Nicolás Ros J, Carrillo de Gea JM, Idri A, Toval A. Effects of Gamification on the Benefits of Student Response Systems in Learning of Human Anatomy: Three Experimental Studies. International Journal of Environmental Research and Public Health. 2021; 18(24):13210. https://doi.org/10.3390/ijerph182413210

Chicago/Turabian Style

López-Jiménez, Juan J., José L. Fernández-Alemán, José A. García-Berná, Laura López González, Ofelia González Sequeros, Joaquín Nicolás Ros, Juan M. Carrillo de Gea, Ali Idri, and Ambrosio Toval. 2021. "Effects of Gamification on the Benefits of Student Response Systems in Learning of Human Anatomy: Three Experimental Studies" International Journal of Environmental Research and Public Health 18, no. 24: 13210. https://doi.org/10.3390/ijerph182413210

APA Style

López-Jiménez, J. J., Fernández-Alemán, J. L., García-Berná, J. A., López González, L., González Sequeros, O., Nicolás Ros, J., Carrillo de Gea, J. M., Idri, A., & Toval, A. (2021). Effects of Gamification on the Benefits of Student Response Systems in Learning of Human Anatomy: Three Experimental Studies. International Journal of Environmental Research and Public Health, 18(24), 13210. https://doi.org/10.3390/ijerph182413210

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop