Next Article in Journal
Spatial Attention-Based Kernel Point Convolution Network for Semantic Segmentation of Transmission Corridor Scenarios in Airborne Laser Scanning Point Clouds
Previous Article in Journal
Person Identification Using Temporal Analysis of Facial Blood Flow
Previous Article in Special Issue
Personalized Feedback in Massive Open Online Courses: Harnessing the Power of LangChain and OpenAI API
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Acquisition and Development of Basic Competences in Collaborative Environments Through Quiz-Based Game Applications

by
José Amelio Medina
1,*,
Rosa Estriegana
2,
Roberto Barchino
1,
Rafael Robina-Ramírez
3,
Salvador Otón-Tortosa
1 and
António Moreira Teixeira
4
1
Departamento de Ciencias de la Computación, Universidad de Alcalá, 28871 Madrid, Spain
2
Departamento de Automática, Universidad de Alcalá, 28871 Madrid, Spain
3
Business and Sociology Department, University of Extremadura, 10004 Cáceres, Spain
4
Laboratory of Distance Education and eLearning, Universidade Aberta Portugal, 1269-001 Lisboa, Portugal
*
Author to whom correspondence should be addressed.
Electronics 2024, 13(22), 4500; https://doi.org/10.3390/electronics13224500
Submission received: 30 July 2024 / Revised: 2 November 2024 / Accepted: 13 November 2024 / Published: 15 November 2024

Abstract

:
This experimental study aims to examine students’ acceptance and attitude towards the use of Quiz-Based Game Applications, as well as to analyze how Quiz-Based Game Applications affect the acquisition and development of basic competences. To achieve this purpose, a mixed-methods approach was employed, combining a quantitative methodology through the analysis of responses from online questionnaires completed by 166 computer science students using Structural Equation Modeling (SEM) with a qualitative methodology based on focus groups and observation. The theoretical framework was based on the widely recognized Technology Acceptance Model (TAM). The findings indicate that students perceive these tools as useful and easy to use, thereby positively influencing their attitude towards the implementation of game-based learning. Furthermore, the study emphasizes the crucial role of game-based learning strategies in the effective development of essential competences for the comprehensive education of computer science students. These findings underscore the importance of considering the Quiz-Based Game Learning Applications (QGBLAs) approach as a valuable educational strategy to enhance learning and develop fundamental skills in students.

1. Introduction

In recent years, online learning applications and educational technologies have experienced unprecedented growth, expanding their usage across all educational levels, including higher education. These tools offer substantial benefits, such as flexibility in content access, increased motivation, and performance, as well as support for experimentation and hands-on learning. Among the diverse range of educational technologies, game-based learning applications (GBLAs) have emerged as a prominent choice, not only to boost student motivation and engagement [1,2,3] but also to facilitate interaction, collaborative learning, and the acquisition of basic competences or “soft skills” [4,5,6].
These basic competences such as teamwork, communication competences, autonomous learning, knowledge management, and analytical and synthesis abilities are presented as essential in today’s context, with an increasing demand from both society and businesses. In response to this need, the European Higher Education Area (EHEA) [7] urges institutions to redefine degree programs, focusing on these types of competences to better prepare graduates for their future roles in society. Higher education institutions have endorsed this initiative by promoting strategies for the development of these crucial competences and incorporating them into course guidelines. However, despite the importance attributed to these basic competences, curriculum plans often lack clear guidelines on strategies, activities, or learning tools that encourage their acquisition. Moreover, while numerous studies explore the benefits of game-based learning tools in terms of motivation and academic performance, little attention has been given to how these tools influence the development of essential competences in students.
On the other hand, the development of these competences, as well as other educational benefits arising from the use of GBLAs, largely relies on the students’ attitude and interest in utilizing such tools [8,9]. Thus, the acceptance and the effective use of technology depend on the motivation to use it, along with factors such as the perception of utility and ease of use.
In this context, the primary objectives of the current research are as follows:
  • To assess students’ perceptions of Quiz-Based Game Applications in collaborative learning environments by analyzing their attitude towards their use.
  • To examine the impact of these tools on the development of basic competences in university students, with a specific focus on computer science majors.
In order to achieve these objectives, a collaborative learning environment was implemented, utilizing game-based learning (GBL) strategies and mobile-based learning games. The analysis of students’ attitudes towards the use of these applications and their impact on competency development has been carried out through a model based on the Technology Acceptance Model (TAM) [10,11].
This paper is structured as follows: Section 2 contains a literature review and the theoretical framework. This section outlines the benefits of strategies and dynamics employing GBL found in multiple studies, as well as the importance of the foundational competences (or soft skills). Section 3 presents the model and its components, and the hypotheses are also presented in this section. Section 4 describes the research methodology, the instrument employed, the participants, and the data collection process, while the data analysis and results are presented in Section 5. A discussion follows in Section 6. Finally, the paper ends with the conclusions drawn from the study.

2. Literature Review and Theoretical Framework

Despite numerous studies exploring the role of technology in education, no articles have been found that evaluate the effects of game-based learning (GBL) on the acquisition of basic competences [12,13,14,15]. Consequently, this study endeavors to fill this void in the existing literature by applying and testing a model to gain insights into students’ attitude towards GBL within collaborative environments, alongside its influence on competence development.

2.1. Game-Based Learning (GBL)

Within the realm of gamification, game-based learning (GBL) stands out for its ability to actively engage students and motivate their participation by incorporating game elements to facilitate and enhance the learning process [16,17]. This approach integrates learning with various resources, such as games, to enhance and enrich the teaching/learning process, as well as student evaluation, through active engagement [5].
Examples of GBLAs such as Quizizz, Quizlet, Kahoot!, or Socrative present questions and challenges for students to regularly respond to using their mobile devices, fostering student engagement and participation both inside and outside the classroom. The gaming elements inherent in these GBLAs, such as challenges involving acquiring badges, unlocking content, customizing avatars, building collections, exchanging gifts, advancing through levels, completing quests, exploring social networks, and obtaining virtual goods, enhance the interactivity and immersion of the educational environment [18,19].
Numerous researchers have explored the positive impact of GBL on student motivation and engagement [1,6,7,17] which correlates an improved overall learning experience [20] and it is also associated with an enhancement in skills [6,20]. Thus, GBL, using rewards such as points and badges, motivates and engages students by providing them with a sense of achievement and progress, which drives them to actively participate and engage more deeply with the learning process. Another key element of GBL that enhances motivation is immediate feedback on student performance [21], enabling them to quickly and effectively understand their strengths and areas for improvement. For example, progress bars provide immediate visual feedback on students’ individual progress, allowing them to monitor their performance and set specific goals for continuous improvement. Additionally, GBL activities also foster healthy competition among students, which, according to [21], enhances student motivation. Competitive dynamics with elements such as leaderboards displaying each player’s relative performance compared to their peers not only stimulate individual performance, encouraging students to strive harder, but also promote collaboration and collective engagement in the educational process.
In addition to the benefits already mentioned, GBL strategies provide students with the opportunity to develop both hard and soft skills, as indicated by [20]. Besides motivating and facilitating the achievement of educational objectives in a more dynamic and effective manner, GBL fosters and enhances soft skills such as leadership, team management and time management skills [10], teamwork competence, innovation behaviors [8], cooperative skills [12], communication and negotiation skills, problem-solving skills, as well as critical and creative thinking skills [22].
On the other hand, this type of activity also presents some limitations. According to [23], students encounter technical problems such as unreliable internet connections or difficulties reading on a projected screen, as well as stressful time pressure when providing answers. Additionally, teachers share concerns with students about connectivity issues and express reservations due to the lack of time to respond, leading some students to guess without thinking. Moreover, some teachers find it challenging to adapt to using the technology [23]. These limitations can diminish the overall gratification of the experience, highlighting that the effectiveness and potential benefits of GBL heavily rely on the acceptance of lecturers [7] and students [2,9]. While numerous studies confirm the benefits of GBL, there is limited research on the acceptance and attitude of students towards such tools. Additionally, research on how it affects the acquisition of foundational competences or soft skills is scarce.

2.2. Technology Acceptance Model (TAM)

The Technology Acceptance Model (TAM) [10,11] stands out as the most renowned and extensively utilized framework for comprehending the acceptance and prospective usage intentions of information systems, online learning tools, and technology at large. While advanced models such as TAM2 [24] and the Unified Theory of Acceptance and Use of Technology (UTAUT) [25] introduce additional constructs, such as social influence and facilitating conditions, the original TAM remains highly effective and widely adopted, particularly in educational research where the focus is on understanding critical perceptions like perceived usefulness (PU), perceived ease of use (PEOU), and attitude towards use (ATU). TAM posits that these perceptions are key determinants in explaining user motivation and behavior. These constructs are central to this study, which aims to explore how students’ perceptions of game-based learning (GBL) applications influence the development of essential competences, particularly soft skills.
TAM and its extensions have been widely applied, especially in examining the adoption of diverse technological tools in education. For example, research has examined the acceptance of e-learning [26,27,28], particularly during the COVID-19 pandemic [29,30], as well as post-pandemic [13,31]. Similarly, studies have explored the acceptance of various learning tools, such as learning management systems (LMS) [32,33], mobile technologies [34], cloud computing [35] and online videos [36].
There is also limited research evaluating the acceptance of gamification. For instance, Ref. [3] examines factors determining lecturers’ intention to use digital games using an extended TAM, while Ref. [9] focuses on studying the relationship between students’ acceptance attitudes and variables of the exponential learning model based on digital games. In another study, the authors of [37] evaluate the acceptance of a GBL tool such as Kahoot!, alongside collaborative canvas tool (Padlet) and an annotation tool (Cirrus). They found Kahoot! to be considered the most useful and easy to use for revising learned concepts. However, despite the presence of such studies, there is a noticeable lack of research focusing on student intentions to use GBL in higher education, as well as on students’ perceptions of these tools, their attitude towards using them, and how they impact the development of essential competences or soft skills for university students in general, and for engineering students in particular.

2.3. The Models of Technology Acceptance and Game-Based Learning (GBL)

The integration of game-based learning (GBL) and the Technology Acceptance Model (TAM) has garnered significant attention in educational research, particularly in the context of higher education. As digital technologies continue to evolve, educators and researchers are exploring innovative ways to enhance student engagement and learning outcomes through gamification and digital game-based learning approaches. The TAM, originally developed to predict technology adoption in organizational settings, has been adapted and extended to examine the acceptance and use of educational technologies, including game-based learning tools [8,14,38].
Several studies have investigated the factors influencing teachers’ and students’ acceptance of game-based learning technologies. Ref. [39] examined the adoption of learning management systems among primary and secondary education teachers, while Ref. [40] focused on predicting teachers’ behavioral intention to use educational video games in their courses. These studies highlight the importance of perceived usefulness and ease of use in shaping educators’ attitudes towards GBL technologies.
In the context of higher education, researchers have explored the application of TAM to various game-based learning scenarios. Ref. [41] constructed a virtual reality tour-guiding platform and established a Technology Acceptance Model based on the Unified Theory of Acceptance and Use of Technology (UTAUT). Similarly, Ref. [42] developed a platform for online high-level cooperative games, finding that perceived usefulness and ease of use significantly influenced players’ attitudes and usage intentions.
The integration of augmented reality (AR) and mixed reality (MR) technologies in game-based learning has also been examined through the lens of TAM. Ref. [43] investigated the acceptance of an AR-enhanced board game for health education, while Ref. [44] explored students’ technology acceptance of an educational game prototype integrating mixed reality and concept maps. These studies demonstrate the potential of emerging technologies to enhance the game-based learning experience and increase student engagement.
Researchers have also explored the application of TAM in specific disciplinary contexts. Ref. [45] investigated the effectiveness of combining an interactive game with the concept of productive failure in teaching data structures. Ref. [46] developed an interactive serious programming game for teaching JavaScript, evaluating it using both TAM and the Technology-Enhanced Training Effectiveness Model (TETEM). These studies highlight the potential of game-based learning to enhance student engagement and learning outcomes in technical disciplines.
Several studies have proposed extensions or modifications to the TAM to better capture the unique aspects of game-based learning. Ref. [47] suggested a theoretical model (EdTAM) designed to enhance the adoption of educational technology among teachers, while Ref. [3] examined factors determining the intention of accounting and business lecturers to use digital games in their courses using an extended TAM. These adaptations reflect the ongoing effort to refine and improve our understanding of technology acceptance in educational contexts.
The role of individual characteristics and motivational factors in technology acceptance has also been explored. Ref. [48] incorporated epistemological beliefs into the TAM to explore students’ digital game preferences, while [49] investigated learner satisfaction within the context of marketing simulations, exploring the roles of Performance Expectancy and Effort Expectancy. Ref. [50] examined the interplay between TAM, self-regulation strategies, and academic self-efficacy in the context of remote education using game-based online resources.
Research has also focused on the development and evaluation of specific game-based learning tools. Ref. [51] developed a game application for sex education, while [52] created a gamified reviewer for accounting education. Ref. [53] investigated the use of escape rooms in higher education, finding that enjoyment and perceived usefulness were key factors in facilitating collaborative learning. These studies demonstrate the diverse applications of game-based learning across various educational domains.
The adoption of game-based learning technologies in different cultural contexts has also been examined. Ref. [54] investigated the effectiveness and student attitudes towards an edutainment game for the computer technology curriculum in Saudi Arabia, while [55] evaluated the effectiveness of a tablet-based mobile application for teaching literacy and numeracy in Pakistan. These studies highlight the importance of considering cultural and contextual factors in the design and implementation of game-based learning technologies.
Several systematic reviews and theoretical frameworks have been proposed to synthesize the growing body of research on game-based learning and technology acceptance. Ref. [56] conducted a systematic review to identify the primary drivers and barriers to the use of gamification and game-based learning by university educators. Ref. [57] presented a Theoretical Model of Student-Centric Edu-Gamification Systems, addressing the gap in knowledge regarding how to include learning content and instructor behaviors in examining the impact of gamification systems on learning outcomes.
As the field of game-based learning continues to evolve, researchers are exploring new directions and applications. Ref. [58] investigated pre-service teachers’ perspectives towards the use of digital game-based learning for sustainable development of STEM education and promoting 21st century skills. Ref. [59] presented a framework investigating the impact of non-player characters’ attributes on technology acceptance factors, flow state, and intention to continue using a digital game-based learning environment.
Therefore, the integration of game-based learning and the Technology Acceptance Model has provided valuable insights into the factors influencing the adoption and effectiveness of educational technologies. As digital games and gamification techniques continue to gain prominence in educational settings, ongoing research in this area will be crucial for developing effective, engaging, and widely accepted learning tools and strategies.

2.4. Basic Competences or Soft Skills

Basic competences or soft skills, also referred to as 21st century competences, encompass abilities including, but not limited to, communication, collaboration, teamwork, time management, critical thinking, problem solving, and adaptability. These competences go beyond technical skills, referred to as ‘hard skills’, which represent specific capabilities acquired during academic studies in a particular discipline [60].
Currently, these “soft” skills are highly valued by companies [12,19], serving as a differentiating factor among candidates in selection processes. While companies initially assess the technical skills acquired during academic studies, it is the cross-cutting competences that prove decisive in standing out in an interview and, ultimately, in securing and maintaining employment.
This reality motivates European universities to give importance to and integrate these competences into their academic programs, as part of an educational approach aligned with the principles and guidelines established by educational and governmental authorities, such as those of the European Higher Education Area (EHEA) [7]. In this regard, the objective of the EHEA goes beyond providing a common European framework for education. It also aims to describe achievement levels in order to equip students with skills that transcend the boundaries of their academic disciplines, thus preparing them for effective participation in society and the job market [14].
It is worth noting that these competences may vary depending on the specific academic program. In the field of engineering programs, as highlighted by [60], soft skills acquire significant importance due to the ever-evolving nature of these disciplines and their continuous pursuit of solutions to various societal issues. In this context, extracurricular skills play a crucial role in ensuring professional development that facilitates ongoing skill enhancement through lifelong learning, thereby enabling effective adaptation to future contexts.
Some of the cross-cutting competences included in the teaching guides of the computer science field, which are the focus of this study, are as follows:
  • Communication: Ability to convey ideas clearly and effectively, encompassing both oral and written expression, active listening, and assertiveness.
  • Teamwork: Collaborating effectively to foster a productive and enriching environment.
  • Information Management: Skill to gather, process, and apply information efficiently.
  • Self-directed Learning: Capacity to acquire new knowledge independently.
  • Analysis and Synthesis: Essential skills for problem solving and decision making. These skills involve the ability to break down complex information into manageable components (analysis) and then integrate these components to achieve a comprehensive understanding (synthesis).
In alignment with [14], while acknowledging the crucial role of soft skills in higher education, research into their acquisition and development remains limited. Furthermore, there is a lack of clear guidelines on how to acquire or assess these competences.
Building on this foundation, this study aims to investigate how the implementation of game-based learning (GBL) and, more specifically, Quiz-Based Game Learning Applications (QGBLAs) influences competency development.

3. Research Model and Hypotheses

A theoretical model was constructed to understand students’ attitudes toward QGBLA, their use through a collaborative learning methodology, and the impact on the acquisition and development of basic competences. Each of the hypotheses presented below corresponds to a path in the structural equation modeling (SEM) that was applied.

3.1. The Collaborative Learning Environment by Means of Quiz-Based Game Learning Applications (QGBLAs)

Drawing from the literature review and in alignment with various authors [18,21], it is reasonable to propose strategies incorporating Quiz-Based Game Learning Applications (QGBLAs) and motivating elements such as peer collaboration, rewards and achievement badges, alongside learning-enhancing features like progress bar feedback and healthy peer competition via leader boards. These are elements that students find easy to handle and useful for enhancing their learning experience and developing skills in a satisfying and gratifying manner. Therefore, our hypotheses suggest that the learning experience using QGBLAs, especially activities conducted through applications like Quizziz and Quizlet, positively influence perceived usefulness (PU) (H1) and perceived ease of use (PEOU) (H2).

3.2. Perceived Usefulness (PU), Ease of Use (PEOU), and Attitude Towards Use (ATU) of Quiz-Based Game Learning Applications (QGBLAs)

Within the TAM framework, Ref. [11] introduced the concept of perceived usefulness (PU), which refers to the extent to which an individual believes that the adoption of a specific system or approach will enhance their performance in a task or role. Following this, Ref. [11] defined perceived ease of use (PEOU) as the degree to which an individual perceives that utilizing a particular system or approach will require minimal effort. Additionally, Ref. [11] outlined the attitude towards use (ATU) as an individual’s overall affective reaction to the use of the system, thus providing a framework for understanding individuals’ responses to systems.
Numerous studies employing the TAM, including most of the 145 articles reviewed by [61], have indicated a direct relationship between perceived ease of use (PEOU), perceived usefulness (PU), and attitude towards use (ATU). They have identified a positive correlation between PEOU and PU, as well as between these factors and the attitude toward technology use. Therefore, we hypothesize that perceiving Quiz-Based Game Learning Applications (QGBLAs) and collaborative learning methodology, as easy to use (PEOU) would positively influence the perceived usefulness of these applications and methodology (PU) (H3). Similarly, we propose that students’ attitude toward the use of QGBLAs is significantly affected by their perception of these platforms and the collaborative methodology used during the learning experience as useful (PU) (H4) and easy to use (PEOU) (H5).

3.3. Competences Development

Based on the review of previous literature, Ref. [20] highlight that QGBL strategies provide students with the opportunity to develop essential skills. This observation is supported by other studies that also confirm the effectiveness of GBL in developing basic skills, as indicated by [5,6,12,22]. Therefore, we posit that the use of GBL and students’ perception of its utility (PU) (H6) and ease of use (PEOU) (H7), along with their attitude towards its utilization (H8), significantly influence the development of essential competences, such as knowledge management, analysis and synthesis skills, communication and teamwork ability, as well as autonomous learning capability.
The conceptual research framework is shown in Figure 1.

4. Methodology

This study was developed using the hypothetical–deductive method, aligned with the primary objectives of the project. First, the research assessed students’ perceptions of Quiz-Based Game Applications within collaborative learning contexts by analyzing their attitudes toward these tools. Second, it examined the impact of these applications on the development of fundamental competences among university students, focusing specifically on undergraduate students in computer science.
To conduct this study, a mixed-method approach was applied, combining both quantitative and qualitative analyses. This methodological approach facilitates both breadth and depth in understanding and verifying the factors that influence the acquisition of essential competences.
In order to test the proposed hypotheses, a three-stage research design was implemented as follows. In the first stage, following a literature review, relevant Quiz-Based Game Applications were selected for use in the learning process, course materials were adapted, and the model to be evaluated was developed.
In the second stage, data were collected through an online questionnaire distributed to 185 students to assess their attitudes toward the use of Quiz-Based Game Applications and to examine the impact of these tools on the development of core competences among students, in order to validate the proposed model. Data were analyzed using partial least squares (PLS) through the SmartPLS software v4.1.0.2.
In the third stage, and with the aim of cross-referencing the survey results, additional data were gathered through focus groups involving 8 students who participated in activities with Quiz-Based Game Applications. This allowed for an exploration of their experiences, perceptions, and opinions regarding the impact of these tools on their learning process. Additionally, during sessions conducted in both classroom and laboratory settings, students’ interactions with the applications and with their peers during collaborative activities were recorded. These observations enabled the analysis of key aspects, such as students’ willingness to propose ideas, their adaptability to the automated feedback provided by the applications, and their level of engagement in joint problem solving. Further, notes were taken on verbal and nonverbal language, interaction frequency, and the degree of independence in the use of the tools, providing a detailed context for the data obtained.
Below we detail a formative learning and evaluation experience that incorporates subjective elements where creativity, teamwork, interaction and communication play pivotal roles. The design of this formative experience is underpinned by the literature review presented earlier, which involved a theoretical study and adaptation of the subject to incorporate Quiz-Based Game Applications.

4.1. Methodology Used to Conduct the Literature Review

The literature review was conducted following the methodology proposed by Medina-López et al. [62], which outlines five stages in this process: identification of the field of study and the period to be analyzed, selection of information sources, execution of the search (what, where, and how), management and refinement of the search results, and analysis of the results.
For the first phase, game-based learning (GBL) was identified as the field of study, with the period to be analyzed limited to the last decade. Once the field of study was established, we focused on considering the different sources from which to gather information. We chose to access relevant scientific journal articles and impactful conference papers in the field. This was accomplished through consultation of the Scopus database [63].
For the third stage, concerning the execution of the search itself, explicit criteria/rules were defined to manually select a set of articles. Specifically, the rules we followed were that the article deals with game-based learning (GBL) and focuses on Quiz-Based Game Learning Applications (QGBLAs), Technology Acceptance Model (TAM), Soft Skills (SS), or Basic Competence.
Subsequently, keywords were combined to obtain references related to factors such as Perceived Usefulness (PU), Collaborative Learning Environment (CLE), Ease of Use (PEOU), and Attitude Towards Use (ATU).
In the next phase, a process of management and refinement of the search results was conducted. This involved classifying the found references and analyzing false positives.
Finally, the obtained results were analyzed, which allowed for the selection of all references used in the literature review presented in the study.

4.2. Experience with Quiz-Based Game Applications

With the premise of analyzing students’ attitude towards QGBLAs and their impact on the development of basic competences, a collaborative experience was conducted. This experience incorporated game-based learning strategies and mobile-based learning games.
Specifically, the study was conducted using the programming course taught in the Computer Engineering and Information Systems degrees at the University of Alcalá. This is the second programming course that students undertake, focusing on learning Object-Oriented Programming (OOP). The course had previously experienced high early dropout rates, which led to the decision to incorporate gamified activities to motivate and engage students while also developing basic competences. Specifically, the Quizlet and Quizizz game-based learning applications were selected for their ability to foster active and evaluative participation [64,65]. Both platforms serve as concrete examples of how gamification can enhance learning by engaging students in a dynamic and motivating way [66,67,68].
Quizlet is a gamified tool based on flashcards that presents terms alongside their definitions. This type of game is highly suitable for facilitating knowledge acquisition both individually and collectively. The key theoretical concepts necessary for understanding the OOP paradigm, such as class, object, inheritance, abstract class, and polymorphism, are introduced to students through this method. In the theory class, gamification is implemented using Quizlet’s “Quizlet Live” and “Flashcards” tools. The instructor directs students to form groups of up to four members, encouraging group discussions and collaborative resolution of concepts, which generates debates among students over the correct answers. In “Quizlet Live” mode, Quizlet selects a set of 12 flashcards from the deck created by the instructor, presenting a term with four possible definitions, among which the correct one is included.
In contrast, in the “Flashcards” mode, Quizlet selects a set of 12 flashcards from the created deck and presents a term, prompting students to identify its correct definition; this process can also be reversed, starting from the definition and requiring students to propose the term.
Several rounds of games are typically played to cover a broader range of concepts. Groups earn points as they answer questions correctly, and the instructor later discusses the most insightful questions. These games are conducted during sessions before exams to reinforce key concepts. After the gamified activities conclude, students receive a link allowing them to practice autonomously. In Quizlet, students not only memorize concepts but can also create their own flashcards or review predefined sets, enabling personalized learning, as shown in Figure 2.
Quizizz, in turn, was employed in the laboratory setting to analyze, evaluate, and solve problems collaboratively. This tool combines gamification elements, such as competition and real-time feedback, with questions designed to deepen students’ understanding through a dynamic and engaging environment (Figure 3). Quizizz also allows students to work in groups, thereby fostering collaboration, motivation, and teamwork, as results are visible to all, creating a competitive atmosphere that reinforces active participation.
Quizizz operates as a multiple-choice quiz, where each question offers four possible answers, one of which is correct. In this study, Quizizz was utilized for learning the object-oriented programming language Java. Each question incorporated theoretical and practical concepts; for instance, some questions presented incomplete code, prompting students to identify the missing segment needed for functionality, or to determine the output of code execution. In the laboratory, gamification was implemented through the “Quizizz Classic” tool. Students were organized into groups of four to complete a quiz where 25 questions were randomly selected, with each group answering and competing with the others. At the end of the gamified activities, students were provided with a link to enable autonomous practice.
The sequence of activities was planned, and the material and activities used in the various QGBL tools were adapted and developed.
The course was divided into two blocks, with the first focusing on basic concepts and the second covering the remainder of the syllabus. These topics served as a guide to promote the acquisition of essential transversal competences for computer science students. Each block included three Quizlet sessions and three Quizizz sessions conducted during the second part of the theory or lab sessions. The sessions were held with randomly assigned student groups, ensuring that they had not previously worked together. This promoted a collaborative environment where group discussions and the resolution of doubts were encouraged through oral and written communication.
In terms of teamwork, group members were encouraged to discuss possible solutions to a code, question, or problem with the aim of selecting the correct answer for each task.
In addition, all materials were available to students to work on individually and/or collectively outside of class, extending learning opportunities beyond the classroom.

4.3. Instrument

Items for each variable in the study were adapted from scales validated in previous studies. Thus, the questions about students’ perceptions of the methodology using GBL and their opinions on the motivating elements of these strategies—such as rewarding points obtained after achieving goals in the game, achievement badges awarded in recognition of completing an objective, leaderboards showing the rank achieved by each player, or progress bars indicating the level of completion of an objective—are based on [69,70].
Scales of perceived usefulness, perceived ease of use, and attitude towards the use of GBL applications and strategies were measured by means of items adapted from the Technology Acceptance Model (TAM) [11,24]. Finally, questions to evaluate the essential transversal competences for computer science students, such as knowledge management skills, the ability to analyze and synthesize, communication skills, teamwork skills, and autonomous learning skills, were adapted from [71,72,73,74].
To carry out this study, an online questionnaire was designed following several criteria as a guide and was adapted considering other reviewed models as recommended by [75].
The questionnaire used a 5-point bipolar Likert scale, with responses ranging from 1: completely disagree to 5: completely agree. To minimize errors in variance-related items, the questionnaire used simple questions and easy-to-understand language. This questionnaire was subsequently analyzed.
The questions, constructs, variables, and authors are shown in Table 1.

4.4. Participants and Data Collection

The University of Alcalá, with nearly 28,000 students, 1847 professors, and 800 administrative and service staff, offers 45 official degrees, 58 official postgraduate programs, 29 doctorates, and a significant offering of master’s and specialization studies [76]. Due to its size, it is considered a medium-sized university [77], and according to the u-Ranking, it ranks in the medium-high position [78,79].
The experimental study was conducted on the “Programming” course, which is taught in the first year of the Computer Engineering and Information Systems degree at the University of Alcalá. Currently, the number of students studying at the Polytechnic School of the University is 2842, of which 1060 are enrolled in computer science branches.
Data were collected on a voluntary basis when the assessment process was fully completed. The questionnaire was completed by 168 students, incomplete surveys were excluded, leaving a total of 166. The respondents were mostly male (140), as only 26 were female, and aged between 20 and 24.

5. Data Analysis

A regression analysis of latent variables, based on the optimization technique of partial least squares (PLS) to construct the model, was performed by means of SmartPLS 4.1.0.2. Ref. [80] present this technique as a multivariate one for testing structural models that estimates the model parameters oriented to minimize the residual variance of the entire model’s dependent variables. SmartPLS does not require any parametric conditions and is recommended for small sample sizes [81].
To determine sample size, it is necessary to specify the expected effect size (ES) and the significant values for alpha (α) and power (β). In general terms, an alpha of 0.05 and a power of 80% are acceptable. These three values are then used to calculate sample size. In this case, a multiple regression study was conducted with four predictors, an average effect size (ES) of 0.15, an alpha of 0.05, and a power of 0.95, in line with [82], to obtain the sample size. The result of this analysis was N = 119 participants. Given that our available study sample consisted of 166 cases, our sample exceeded all criteria for performing an analysis of the measurement models and structural model.

5.1. Measurement Model Evaluation

Results of the analysis indicated that the measurement model was satisfactory. The degree of skewness is not severe and there is one of the two indicators measuring the (reflective) construct. This deviation from normality is not considered an issue and the indicator is retained. Moreover, all standardized loadings (λ) are greater than 0.707 (Table 2). Consequently, the individual item reliability is adequate [83].
The simple reliability of the measurement scales used was calculated considering the Cronbach’s alpha values, all of which were above 0.70 [84]. The composite reliability of indicator values are shown to be greater than 0.7 [85], so a high level of internal consistency reliability has been demonstrated among latent variables.
In the analysis of variance, all the values for the average variance extract (AVE) were above 0.50 [86], exceeding the minimum acceptable values for validity (Table 3).
Additionally, Ref. [86] suggest that the square root of AVE in each latent variable can be used to establish discriminant validity, so to confirm discriminant validity among the constructs, the square root of the AVE must be superior to the correlation between the constructs. Table 4 presents the square roots of the AVE on the diagonal and the correlations among the constructs. These values are larger than other correlation values among the latent variables, hence indicating adequate discriminant validity of the measurements.
Discriminant validity measures using criterion were applied [86]. The value is higher than other correlation values between latent variables, indicating acceptable discriminant validity of the measurements. On the other hand, as shown in Table 4, the discriminant validity measures using the heterotrait–multitrait (HTMT) method [87] indicate the mean of the heterotrait–heteromethod correlations relative to the geometric mean of the average monotrait–heteromethod correlation of both variables. A conservative criterion of 0.85 has been used, which is associated with sensitivity levels of 95% or over. With construct correlations of 0.70, the specificity rates for HTMT 0.85 are near 100%. The HTMT ratio for perceived usefulness (PU) and attitude towards use (ATU), at 0.799, was below the 0.85 cut-off, and substantially below the 0.95 cut-off recommended for conceptually close constructs [87]. This provides good support for our claims of discriminant validity between our measures a group and individual level.
Therefore, the analysis confirmed that the measurement model was robust, with high individual item reliability, internal consistency, and convergent validity across constructs. Discriminant validity was also established, affirming that each construct was distinct within the model.

5.2. Structural Model Analysis

The model shown in Figure 4 has resulted from the analysis of the reviewed literature.
The PLS program can generate T-statistics for significance testing of both the inner and outer model, using a procedure called bootstrapping [88]. In this procedure, a large number of subsamples (10,000) are taken from the original sample with replacement to give bootstrap standard errors, which in turn give approximate T-values for significance testing of the structural path.
After the bootstrapping procedure was completed, the results were as follows: All the R2 (R-squared) values range from 0 to 1 (Table 5). The higher the value, the more predictive capacity the model has for that variable. Because R2 should be high enough for the model to reach a minimum level of explanatory power, the R2 values are greater than 0.10 with a significance of t > 1.64 [89].
Figure 4 and Table 6 show the variance explained R2 of the dependent constructs and the path coefficients for the model. They are not less than 0.10, indicating that the independent explanatory variables are adequate.
The hypothesized relationships between constructs are estimated by standardized regression coefficients. Therefore, the algebraic sign is analyzed if there is change of sign; the magnitude and statistical significance (T statistics) are greater than 1.64 (t (9999), one-tailed test). Afterwards, the hypotheses are checked and validated. Relationships were positive, mostly with high significance as shown in Table 7.
However, when the percentile bootstrap is applied to generate a 95% confidence interval using 10,000 resamples, all hypotheses from H1 to H8 are supported, because their confidence interval does not include zero (Table 7), so these hypotheses are adopted.
All of these results complete a basic analysis of PLS-SEM in our research. The PLS-SEM result is shown in Figure 5.
Finally, Table 8 shows the amount of variance that each antecedent variable explains on each endogenous construct. R2 were greater than 0.334 for almost all values except PEOU, which was 0.209, still greater than 0.1. Moreover, all values of Q2 were greater than 0.199, which means that cross-validated redundancy measures show that the theoretical/structural model has a predictive relevance (Q2 > 0).
Hence, the results provide strong support for the significance of the model paths, with sufficient explanatory power and predictive capacity of the independent variables on the dependent constructs. All hypotheses (H1 to H8) are validated, supporting the model’s robustness in explaining the relationships between constructs in the context of Quiz-Based Game Applications.

5.3. Qualitative Analysis

The analysis phases conducted in this study involved several key steps. First, introductory questions were asked to determine the profile of participants (age and academic background) and their interest in attending these sessions. In the second step, the topic of the focus group was introduced, providing context with information about the use of Quiz-Based Game Applications.
In the third step, participants were actively encouraged to contribute responses that aligned with their personal experiences. The facilitator guided a discussion structured around six themes: general perception of Quiz-Based Game Applications, attitudes toward their use in educational contexts, impact on communication and expression skills, contribution to teamwork, support in information management and autonomous learning, and facilitation of analysis and synthesis skills for problem solving.
The focus group questions were as follows: (1) What is your general perception of using Quiz-Based Game Applications? (2) How does the use of Quiz-Based Game Applications influence your attitude toward collaborative learning? (3) Do you feel that using Quiz-Based Game Applications has improved your ability to communicate ideas clearly and effectively? (4) Has your willingness to work in a team changed due to Quiz-Based Game Applications? (5) Do you consider that Quiz-Based Game Applications help you manage information more efficiently and develop autonomous learning skills? (6) Has the use of Quiz-Based Game Applications helped you develop analysis and synthesis skills for problem solving?
There was a high level of interest and participation in the focus group sessions. Each concept was introduced with a guiding question to frame the discussion. The research team took notes on participants’ comments and opinions without including any personal data.
The results obtained for each question are detailed below:
(1)
What is your general perception of using Quiz-Based Game Applications?
Participants indicated that these applications are highly effective for reinforcing knowledge. They provide an interactive form of learning that enhances concept retention and makes studying more dynamic and engaging.
(2)
How does the use of Quiz-Based Game Applications influence your attitude toward collaborative learning?
Participants reported that these applications positively impacted their attitude toward group learning. By allowing them to compare their answers with those of their peers, the applications motivated them to share ideas and learn from each other.
(3)
Do you feel that using Quiz-Based Game Applications has improved your ability to communicate ideas clearly and effectively?
Participants indicated that the use of these tools helped them to express their ideas more clearly during the exercises. They also noted that these tools enabled them to initiate communication with classmates with whom they had not previously interacted, allowing them to overcome initial shyness. Additionally, they highlighted that the need to respond within a limited time encouraged them to be less inhibited when discussing questions and answers with their classmates, thereby fostering effective communication to articulate their decisions.
(4)
Has your willingness to work in a team changed due to Quiz-Based Game Applications?
Participants felt that these tools helped them express ideas more clearly during the exercises. They also noted that the applications encouraged them to communicate with classmates they had not previously interacted with, reducing their inhibitions. They highlighted that the time constraints for answering encouraged them to be more decisive and practice effective communication when discussing questions and answers.
(5)
Do you consider that Quiz-Based Game Applications help you manage information more efficiently and develop autonomous learning skills?
Participants indicated that these applications helped them organize information more systematically and identify areas needing improvement. Additionally, the publication of quizzes on the virtual platform after classes promoted autonomous learning by allowing them to review and repeat the exercises independently, which fostered their independence.
(6)
Has the use of Quiz-Based Game Applications helped you develop analysis and synthesis skills for problem solving?
Participants noted that the exercises conducted through these applications taught them how to break down problems into manageable parts and synthesize relevant information to reach a solution. Additionally, they appreciated the opportunity to observe how other students approached the problems, which allowed them to compare strategies, discover new approaches, and gain confidence in tackling problems more analytically and logically.
Observations during classroom and laboratory sessions revealed a high level of student interaction with Quiz-Based Game Applications and with peers in collaborative activities. Students demonstrated a notable willingness to propose ideas and actively participate, utilizing the applications’ functionalities to share knowledge and enrich group discussions. This proactive attitude reflects a learning environment in which students feel comfortable and motivated to contribute, fostering the development of their communication and collaboration skills.
Moreover, students responded positively to the automated feedback provided by the applications. Most displayed a strong adaptability to immediate feedback, allowing them to adjust their responses and improve their understanding of the topics addressed. This ability to adapt to feedback not only facilitated deeper learning but also reinforced their self-confidence and autonomy in problem-solving processes. Students appeared to use feedback as a continuous improvement tool, evidencing a genuine commitment to their academic development.
In terms of non-verbal communication, positive behaviors were recorded that reflect student engagement. Their interactions—both verbal and gestural—were frequent, and their body language indicated interest and concentration in the activities. Most students initially showed a tendency to work independently with the applications, yet as the activity progressed, they became less inhibited, participating actively without needing constant assistance. This degree of independence suggests that students not only understood how the applications functioned but also integrated them effectively into their learning process, indicating a high level of competence in using educational technology.

6. Discussion

Based on the results, the model proposed for this analysis is highly satisfactory. The reliability of each item, along with the values of simple and composite reliability, met acceptable standards, demonstrating a high level of internal consistency reliability among the latent variables. Additionally, it was found that the values of validity and discriminant validity of the measures were within acceptable ranges. Furthermore, the relationships between the variables were predominantly significant, confirming the validation of all hypotheses.
As shown in Table 7, QGBL applications have a direct positive impact on perceived usefulness (PU) H3, and perceived ease of use (PEOU) H2, explaining up to 33.80% and 20.9% of the variance, as stated in [1,2,3]. Additionally, we can observe that perceived ease of use (PEOU) has a significant positive correlation with perceived usefulness (PU), explaining 13.29% (H5), in line with [11,14,24]. On the other hand, we can observe that perceived usefulness (PU) H7 and perceived ease of use (PEOU) H4 have a direct positive impact on attitude towards use (ATU) H1, explaining 50.10% of the variance among students towards game-based learning applications within formative collaborative learning, as stated in [17,61]. It is thus evidenced that the acceptance and effective use of technology depends on the motivation to use it and factors such as perceived usefulness and perceived ease of use. We can also observe that there is a direct positive relationship of perceived usefulness (PU) H8, perceived ease of use (PEOU) H6, and attitude towards use (ATU) on the acquisition of competences, which largely depends on the attitude and interest of students in using these game-based learning applications within formative collaborative learning [8,9].
However, although QGBL does not have a direct impact on students, it enhances the learning process by facilitating the acquisition of basic competences through the motivation of students who perceive the methodology used in this formative collaborative learning process as useful (PU) H3 and easy to carry out (PEOU) H2. Thus, the relationships influence COMP by 11.19% (H1), PEOU by 16.84% (H6), and PU by 11.76% (H8), in line with the findings of other authors such as [20], thereby enhancing soft skills such as teamwork competency [5,6], communication and negotiation, problem-solving skills, as well as critical and creative thinking skills [22] and cooperative skills [12].
In view of the results, we can affirm that students show a positive attitude towards QGBL due to its engaging and interactive nature, which makes learning more enjoyable and motivating, in line with [90]. Additionally, it enhances knowledge acquisition, particularly in subjects like programming and science, by providing interactive and hands-on learning experiences as indicated by [28]. The integration of social elements in QGBL, such as cooperative and competitive modes, can improve learning outcomes and student attitudes by fostering a collaborative and competitive spirit [91]. Another benefit of using QGBL in collaborative learning focuses on improvements in content comprehension, in line with the literature review carried out [92].
These findings are consistent with focus group and observation results, which suggest that the use of Quiz-Based Game Applications in collaborative contexts has been well received and has enabled students to thrive in an active and positive learning environment. The results indicate improvements in students’ collaboration skills, adaptability and autonomy, as well as a genuine inclination towards continuous learning and teamwork.

7. Conclusions

Since COVID-19, game-based learning (GBL) has gained significant traction in education due to its potential to increase student engagement and improve learning outcomes. Understanding students’ attitudes towards these applications can provide insights into their effectiveness and areas for improvement.
The results reveal that students in the experimental study had positive attitudes towards Quiz-Based Game Applications, finding them attractive and effective for enhancing their knowledge and motivation. Furthermore, the study shows that these tools and students’ attitude towards their usage significantly influence the acquisition of essential competences for computer engineering students, such as communication skills, teamwork, self-directed learning, knowledge management, and analytical and synthesis abilities.
While numerous studies have investigated the effects of game-based learning (GBL) on student motivation and academic performance, fewer have examined how Quiz-Based Game Learning Applications (QGBLAs) contribute to the development of critical competences such as communication, teamwork, and analytical skills. Our research fills this gap by exploring how QGBLAs support not only academic achievement but also the acquisition of vital soft skills within a collaborative learning environment. Utilizing the Technology Acceptance Model (TAM), we demonstrate how students’ perceptions of the usefulness, ease of use, and overall attitude towards QGBL applications affect their development of competences in computer science.
Our results indicate the following: Firstly, students perceive Quiz-Based Game Learning Applications (QGBLAs) as useful and easy to use, which positively influences their attitude towards using these tools in educational contexts. Secondly, the implementation of game-based learning strategies plays a crucial role in the effective development of essential competences, such as soft skills, which are critical for the comprehensive education of computer science students. The study underscores the importance of considering the QGBL approach alongside collaborative learning as a valuable educational strategy to enhance learning and develop fundamental skills in computer science students. Therefore, it is recommended to consider integrating game-based learning strategies into the broader educational curriculum to enhance student engagement and skill acquisition. Thirdly, the research reinforces the validity of TAM in the context of e-learning, demonstrating that perceived usefulness and ease of use are significant predictors of students’ adoption of new technologies. That emphasizes the importance of developing applications that are not only pedagogically effective but also accessible and easy to handle for end-users.
Quiz-Based Game Applications present an innovative opportunity to make the learning process more engaging, motivating, and aligned with the development of competences required for contemporary students. The findings of this study not only contribute to a better understanding of the acceptance of these tools and their impact on competency development, but also provide valuable insights for educational decision making regarding which tools and strategies to employ in the university context.
The results of the study can assist teachers and administrators in integrating QGBL tools into their courses and higher education curricula, rethinking how soft skills are taught and assessed in universities.
Other theoretical and practical implications for education drawn from this study are the following:
While this study offers valuable insights, it also presents several theoretical and practical limitations relevant to educational contexts.
Firstly, technical challenges, such as connectivity issues and the need for teacher adaptation, may impact the effectiveness of QGBL. These challenges highlight the importance of institutional support and targeted teacher training to facilitate the successful integration of these technologies in the classroom.
Secondly, the quantitative methodology applied, based on self-reported data, may introduce common method variance. Although the sample met all the necessary criteria for conducting measurement and structural model analyses, an expanded sample would be needed to improve the generalizability of the findings.
Thirdly, the total variance explained by the dependent variables is not fully accounted for, suggesting that certain relevant predictors may have been omitted from the study. Finally, as the sample is limited to computer science students, the findings may have limited applicability to other fields, such as health sciences or business studies. Expanding the study to include students from various academic disciplines would enrich the research and allow for a more comprehensive assessment of the generalizability of the findings.
Accordingly, the following future lines of research are recommended: Firstly, to conduct further investigations that explore in greater depth how variations in QGBLA design may influence different types of learning and diverse student groups. Secondly, to examine the potential for expanding the study to incorporate additional factors that enable a more comprehensive analysis of their impact on educational outcomes. Thirdly, considering that the study was conducted at a medium-sized university, it would be valuable to extend the research to include students from both large and small universities offering computer science degrees, thereby enhancing the generalizability of the findings.

Author Contributions

Conceptualization, methodology, software, validation, formal analysis, investigation, resources, writing—original draft preparation, writing—review and editing, visualization, supervision, project administration, funding acquisition, J.A.M.; Conceptualization, methodology, formal analysis, investigation, resources writing—original draft preparation, writing—review and editing, visualization, R.E. and R.B.; methodology, software, validation, investigation, resources, data curation, writing—original draft preparation, writing—review and editing, R.R.-R., S.O.-T. and A.M.T. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Comunidad de Madrid through the Multi-Year Agreement with the Universidad de Alcalá in its “Programa de Estímulo a la Investigación de Jóvenes Investigadores”, within the framework of the V PRICIT (V Plan Regional de Investigación Científica e Innovación Tecnológica), with reference number CM/JIN/2021-026.

Data Availability Statement

The data that support the findings of this study are available from the corresponding author upon reasonable request.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Auvinen, T.; Hakulinen, L.; Malmi, L. Increasing Students’ Awareness of Their Behavior in Online Learning Environments with Visualizations and Achievement Badges. IEEE Trans. Learn. Technol. 2015, 8, 261–273. [Google Scholar] [CrossRef]
  2. Hasan, H.F.; Nat, M.; Vanduhe, V.Z. Gamified Collaborative Environment in Moodle. IEEE Access 2019, 7, 89833–89844. [Google Scholar] [CrossRef]
  3. Kuang, T.M.; Agustina, L.; Monalisa, Y. Acceptance of digital game-based learning by accounting and business lecturers: Empirical evidence from Indonesia based on the extended Technology Acceptance Model. Account. Educ. 2023, 33, 391–413. [Google Scholar] [CrossRef]
  4. Garcia, I.; Pacheco, C.; Méndez, F.; Calvo-Manzano, J.A. The effects of game-based learning in the acquisition of “soft skills” on undergraduate software engineering courses: A systematic literature review. Comput. Appl. Eng. Educ. 2020, 28, 1327–1354. [Google Scholar] [CrossRef]
  5. Martín-Hernández, P.; Gil-Lacruz, M.; Gil-Lacruz, A.I.; Azkue-Beteta, J.L.; Lira, E.M.; Cantarero, L. Fostering university students’ engagement in teamwork and innovation behaviours through game-based learning (GBL). Sustainability 2021, 13, 13573. [Google Scholar] [CrossRef]
  6. Sousa, M.J.; Rocha, Á. Game Based Learning Contexts for Soft Skills Development. In Recent Advances in Information Systems and Technologies. WorldCIST 2017; Rocha, Á., Correia, A., Adeli, H., Reis, L., Costanzo, S., Eds.; Advances in Intelligent Systems and Computing; Springer: Cham, Switzerland, 2017; Volume 570. [Google Scholar] [CrossRef]
  7. The European Higher Education Area (EHEA). Available online: https://www.study.eu/article/the-european-higher-education-area-ehea (accessed on 3 June 2024).
  8. Chung, C.; Shen, C.; Qiu, Y. Students’ acceptance of gamification in higher education. Int. J. Game-Based Learn. 2019, 9, 1–19. [Google Scholar] [CrossRef]
  9. Moon, M.; Jung, H. An empirical study of the exponential learning factors in digital game based learning model: Using an extended technology acceptance model (ETAM) approach. Adv. Sci. Lett. 2016, 22, 2035–2042. [Google Scholar] [CrossRef]
  10. Davis, F.D. A Technology Acceptance Model for Empirically Testing New End-User Information Systems: Theory and Results. Ph.D. Thesis, Massachusetts Institute of Technology, Sloan School of Management, Cambridge, MA, USA, 1985. Available online: http://hdl.handle.net/1721.1/15192 (accessed on 3 June 2024).
  11. Davis, F.D. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 1989, 13, 319–340. [Google Scholar] [CrossRef]
  12. Estriegana, R.; Medina-Merodio, J.; Robina-Ramírez, R.; Barchino, R. Analysis of cooperative skills development through relational coordination in a gamified online learning environment. Electronics 2021, 10, 2032. [Google Scholar] [CrossRef]
  13. Estriegana, R.; Teixeira, A.M.; Robina-Ramirez, R.; Medina-Merodio, J.-A.; Otón, S. Impact of communication and relationships on student satisfaction and acceptance of self- and peer-assessment. Educ. Inf. Technol. 2024, 29, 14715–14731. [Google Scholar] [CrossRef]
  14. Estriegana, R.; Medina-Merodio, J.; Barchino, R. Analysis of competence acquisition in a flipped classroom approach. Comput. Appl. Eng. Educ. 2018, 27, 49–64. [Google Scholar] [CrossRef]
  15. De-Marcos, L.; García-López, E.; García-Cabot, A.; Medina-Merodio, J.-A.; Domínguez, A.; Martínez-Herráiz, J.-J.; Diez-Folledo, T. Social network analysis of a gamified e-learning course: Small-world phenomenon and network metrics as predictors of academic performance. Comput. Hum. Behav. 2016, 60, 312–321. [Google Scholar] [CrossRef]
  16. Werbach, K.; Hunter, D. For the Win: How Game Thinking Can Revolutionize Your Business; University of Pennsylvania: Philadelphia, PA, USA, 2012. [Google Scholar]
  17. Alawadhi, A.Y.; Abu-Ayyash, E.A. Students’ perceptions of Kahoot!: An exploratory mixed-method study in EFL undergraduate classrooms in the UAE. Educ. Inf. Technol. 2021, 26, 3629–3658. [Google Scholar] [CrossRef]
  18. Buckley, P.; Doyle, E. Gamification and student motivation. Interact. Learn. Environ. 2014, 24, 1162–1175. [Google Scholar] [CrossRef]
  19. Wang, A.I. The wear out effect of a game-based student response system. Comput. Educ. 2015, 82, 217–227. [Google Scholar] [CrossRef]
  20. Feroz, H.M.B.; Zulfiqar, S.; Noor, S.; Huo, C. Examining multiple engagements and their impact on students’ knowledge acquisition: The moderating role of information overload. J. Appl. Res. High. Educ. 2022, 14, 366–393. [Google Scholar] [CrossRef]
  21. Kazu, İ.; Kuvvetli, M. A triangulation method on the effectiveness of digital game-based language learning for vocabulary acquisition. Educ. Inf. Technol. 2023, 28, 13541–13567. [Google Scholar] [CrossRef]
  22. Fejes, C.; Ros-McDonnell, L.; Péter, B. Enhancement and assessment of engineering soft skills in a game-based learning environment. In Proceedings of the European Conference on Games-Based Learning, Steinkjer, Norway, 8–9 October 2015; pp. 178–185. [Google Scholar]
  23. Wang, A.I.; Tahir, R. The effect of using kahoot! for learning—A literature review. Comput. Educ. 2020, 149, 103818. [Google Scholar] [CrossRef]
  24. Venkatesh, V.; Davis, F.D. A Theoretical Extension of the Technology Acceptance Model: Four Longitudinal Field Studies. Manag. Sci. 2000, 46, 186–204. [Google Scholar] [CrossRef]
  25. Venkatesh, V.; Morris, M.G.; Davis, G.B.; Davis, F.D. User acceptance of information technology: Toward a unified view. MIS Q. 2003, 27, 425–478. [Google Scholar] [CrossRef]
  26. Al-Azawei, A.; Parslow, P.; Lundqvist, K. Investigating the effect of learning styles in a blended e-learning system: An extension of the technology acceptance model (TAM). Australas. J. Educ. Technol. 2017, 33, 1–13. [Google Scholar] [CrossRef]
  27. Salloum, S.A.; Alhamad, A.Q.M.; Al-Emran, M.; Monem, A.A.; Shaalan, K. Exploring students’ acceptance of e-learning through the development of a comprehensive technology acceptance model. IEEE Access 2019, 7, 128445–128462. [Google Scholar] [CrossRef]
  28. Zhao, Y.; Wang, N.; Li, Y.; Zhou, R.; Li, S. Do cultural differences affect users’e-learning adoption? A meta-analysis. Br. J. Educ. Technol. 2021, 52, 20–41. [Google Scholar] [CrossRef]
  29. Sukendro, S.; Habibi, A.; Khaeruddin, K.; Indrayana, B.; Syahruddin, S.; Makadada, F.A.; Hakim, H. Using an extended technology acceptance model to understand students’ use of e-learning during COVID-19: Indonesian sport science education context. Heliyon 2020, 6, e05410. [Google Scholar] [CrossRef]
  30. Zalat, M.M.; Hamed, M.S.; Bolbol, S.A. The experiences, challenges, and acceptance of e-learning as a tool for teaching during the COVID-19 pandemic among university medical staff. PLoS ONE 2021, 16, e0248758. [Google Scholar] [CrossRef]
  31. Adejare, B.O.; Olaore, G.O.; Udofia, E.E.; Adenigba, O.A. COVID-19 Pandemic and Business Survival as Mediation on the Performance of Firms in the FMCG-Sector. Athens J. Bus. Econ. 2022, 8, 239–260. [Google Scholar] [CrossRef]
  32. Radif, M.; Fan, I.S.; McLaughlin, P. Employment of technology acceptance model (tam) to adopt learning management system (lms) in iraqi universities. In Proceedings of the INTED2016 Proceedings, Valencia, Spain, 7–9 March 2016; pp. 7120–7130. [Google Scholar]
  33. Kaewsaiha, P.; Chanchalor, S. Factors affecting the usage of learning management systems in higher education. Educ. Inf. Technol. 2021, 26, 2919–2939. [Google Scholar] [CrossRef]
  34. Briz-Ponce, L.; Pereira, A.; Carvalho, L.; Juanes-Méndez, J.A.; García-Peñalvo, F.J. Learning with mobile technologies—Students’ behavior. Comput. Hum. Behav. 2017, 72, 612–620. [Google Scholar] [CrossRef]
  35. Arpaci, I. Antecedents and consequences of cloud computing adoption in education to achieve knowledge management. Comput. Hum. Behav. 2017, 70, 382–390. [Google Scholar] [CrossRef]
  36. Nagy, J.T. Evaluation of online video usage and learning satisfaction: An extension of the technology acceptance model. Int. Rev. Res. Open Distrib. Learn. 2018, 19, 160–185. [Google Scholar] [CrossRef]
  37. Dianati, S.; Nguyen, M.; Dao, P.; Iwashita, N.; Vasquez, C. Student perceptions of technological tools for flipped instruction: The case of Padlet, Kahoot! and Cirrus. J. Univ. Teach. Learn. Pract. 2020, 17, 4. [Google Scholar] [CrossRef]
  38. Dele-Ajayi, O.; Strachan, A.E.V.; Victor, A.M. Technology-Fnhanced Teaching: A Technology Acceptance Model to Study Teachers’ Intentions to Use Digital Games in the Classroom. In Proceedings of the 2019 IEEE Frontiers In Education Conference (FIE 2019), Covington, KY, USA, 16–19 October 2019. [Google Scholar]
  39. Balkaya, S.; Akkucuk, U. Adoption and Use of Learning Management Systems in Education: The Role of Playfulness and Self-Management. Sustainability 2021, 13, 1127. [Google Scholar] [CrossRef]
  40. Sánchez-Mena, A.; Martí-Parreño, J.; Aldás-Manzano, J. Teachers’ intention to use educational video games: The moderating role of gender and age. Innov. Educ. Teach. Int. 2018, 56, 318–329. [Google Scholar] [CrossRef]
  41. Chiao, H.-M.; Chen, Y.-L.; Huang, W.-H. Examining the usability of an online virtual tour-guiding platform for cultural tourism education. J. Hosp. Leis. Sport Tour. Educ. 2018, 23, 29–38. [Google Scholar] [CrossRef]
  42. Liu, F.-L.; Hong, G.-D.; Shih, J.-L.; Ghinea, G. The Development and Evaluation of the Platform for Online High-Level Cooperative Games. In Proceedings of the 31st International Conference on Computers in Education, ICCE 2023, Matsue, Japan, 4–8 December 2023; Volume I, pp. 177–182. [Google Scholar]
  43. Lin, H.-C.K.; Lin, Y.-H.; Wang, T.-H.; Su, L.-K.; Huang, Y.-M. Effects of Incorporating Augmented Reality into a Board Game for High School Students’ Learning Motivation and Acceptance in Health Education. Sustainability 2021, 13, 3333. [Google Scholar] [CrossRef]
  44. Liu, Y.; Liu, Y.; Yue, K. Investigating the Factors that Influence Technology Acceptance of an Educational Game Integrating Mixed Reality and Concept Maps. In Proceedings of the IEEE 21st International Conference On Advanced Learning Technologies (ICALT 2021), Online, 12–15 July 2021; pp. 409–413. [Google Scholar]
  45. Fernando, O.N.N.; Kannappan, V.T.; Tan, X.; Hong, J.Y.J.; Chattopadhyay, A.; Seah, H.S. La Petite Fee Cosmo Learning data structures through game-based learning. In Proceedings of the 17th Acm Siggraph International Conference on Virtual-Reality Continuum and Its Applications in Industry (VRCAI 2019), Brisbane, Australia, 14–16 November 2019. [Google Scholar]
  46. Maskeliūnas, R.; Kulikajevas, A.; Blažauskas, T.; Damaševičius, R.; Swacha, J. An Interactive Serious Mobile Game for Supporting the Learning of Programming in JavaScript in the Context of Eco-Friendly City Management. Computers 2020, 9, 102. [Google Scholar] [CrossRef]
  47. Frøsig, T.B. Expanding the Technology Acceptance Model (TAM) to Consider Teachers Needs and Concerns in the Design of Educational Technology (EdTAM). Int. J. Emerg. Technol. Learn. (iJET) 2023, 18, 130–140. [Google Scholar] [CrossRef]
  48. Shiue, Y.-M.; Hsu, Y.-C.; Liang, Y.-C. Investigating elementary students’ epistemological beliefs, game preference by applying game-based learning to a history course. In Proceedings of the IEEE International Conference on Advanced Materials for Science and Engineering (IEEE-ICAMSE 2016), Tainan, Taiwan, 12–13 November 2016; pp. 460–462. [Google Scholar]
  49. Ng, S.C.K.; Tan, L.F.; Lau, P.N. Enhancing Learner Satisfaction in Simulation-Based Learning: The Impact of Learner Characteristics and Expectancy. In Proceedings of the 31st International Conference on Computers in Education, ICCE 2023, Matsue, Japan, 4–8 December 2023; Volume I, pp. 696–703. [Google Scholar]
  50. Zhang, F. Effects of game-based learning on academic outcomes: A study of technology acceptance and self-regulation in college students. Heliyon 2024, 10, e36249. [Google Scholar] [CrossRef]
  51. Chu, S.K.W.; Kwan, A.C.; Reynolds, R.; Mellecker, R.R.; Tam, F.; Lee, G.; Leung, C.Y. Promoting Sex Education among Teenagers Through an Interactive Game: Reasons for Success and Implications. Games Health J. 2015, 4, 168–174. [Google Scholar] [CrossRef]
  52. Gayao, K.D.; Aben, J.P.D.; Remiendo, J.Y.; Palaoag, T.D. Gamified Reviewer Based on the EFM Model for An Effective Learning Environment. In Proceedings of the 2021 1st International Conference in Information and Computing Research (ICORE 2021), Manila, Philippines, 11–12 December 2021; pp. 193–197. [Google Scholar]
  53. Quintana, N.B.; Andonegui, A.R.; Berasaluce, J.P.; de la Serna, A.L. Digital Escape Room for the Development of Collaborative Learning in Higher Education. Educ. Knowl. Soc. 2022, 23, 229–242. [Google Scholar]
  54. Saleh, N.; Prakash, E.; Manton, R. Measuring Student Acceptance of Game Based Learning for Game and Technology Education Curriculum Development. In Proceedings of the 2014 International Conference on Education Technologies and Computers (ICETC), Lodz, Poland, 22–24 September 2014; pp. 79–85. [Google Scholar]
  55. Ishaq, K.; Azan, N.; Rosdi, F.; Abid, A.; Ali, Q. Usefulness of Mobile Assisted Language Learning in Primary Education. Int. J. Adv. Comput. Sci. Appl. 2020, 11, 384–395. [Google Scholar] [CrossRef]
  56. Lester, D.; Skulmoski, G.J.; Fisher, D.P.; Mehrotra, V.; Lim, I.; Lang, A.; Keogh, J.W.L. Drivers and barriers to the utilisation of gamification and game-based learning in universities: A systematic review of educators’ perspectives. Br. J. Educ. Technol. 2023, 54, 1748–1770. [Google Scholar] [CrossRef]
  57. Barber, C.S. When Students are Players: Toward a Theory of Student-Centric Edu-Gamification Systems. J. Inf. Syst. Educ. 2021, 32, 53–65. [Google Scholar]
  58. Gumbi, N.M.; Sibaya, D.; Chibisa, A. Exploring Pre-Service Teachers’ Perspectives on the Integration of Digital Game-Based Learning for Sustainable STEM Education. Sustainability 2024, 16, 1314. [Google Scholar] [CrossRef]
  59. Liew, T.W.; Siradj, Y.; Tan, S.-M.; Roedavan, R.; Khan, M.T.I.; Pudjoatmodjo, B. Game-Changer NPCs: Leveling-Up Technology Acceptance and Flow in a Digital Learning Quest. Int. J. Hum.–Comput. Interact. 2024, 1–22. [Google Scholar] [CrossRef]
  60. García-García, C.; Serrano, J.G.; Escrig, R.I.; Miralles, F.F.; Torres, I.A.; Poch, M.P. Gamification as a tool for acquisition soft skills in the design field. In Proceedings of the INTED2018 Proceedings, Valencia, Spain, 5–7 March 2018; pp. 3569–3578. [Google Scholar]
  61. Yousafzai, S.Y.; Foxall, G.R.; Pallister, J.G. Technology acceptance: A meta-analysis of the TAM: Part 1. J. Model. Manag. 2007, 2, 251–280. [Google Scholar] [CrossRef]
  62. Medina-Lopez, C.; Marin-Garcia, J.A.; Alfalla-Luque, R. A methodological proposal for the systematic literature review. WPOM Work. Pap. Oper. Manag. 2010, 1, 13–30. [Google Scholar] [CrossRef]
  63. Scopus. 2024. Available online: https://www.scopus.com/home.uri (accessed on 3 June 2024).
  64. Zeitlin, B.D.; Sadhak, N.D. Attitudes of an international student cohort to the Quizlet study system employed in an advanced clinical health care review course. Educ. Inf. Technol. 2022, 28, 3833–3857. [Google Scholar] [CrossRef]
  65. Pham, A.T. The impact of gamified learning using Quizizz on ESL learners’ grammar achievement. Contemp. Educ. Technol. 2023, 15, ep410. [Google Scholar] [CrossRef] [PubMed]
  66. Le, M.T.T.; Van Tran, K. University Students’ Engagement with Feedback in a Quiz Platform: A Case Study of Quizizz. J. Univ. Teach. Learn. Pract. 2024, 21. [Google Scholar] [CrossRef]
  67. Boroughani, T.; Behshad, N.; Xodabande, I. Mobile-assisted academic vocabulary learning with digital flashcards: Exploring the impacts on university students’ self-regulatory capacity. Front. Psychol. 2023, 14, 1112429. [Google Scholar] [CrossRef]
  68. Maiti, M.; Priyaadharshini, M. Evaluation of the experiences of learners and facilitators with ICT within the realm of higher education. Cogent Educ. 2024, 11, 2355377. [Google Scholar] [CrossRef]
  69. Codish, D.; Ravid, G. Academic course gamification: The art of perceived playfulness. Interdiscip. J. e-Skills Lifelong Learn. 2014, 10, 131–151. Available online: http://www.ijello.org/Volume10/IJELLOv10p131-151Codish893.pdf (accessed on 3 June 2024). [CrossRef]
  70. Aparicio, M.; Costa, C.J.; Moises, R. Gamification and reputation: Key determinants of e-commerce usage and repurchase intention. Heliyon 2021, 7, e06383. [Google Scholar] [CrossRef]
  71. Macqual, S.M.; Salleh, U.K.M.; Zulnaidi, H. Assessing prospective teachers’ soft skills curriculum implementation: Effects on teaching practicum success. S. Afr. J. Educ. 2021, 41, 1–21. [Google Scholar] [CrossRef]
  72. Hossain, M.; Alam, M.; Alamgir, M.; Salat, A. Factors affecting business graduates’ employability–empirical evidence using partial least squares (PLS). Educ. + Train. 2020, 62, 292–310. [Google Scholar] [CrossRef]
  73. Dogara, G.; Bin Saud, M.S.; Bin Kamin, Y. Work-Based Learning Conceptual Framework for Effective Incorporation of Soft Skills Among Students of Vocational and Technical Institutions. IEEE Access 2020, 8, 211642–211652. [Google Scholar] [CrossRef]
  74. Kosasi, S.; Kasma, U.; Yuliani, I.D.A.E. The Mediating Role of Learning Analytics to Improve Student Academic Performance. In Proceedings of the International Conference on Cybernetics and Intelligent System (ICORIS), Manado, Indonesia, 27–28 October 2020; pp. 1–6. [Google Scholar]
  75. O’Leary, Z. The Essential Guide to Doing Your Research Project; SAGE Publications Ltd.: Thousand Oaks, CA, USA, 2017; ISBN 978-1-4739-5207-2. [Google Scholar]
  76. Universidad de Alcalá, La Universidad en Cifras. Available online: https://www.uah.es/es/conoce-la-uah/la-universidad/la-uah-en-cifras/la-oficina-estadistica/ (accessed on 24 July 2024).
  77. Ministerio de Universidades, Estadística de Estudiantes. Available online: https://www.universidades.gob.es/estadistica-de-estudiantes/ (accessed on 24 July 2024).
  78. Fundacion BBVA, Resultados U-Ranking de Universidades. Available online: https://www.u-ranking.es/ranking (accessed on 25 July 2024).
  79. Universidad de Alcalá. Datos y Cifras. Available online: https://transparencia.uah.es/export/sites/transparencia/es/.galleries/Informes-Planificacion/UAH_Cifras_2023-24.pdf (accessed on 24 July 2024).
  80. Hair, J.F.; Hult, G.T.M.; Ringle, C.M.; Sarstedt, M. A Primer on Partial Least Squares Structural Equation Modeling (PLS-SEM); SAGE: Thousand Oaks, CA, USA, 2014. [Google Scholar]
  81. Hulland, J. Use of partial least squares (PLS) in strategic management research: A review of four recent. Strateg. Manag. J. 1999, 20, 195. [Google Scholar] [CrossRef]
  82. Cohen, J. The earth is round (p < 0.05). Am. Psychol. 1994, 49, 997–1003. [Google Scholar]
  83. Carmines, E.G.; Zeller, R.A. Reliability and Validity Assessment; Sage Publications: Newbury Park, CA, USA, 1979. [Google Scholar]
  84. Nunnally, J.C.; Bernstein, I. Psychometric Theory. Rdsepiucsforg 1994, 3, 701. [Google Scholar]
  85. Werts, C.E.; Linn, R.L.; Jöreskog, K.G. Intraclass Reliability Estimates: Testing Structural Assumptions. Educ. Psychol. Meas. 1974, 34, 25–33. [Google Scholar] [CrossRef]
  86. Fornell, C.; Larcker, D.F. Evaluating Structural Equation Models with Unobservable Variables and Measurement Error. J. Mark. Res. 1981, 18, 39–50. [Google Scholar] [CrossRef]
  87. Henseler, J.; Ringle, C.M.; Sarstedt, M. A new criterion for assessing discriminant validity in variance-based structural equation modeling. J. Acad. Mark. Sci. 2014, 43, 115–135. [Google Scholar] [CrossRef]
  88. Chin, W.W. The partial least squares approach to structural equation modelling. In Modern Methods for Business Research; Marcoulides, G.A., Ed.; Lawrence Erlbaum: Mahwah, NJ, USA, 1998; pp. 295–336. [Google Scholar]
  89. Falk, R.F.; Miller, N.B. A Primer for Soft Modeling; University of Akron Press: Akron, OH, USA, 1992. [Google Scholar]
  90. Nguyen, M. Perception toward Game-based Learning of Students in Vietnam and Taiwan. Bachelor’s Thesis, I-Shou University, Kaohsiung, Taiwan, 2015. [Google Scholar] [CrossRef]
  91. Ke, F. Alternative goal structures for computer game-based learning. Int. J. Comput. Collab. Learn. 2008, 3, 429–445. [Google Scholar] [CrossRef]
  92. Hussein, M.H.; Ow, S.H.; Cheong, L.S.; Thong, M.-K.; Ebrahim, N.A. Effects of Digital Game-Based Learning on Elementary Science Learning: A Systematic Review. IEEE Access 2019, 7, 62465–62478. [Google Scholar] [CrossRef]
Figure 1. Conceptual research framework.
Figure 1. Conceptual research framework.
Electronics 13 04500 g001
Figure 2. Example of Quizlet in its Flashcards mode: (a) Term (b) Definition.
Figure 2. Example of Quizlet in its Flashcards mode: (a) Term (b) Definition.
Electronics 13 04500 g002
Figure 3. Example of Quizziz in its Quizzes mode.
Figure 3. Example of Quizziz in its Quizzes mode.
Electronics 13 04500 g003
Figure 4. Structural model results (baseline model).
Figure 4. Structural model results (baseline model).
Electronics 13 04500 g004
Figure 5. Results of testing the model significance * p < 0.05. ** p  <  0.01. *** p  < 0.001.
Figure 5. Results of testing the model significance * p < 0.05. ** p  <  0.01. *** p  < 0.001.
Electronics 13 04500 g005
Table 1. Questions, constructs, variables, and authors.
Table 1. Questions, constructs, variables, and authors.
Variable Questions Constructs Based on
ATT1I believe using the gamification system is a good idea.Attitude toward use[11,24]
ATT2I enjoy learning with a gamification system
ATT3Whenever possible, I would use the gamification system in future courses.
GA1Knowing my progress motivates me to complete 100% of the tasks I need to accomplish.Gamification[69,70]
GA2Tracking my progress helps me understand how I’m doing compared to the tasks I need to complete
GA3By being aware of my progress, I would make extra effort to improve it.
PEOU1I find the gamification system flexible to use.Perceived ease of use[11,24]
PEOU2The functionality and interface of the gamification system are clear and easy to understand.
PEOU3Interacting with the gamification system does not require much mental effort.
PU1Using the gamification system is useful in my learning.Perceived usefulness[11,24]
PU2Using the gamification system enhances my learning performance.
PU3The gamification system improves my learning outcomes.
CNK1The use of collaborative learning and gamification tools helps me acquire more knowledge in the subjects.Competences[71,72,73,74]
CR1Collaborative learning and gamification tools improve my communication with peers.
CR2Using collaborative learning and gamification tools helps me build better relationships with my peers.
TW1The use of collaborative learning and gamification tools helps me work effectively in a team.
Table 2. Outer model loadings.
Table 2. Outer model loadings.
ATTGAPEOUPUCOMP
ATT10.862
ATT20.917
ATT30.886
GA1 0.904
GA2 0.869
GA3 0.886
PEOU1 0.843
PEOU2 0.864
PEOU3 0.727
PU1 0.865
PU2 0.887
PU3 0.856
CNK1 0.759
CR1 0.862
CR2 0.878
TW1 0.810
Table 3. Cronbach’s alpha coefficients, Rho_A, construct reliability, and average variance extracted (AVE).
Table 3. Cronbach’s alpha coefficients, Rho_A, construct reliability, and average variance extracted (AVE).
Cronbach’s Alpharho_AComposite ReliabilityAverage Variance Extracted (AVE)
ATT0.8670.8800.9180.790
GA0.8640.8670.9170.786
PEOU0.7440.7610.8540.662
PU0.8390.8440.9030.756
COMP0.8470.8480.8970.687
Table 4. Discriminant validity matrix (Fornell–Larcker Criterion).
Table 4. Discriminant validity matrix (Fornell–Larcker Criterion).
ATTGAPEOUPUCOMP
ATT0.889
GA0.4720.886
PEOU0.4420.4570.814
PU0.6950.5230.4650.869
COMP0.5160.5120.5230.5250.829
Table 5. Discriminant validity matrix (heterotrait–monotrait ratio criterion).
Table 5. Discriminant validity matrix (heterotrait–monotrait ratio criterion).
ATTGAPEOUPUCOMP
ATT
GA0.540
PEOU0.5490.567
PU0.7990.6070.577
COMP0.5920.5870.6420.615
Table 6. Structural model results.
Table 6. Structural model results.
RsquareSample Mean (M)Standard Deviation (STDEV)T Statistics (|O/STDEV|)p Values
ATT0.5010.5110.0578.7350.000
PEOU0.2090.2180.0653.2250.001
PU0.3380.3490.0694.8860.000
COMP0.3990.4130.0577.0310.000
Table 7. Structural model results. Path significance using percentile bootstrap 95% confidence interval (n = 10,000 subsamples). Note(s): * p < 0.05, ** p < 0.01 *** p < 0.001. ns: no-significant (based on t (9999), one-tailed test).
Table 7. Structural model results. Path significance using percentile bootstrap 95% confidence interval (n = 10,000 subsamples). Note(s): * p < 0.05, ** p < 0.01 *** p < 0.001. ns: no-significant (based on t (9999), one-tailed test).
HypothesisResultsInfluenceSPCSample Mean (M)Standard Deviation (STDEV)T Statistics (|O/STDEV|)p ValuesChange Sing
H1Accept
(**)
ATT -> COMP0.2170.2170.0872.4870.006No
H2Accepted
(***)
GA -> PEOU0.4570.4620.0716.4670.000No
H3Accepted
(***)
GA -> PU0.3920.3900.0834.7110.000No
H4Accepted
(*)
PEOU -> ATT0.1520.1510.0752.0420.021No
H5Accepted
(***)
PEOU -> PU0.2860.2900.0723.9780.000No
H6Accepted
(***)
PEOU -> COMP0.3230.3270.0655.0040.000No
H7Accepted
(***)
PU -> ATT0.6240.6280.06010.3670.000No
H8Accepted
(**)
PU -> COMP0.2240.2250.0862.6210.004No
Table 8. Effects on endogenous variables (extended model).
Table 8. Effects on endogenous variables (extended model).
Dependent VariableRsquareQ2AntecedentsPath CoefficientsCorrelationsExplained Variance (%)
PEOU0.2090.199 20.90
H2: GA0.4570.45720.88
PU0.3380.256 33.80
H3: GA0.3920.52320.50
H5: PEOU0.2860.46513.29
ATT0.5010.200 50.10
H7: PU0.6240.69543.36
H4: PEOU0.1520.4426.71
COMP0.3990.225 39.90
H8: PU0.2240.52511.76
H1: ATT0.2170.51611.19
H6: PEOU0.3220.52316.84
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Medina, J.A.; Estriegana, R.; Barchino, R.; Robina-Ramírez, R.; Otón-Tortosa, S.; Moreira Teixeira, A. Acquisition and Development of Basic Competences in Collaborative Environments Through Quiz-Based Game Applications. Electronics 2024, 13, 4500. https://doi.org/10.3390/electronics13224500

AMA Style

Medina JA, Estriegana R, Barchino R, Robina-Ramírez R, Otón-Tortosa S, Moreira Teixeira A. Acquisition and Development of Basic Competences in Collaborative Environments Through Quiz-Based Game Applications. Electronics. 2024; 13(22):4500. https://doi.org/10.3390/electronics13224500

Chicago/Turabian Style

Medina, José Amelio, Rosa Estriegana, Roberto Barchino, Rafael Robina-Ramírez, Salvador Otón-Tortosa, and António Moreira Teixeira. 2024. "Acquisition and Development of Basic Competences in Collaborative Environments Through Quiz-Based Game Applications" Electronics 13, no. 22: 4500. https://doi.org/10.3390/electronics13224500

APA Style

Medina, J. A., Estriegana, R., Barchino, R., Robina-Ramírez, R., Otón-Tortosa, S., & Moreira Teixeira, A. (2024). Acquisition and Development of Basic Competences in Collaborative Environments Through Quiz-Based Game Applications. Electronics, 13(22), 4500. https://doi.org/10.3390/electronics13224500

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop