Next Article in Journal
Assessing Voltage Stability in Distribution Networks: A Methodology Considering Correlation among Stochastic Variables
Previous Article in Journal
Field Observation and Settlement Prediction Study of a Soft Soil Embankment under Rolling Dynamic Compaction
Previous Article in Special Issue
The Effects of an Ethics Education Program on Artificial Intelligence among Middle School Students: Analysis of Perception and Attitude Changes
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Incidence of Metaphorical Virtual Classrooms and Interactive Learning Objects in the Interaction of Online Students: An Ecuadorian Case Study

by
Erick P. Herrera-Granda
1,*,
Jonathan G. Loor-Bautista
1 and
Jorge I. Mina-Ortega
2
1
Universidad Politécnica Estatal del Carchi, Posgrado, Av. Universitaria y Antisana, Tulcán 040101, Ecuador
2
Facultad de Industrias Agropecuarias y Ciencias Ambientales, Universidad Politécnica Estatal del Carchi, Av. Universitaria y Antisana, Tulcán 040101, Ecuador
*
Author to whom correspondence should be addressed.
Appl. Sci. 2024, 14(15), 6447; https://doi.org/10.3390/app14156447
Submission received: 25 March 2024 / Revised: 15 April 2024 / Accepted: 15 April 2024 / Published: 24 July 2024
(This article belongs to the Special Issue ICTs in Education)

Abstract

:
This study explored the incidence of metaphorical virtual classrooms and interactive learning objects in the interaction of students in online mode. The main objective was to analyze how these digital tools, driven by a set of strategies to promote their use, affect the interaction of students in the virtual classroom system and their derived effects. To this end, the latest version of Moodle was implemented in conjunction with gamification plugins and interactive tools in the higher education institution used as a case study. The methodology consisted of data collection through ordinal instruments applied to the teachers and student performance metrics gathered using a plugin developed to extract accurate metrics of each student’s usage and performance through direct queries to the Moodle database and its processing through a neural network. This facilitated the collection of standardized data on the actual metrics of each virtual classroom at the end of the teaching of each subject from both the previous LMS and the newly implemented one. This data was then analyzed using advanced statistical techniques, including Mahalanobis distances, confirmatory factor analysis, and the Wilcoxon signed-rank test. These methods provided a compelling comparison between the old and new systems, revealing significant improvements in the metrics and factors evaluated. The results showed a significant improvement in teachers’ perceptions of the usability of the virtual classroom system and an increase in students’ academic performance, interaction, progress, and time spent learning in virtual contexts. These results provide solid empirical evidence of the added value of these educational tools as effective strategies for improving student interaction, performance, and motivation in online education.

1. Introduction

Student dropout is a complex and multifactorial problem that affects higher education institutions in different countries, primarily online or distance education. Depending on the context, different studies consider the specific causes of this phenomenon.
Student dropout in the online learning modality is a global challenge reflecting an intrinsic complexity encompassing academic, economic, social, and technological factors. In previous research, several cases have been cited where this problem has been observed. For example, at the National University of Distance Education (UNED) of Costa Rica, a significant discrepancy was observed between the official dropout rate, 87%, and the perception of dropout by students, 19%, who reported that the main problem was not dropout but the delay in graduation caused by structural deficiencies and a lack of support in the graduation process [1]. According to what was reported in [1], in Spain in 2010, UNED University reported a dropout rate of 80%, with causes that include psychological, academic, economic, social, and organizational aspects, highlighting the variability of determinants and the need for contextualized analysis.
At the regional level, the National Autonomous University of Mexico (UNAM) has reported facing similar challenges, with high dropout rates in open and distance modalities attributed to poor communication with faculty and a lack of clarity in instructions [2]. In Chile, as reported in [3], a combination of economic, methodological, communication, and time management factors were identified as determinants of dropout, suggesting the need for comprehensive approaches to improve student retention. According to [4], the challenges in Colombia include technological, pedagogical, communicative, and economic factors, suggesting solutions such as reviewing curriculum design and developing digital literacy. The study by [5] in Brazil highlights a lack of time to study, financial problems, and technological difficulties as the main reasons for dropping out. The study recommends improving teacher-student interaction, providing financial support, and offering flexible schedules. Several studies have highlighted the need for greater student self-regulation and engagement in Ecuador, suggesting gamification and innovative methodological strategies to increase retention [6]. In addition, the need to understand attrition patterns has been identified, suggesting the use of business intelligence for better information management and decision-making [7].
Despite the weather and whether the university where the study was conducted was public or private, all these studies reveal a commonality in the challenges associated with attrition in online learning, including academic, economic, social, and technological factors. Similar to other countries, Ecuador emphasizes the need for increased student self-regulation and engagement and suggests innovative strategies such as gamification to improve retention. There’s also a common focus on understanding attrition patterns, with suggestions such as using advanced statistical techniques for effective information management and decision-making [7]. These insights underscore the global nature of the dropout problem while highlighting specific strategies relevant to the Ecuadorian context. Thus, at the national level, dropout in the online learning modality is a multi-cause problem that deserves a comprehensive approach. The design of active and innovative methodological proposals is required, supported by technological tools and friendly and meaningful virtual environments that promote students’ motivation and commitment to their educational process [7].
The University of Otavalo—Otavalo, Ecuador, a private higher education institution used as a case study for this research, has a national scope in its online learning modality, as each academic period receives new students from all over Ecuador. Until 2023, this institution had serious dropout problems for online career students due to teachers’ poor use of the virtual classroom system and the lack of interaction and feedback to students. In addition, the institution had a virtual classroom system in an outdated version of Moodle, and the classrooms had no instructional design, so students were easily lost in the content, which was often not structured or timed appropriately at the time. There were no guidelines for using the LMS. As a result, students are easily demotivated due to a lack of support, guidance, and interactive tools to motivate them to access the content and complete the virtual learning environment activities [8,9].
This research was developed with the motivation of contributing to the study of the problem of student dropout in online higher education. This critical and multifaceted challenge significantly affects academic institutions globally. This study emphasized the need for a comprehensive approach to address the problem, given that dropout is not limited to a single factor but rather an amalgamation of interrelated causes. In the case of the University of Otavalo, significant problems were identified related to the inefficient use of virtual classrooms, technological obsolescence, and the lack of effective interaction between students and teachers. The observation of these phenomena highlighted the urgency of designing active and attractive methodological proposals, supported by advanced technological tools, to promote the motivation and commitment of students in their educational process.
As suggested in [10], interaction is a key tool capable of significantly mitigating the problems of lack of motivation, excessive cognitive load, and subsequent dropout. That is why, in the present study, it was proposed to implement an updated Moodle LMS in its latest version, with a focus on the use of metaphorical classrooms and interactive tools, such as H5P and LevelUp XP, to encourage interaction between teachers and students and contribute to the reduction of dropout problems in the institution. This new virtual classroom system was implemented concurrently with creating a policy that defined how the instructor should use the virtual classroom system and the standardization of a metaphorical virtual classroom template consisting of three units of three weeks each, including sections of content, activities, and assessments. Content included in the template included planning, guides, pre-recorded lessons, and additional resources. The activities were designed to be self-directed, hands-on, and teacher-directed and had preset templates. The evaluations proposed in the template considered incorporating weekly learning, considering pre-designed questionnaires integrated with a bank of questions and content. All these configurations proposed to be applied to the teacher and motivated by the regulations established in the institution promoted the use of a gamification system implemented through the use of the H5P and LevelUp XP plugins that allow students to be rewarded for each interaction they have in the system of virtual classrooms and rise in the ranking of each classroom. Finally, the proposal was evaluated using ordinal instruments applied to teachers and a plugin that allowed queries to the institution’s Moodle system database to process this information through neural networks to obtain interaction indices in each classroom and performance metrics. Advanced statistical analysis was then applied to the results obtained from the institution’s previous system and the new system. The results showed significant improvements of approximately 11.25% in student performance, 110.45% in the completion status of each classroom, 77.82% in the interaction index, and 114.56% in the average time students spend learning in the institution’s LMS. This paper is the result of a Master’s degree final project developed at the Polytechnic University of Carchi, whose author worked at the Online Learning Unit of the University of Otavalo during the intervention period. The present research was developed in an Ecuadorian higher education institution on the online study modality, so the area of intervention for the contribution is Ecuador. However, the implementations, experiences, and recommendations considered in this manuscript could be useful in other countries and regions with similar educational contexts.

Related Works

The implementation of metaphorical virtual classrooms in higher education has been positioned in recent years as a promising strategy to transform teaching-learning processes. Various studies developed in universities in Latin America and Spain show the benefits of this model and the challenges to its effective adoption. Below, a detailed review of this background is presented to support a novel investigation of the potential impact of implementing metaphorical classrooms in Ecuadorian universities. It is essential to cite relevant previous studies supporting the topic’s development. As an example, the following works can be mentioned.
One of the pioneering investigations in the area was developed by [11] at the “Antonio José de Sucre” National Experimental Polytechnic University of Venezuela. The study focused on the design of virtual classrooms based on pedagogical metaphors implemented through the Moodle platform. The authors highlighted that using metaphors allows the presentation of academic content in an entertaining, imaginative, and highly motivating way for students. The proposed activities are facilitated by using concepts and models from everyday scenarios close to reality, leading to understanding the topics, greater immersion, and active participation.
According to the study by [11], the PACIE methodology is ideal for developing this type of metaphorical virtual classroom, given that it harmoniously integrates pedagogical, technological, and instructional design components. The stages of Presence, Scope, Training, Interaction, and E-learning allow the creation of a virtual environment that combines academic and metaphorical spaces to generate significant learning. Among the findings, the importance of carefully selecting the metaphor based on the subject and objectives of the course stands out. They also highlight the need for the teacher to take a leading role in using metaphors to energize students’ interactions and experiences in the virtual classroom.
Along the same lines, [12] carried out research at the Territorial Polytechnic University of the State of Lara “Andrés Eloy Blanco” in Venezuela, specifically focused on designing a metaphorical virtual classroom for teaching algorithms and programming. The virtual environment created was based on the story of “The Little Prince” to be more exciting and motivating for programming students. Following the PACIE methodology, an immersive interface was achieved with missions and challenges related to academic content. The results demonstrated the usefulness of this approach in complementing face-to-face instruction in complex disciplines such as programming. However, the authors emphasize the importance of focusing metaphorical classrooms on those courses and subjects where they can most effectively enhance learning.
A study conducted by [13] at the University of Guayaquil in Ecuador revealed significant deficiencies in integrating ICT into teaching university teachers. It was found that 60% of the teachers did not use educational technology tools in their classes, limiting the possibilities for meaningful learning according to current needs. Faced with this problem, the researchers proposed a teacher training program focused on managing various computer applications such as Word, Excel, PowerPoint, virtual platforms, and the Internet. This background highlights the need to train teachers in the efficient use of technology as an essential step in implementing innovations such as metaphorical virtual classrooms in universities.
Similarly, at the University of Granada in Spain [14], a study was conducted in response to the confirmation of minimal use of available virtual resources by students and teachers. After developing a complete virtual classroom and conducting a pilot test with students, significant improvements in using virtual tools were obtained, reaching 98% usage among the participants. This result shows the potential of a comprehensive design of virtual classrooms, considering all the technical and pedagogical details, to increase their use and enrich the training processes.
Another relevant work that can be mentioned is the study of [15], which outlined the development and evaluation of online learning platforms tailored to enhance student interactivity and motivation. During the COVID-19 pandemic, this research identified key deficiencies in current e-learning systems, particularly in promoting clear communication and emotional engagement among users. It emphasizes the crucial role of emotions and awareness in online education, highlighting their impact on learner engagement and interaction. Through a comprehensive survey of Latin American educational institutions, the study underscored the need for e-learning platforms to integrate emotional and cognitive aspects to mitigate the feelings of isolation and passivity experienced by students in virtual learning environments.
From a technological perspective, the study of [16] presented a case study evaluating an Immersive Virtual Classroom (IVC) as an augmented reality platform for synchronous distance learning during the COVID-19 pandemic. The study used ANOVA analysis and structural equation modeling to assess engineering students’ preferences and acceptance of different classroom models: face-to-face, traditional virtual, and the IVC. The results indicated that students had a comparable preference for the IVC and face-to-face classrooms, both higher than the conventional virtual classroom. The key to the effectiveness of the IVC was its ability to simulate a face-to-face environment by enhancing the interactivity and didactic capabilities of online learning.
In summary, the reviewed studies provide consistent background information on the benefits of implementing metaphorical virtual classrooms in higher education. The main benefits include: increased student motivation and interest by feeling part of an immersive scenario that combines reality and fiction, development of significant learning by relating metaphors to prior knowledge, increased levels of interaction and collaboration among peers, and more active participation in academic activities [17]. However, several studies agree that technology alone cannot achieve these results. A careful design of the pedagogical aspects, the creation of quality content, the training of teachers, and the involvement of students are key factors that must be addressed for a successful implementation. In this context, the background analyzed lays the foundation and highlights the need for new studies that validate the effectiveness of metaphorical environments in different educational settings, such as those of Ecuadorian universities, so that good practices can be established for broader adoption in the region [18].
In this way, the background review supported the potential of metaphorical virtual classrooms to transform teaching-learning processes in higher education, overcoming the limitations of traditional models. Research in universities in Venezuela, Ecuador, and Spain agrees on the importance of aspects such as the selection of relevant metaphors for students, the training of teachers in the use of ICT, the careful design of classrooms, and the monitoring of their impact through pilot tests. Considering these experiences and recommendations, the need for innovative studies that validate the effectiveness of this model in the context of Ecuadorian universities is evident. The knowledge gained could guide the implementation of metaphorical virtual classrooms, improving the quality of teaching and student learning at a higher level.

2. Materials and Methods

2.1. Study Area Description

The University of Otavalo is a private institution of higher learning located in northern Ecuador. This University has two main study modalities, offering on-site and online programs. The online courses offered in the region of interest in this study include law, basic education, international business, and business administration, with a total of 1697 students for the first intervention period (semester 2023-A previous LMS) and 1859 students during the second intervention period (semester 2023-B new LMS). During the intervention period, 224 teachers participated in the pre-and post-implementation evaluation of the system, distributed over 113 and 116 virtual classrooms created for each academic period, respectively.

2.2. Implemented System

Prior to the intervention carried out, the University of Otavalo had an outdated virtual classroom system in which each teacher used the virtual classroom assigned to him for each academic period in the way he saw fit, which promoted a non-standardized learning environment complex for each student to understand in each new subject. For these reasons, the proposed solution used the latest version of Moodle available to date (Moodle 4.2), implemented in conjunction with a set of plugins to promote interaction, gamification, and monitoring in the LMS. Among the additional plugins implemented, the following plugins stand out: b l o c k _ o p e n a i _ c h a t , m o d _ a u t o a t t e n d m o d , m o d _ b o a r d , m o d _ m o o d l e o v e r f l o w , m o d _ p d f a n n o t a t o r , q u i z a c c e s s _ o n e s e s s i o n , q u i z a c c e s s _ p r o c t o r i n g , b l o c k _ a n a l y t i c s _ g r a p h s , b l o c k _ c o m p l e t i o n _ p r o g r e s s , b l o c k _ d e d i c a t i o n , b l o c k _ m e s s a g e t e a c h e r , b l o c k _ p o i n t _ v i e w , b l o c k _ p o w e r b i , b l o c k _ x p , f i l t e r _ f i l t e r c o d e s , h 5 p , p l a g i a r i s m _ t u r n i t i n s i m , among others. In particular, we can highlight the installation of Bootstrap v4, which made it possible to give a personalized look to the entire LMS and each of the classrooms, giving the students an idea of immersion in the LMS, the OpenAI Chat Block plugin, which allowed the use of ChatGPT by some of the teachers directly in each classroom and the H5P and LevelUp XP plugins, which allowed the inclusion of interactive objects and gamification in each virtual classroom. In addition to all these technological efforts, regulations were established at the university that made it possible to define the rules for the use of the classrooms and to standardize the design of the virtual classroom system through a virtual classroom template that was given to each teacher, with an instructional design, and a structure and configuration determined by the Teachers’ Directorate, subject to a user manual that the teacher must follow in the development of his classes.
The structure of the virtual classroom for online study programs had a design of three units corresponding to the three midterms of each semester, with each unit consisting of three weeks of study. Each week included a design composed of sections of content, activities, and evaluations. The content section presented the planning, the study guide, the asynchronous lessons pre-recorded by the instructor, and additional materials, including class slides and any additional resources the instructor deemed necessary. The Activities section has been arranged with the objective that the teacher can execute and time the different autonomous activities, activities in contact with the teacher, and experimental practical’s that are required and have preconfigured templates such as Task, Forum, Glossary, H5P, and Board arranged in hidden mode so that the teacher can activate them according to his needs, where all these activities include their evaluation rubric template and configuration of contributions to the grade book and classroom competencies. Finally, the assessment section of each week was designed to perform assessment activities that integrate each week’s content, such as questionnaires, lessons, and projects, among others. This section included a preconfigured questionnaire template to consume the classroom question bank by taking a random subset of the items the teacher will create for each unit within the question tree.
Furthermore, in order to comply with the recommendations of the ADDIE and PACIE models, the designed virtual classroom template had an introduction block where all the documentation of the subject, planning, and access to synchronous classes are presented and a metaphorical menu that allows the student to access all the units, weeks, and classroom tools in an easy, entertaining, and interactive way. In addition, the classroom template used had a closing block where the teacher could provide all the final information on the subject, collect statistics on the quality of the course, and provide final evaluation and recovery activities for their students.
The virtual classroom template, approved and regulated for use by all faculty at the institution, was integrated with the LevelUp XP gamification plugin and configured with a set of rules to reward each student for their interactions with the classroom content. The rules implemented by default in the virtual classroom template included rewards ranging from 10 to 30 points for students for completing various challenges, including the following:
  • Ten points are given for each chat message sent, resource viewed, folder viewed, and course page viewed.
  • Fifteen points are for any content posted on forums or messages or threads created on forums.
  • Twenty points for each lesson completed or Microsoft Teams session started (attendance at synchronous classes).
  • Thirty points for each assignment submitted, quiz attempt submitted, glossary entry passed, H5P interactive object completed and scored, and entry posted on board.
In addition, this integration is equipped with a system of experience levels configured using a logarithmic scale with a base of 300 points and a growth coefficient of 1.4, through which ten experience levels have been defined, allowing the student to progress from the Novice rank to the Legendary rank. In this way, the implementation was designed to maximize student use of the virtual classroom system and reward them in a non-traditional way through a ranking system and experience points similar to those used in video game platforms. Sample images of the metaphorical classroom design (Figure 1a) and its integration with the LevelUp XP plugin (Figure 1b) are shown in Figure 1.

2.3. Data Collection

The analysis of the incidence of the implemented proposal was carried out for the two levels of existing users, teachers, and students. To collect information about teachers, the creation of an ordinal instrument was considered to measure the perception of the ease of use that the implemented proposal has provided to their work activities, under the hypothesis that a more organized, standardized system equipped with cutting-edge technology should contribute to the daily work of teachers. On the other hand, since students are a population with a less formed and objective criterion than teachers, it was considered to measure the impact of the implementation not through a survey but through tangible performance metrics extracted directly from the installed system’s Moodle database. To do this, a plugin was developed to query the Moodle database for each classroom and calculate accurate performance metrics for each student. This plugin is available as an open-source version in the public repository: https://github.com/erickherreraresearch/MoodleSeguimiento, accessed on 27 March 2024.
An ordinal instrument was created to allow the extraction of relevant and inferential information to verify the impact of the proposal implemented on teachers [19,20]. The development of this instrument was based on the requirements specified by the Teaching Management, the Office of the Academic Vice-Rector, and the Online Education Management. Therefore, it was designed to support the assessment of teachers’ perceptions of ease in aspects such as the use of Moodle, the generation of educational resources, the creation of interactive learning objects, and the application of artificial intelligence in education. Measuring these variables was challenging because they depend on individual perception. To address this problem, the variables were configured as unobserved latent variables within a factorial model, which allowed them to be assessed through a set of items rather than a single question. The layout of the factor structure and the items used are shown in Table 1.
Contrary to the analysis of the incidence in the teaching staff, for the evaluation of the incidence in the students, it was considered to increase the numerical performance metrics obtained from the Moodle database of the previous virtual classroom system and the new implementation, with the objective that the analysis allows us to objectively visualize the impact generated in the users of the virtual classroom systems. For this purpose, a block-type plugin for Moodle called “MoodleMonitoring” was developed using the programming language PHP with corresponding HTML components and CSS styles, which allowed querying the different components of the database through structured SQL queries. Moodle data is extracted for each student, and these metrics are averaged to determine the results achieved in each classroom. The plugin was developed using the PhpStorm software v2023.3.1 in a Moodle test environment installed on an Ubuntu virtual machine configured to run PHP version 8.1, a PostgreSQL database, and a Nginx web server.
The metrics considered for collecting student information for each subject at the end of the academic period were performance, progress, interaction rate, and time spent by each student in the classroom. Performance is a metric that was designed to be obtained as a query to the database generated by the classroom on the final grade obtained by each student only for the activities that have been enabled and graded by the teacher, considering the grading rubric for each activity and their contribution configuration to the grade book. The percentage of progress for each student was configured to be obtained by consulting the database with the number of activities and resources that the student has completed in the virtual classroom, divided by the total number of activities created and enabled by the teacher in the classroom, and transformed into a percentage scale, considering the completion status rules configured by each teacher in each activity. The student’s engagement time is obtained by consulting the Moodle log database, comparing it with the classroom index, and adding up the accumulated connection times for each student.
Finally, to obtain the interaction index for each student, various aspects regulated by the University’s Online Education Unit were considered in accordance with the interests of the University’s pedagogical model. The Online Education Unit determined that this index should include:
  • The number of activities and resources visited by students n A R .
  • The number of total activities and resources enabled by the teacher in the classroom N A R and the number of activities only enabled by the teacher N A .
  • The number of interactive learning objects and interactive activities completed by the student n I L O I A (considering interactive objects as those of the H5P, IMS, or SCORM type and interactive activities as questionnaires, lessons, and forums).
  • The total time spent by the student solely on solving interactive objects, interactive and communication activities t I L O I A C (excluding homework delivery times, consulting calendars or grades, reading or viewing static resources, among others).
  • The number of experience points accumulated by the student was awarded by the LevelUp XP plugin p E X P .
  • The rank achieved in the classroom ranking system was awarded by the LevelUp XP plugin r E X P .
Using all these parameters, a variable was designed that is obtained from the contribution of the majority of these metrics on a percentage scale named cumulative interaction i c , which can be calculated using the expression:
i c = 50 n A R N A R + 50 1 e 4.595 · n I L O I A N A + 50 1 + e 0.00765 · t I L O I A C 50
As can be seen in Equation (1), the ratio between the number of activities visited by students and the total number of activities facilitated by the teacher was assigned 50% of the weight of this variable. Similarly, the proportion between the interactive activities, the total number of activities designed in each subject, and the time spent by the students in interactive activities was given a weight of 25% each. Next, considering that the number of experience points obtained by the student in the development of each subject does not have a defined scale and can continue to grow indefinitely depending on the activities considered in the set of rules and that the range obtained by the LevelUp XP plugin depends on all the configurations and thresholds that the teacher has defined for each score, which are defined by default in the institutional template but can be modified according to the needs of each teacher; for this reason, the inclusion of these two variables, experience points p E X P and XP rank r E X P from the LevelUp plugin, along with the cumulative interaction i c were modeled using a shallow neural network to provide an output on a scale of 1 to 100 named interaction index   I i n t [21]. The configuration of this shallow neural network is presented in Figure 2.
To obtain this neural network model, a database was created for the variables of interest—experience score, experience range, and cumulative interaction—comprising the observations of 147 students from different classrooms whose level of interaction was outstanding as observed by their teachers. Next, the teachers who taught these students subjects in the current period were asked to assign an interaction score on a scale of 1 to 100, and the average of these scores for each student was used as the interaction index in the training phase. With these scores, several sequential neural network models were trained using the TensorFlow v2.10.0 and Keras v2.12.0 libraries with RStudio v2023.06.2, using leaky ReLU activation functions in the hidden layer and linear activation in the output layer. An Adam optimizer was used with a coefficient of 0.001 and Mean Absolute Error (MAE) as the cost function (loss). For each trained model, 100 learning epochs were configured, and a training set consisting of 70% of the observations and a validation set of 30% were considered. In this way, tests were carried out with two to 20 neurons in the hidden layer, and it was found that the configuration with 14 neurons had the best result, with a loss of 0.1782, a precision of 95.31%, and an MAE of 0.0048. In addition, models with two and three hidden layers were trained and tested with 2 to 20 neurons in these additional layers but did not perform better than the 14-neuron model finally implemented in the Moodle plugin [21,22].

2.4. Statistical Analysis

Data processing: Data collection in scientific research often runs the risk of including missing and atypical data, which underscores the importance of beginning any statistical analysis with a comprehensive data analysis protocol. One of the most popular methods for processing data in multivariate samples is using Mahalanobis distances. This technique allows us to quantify how many standard deviations an observation is from the mean of a distribution. This tool is particularly useful for identifying observations that deviate significantly from the norm since it can distinguish between regular and outlier data. From a geometric point of view, unlike the Euclidean distance, which measures the shortest path between two points without taking into account the correlation between the variables, the Mahalanobis distance incorporates this correlation in its calculation [23,24], offering a most precise measure of the distance between a specific point x   R p , generated by a multivariate probability distribution f X ( . ) , and the average μ = E ( X ) of said distribution. Assuming that the f X ( . ) distribution has finite second-order moments, the covariance matrix = E ( X μ ) can be established, thus allowing the precise definition of the Mahalanobis distances as:
D X , μ = X μ T Σ 1 ( X μ )
Confirmatory factor analysis: Once the processed data are available, it is possible to perform a Confirmatory Factor Analysis (CFA) to assess the validity and reliability of the instrument used. CFA is a method of analyzing variables that make up a construct, analyzing the dependency relationship of each variable on the factor to which it is associated. This technique is particularly effective in the treatment of ordinal variables, allowing us to evaluate the extent to which a vector of dimensional responses p × 1 , made up of observable variables, manages to represent one or more latent variables, called factors η . Through this approach, it is established that each observed variable contributes to the explanation of the behavior of the latent variable that is intended to be measured and estimated. In this context, a vector of observed responses is used Y i to explain the latent variable ξ through the applied model.
Y = Λ ξ + ϵ ,
In the context of the presented model, Y is defined as a dimension vector p × 1 composed of observed random variables, while ξ refers to unobserved latent variables, and Λ denotes a dimension matrix p × k , being k the number of involved unobserved latent variables. Considering that Y comprises a set of latent variables ξ , the model incorporates an error ϵ . The resolution of this model is facilitated by using the maximum likelihood (ML) estimation technique, which is based on the iterative minimization of the function:
F M L = l n Λ Ω Λ + I d i a g Λ Ω Λ + t r R Λ Ω Λ + I d i a g Λ Ω Λ 1 ln R p ,
where Λ Ω Λ symbolizes the variance-covariance matrix derived from the proposed factorial model, while R is identified with the observed variance-covariance matrix. In more detail, the parameters associated with the model in the context of the CFA are estimated by minimizing the discrepancy between the theoretical variance-covariance matrix of the model and the observed matrix [20,25,26,27,28,29].
Wilcoxon Signed-Rank Test: The Wilcoxon Signed-Rank test, originally proposed by [30], is a nonparametric analysis method widely used in data analysis to deal with unique sample problems. In the context of paired data, this procedure has been used to evaluate the hypothesis that the probability distributions of both samples are identical. Such an evaluation is performed by analyzing the statistics derived from the within-group differences, with the primary objective of determining whether these differences are due to a distribution with a median of zero. For the Wilcoxon Signed-Rank Test, the null hypothesis represents if the set of differences observed between pairs has a probability distribution with the center at zero [31].
To carry out the test, we initially proceeded to calculate the absolute values of the differences d i . Subsequently, these absolute values are sorted in ascending order, excluding zero values. The Wilcoxon test methodology involves the sum of the ranks corresponding to positive differences, denoted as ( T + ), as well as the sum of the ranks assigned to negative differences, represented by ( T ). Thus, for a set of n differences, the following relationship is established between both sums:
T = n ( n + 1 ) 2 T +
To evaluate the null hypothesis, a rejection region was defined for the test statistic called T + . The rejection region was determined using the exact distribution of the null hypothesis of T + . The standard normal distribution can be adopted for large samples as an approximation for hypothesis verification. For these circumstances, a two-tailed rejection region is established for the null hypothesis based on the statistic T + or T as:
Z + = T + n ( n + 1 ) 4 n n + 1 2 n + 1 24 1 / 2 > Z 1 α / 2
Z = T n ( n + 1 ) 4 n n + 1 2 n + 1 24 1 / 2 > Z 1 α / 2
Finally, a one-tailed test is performed analogously to the comparison made for Z 1 α .

3. Results

The analysis of the effect obtained by the implemented interactive metaphorical classroom system began with evaluating the effect obtained on the institution’s teachers. For this purpose, an ordinal instrument was designed and administered to all the institution’s staff before and after the implementation of the system and their training in its use. This pre-test and post-test design allowed us to assess teachers’ perceptions of their own online teaching skills and abilities before and after using the newly implemented virtual classroom system. In this way, the instrument allowed us to determine if significant differences existed between how they taught before and after using the new system.

3.1. Validation of the Ordinal Instrument

The statistical analysis began with validating and applying the ordinal instrument proposed during the experimental phase. This instrument was created based on the evaluation guidelines defined by the Teachers’ Directorate, the Office of the Academic Vice Rector, and the Online Education Unit. In this way, the instrument was designed to evaluate the teachers’ ease of use perception of using Moodle, creating educational resources, interactive learning objects, and artificial intelligence for education. Evaluating these variables can be challenging because their measurement depends on each individual’s perception. For this reason, the variables were arranged as unobserved latent variables in a factor structure that facilitates their measurement [23,29,32]. As this was a new instrument created in the institution, Confirmatory Factor Analysis was applied to verify and guarantee its validity and reliability.
The instrument designed to collect information and perceptions of teachers was applied to a sample of 224 participants, corresponding to the opinions collected from teachers at the University of Otavalo who used the virtual classroom system before and after the intervention carried out through the implementation of the new virtual classroom system and training the staff in its use. In this way, each observation included 24 variables, of which 2 were discrete numerical variables for years of experience in face-to-face and virtual teaching. Two variables were categorical: the time of the intervention and the type of teacher contract. In addition, 20 ordinal variables were structured as factors, as presented in Table 1, to register the perception that teachers had of their ease of use of the Moodle platform for online education, ease of using software to create educational resources, ease of using virtual and interactive learning objects, and ease of using artificial intelligence applied to education. The descriptive statistics for each of the numerical and ordinal variables that made up the survey are presented in Table 2.
Next, the presence of atypical observations in the compiled database was verified using Mahalanobis distances. This evaluation was carried out by implementing the function m a h a l a n o b i s in R , setting a cut-off threshold at 45.3147, which was determined based on the distribution χ 2 . For this, 99.9% of the distribution was considered to accept distances, thus excluding 0.1% of the most distant observations considered atypical. This process identified ten outlier observations, which were eliminated from the database, resulting in a final database composed of 214 multivariate observations.
Using the refined database, we verified its parametric assumptions, a critical element given that the chosen instrument validation technique was Confirmatory Factor Analysis (CFA). This method requires adherence to additivity, normality, linearity, homogeneity, and homoscedasticity assumptions. In this context, the first assumption tested was additivity, for which the multivariate correlation matrix applied to each pair of possible ordinal variables was used. The resulting correlation matrix for the sample in question is shown in Figure 3.
As shown in Figure 3, the bivariate correlation obtained for each pair of items showed adequate correlation values, with no combination of items having correlation values equal to one. This allowed us to confirm that the additivity assumption was met. A false regression analysis was chosen to test the normality, linearity, homogeneity, and homoscedasticity assumptions. This multivariate technique uses a set of theoretical random quantiles derived from the distribution χ 2 to compare with the parameters and quantiles observed in the sample. The results derived from the false regression analysis were evaluated using a histogram, a QQ Plot quantile plot, and a scatterplot, as shown in Figure 4 [28].
As can be seen in Figure 4a, where the histogram of the standardized quantiles from the regression model was plotted, a uniform distribution was observed, showing a remarkable similarity to the normal distribution. This finding allowed us to validate the assumption of normality. In Figure 4b, the QQ plot plotted the quantiles of the sample against the theoretical quantiles, showing an alignment close to an ideal straight line of 45 degrees, which led to the acceptance of the assumption of linearity. In addition, the scatterplot shown in Figure 4c plots the standardized residuals of the sample against the residuals fitted by the spurious regression model. This graphical representation showed an even distribution of quantiles in all quadrants, with distances from the horizontal axis remaining constant in the range of −2 to 2, with no remarkable patterns or groupings. Therefore, the assumptions of homogeneity and homoscedasticity were validated.
After confirming the assumptions in the sample, we proceeded with the application of Confirmatory Factor Analysis (CFA), configuring its factor structure through the library l a v a a n and the statistical programming language R . The results obtained from the CFA are presented in Figure 5 and Table 3.
As seen in Figure 5 and Table 3, the factor loadings of each item in its corresponding factor exceeded the threshold of 0.5, and the correlation between factors did not show a perfect correlation, which led to the conclusion that the factor structure did not present signs of disability. When examining the goodness of fit indices of the model, it was found that the CFI (Comparative Fit Index), TLI (Tucker-Lewis Index), and NNFI (Non-Normed Fit Index) indices exhibited values above 0.9, which, of course, according to [29], classifies the model as excellent. Likewise, when evaluating the SRMR (Standardized Root Mean Square Residual) and RMSEA (Root Mean Square Error of Approximation) indices, values of 0.0317 and 0.0451, respectively, were obtained, which demonstrates the validity and reliability of the designed instrument.
Additionally, Confirmatory Factor Analysis (CFA) allowed the estimation of the determination coefficients r 2 for each item within its factor, which indicates the degree of contribution of each item to explaining the factor to which they belong. In this way, the determination coefficients provided a perspective on the instrument items that have the greatest influence on each unobserved latent variable assessed. The determination coefficients obtained by CFA are shown in Figure 6.
In Figure 6, it can be seen that within the factor of ease of use of the Moodle platform for online education, question 3, “Do you think that the communication blocks and tools of each virtual classroom are optimally configured?”, reached the highest coefficient of determination of 0.970, suggesting that the optimal configuration of tools is critical to improving the overall Moodle experience. For its part, in the factor ease of using software to create educational resources, question 9, “Do you find it easy to create, export, and import question banks in the virtual classroom system?”, reached the highest coefficient of determination of 0.911, which suggests that improvements made in the creation of question banks are conceived as a great advancement that facilitates the creation of educational resources by teachers. In the factor of ease of use of virtual and interactive learning objects, question 13, “Do you think that the virtual classroom system has sufficient tools to facilitate the creation of recreational elements such as word searches, crossword puzzles, and other interactive games?”, reached the highest determination index of 0.955, which suggests that creating implementations that facilitate the creation of recreational objects such as word searches and crossword puzzles is critical in promoting virtual interactivity by teachers. Finally, in the area of artificial intelligence, it was found that question 19, “Do you think that the implemented system facilitates the editing of academic documents assisted by artificial intelligence?”, reached the highest coefficient of determination of 0.979, which suggests that teachers perceive that implementations that facilitate the editing of text documents are critical to promoting the use of artificial intelligence in LMS.
Furthermore, as mentioned in [33], these coefficients of determination are applicable in formulating a regression model that facilitates the extraction of scores for each factor based on the contributions of each question involved. Consequently, Equations (8)–(11) were proposed, which allowed the calculation of the observed score of each researcher in each dimension evaluated.
M o o d l e = 2.3496 0.934 p 1 + 0.674 p 2 + 0.970 p 3 + 0.765 p 4 + 0.913 p 5 ,
E d u c . R e s . = 2.4709 0.704 p 6 + 0.828 p 7 + 0.778 p 8 + 0.911 p 9 + 0.826 p 10 ,
I n t . O b j . = 2.6759 0.929 p 11 + 0.918 p 12 + 0.955 p 13 + 0.935 p 14 ,
A I = 1.7223 0.960 p 15 + 0.971 p 16 + 0.969 p 17 + 0.963 p 18 + 0.979 p 19 + 0.964 p 20 ,

3.2. Analysis of the Incidence among Teachers

Based on the results of the validated instrument and the simple factor structure obtained and quantified in the four variables: perceived ease of using Moodle, generation of educational resources, generation of interactive learning objects, and use of artificial intelligence in education, all possible tests of difference were performed for these variables. The analysis began by applying the Wilcoxon Signed-Rank Test for dependent samples in each variable, contrasting them using the categorical variable of the state in which each observation was raised before and after the implementation of the virtual classroom system. The results of the tests for differences are shown in Figure 7 and Table 4.
As seen in Table 4 and Figure 7, the weighted scores obtained through the CFA for the simple structure could identify significant differences between the variables that make up the sample. Teachers rated the perceived ease of online teaching before and after implementing the new virtual classroom system. It was possible to identify that teachers experienced a significant improvement in the ease of use of the Moodle LMS of approximately 12.61%, with a p-value of 0.03028. Additionally, a significant improvement of approximately 14.42% was identified in the ease of creating educational resources, with a p-value of 0.02226. Regarding the perception of ease in creating interactive learning objects, a significant improvement of approximately 66.95% was identified, with a p-value of 5.742 ×   10 5 . Finally, regarding the ease of use and integration of artificial intelligence in education, teachers showed a significant improvement of approximately 146.29%, with a p-value of 1.459 ×   10 7 . Additionally, in Figure 5, it can be seen that the dispersion of these weighted scores for each variable decreased after implementation, which suggests that the opinion of the experiment participants became more homogeneous after the implementation, training, and use of the platform, which represents a positive impact on the standardization of the teaching-learning process in the institution.
Finally, significant differences in the sample were evaluated using the two additional categorical variables corresponding to the type of contract under which the teacher works and the area of knowledge to which each individual’s training corresponds. However, concerning the contract type variable, no significant differences were found, with p-values of 0.9731, 0.7360, 0.1138, and 0.5888, respectively, for ease of use of Moodle, creation of educational resources, interactive learning objects, and ease of use and integration with artificial intelligence. The results for these contrasts are shown in Figure 8.
As can be seen in Figure 8, the perception of the ease of using the online teaching platform did not show significant differences according to the type of contract of each teacher. Thus, although slight differences were observed in the ease of creating interactive learning objects and in the use of artificial intelligence since the threshold of significance was not reached, it is concluded that there was not enough evidence to identify whether teachers with permanent contracts or occasional users perceived greater or lesser ease in using these tools.

3.3. Analysis of the Incidence in Students

As detailed in Section 2.1, in order to effectively measure the impact of the implemented virtual classroom system, a plugin for Moodle was developed that was capable of making queries to the LMS database, which collected relevant information about the actual use by students of both the old system and the newly implemented system. This information was summarized as four numerical variables on a scale of 1 to 100, corresponding to the average of the students’ scores in each classroom for performance, percentage of progress, percentage of interaction, and time of classroom use (in hours). Using this plugin, developed as a block, observations were collected for 113 classrooms of the previous system in version Moodle 3.6 and 116 classrooms for the new system implemented in version 4.2, resulting in a database of 229 observations for each student’s classroom at the end of each subject. The descriptive statistics obtained for these variables are presented in Table 5.
The data were then processed by checking for missing data and identifying atypical data. No missing data were detected using this method. On the other hand, using Mahalanobis distances, five atypical observations were detected, using a cut-off value of 22.4577 for a statistic χ 2 of 99.999 and degrees of freedom determined by the number of variables in the sample. This way, these five outlier observations were removed so that the final sample consisted of the observations collected for 224 virtual classrooms.
Next, we proceeded with the verification of assumptions by applying the Lilliefors test to verify normality and the Snedecor F test to verify homogeneity in the data that made up the sample. When running the normality tests using the Lilliefors test, p-values of 4.714 ×   10 6 and 0.002511 were obtained for the average performance of the students in each classroom for the state before and after implementation, respectively. For the average percentage of progress variable, p-values of 7.529 ×   10 5 and 0.1226 were obtained for the state before and after implementation, respectively. The p-values of 0.2797 and 7.572 ×   10 6 were obtained for the average interaction percentage variable for the state before and after implementation, respectively. For the variable, average time of use of each virtual classroom, p-values of 4.285 ×   10 5 and 0.01935 were obtained for the state before and after implementation, respectively. For these reasons, the normality assumption was rejected since the significance threshold established at 0.05 was exceeded in several cases. Furthermore, when the homogeneity tests were executed using Snedecor’s F, p-values of 0.0627, 0.0002, 0.0037, and 5.162 ×   10 13 , respectively, were obtained, so only the average performance variable reached the level of significance, so the homogeneity assumption was rejected.
In this way, it was determined that the sample for analyzing student results was not parametric, so the Wilcoxon Signed-Rank Test for dependent samples was used for analysis [33,34]. The results of applying this test to the four variables are presented in Figure 9 and Table 6.
As shown in Figure 9 and Table 6, when comparing the data of each variable for the previous system and the newly implemented virtual classroom system, a significant improvement in each of the metrics could be observed. Regarding the students’ academic performance, it was observed that the average final grades for each subject of the students in each classroom went from a median of 7.55 to a median of 8.4, which represents a significant improvement with a p-value of 7.39 ×   10 11 . The percentage of activities and resources completed or progressed by students in each classroom increased from 32.43% to 68.25%, a significant improvement with a p-value of 2.2 ×   10 16 . The rate of student interaction in each classroom increased significantly from 44.92% to 79.88%, with a p-value of 2.2 ×   10 16 . Finally, the average time students spent working in each virtual classroom showed a significant increase from 8.39 h in the previous system to 18.01 h in the new platform, with a p-value of 1.011 ×   10 15 . Thus, it was evident that the system implemented and the strategies embodied in the design of the LMS and each virtual classroom significantly improved the end users’ experience, who, in this case, are the students.

4. Discussion

The foray into the use of metaphorical virtual classrooms and interactive learning objects in the online modality has marked a significant milestone in digital education, as evidenced in the study. This work has allowed a detailed analysis of how these tools affect student interaction, contrasting with previous research such as that of [6,17], which has also made valuable contributions to online education, albeit from different perspectives. A fundamental difference between the present study and those of [6,12,17] lies in the methodology used. While previous works focused on qualitative analysis of students’ perceptions and the impact of educational technologies from a more superficial perspective, this study incorporated a robust quantitative approach, using advanced statistical techniques that allowed an accurate assessment of the impact of metaphorical virtual classrooms and interactive learning objects. In addition, implementing a plugin for monitoring and collecting information in virtual classroom systems represented a significant methodological innovation, providing reliable data on students’ interaction with the learning elements at all times and establishing a replicable model for future research in this area.
In terms of results, while research by [6,12,17] highlighted the importance of digital tools in improving student motivation and engagement, the present study was able to quantitatively demonstrate how metaphorical virtual classrooms and specific interactive learning objects improve student interaction and academic performance. This validates student perceptions gathered in previous studies and provides solid empirical evidence of the added value of these educational tools. The main advantage of the current study over previous ones is its ability to provide detailed quantitative evidence of the effectiveness of metaphorical virtual classrooms and interactive learning objects. This is particularly relevant in online education, where student interaction is a key predictor of academic success.
In summary, the present study’s analysis of the incidence of metaphorical virtual classrooms and interactive learning objects in online student interaction marks a significant advance in understanding learning dynamics in virtual environments. Through the implementation of advanced statistical techniques and the development of innovative tools for monitoring student interaction, this work not only overcomes some of the limitations of previous studies but also establishes a solid foundation for future research in the field of digital education. Thus, it can be highlighted that, based on the experimental protocol, it was observed that the ease of use of the implemented virtual classroom system perceived by the teachers showed significant and valuable improvements in the areas of Moodle use and educational resource creation. In particular, the use of Moodle, the creation of interactive learning objects, and the use and integration of artificial intelligence in their classes. For its part, significant contributions were observed in the educational service that students receive, which was evidenced by significant improvements of approximately:
  • an 11.25% improvement in student average grades
  • a 110.45% improvement in the completion status of the activities of each
  • 77.82% improvement in student interaction rate
  • a 114.56% improvement in students’ average time spent on asynchronous learning in each classroom
In this way, these findings provide solid evidence that the metaphorical strategy used, which includes gamification components and is reinforced with interactive objects, along with a set of regulations established in the institution, significantly impacts the teaching-learning process in online education. Furthermore, as mentioned in [15], there are additional variables such as awareness, emotional aspects, gender, age, and culture, among others, which could affect the user experience, motivation, and interaction of online students involved in the use of an LMS, which can be addressed quantitatively and qualitatively. Nevertheless, this research was focused on measuring the incidence of the implemented e-learning system from a quantitative technological perspective, which mainly includes grades, usage times, and completion progress. Thus, we are working on a qualitative and demographic analysis of the e-learning process, which will be addressed in future work. We also plan to evaluate and contrast the instructor and student experience with the newer Moodle implementations in future work.

5. Conclusions

This study investigated the impact of metaphorical virtual classrooms and interactive learning objects on student interaction in online mode. For this purpose, the latest version of Moodle was implemented at the University of Otavalo, along with gamification plugins and interactive tools. The methodology included data collection using ordinal instruments for teachers and performance metrics for students, collected through a plugin developed to perform direct queries to the Moodle database for each student, which obtained the average metrics for each virtual classroom of the previous system and the new system implemented. This information was statistically processed using Mahalanobis distances, Confirmatory Factor Analysis, and the Wilcoxon Signed-Rank test to contrast the previous system with the newly implemented system, which allowed significant improvements to be identified in each metric and factor considered.
The findings evidenced significant increases in students’ academic performance, interaction, progress, and time spent learning in virtual environments. Implementing metaphorical virtual classrooms and integrating interactive and gamification tools proved to be effective strategies for improving student motivation and engagement in online learning environments. Regarding the teachers, we evaluated their perception after using the new system, where they rated the new system as significantly easier to use Moodle, create educational resources, create interactive learning objects, and integrate artificial intelligence for designing their content and courses, which clearly suggests that the implementation of the latest educational technologies implemented from an interactive perspective represents a tool that makes teachers’ work easier.
Thus, this study contributes to the field of online education by providing empirical evidence on the effectiveness of implementing metaphorical virtual classrooms and interactive learning objects to improve student interaction and achievement. From the results, it can be concluded that the adoption of advanced educational technologies and innovative pedagogical strategies is essential to addressing the challenges of attrition and motivation in online education.

Author Contributions

Conceptualization, J.I.M.-O.; Methodology, E.P.H.-G.; Software, E.P.H.-G.; Formal analysis, E.P.H.-G.; Investigation, E.P.H.-G.; Writing—original draft, E.P.H.-G.; Writing—review & editing, J.I.M.-O.; Supervision, J.G.L.-B. and J.I.M.-O.; Project administration, J.I.M.-O. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the State Polytechnic University of Carchi (www.upec.edu.ec).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

We provide all data gathered for experimental evaluation and the developed Moodle plugin as supplementary material in the GitHub repository for reproducibility: https://github.com/erickherreraresearch/MoodleMonitoring, accessed on 27 March 2024.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Barrientos, Z.; Umaña, R. Deserción estudiantil en posgrados semipresenciales de la Universidad Estatal a Distancia (UNED), Costa Rica: ¿Deserción o retraso? UNED Res. J./Cuad. Investig. UNED 2010, 1, 141–149. Available online: http://www.redalyc.org/articulo.oa?id=515651984002 (accessed on 20 March 2024). [CrossRef]
  2. Hernández Miranda, E.A.; de la Cruz Hernández, K.; Rojas Contreras, D.E.; Guzmán Serna, J.A. La deserción escolar: Un caso de educación en línea. Psicoeduc. Reflex. Propues. 2021, 7, 42–53. Available online: https://psicoeducativa.iztacala.unam.mx/revista/index.php/rpsicoedu/article/view/127 (accessed on 20 March 2024).
  3. García, S.; Díaz, I.; Reche, M.P.; Torres, J.M.; Rodríguez, J.M. Systematic Review of Good Teaching Practices with ICT in Spanish Higher Education. Trends and Challenges for Sustainability. Sustainability 2019, 11, 7150. [Google Scholar] [CrossRef]
  4. González Castro, Y.; Manzano Durán, O.; Torres Zamudio, M. Riesgos de deserción en las universidades virtuales de Colombia, frente a las estrategias de retención. Libr. Empres. 2017, 14, 177–197. [Google Scholar] [CrossRef]
  5. Abbad, G.; Carvalho, R.S.; Zerbini, T. Evasão em curso via internet: Explorando variáveis explicativas. RAC-Eletrônica 2006, 1, 1–15. [Google Scholar] [CrossRef]
  6. Vélez Meza, E.M. Gamificación en Técnicas de Aprendizaje Mediante Aulas Virtuales Metafóricas en Educación Superior Modalidad en Línea; Universidad Técnica del Norte: Ibarra, Ecuador, 2020. [Google Scholar]
  7. Silva Morán, J.J. Desarrollar un Conjunto de Tableros Visuales de Business Intelligence (BI) Mediante la Herramienta de Tableau que Permita Fortalecer la Interpretación de la Información de Deserción Estudiantil en la Universidad Técnica del Norte; Universidad Técnica del Norte: Ibarra, Ecuador, 2022. [Google Scholar]
  8. Ferrer, J.; Ringer, A.; Saville, K.; A Parris, M.; Kashi, K. Students’ motivation and engagement in higher education: The importance of attitude to online learning. High. Educ. 2022, 83, 317–338. [Google Scholar] [CrossRef]
  9. Chiu, T.K.F.; Lin, T.-J.; Lonka, K. Motivating Online Learning: The Challenges of COVID-19 and Beyond. Asia-Pac. Educ. Res. 2021, 30, 187–190. [Google Scholar] [CrossRef]
  10. Kalyuga, S. Interactive distance education: A cognitive load perspective. J. Comput. High. Educ. 2012, 24, 182–208. [Google Scholar] [CrossRef]
  11. Rodríguez, Y.T. Design of a Virtual Learning Environment Centered in the Educational Metaphor. Univ. Cienc. Tecnol. 2018, 22, 10–20. Available online: https://uctunexpo.autanabooks.com/index.php/uct/article/view/175 (accessed on 20 March 2024).
  12. Santaella, S. Aulas virtuales metafóricas como herramientas para promover el aprendizaje en los estudiantes universitarios. Red. Investig. Educ. 2018, 11, 41–51. Available online: https://revistas.uclave.org/index.php/redine/article/view/1991 (accessed on 20 March 2024).
  13. Herrera, J.L. Recursos Didácticos Y Manejo de las TIC en los Procesos de Aprendizaje en la Escuela de Lenguas y Linguística de la Facultad de Filofofía, Letras y Ciencias de la Educación; Universidad de Guayaquil: Guayaquil, Ecuador, 2013. [Google Scholar]
  14. Gámiz Sánchez, V.M. Entornos Virtuales para la Formación Práctica de Estudiantes de Educación: Implementación, Experimentación y Evaluación de la Plataforma Aulaweb; Universidad de Granada: Granada, Spain, 2009; Available online: https://digibug.ugr.es/handle/10481/2727 (accessed on 20 March 2024).
  15. Collazos, C.A.; Fardoun, H.; AlSekait, D.; Pereira, C.S.; Moreira, F. Designing Online Platforms Supporting Emotions and Awareness. Electronics 2021, 10, 251. [Google Scholar] [CrossRef]
  16. Flórez Marulanda, J.F.; Collazos, C.A.; Hurtado, J.A. Evaluating an Immersive Virtual Classroom as an Augmented Reality Platform in Synchronous Remote Learning. Information 2023, 14, 543. [Google Scholar] [CrossRef]
  17. Yánez Rueda, H.S. Desarrollo de un Aula Virtual Metafórica para la Asignatura de Herramientas Informáticas Aplicadas en la Universidad Tecnológica Indoamérica; Pontificia Universidad Católica del Ecuador Sede Ambato: Ambato, Ecuador, 2017; Available online: https://repositorio.pucesa.edu.ec/handle/123456789/1885 (accessed on 20 March 2024).
  18. Pilamunga Poveda, E.M.; Quizhpi Lupercio, L.P. La Estrategia de Gamificación y el Proceso de Aprendizaje; Universidad Técnica de Ambato: Ambato, Ecuador, 2018; Available online: https://repositorio.uta.edu.ec/handle/123456789/28903 (accessed on 20 March 2024).
  19. Aza-Espinosa, M.; Guerra Torrealba, L.; Herrera-Granda, E.; Aza-Espinosa, M.; Burbano-Pulles, M.; Pozo-Burgos, J. Learning Performance Indicators a Statistical Analysis on the Subject of Natural Sciences during the COVID-19 Pandemic at the Tulcán District BT—Trends in Artificial Intelligence and Computer Engineering; Botto-Tobar, M., Gómez, O.S., Rosero Miranda, R., Díaz Cadena, A., Luna-Encalada, W., Eds.; Springer Nature: Cham, Switzerland, 2023; pp. 139–154. [Google Scholar]
  20. Guevara-Vega, C.P.; Chamorro-Ortega, W.P.; Herrera-Granda, E.P.; García-Santillán, I.D.; Quiña-Mera, J.A. Incidence of a web application implementation for high school students learning evaluation: A case study. Rev. Ibérica Sist. Tecnol. Informação 2020, 2020, 509–523. Available online: https://www.proquest.com/openview/bfe21dc96eab6a1dd96d132373a9eefc/1?pq-origsite=gscholar&cbl=1006393 (accessed on 20 March 2024).
  21. Demuth, H.B.; Beale, M.H.; De Jess, O.; Hagan, M.T. Neural Network Design, 2nd ed.; Martin Hagan: Stillwater, OK, USA, 2014; ISBN 0971732116. [Google Scholar]
  22. Weidman, S. Deep Learning from Scratch, 1st ed.; Porter, M., Ed.; O’Reilly: Sebastopol, CA, USA, 2019; ISBN 9781492041412. Available online: https://www.oreilly.com/library/view/deep-learning-from/9781492041405/ (accessed on 20 March 2024).
  23. Jácome Ortega, A.E.; Caraguay Procel, J.A.; Herrera-Granda, E.P.; Herrera Granda, I.D. Confirmatory Factorial Analysis Applied on Teacher Evaluation Processes in Higher Education Institutions of Ecuador. In Advances in Intelligent Systems and Computing; Springer: Berlin/Heidelberg, Germany, 2020; Volume 1110, pp. 157–170. [Google Scholar]
  24. Ghorbani, H. Mahalanobis Distance and Its Application for Detecting Multivariate Outliers. FACTA Univ. Ser. Math. Inform. 2019, 34, 583–595. [Google Scholar] [CrossRef]
  25. Yang-Wallentin, F.; Jöreskog, K.G.; Luo, H. Confirmatory Factor Analysis of Ordinal Variables With Misspecified Models. Struct. Equ. Model. Multidiscip. J. 2010, 17, 392–423. [Google Scholar] [CrossRef]
  26. Rosseel, Y. lavaan: An R Package for Structural Equation Modeling. J. Stat. Softw. 2012, 48, 1–36. [Google Scholar] [CrossRef]
  27. Jácome-Ortega, A.E.; Herrera-Granda, E.P.; Herrera-Granda, I.D.; Caraguay-Procel, J.A.; Basantes-Andrade, A.V. Análisis temporal y pronóstico del uso de las TIC, a partir del instrumento de evaluación docente de una Institución de Educación Superior. Rev. Ibérica Sist. Tecnol. Inf. 2019, E22, 399–412. Available online: https://www.proquest.com/openview/96910c7cb0c260ae2409940921c7f71b/1?pq-origsite=gscholar&cbl=1006393 (accessed on 20 March 2024).
  28. Imbaquingo, D.E.; Herrera-Granda, E.P.; Herrera-Granda, I.D.; Arciniega, S.R.; Guamán, V.L.; Ortega-Bustamante, M.C. Evaluation of university informatic security systems: Teacher evaluation system a case study. RISTI Rev. Iber. Sist. Tecnol. Inform. 2019, 2019, 349–362. [Google Scholar]
  29. Manuel Batista-Foguet, J.; Coenders, G.; Alonso, J. Análisis factorial confirmatorio. Su utilidad en la validación de cuestionarios relacionados con la salud. Med. Clin. 2004, 122, 21–27. [Google Scholar] [CrossRef] [PubMed]
  30. Wilcoxon, F. Individual Comparisons by Ranking Methods. Biom. Bull. 1945, 1, 80. [Google Scholar] [CrossRef]
  31. Woolson, R.F. Wilcoxon Signed-Rank Test. In Wiley Encyclopedia of Clinical Trials; Wiley: Hoboken, NJ, USA, 2008; pp. 1–3. [Google Scholar]
  32. Uluman, M.; Deha Doğan, C. Comparison of Factor Score Computation Methods In Factor Analysis. Aust. J. Basic Appl. Sci. 2016, 10, 143–151. Available online: http://www.ajbasweb.com/old/ajbas/2016/December/143-151.pdf (accessed on 20 March 2024).
  33. Bang, K.-S.; Kim, S.; Song, M.K.; Kang, K.I.; Jeong, Y. The Effects of a Health Promotion Program Using Urban Forests and Nursing Student Mentors on the Perceived and Psychological Health of Elementary School Children in Vulnerable Populations. Int. J. Environ. Res. Public Health 2018, 15, 1977. [Google Scholar] [CrossRef]
  34. Gunawan, T.J.; Wang, J.; Liao, P.-C. Factors of Project-Based Teaching That Enhance Learning Interest: Evidence from Construction Contract Management Course. Sustainability 2022, 14, 15314. [Google Scholar] [CrossRef]
Figure 1. Structure of the implemented institutional virtual classroom template: (a) interactive menu; and (b) LevelUp XP ranking and gamification system.
Figure 1. Structure of the implemented institutional virtual classroom template: (a) interactive menu; and (b) LevelUp XP ranking and gamification system.
Applsci 14 06447 g001
Figure 2. Shallow Neural Network used to calculate the Interaction Index.
Figure 2. Shallow Neural Network used to calculate the Interaction Index.
Applsci 14 06447 g002
Figure 3. Multivariate correlation matrix for all pairs of variables in the instrument.
Figure 3. Multivariate correlation matrix for all pairs of variables in the instrument.
Applsci 14 06447 g003
Figure 4. Results of the fake regression analysis: (a) Histogram; (b) QQ Plot; (c) ScatterPlot.
Figure 4. Results of the fake regression analysis: (a) Histogram; (b) QQ Plot; (c) ScatterPlot.
Applsci 14 06447 g004
Figure 5. A path diagram for Confirmatory Factor Analysis applied to the factor structure.
Figure 5. A path diagram for Confirmatory Factor Analysis applied to the factor structure.
Applsci 14 06447 g005
Figure 6. Determination coefficients r 2 for each question within the dimensions of the factor structure.
Figure 6. Determination coefficients r 2 for each question within the dimensions of the factor structure.
Applsci 14 06447 g006
Figure 7. Box plots contrasting perceptions of system usability before and after implementation, (a) Moodle ease of use, (b) ease in creating educational resources, (c) ease in creating interactive learning objects, and (d) AI usability.
Figure 7. Box plots contrasting perceptions of system usability before and after implementation, (a) Moodle ease of use, (b) ease in creating educational resources, (c) ease in creating interactive learning objects, and (d) AI usability.
Applsci 14 06447 g007
Figure 8. Box plots for contrasting teachers’ perceptions of the ease of use of the platform depending on their contract type, (a) Moodle ease of use, (b) ease in creating educational resources, (c) ease in creating interactive learning objects, and (d) AI usability.
Figure 8. Box plots for contrasting teachers’ perceptions of the ease of use of the platform depending on their contract type, (a) Moodle ease of use, (b) ease in creating educational resources, (c) ease in creating interactive learning objects, and (d) AI usability.
Applsci 14 06447 g008
Figure 9. Box plots of the contrast for the average online student performance variables for each class before and after implementation, (a) performance, (b) completion progress, (c) interaction index, and (d) time spent in each class.
Figure 9. Box plots of the contrast for the average online student performance variables for each class before and after implementation, (a) performance, (b) completion progress, (c) interaction index, and (d) time spent in each class.
Applsci 14 06447 g009
Table 1. Ordinal variables are components of each factor.
Table 1. Ordinal variables are components of each factor.
Factor 1: Ease of use of the Moodle platform for online education
p 1 Do you consider that the virtual classroom system has an interface that facilitates user navigation?
p 2 Are virtual classroom systems’ communication blocks and tools easy to use and access?
p 3 Do you consider that each virtual classroom’s communication blocks and tools are configured optimally?
p 4 Do you consider the virtual classroom system’s publication and configuration of resources simple and user-friendly?
p 5 Are creating and configuring activities in the virtual classroom system (assignments, questionnaires, forums, glossaries, lessons, etc.) simple and user-friendly?
Factor 2: Ease of creating, publishing, and uploading educational resources
p 6 Are creating and publishing documents in the virtual classroom system easy?
p 7 Are recording and publishing educational videos in the virtual classroom system easy?
p 8 Do you consider it easy to create, edit, or publish figures and illustrations in the virtual classroom system?
p 9 Do you consider it easy to create, export, and import question banks in the virtual classroom system?
p 10 Are creating, projecting, and publishing presentations in the virtual classroom system easy?
Factor 3: Ease of creating and using interactive learning objects
p 11 Is creating interactive videos in the virtual classroom system easy? (videos that project questions and challenges to the user automatically)
p 12 Does the virtual classroom system facilitate the creation of interactive diagrams?
p 13 Do you consider that the virtual classroom system has sufficient tools to facilitate the creation of recreational elements such as word searches, crossword puzzles, and other interactive games?
p 14 Do you consider that the virtual classroom system has sufficient design and tools to promote the gamification of the virtual classrooms used to teach the subjects?
Factor 4: Ease of using artificial intelligence applied to education
p 15 Do you consider that the system and regulations implemented promote the use of artificial intelligence for text processing, such as ChatGPT?
p 16 Do you consider that the implemented system facilitates the creation of educational texts and documents assisted by artificial intelligence?
p 17 Does the implemented system facilitate the creation of educational presentations assisted by artificial intelligence?
p 18 Does the implemented system facilitate the creation of question banks in a format compatible with Moodle, assisted by artificial intelligence?
p 19 Do you consider that the implemented system facilitates the editing of academic documents assisted by artificial intelligence?
p 20 Does the implemented system facilitate the creation and editing of scripts for educational videos assisted by artificial intelligence?
Table 2. Descriptive statistics for the ordinal variables of the instrument applied to teachers.
Table 2. Descriptive statistics for the ordinal variables of the instrument applied to teachers.
QuestionMin1st Qu.MediumMean3rd Qu.Max.Sd.
E x p f 2 f 0.004.758.0012.4515.0040.0011.1122
E x p o n l i n e 0.000200025003554400013,0003.2458
Factor 1: Perceived ease of use of the Moodle platform for online education
p 1 1000575080006935900010,0002.2375
p 2 1000500070006565800010,0002.3454
p 3 1000600080007022900010,0002.2869
p 4 1000500070006663800010,0002.3312
p 5 1000600080007217900010,0002.3572
Factor 2: Perceived ease of using software to create educational resources
p 6 1000500070006587800010,0002.3117
p 7 1000500070006359800010,0002.4114
p 8 1000400070006196900010,0002.6446
p 9 1000500080006957900010,0002.5415
p 10 1000500070006641900010,0002.6000
Factor 3: Perceived ease of using interactive learning objects
p 11 1000400070006022800010,0002.4894
p 12 1.004.007.005.757.0010.002.4967
p 13 1.004.007.005.878.0010.002.4683
p 14 1000400070005707800010,0002.5786
Factor 4: Perceived ease of using artificial intelligence applied to education
p 15 1000300065005674800010,0002.9616
p 16 1000300065005565800010,0002.9772
p 17 1000300070005728900010,0003.1661
p 18 1.003.007.005.638.0010.003.0516
p 19 1000200060005424800010,0003.1070
p 20 1000200060005315800010,0003.1305
Table 3. The goodness of fit indices of the factor structure obtained through the ACF.
Table 3. The goodness of fit indices of the factor structure obtained through the ACF.
nparfminchisqdfp-value
46.00004.0021656.4102164.0000 2.2 × 10 16
cfitlinnfirmseasrmr
0.94610.94230.94200.03170.0451
Table 4. Wilcoxon Signed-Rank Test for each variable of the simple structure obtained through the ACF.
Table 4. Wilcoxon Signed-Rank Test for each variable of the simple structure obtained through the ACF.
VariableMoodle EASE of UseEase in Creating Educational ResourcesEase in Creating Interactive Learning ObjectsArtificial Intelligence Ease of Use
Wilcoxon Signed-Rank Test W = 673
p v a l u e = 0.03028 *
Significant
W = 659
p v a l u e = 0.02226 *
Significant
W = 458.5
p v a l u e = 5.742 × 10 5  *
Significant
W = 316.5
p v a l u e = 1.459 × 10 7 *
Significant
95% confidence interval−13.77818194
−0.07045598
−16.3528372
−0.5115355
−35.012105
−7.551537
−53.28974
−26.65344
Before71.0972 a70,0000 to44.9143 a32.4810 a
despues de80.0599 b80.0976 b74.9879 b80,0000 b
Medians with different letters in the same column differ significantly according to the Wilcoxon Signed-Rank Test for p v a l u e 0.05 . * Represents the p-values that reached the significance level.
Table 5. Descriptive statistics for the numerical variables obtained through the monitoring plugin applied to 224 classrooms used by students at the end of the teaching of each subject.
Table 5. Descriptive statistics for the numerical variables obtained through the monitoring plugin applied to 224 classrooms used by students at the end of the teaching of each subject.
VariableMin1st Qu.MediumMean3rd Qu.Max.Sd.
Average performance0.750698080007586853010,0001.6371
Average progress (%)0.0027.5047.3749.9671.05100.0025.6404
Average interaction rate (%)−25.0044.1556.8560.4379.8894.9322.0263
Average dedication time0.000718111,84314,26019,02067,613595.8164
Table 6. Wilcoxon Signed-Rank Test for each variable of the simple structure for the average metrics obtained from 224 virtual classrooms.
Table 6. Wilcoxon Signed-Rank Test for each variable of the simple structure for the average metrics obtained from 224 virtual classrooms.
VariableAverage
Performance
Average Progress (%)Average Interaction Rate (%)Average Dedication Time
Wilcoxon Signed-Rank Test W = 3113
p v a l u e = 7.39 × 10 11 *
Significant
W = 1373.5
p v a l u e = 2.2 × 10 16 *
Significant
W = 448
p v a l u e =
2.2 × 10 16  *
Significant
W = 2379
p v a l u e =
1.011 × 10 15 *
Significant
95% confidence interval−1.3700626
−0.7599909
−40.35009
−31.15001
−37.96991
−32.40999
−10.406495
−6.522186
Before7.5500 a32.4300 a44.9200 a8.3936 a
despues de8.4000 b68.2500 b79.8800 b18.0155 b
Medians with different letters in the same column differ significantly according to the Wilcoxon Signed-Rank Test for p v a l u e 0.05 . * Represents the p-values that reached the significance level.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Herrera-Granda, E.P.; Loor-Bautista, J.G.; Mina-Ortega, J.I. Incidence of Metaphorical Virtual Classrooms and Interactive Learning Objects in the Interaction of Online Students: An Ecuadorian Case Study. Appl. Sci. 2024, 14, 6447. https://doi.org/10.3390/app14156447

AMA Style

Herrera-Granda EP, Loor-Bautista JG, Mina-Ortega JI. Incidence of Metaphorical Virtual Classrooms and Interactive Learning Objects in the Interaction of Online Students: An Ecuadorian Case Study. Applied Sciences. 2024; 14(15):6447. https://doi.org/10.3390/app14156447

Chicago/Turabian Style

Herrera-Granda, Erick P., Jonathan G. Loor-Bautista, and Jorge I. Mina-Ortega. 2024. "Incidence of Metaphorical Virtual Classrooms and Interactive Learning Objects in the Interaction of Online Students: An Ecuadorian Case Study" Applied Sciences 14, no. 15: 6447. https://doi.org/10.3390/app14156447

APA Style

Herrera-Granda, E. P., Loor-Bautista, J. G., & Mina-Ortega, J. I. (2024). Incidence of Metaphorical Virtual Classrooms and Interactive Learning Objects in the Interaction of Online Students: An Ecuadorian Case Study. Applied Sciences, 14(15), 6447. https://doi.org/10.3390/app14156447

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop