Next Article in Journal
Can the Digital Economy Empower Low-Carbon Transition Development? New Evidence from Chinese Resource-Based Cities
Next Article in Special Issue
Sustaining and Reinforcing the Perceived Value of Higher Education: E-Learning with Micro-Credentials
Previous Article in Journal
Seismic Risk Assessment and Analysis of Influencing Factors in the Sichuan–Yunnan Region
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Use of AI-Driven Automation to Enhance Student Learning Experiences in the KSA: An Alternative Pathway to Sustainable Education

Department of Information Science, College of Humanities and Social Sciences, King Saud University, P.O. Box 11451, Riyadh 11437, Saudi Arabia
Sustainability 2024, 16(14), 5970; https://doi.org/10.3390/su16145970
Submission received: 29 April 2024 / Revised: 15 June 2024 / Accepted: 6 July 2024 / Published: 12 July 2024
(This article belongs to the Special Issue Sustainable E-Learning and Educational Technology)

Abstract

:
The relevance of virtual learning platforms has been increasingly recognised, and their merit in contributing to sustainable education is ever growing. Depending on the context, the benefits of these virtual platforms were revealed during the COVID-19 pandemic. Moreover, their impact has lingered on post-COVID-19, and virtual learning is now considered a viable option for continuing and sustainable education. Therefore, many countries have taken advantage of these virtual platforms to maximise student engagement, as evidenced by the reports in the existing literature. However, while these studies have explored how this can best be achieved, there are very few studies which have examined how the use of virtual platforms can help to deliver an educational approach that prepares young people to address the many and complex sustainability challenges of the future, i.e., the delivery of sustainable education. This study addresses this gap in the literature by exploring the question of how AI-powered automation can enhance student learning experiences in the Kingdom of Saudi Arabia (hereafter, KSA) as an alternative pathway for sustainable education. Data were collected from 1991 undergraduate and postgraduate students across 10 different Saudi universities using an online survey. The data were analysed using advanced structural equation modelling (SEM) to examine the relationship between student readiness and the (AI-powered) automation of administrative processes. The findings highlight the transformative potential of AI as an alternative pathway to sustainable education and for streamlining learning management system (LMS) operations. The implications of this study extend beyond the immediate instructional context, offering strategic direction for educators, LMS designers, policymakers, and institutional leaders in harnessing AI to equip individuals with the knowledge, skills, values, and attitudes necessary to contribute to a sustainable future.

1. Introduction

In recent years, the concept of sustainability, in all its forms, has gained significant traction in societal settings across the world. Predominantly, the concept is used in an environmental context, referring to the need to conserve natural resources, reduce all forms of pollution, and protect the planet’s physical environment [1]. However, the concept of sustainability also extends to other related areas of society, such as economic sustainability (the use of practices that support long-term economic growth, without negatively impacting the social and environmental aspects of the community), social sustainability (the creation of infrastructures that support the health and quality of life for all individuals in a society), and cultural sustainability (maintaining and valuing cultural heritage, traditions, and diversity).
Each of these dimensions (also known as the pillars) of sustainability are interrelated, and the cumulative, holistic concept has led to the rise of the connected notion of ‘sustainable education’. This is an approach to pedagogy that seeks to equip younger people with the skills, knowledge, and values required to contribute to a sustainable future [2,3]. By integrating the principles of sustainability into all areas of the educational process, sustainable education extends beyond the orthodox and traditional aspects of environmental issues to encourage a culture of environmental stewardship, social responsibility, and economic viability. By fostering interdisciplinary knowledge, critical thinking, and ethical reasoning, it is an approach to education which seeks to develop a holistic perspective that includes not only ecological, but also economic, social, and cultural dimensions of sustainability. The result, according to many, will be a more just, equitable, and sustainable world [4,5].
Bringing such goals to fruition will be neither a quick nor simple task. In fact, it is likely to require a paradigm shift in the attitudes, processes, and technologies used by educators. However, such a shift has already begun, with the implementation of advanced and innovative educational technologies. Often, these innovative changes are the result of necessity. The COVID-19 pandemic, for example, demanded unprecedented innovations across many sectors of society, including education, which hitherto largely relied on what can be described as traditional methods of teaching and learning [6,7]. Irrespective of context, schools, colleges, and universities alike were compelled to transition from the traditional on-campus models of teaching and learning to virtual platforms using learning management systems (LMSs) (Version 8.8 1). While the integration of such online/virtual learning platforms into the learning environment offers significant benefits, such as streamlining course content management to enhance student engagement and alleviating faculty workload [8,9], these platforms also offer major advantages regarding the development of sustainable education, in that they provide powerful tools and methods to teach all aspects of sustainability by improving accessibility, flexibility, inclusivity, and collaboration. This, in turn, helps to advance sustainability education and contribute to a future which is more sustainable across all dimensions of the concept.
In the case of the KSA, the dominant LMS used in Saudi universities was Blackboard Learn (BL), which offered the clear potential to deliver a continuous educational experience through virtual/online interactivity [10,11]. However, this potential is not being fully realised within the KSA’s higher education environment, and its contribution to sustainable education is in doubt [12]. The reasons for this have been explored by a number of studies using frameworks such as the technology acceptance model (TAM) [13], the theory of planned behaviour (TPB) [14], the unified theory of acceptance and use of technology (UTAUT) [15], and the diffusion of innovation (DOI) theory [16]. Individually and collectively, these studies have contributed significantly to the literature, although they have focused predominantly on predicting technology usage through the analysis of behavioural intention, and have ignored the question of its ability, either in principle or in practice, to contribute to educational sustainability [17]. The research models, for example, do not factor in the impact that AI-driven administrative processes can have on user readiness [18,19]. This leaves a notable gap in our current knowledge, as AI-driven automation of administrative processes within an LMS have the potential to significantly streamline the learning process, thereby promoting sustainable education. The provision of automated features such as chatbots for query resolution, personalised learning recommendations, and peer interaction, allows students to engage with course content more efficiently and with less bureaucratic friction [20,21,22]. The result is that students can spend less time navigating the administrative aspects of the learning process and more time focusing on actual learning as part of the sustainable education process [23]. Further, AI-automated features, such as immediate grading and personalised feedback, can provide students with timely insights into their academic performance, allowing them to promptly identify areas for improvement and effectively adjust their study strategies. Collectively, such features can contribute to a more tailored and engaging learning experience, potentially leading to better outcomes in terms of awareness and understanding of the various aspects of sustainability and how the students themselves can interact to produce a more sustainable future for society and the environment.
The quest for a more complete understanding of how these benefits of AI-driven automation can impact student readiness is particularly important, especially in the context of an evolving educational landscape, such as the KSA, where the efficiency, clarity, and ease-of-navigation of administrative processes can significantly influence student engagement. Furthermore, despite the growing body of research on LMS adoption globally, there is a noticeable absence of studies investigating student readiness using the Student Online Learning Readiness (SOLR) model within the KSA context or the sustainability of these strategies over time. This is important for a number of reasons, not the least of which being that the KSA’s Vision 2030 articulates an ambitious plan to develop a more autonomous and innovative HE system in order to catalyse a drive towards a more diverse and entrepreneurial economy and workforce [24], as well as to foster a wider appreciation and understanding of the concept of a sustainable society. Student readiness and engagement are therefore not only a priority, but are prerequisites for the viability of the KSA’s HE system [25]. Therefore, this paper explored the use of AI-driven automation to enhance student learning experiences in the KSA as a pathway to develop and enhance sustainable education.

2. Literature Review

The content of this section falls into four areas: (1) the relevance and importance of LMSs, (2) technology acceptance models, (3) e-learning readiness and the SOLR model, and (4) the emerging role of AI-driven automation. We will review each area in turn.

2.1. The Relevance and Importance of LMSs in Higher Education

Historically, the concept of the LMS was its use merely as basic content repository, but its function has been transformed into comprehensive platforms that facilitate a wide range of teaching methodologies, for use both in-person and online [26,27]. This is an evolution which has been largely driven by the shift towards student-centred learning approaches, which help to foster the deep understanding, critical thinking, collaboration, and active engagement necessary for addressing sustainability challenges. Such approaches require versatile platforms that can support diverse learning and teaching strategies [28]. Indeed, the LMS has proved to be more than just a solution to changing educational needs; it has become a driver of innovation, helping educators create more interactive and engaging learning experiences through the use of scalable and customisable environments. Such systems enable the incorporation of multimedia resources, discussion boards, and real-time feedback mechanisms that enrich the learning process [29]. Furthermore, LMS platforms have been recognised for their role in supporting the flipped classroom model, which is particularly well-suited for sustainability education because it encourages and supports the active engagement and student-centred learning that is essential for understanding and addressing complex sustainability issues and is therefore being increasingly adopted in HE [30,31].
Beyond educational content delivery, LMS platforms also contribute to administrative efficiency within higher education institutions. For instance, the LMS has the capacity to streamline course management, student assessment, and other administrative tasks, which can lead to more efficient use of institutional resources [32,33,34]. In addition, advanced LMS platforms provide data analytics features that can inform decision-making processes, enhance student performance tracking, and lead to more personalised learning experiences [35]. The comprehensive functionality of LMSs (the predominant LMS used in KSA universities), which includes robust course management tools, collaborative features, and efficient distribution of learning materials, plays a critical role in creating an accessible and flexible educational environment [35]. This flexibility is vital as higher education institutions strive to increase educational accessibility for all students, including those with disabilities or those who are geographically dispersed [36]. LMS platforms are also very important to students who cannot commit to full-time study and who are juggling educational pursuits with work and family commitments. The ability to access course materials and engage with instructors and peers at any time, and from any location with internet connectivity, play an important part in helping to address the varied educational needs and lifestyles of a diverse student population [37].
Furthermore, as higher education becomes more globalised, LMS platforms are being increasingly deployed to manage cross-cultural exchanges and collaborative international programmes. These systems are not only used to manage content delivery, but also to foster intercultural competencies and global awareness among students [38]. This aspect of LMS platforms is crucial for preparing students for a globally interconnected world, as well as to promote sustainable global educational values. Yet, despite the numerous advantages of LMS platforms, challenges remain. These include securing faculty buy-in, providing adequate training for both instructors and students, and continuously adapting to the ever-changing landscape of educational technology [39]. The dynamic nature of technological advancement necessitates that LMS platforms must not only be robust and reliable, but also agile and responsive to the evolving needs of the educational community [40]. Therefore, LMS platforms have become foundational for the modern educational landscape in higher education, offering a plethora of tools and capabilities that support a multiplicity of pedagogical models and institutional operations. The use of these systems has expanded the reach and accessibility of educational opportunities, meeting various student needs and situations [38,39]. With the growing emphasis on personalised learning experiences and the increasing importance of data-driven decision making, LMS platforms are poised to continue their evolution, becoming ever more integral to pedagogical strategies that foster an understanding of the importance of sustainability in regards to the environment, society, and the global economy.

2.2. Technology Adoption Models in LMS Research

The acceptance and subsequent adoption of new technologies are clearly critical to maximising their benefits. In order to understand the factors which influence acceptance and adoption behaviours among users, researchers have constructed several theoretical frameworks. Here, we look briefly at some of the models which are widely recognised in the field of higher education, prominent among which is the technology acceptance model (TAM). Introduced by Davis in 1989 [41], TAM posits that perceived usefulness (PU) and perceived ease of use (PEOU) are fundamental determinants of technology acceptance (Figure 1). Widely used to study LMS adoption, the model has provided valuable insights into how attitudes and intentions formed during the initial acceptance phase can shape subsequent engagement behaviours [42]. However, while it is an important model, TAM has been criticised for its failure to capture nuanced user perceptions [43], as well as for oversimplifying the adoption process so that it does not account for specific contextual factors that influence LMS usage, such as organisational support, user training, and technological infrastructure [42,43]. Another criticism of the original TAM is that its structure is predicated on mandated use, i.e., students are not offered a choice in regards to the LMS they use. In this case (mandated use), the application of the TAM is (subtly but significantly) different from situations in which students are offered a choice of platforms. When the LMS is mandated, for example, learners ‘judge’ PU on the basis of a single LMS experience (i.e., no benchmark), possibly leading to an unbalanced assessment. Furthermore, in the mandated-use situation, the significance of PEOU can become distorted, as students cannot choose an alternative system if they find the mandated one difficult to use. Conversely, in a mandated scenario, behavioural intention (BI) to use the system becomes less significant, as learners have no choice other than the provided LMS [44,45].
Another widely recognised model in the field of higher education is the theory of planned behaviour (TPB), developed by Ajzen in 1985 [46]. Expanding on the theory of reasoned action [14,47], the TBP includes the construct of perceived behavioural control, accounting for situations where individuals have incomplete control over their behaviour (Figure 2). The TPB has been instrumental in understanding how social pressure (subjective norms) and control factors impact LMS adoption [43], although some scholars have argued that it does not fully account for habitual behaviour and emotional reactions, which can significantly influence the adoption and use of technologies [48,49].
Rogers’ diffusion of innovations (DOI) theory is yet another model which has been used to study the adoption of technologies, including LMS platforms [16]. Specifically, this model examines how attributes such as relative advantage, compatibility, complexity, trialability, and observability can act as drivers of adoption (Figure 3). The DOI has been used to identify barriers to faculty and student adoption [50], though it has been challenged for its lack of differentiation between individual and organisational adoption processes [51]. Critics of the theory have also argued that it tends to focus on individual decision making, without fully considering the complex dynamics of organisational change and cultural resistance that can affect LMS adoption [11].
Further, the unified theory of acceptance and use of technology (UTAUT), proposed in 2003 by Venkatesh et al., has also contributed significantly to the literature regarding technology acceptance [52] (Figure 4). Fundamentally, UTAUT integrates elements from various adoption models to study the acceptance and sustained use of LMS platforms [53,54,55]. Critics have argued that the complex relationships between its constructs can make it difficult to apply in practice, and that its predictive power becomes questionable when extended to non-Western contexts or different types of technologies [53].
Across the board, each of the models discussed above play an important role in our understanding of technology acceptance and adoption, as well as contribute to LMS adoption research. Despite this, however, the student readiness construct is often omitted from the model, or is given less emphasis. This leads to a significant gap in the literature, as student readiness is a multidimensional construct which covers a wide range of psychological and emotional aspects, such as internal and external motivation, self-regulation, purposefulness, and planning [56], necessary for engaging with LMS platforms [56]. Recent research has called for more nuanced models that can address the limitations of existing frameworks. There is a need for models that take into account the evolving nature of digital literacy and e-learning, as well as the impact of social media and collaborative technologies, on LMS adoption [31,57]. Additionally, the increasing use of analytics in LMS platforms suggests a need for adoption models that can encompass data-driven decision making and privacy concerns [58]. Therefore, while technology adoption models like TAM, TPB, DOI, and UTAUT provide a valuable foundation for understanding how LMS platforms are adopted in higher education, there is growing recognition of the need to refine these models. This includes incorporating factors such as student readiness, digital literacy, and the unique contextual challenges of implementing LMS platforms across diverse educational landscapes.

2.3. E-Learning Readiness and the SOLR Model

E-learning readiness refers to the degree to which a learner is prepared to participate in, and benefit from, e-learning processes. As a concept, e-learning readiness includes not just access to technology, but also cognitive, emotional, psychological, and behavioural readiness [59,60,61], and it is often seen as a critical determinant of successful engagement with online learning environments. The multidimensional nature of e-learning readiness, emphasised by Liu and Roberts-Kaye [62], also includes such key factors as self-direction, strategic learning, technological skills, and digital etiquette. These key factors, it has been argued, are essential in order for learners to navigate the online learning space and to use the environment effectively.
The SOLR model (Figure 5), developed by Allam et al. [63], expands on e-learning readiness and the key factors mentioned above. The SOLR model divides readiness into three specific broad competencies, i.e., communication (the ability to effectively communicate in an online environment), social interaction (the capacity to engage with peers and instructors in a virtual setting), and technology use (proficiency in using e-learning platforms and related digital tools to support learning objectives).
The relevance and validity of the SOLR model has been tested in various educational contexts, including in a study by the Jordanian Massive Open Online Courses (MOOC), which showed significant correlations between the SOLR competencies of students and successful course engagement [64]. Another study by the Malaysian Adult Learners, which highlights self-efficacy as a key component within the SOLR framework, suggests that self-belief is a strong predictor of e-learning success [65]. Furthermore, a study by the Malaysian Physiotherapy Student Association also used the SOLR model to show that gender had little impact on overall e-learning readiness [66].

2.4. AI-Driven Administrative Automation in LMSs

The increasing use of AI-driven administrative automation is driving significant changes in LMS platforms by enhancing efficiency and facilitating the personalisation of the learning experience. AI technology achieves these results in a number of ways, ranging from automating routine administrative tasks, such as grading and scheduling, to providing adaptive learning pathways, which can enhance sustainability education by providing personalised, engaging, and flexible learning experiences [21,22,67]. Despite these advancements, however, there is little research on the impact of AI-driven administrative automation on student readiness, particularly within LMS platforms and in specific contexts, such as KSA universities. This leaves a notable gap in the current literature. This study aims to help fill this gap by extending the SOLR model to encompass AI-driven administrative efficiency and by examining the impact of AI automation on student readiness for LMS in the context of higher education in the KSA. The results could prove valuable to administrators and policymakers in the KSA, as well as in similar higher education contexts, by providing insights into the implications of AI for sustainable educational practices and student engagement.

2.5. Research Model and Hypothesis

The research model of this study seeks to assess the effect of AI-driven automation within an LMS environment on student readiness for e-learning in KSA universities. There is significant evidence that AI-driven LMSs can play a key role in helping to develop long-term learner readiness, as opposed to merely helping users to adjust to a lack of readiness in the short term. One study [68], for example, found that learners who used AI-driven adaptive learning tools progressed significantly more rapidly, suggesting that such approaches contribute to the development of long-term competency, as opposed to merely short-term support, while another notable research project [69] underscored how formative feedback (a fundamental feature of AI-powered LMSs) promotes long-term learner readiness by aiding in the development of metacognitive skills. In a related study, Hrastinski [70] supported these findings by showing that the collaborative tools which are integral to the LMS concept help to build long-term learner readiness by enhancing teamwork and communication skills.
The model is adapted from the SOLR framework, which proposes that student readiness for online learning can be quantified through three core competencies—social, communication, and technical. These competencies are particularly relevant in an LMS context, as they could be significantly influenced by the automation of administrative processes.
The model not only assesses the impact of factors such as AI-driven personalised learning on the core competencies, but also the impact of automated tasks, such as grade posting and scheduling. Such tasks can have a direct or indirect effect, not only on technical competencies, but also on social and communication competencies as well. Rapid grade posting, for example, can help learners quickly identify areas for improvement, thus fostering a proactive approach to discussing progress with peers and teachers. Similarly, automated notifications can help ensure schedule transparency for the learner, which can reduce anxiety and promote open communication with their instructors and classmates. Another example of how AI-automated systems can improve communication is in the area of peer review. Such systems can encourage regular interaction and help learners develop their ability to receive and provide constructive feedback, which is a communication skill that is critical for optimal learning. There is also research which shows that AI can efficiently organise collaborative activities by identifying optimal times for all participants involved, thus encouraging and helping learners to develop their communication skills in a team environment [71,72,73].
The proposed research model, including the hypothesised relationships, is shown in Figure 6.
The study proposes a series of hypotheses aimed at examining the factors that influence student readiness to use the LMS:
H1. 
Technical competency has a positive effect on student readiness for using an LMS.
H2. 
Social competency has a positive effect on student readiness for using an LMS.
H3. 
Communication competency has a positive effect on student readiness for using an LMS.
H4. 
Personalised learning has a positive effect on student readiness for using an LMS.
H5. 
Test administration has a positive effect on student readiness for using an LMS.
H6. 
User support features have a positive effect on student readiness for using an LMS.

3. Research Methodology

3.1. The Survey Instruments

Data for the study was collected using an online survey (Google Forms). This approach was chosen, as such surveys have a significant geographical reach, as well as typically achieve higher response rates compared to other traditional methods [74]. The use of online surveys is also relatively cost-effective, allows for higher sample sizes, and offers significant flexibility [75]. The online survey comprised two sections. The first section was planned to collect demographic and background information about the participants that would later be used to contextualise the findings. This data included details concerning gender, the specific college attended, the participant’s current academic programme, and the type of device the student used for accessing the LMS and online learning materials. The second section was designed to evaluate the constructs defined in the research model (Figure 6). Each factor was examined through a series of carefully formulated items (Table 1), and the responses of the participants were measured using a 5-point Likert scale. It is important to acknowledge that the questionnaire used in this study was adapted from those in the literature to suit the specific needs of this research (e.g., [76,77,78,79]). The adaptation involved reviewing existing questionnaires related to the present study and modifying them to ensure relevance and comprehensiveness for the study’s context.

3.2. Sampling

Up to 3000 students across various levels of study (undergraduate/postgraduate) and who are actively engaged in online courses facilitated through LMS at ten KSA universities were invited to take part in the survey. Of the initial invitations, 2105 students expressed interest, out of which 114 did not meet the particular selection criteria. Therefore, the resulting sample size of admissible responses was 1991. Table 2 provides a summary of the valid respondent demographics.
Participants were recruited during the peak of the 2022–2023 academic year (April–June 2023), essentially to ensure that they received significant exposure to the LMS model. The research complied with the relevant ethical standards approved by the Research Ethics Committee of King Saud University It was also emphasised that all data collection and handling processes would ensure complete anonymity, and that the participants could withdraw from the study at any period of time. The participants were not offered any direct incentives, financial or otherwise, to inspire their participation. However, a small donation to a charitable cause was promised for every survey completed and deemed valid.

3.3. Non-Response Bias

Approaches suggested by Armstrong and Overton were used to calculate the potential impact of non-response bias on the study findings [74]. In practical terms, the analysis was carried out comparing the earliest and the latest quartiles of respondents to identify any disparities attributable to non-response based on the assumption that the attributes of those who participated in the survey would be reflective of those who did not. Across the board, using a t-test (p > 0.05), no notable variances were found between these cohorts in regards to the primary variables under investigation. Further, a Chi-squared test [74,80,81] was performed to evaluate differences in gender and age profiles between these groups, and this test similarly revealed no significant discrepancies (p > 0.05). These outcomes therefore suggest that non-response bias is unlikely to have materially affected the results of the study.

3.4. Method of Analysis

Two distinct analytical techniques were employed to examine the data. The initial phase involved the processing and examination of participants’ demographic data using SPSS (Version 29) to evaluate some descriptive statistics. The subsequent phase, the evaluation of moment structures (AMOS), was conducted using a dual-phase method, i.e., the use of confirmatory factor analysis (CFA) to evaluate the measurement model’s validity, as well as the application of structural equation modelling (SEM) to investigate the interrelationships among the factors and to verify the research hypotheses.

4. Results

4.1. Testing the Measurement Model

The study used factor analysis (FA) to determine latent factors from the variables/items, while also evaluating the model’s fit and its (convergent and discriminant) validity [74,80]. The preliminary step involved verifying the suitability of the sample size for FA, using the Kaiser–Meyer–Olkin (KMO) measure. This yielded a value of 0.881, which exceeded the recommended baseline (0.7), indicating a sufficient sample size for FA [81,82]. Bartlett’s test of sphericity was also conducted to test the null hypothesis regarding the presence of correlations among the variables. The test yielded a statistically significant outcome (p-value < 0.05), reinforcing the decision to apply FA in this context. In regard to the indices of model fit, all the values obtained were inside the acceptable range, consistent with accepted criteria [82,83,84]. These indices are shown in Table 3 below.
Table 1 displays the factor loadings for the variables, which were acceptably strong, with values ranging between 0.766 and 0.955. This provides evidence of the convergent validity of the study, confirming that each variable was a robust indicator of its respective factor [80]. To evaluate internal consistency, Cronbach’s alpha (CA) was used, and the results, documented in Table 4, show CA values for the constructs between 0.79 and 0.84. The composite reliability (CR) scores also exceeded the established minimum standard of 0.70, with a range from 0.76 to 0.82. These measures indicate a high level of internal consistency, which suggests that the constructs within the study were measured with a high degree of reliability, as supported by the literature [85].
A test for discriminant validity was also conducted to verify that the constructs differed significantly from each other regarding how they were measured, as per accepted guidelines [80]. This test involves comparing the square roots of the average variance extracted (AVE) from each construct with the respective correlation coefficients. To confirm discriminant validity, the square root of the AVE for any given construct must be larger than any of its correlation coefficients with other factors, and it must be larger than 0.50 [86]. Table 4 shows that this critical threshold has been met for all constructs in the study.
The analysis also considered the potential for multicollinearity (i.e., the occurrence of high intercorrelations among two or more independent variables), which can distort statistical analysis [87]. To assess this potential, the variance inflation factor (VIF) and the tolerance statistic were used. The derived VIF values were found to be less than 3.0, and the tolerance values were greater than 2.0, suggesting that multicollinearity did not unduly influence the study’s results [88,89]. Given the above, it is reasonable to assume that the measurement model offers sufficient validity and reliability, demonstrates a strong model fit, and exhibits substantial convergent and discriminant validity.

4.2. Analysis of Common Method Variance and Bias

The concept of common method variance (CMV) applies to error which is attributable to the measurement method rather than to the construct [87]. As such a concern was applicable to the current investigation, Harman’s single-factor test was applied as a procedural check for CMV [89]. However, the outcome of this test suggested no indication of CMV. At the same time, a test for common method bias (CMB)—a subset of CMV potentially introduced by the implementation of identical response scales [87,88,89]—was conducted using the common latent factor method [90]. This test also proved negative, showing that CMB was not present. These findings for both CMV and CMB provide additional evidence for the reliability and validity of the research findings.

4.3. Findings of the Research Hypotheses

To measure the psychometric attributes of the measurement model and to test the theoretical hypotheses, this research employed structural equation modelling (SEM). The analysis shows that there is a positive relationship between AI-driven administrative processes and student readiness for e-learning, i.e., the more streamlined and intuitive the administrative processes due to AI automation, the greater the students’ preparedness and willingness to engage with e-learning platforms (Figure 7). It should be noted, however, that the sample consisted of participants who exhibited various levels of experience with LMS. It is possible that this correlation has been weighted by previous student experience. Future research could seek to investigate this possibility through the use of an appropriately (in)experienced sample.
The model successfully explains a significant proportion of the variance (approximately 62.1%) in students’ intentions to fully utilise the LMS for their educational needs. These findings support the hypotheses H1–H6. The statistical significance of the relationships between variables was confirmed by the t-values and standardised path coefficients (Table 5). The extent and impact of these relationships were emphasised by the standardised path coefficients, providing a deeper understanding of the dynamics at play. In summary, the SEM results offer good support for the proposed theoretical framework.

5. Discussion

This examination of the effects of AI-driven administrative automation processes (i.e., personalised learning, test administration, and user support features) on e-learning readiness, in the context of Saudi Arabian universities, has provided some valuable insights, along with additional readiness dimensions (i.e., technical, social, and communication competencies). These factors have been shown by the study to have a significant positive influence on student readiness to engage with an LMS.
In recent years, the relationship between technical competency and e-learning readiness has been the subject of considerable interest in the academic literature (e.g., [20,67,91]), and such studies have drawn attention to the importance of technical skills as a key factor in successful e-learning engagement. However, in an AI-augmented LMS environment, such as that considered in this study, the effective definition of technical competency needs to be re-evaluated. In an AI-augmented environment, technical competency is no longer merely about the basic ability to navigate a digital interface; rather, it becomes a more comprehensive skillset that involves interacting with an intelligent system capable of not only anticipating, but also responding to a diverse array of student administrative needs. This evolution in competency requirements aligns with the expectations of many higher education environments, including that of the KSA, where AI literacy is becoming an indispensable part of its drive towards the sustainability of its academic success.
As students interact with AI-enhanced systems, they effectively engage in a reciprocal dialogue with technology [22,37] in such a way that the system’s anticipatory nature allows it to cater to their individual administrative needs. The system may, for example, make course recommendations, based on the student’s academic history, current progress, and degree requirements, or it can analyse the student’s existing schedule and preferences to propose an optimised class schedule that avoids conflicts and ensures a balanced workload. This responsiveness is indicative of a shift in e-learning paradigms, wherein technical competency involves a blend of strategic interaction with AI and traditional digital navigation skills. Thus, the use of AI-driven automation acts as a catalyst to help students to augment their basic digital literacy with more complex technical skills. In effect, AI serves as a digital learning accelerator, narrowing the gap between the possession of foundational technical skills and the mastery of advanced competencies that are increasingly demanded in the educational environment, as well as in the context of global interaction [92,93].
These developments have some profound and far-reaching implications for sustainable HE in the KSA. They strongly suggest, for example, that training programmes must be re-conceptualised and reformulated to leverage the benefits of AI [22]. These reformulated programmes should provide practical, hands-on experience with AI-enhanced systems to ensure that students are not only familiar with AI capabilities, but also adept at using these capabilities to streamline their learning processes [93,94]. Furthermore, training initiatives should be designed to reflect the integrated nature of AI in the LMS, offering students the opportunity to engage directly with AI functionality. This approach will prepare students for the AI-supported management of administrative tasks, an essential component of technical competency in the AI-enhanced LMS landscape.
The reconceptualisation of learning programmes as described above raises the question of how to encourage and enable students to engage with LMSs. The answer to this question extends beyond the relatively simple issue of user interface (UI) design, as is suggested by many sources which argue that technical competency is inherent due to the increasingly embedded nature of digital technology in society [95,96]. Instead, the full realisation of AI’s potential in e-learning requires an intuitive understanding of AI’s role within the system, coupled with targeted and structured training [96]. This is a key issue, as the development of technical competency, to meet a more advanced definition that aligns with AI integration, has been recognised in previous research.
Another key factor in the cultivation of an effective e-learning environment is the interdependence of social and communication competencies within the (AI-enabled) LMS framework. The integration of AI-driven administrative automation with an LMS has the potential to enhance these competencies by simplifying and streamlining the logistical and organisational components of the learning process [93,94]. This gives students the opportunity to focus more on meaningful social interactions and the development of their communication skills, which are critical to collaborative learning. Not only can AI provide the opportunity for greater social interaction, but the technology can also play an important role in realising this opportunity by analysing student data to suggest relevant social connections and study groups. This helps to create a more personalised and interconnected learning community. This tailored networking approach can help students identify, and connect with, peers with similar interests and/or academic goals, fostering a sense of belonging and enhancing collaborative learning opportunities.
AI can also provide a similar benefit within a communications context by offering context-aware communication capabilities, such as language translation services or sentiment analysis, to support diverse student populations and to help ensure that all voices are heard and understood [93,94]. These tools can facilitate a more inclusive environment where students from various backgrounds can communicate more effectively. As AI becomes more integrated with e-learning environments, it is important that educational institutions, and particularly those which are digitally progressive, such as those in Saudi Arabia [97], adapt their training programs to meet the evolving demands of technical literacy. By doing so, they will equip students with the necessary competencies to thrive in a modern educational ecosystem where AI-driven automation is rapidly becoming the norm.

5.1. Theoretical Contributions

One of the key theoretical implications of this study is that it extends the boundaries of the current SOLR literature by introducing the concept of AI-driven automation as a significant factor in the development of e-learning readiness. While the SOLR literature has previously focused on readiness factors such as self-efficacy, computer literacy, and student attitudes towards e-learning, this study enhances our understanding of these factors by showing how they are impacted by AI functionalities embedded within the LMS platform. The study also suggests that readiness for e-learning is not a static entity; it evolves as technology advances. This is clear from the results of the study, which show that AI-driven automation has added a new component to readiness, suggesting that students’ ability to interact with intelligent systems is now part of what it means to be prepared for e-learning. By incorporating AI into the SOLR model, the study narrows the gap between educational technology and computer science, inviting interdisciplinary dialogue on how these fields can collaborate to enhance e-learning readiness. This study’s theoretical contribution significantly expands the SOLR literature by integrating the dimension of AI-driven administrative automation.

5.2. Practical Contributions

There are several important practical implications of this research, especially for stakeholders in the Saudi Arabian higher education system, where technology integration is rapidly advancing. One group of stakeholders than can benefit from the insights provided by this study is educators. Knowing that AI can improve technical, social, and communication competencies, educators can develop learning environments that use these improved competencies to maximise student engagement and readiness. AI algorithms, for example, can assist in forming study groups or project teams based on students’ learning styles, performance data, and preferences. This intelligent grouping capability can lead to more effective and harmonious collaboration, as it takes into account the strengths and weaknesses of each student to create balanced groups [94]. Institutions can also use AI to analyse participation patterns, which can help to predict when students might be disengaging. This can prompt instructors or peers to reach out to the learner concerned—an approach which can help to maintain a vibrant and interactive learning community.
Another way that AI can improve engagement with, and effectiveness of, an LMS is by enabling students to concentrate on the human aspect of learning—that is, connecting, communicating, and collaborating with peers and instructors. As has been noted in other research [93,94], these competencies are integral to the student’s e-learning experience, and strengthening them through the use of AI can enrich the overall quality of online education. The Saudi Arabian higher education system, with its growing adoption of online learning, is likely to benefit significantly from these AI advancements.
The study will also be of benefit to policymakers, who can use the findings to inform decisions regarding the allocation of resources within the education systems. A deeper understanding of the positive impact of AI on student readiness could enable investments to be more strategically targeted to areas that will yield the most significant benefit in terms of learning outcomes and student preparedness. Lastly, for designers of LMS systems, the study emphasises the importance of user-centric AI features that support the development of the competencies critical for e-learning readiness.
Overall, the study offers insights for educators, institutional leaders, policymakers, and LMS designers which can aid them in harnessing AI’s potential to increase student readiness for e-learning. Through a deeper understanding of the nuanced impacts of AI, stakeholders can enhance the design, implementation, and utilisation of e-learning platforms, ultimately improving the educational experience and outcomes for students in Saudi Arabia and beyond.

6. Conclusions, Limitations, and Future Research

By applying an evolved version of the SOLR model, this research set out to better understand how the use of AI-driven automation can enhance students’ learning experiences in the KSA as an alternative pathway to sustainable education, as defined earlier, with a particular emphasis on Saudi Arabian universities. The results of the study showed clearly that the integration of AI-driven administrative automation exhibits a significant influence on readiness for LMS. The findings, therefore, point to a clear need for higher education institutions in Saudi Arabia to leverage AI capabilities within their LMSs to help maximise students’ preparedness for, and effectiveness in, utilising the LMS platform.
While the model used in this study has been shown to be robust, it also exhibits some limitations. The complexity of the e-learning readiness construct, for example, suggests that additional elements, such as learners’ self-efficacy and intrinsic motivation, could further refine our understanding and could be usefully examined in future research. Additionally, the adoption of a mixed-method approach in subsequent studies could enrich our understanding of student readiness by facilitating a more nuanced exploration of the interplay between AI-enhanced LMS features and learner engagement. In terms of the approach, it is also important to note that the current study is cross-sectional. It uses the SOLR model to evaluate the impact of technologies on e-learning readiness at a specific point in time. Future researchers could consider the use of a longitudinal approach, which takes into account the learners’ previous experiences with LMSs and how readiness has changed over time due to the introduction of advanced technology.
Moreover, the data were collected from a sample of students following a range of courses at different levels, so the results do not reveal any insights as to if, and to what extent, the relationships verified by the research vary between academic domains (subjects). Further, the limited scope of data collection, which occurred at only 10 Saudi Arabian universities, may impact the broader applicability of the study’s conclusions. Additionally, a potential limitation of this study is the inability to discriminate between student readiness as measured by the SOLR model and the readiness gained by the time spent acclimating to the LMS before the study. This could influence the accuracy of the readiness measurements. Future research could address this limitation by designing studies that separate these two aspects of readiness, providing a clearer understanding of their individual impacts. To enhance the generalisability of these findings, future research could consider using data from a more diverse array of universities and higher education institutions globally. By expanding the sample, the research community can better understand the extent and depth of the impact of AI-driven automation on e-learning readiness across various educational contexts.
To conclude, this study suggests that the integration of AI-driven administrative automation within the LMS environment is a critical factor in shaping student readiness for e-learning in institutions of higher education in Saudi Arabia. It is therefore essential that Saudi educational institutions leverage the potential of AI in enhancing students’ readiness in order to equip students with the necessary tools to navigate and excel in an increasingly digital academic world.

Funding

This research was funded by the Researchers Supporting Project number (RSP2024R233), King Saud University, Riyadh, Saudi Arabia.

Institutional Review Board Statement

The study was carried out in accordance with the principles outlined in the Declaration of Helsinki and received approval from the Institutional Review Board (Human and Social Research) at King Saud University (KSU-HE-12-242-2022).

Informed Consent Statement

All participants involved in the study provided written informed consent.

Data Availability Statement

The data can be made available upon request to ensure that privacy re-strictions are upheld.

Acknowledgments

The author would like to extend his sincere appreciation to the Researchers Supporting Project (RSP2024R233), King Saud University, Riyadh, Saudi Arabia.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Brundtland, G. Our Common Future: Report of the World Commission on Environment and Development; Oxford University Press: Oxford, UK, 1987. [Google Scholar]
  2. Zhou, L.; Alam, G.M. Commercial Higher Education Strategies for Recruiting International Students in China: A Catalyst or Obstacle for Sustainable Education and Learning? Discov. Sustain. 2024, 5, 33. [Google Scholar] [CrossRef]
  3. Huang, R.X.; Pagano, A.; Marengo, A. Values-Based Education for Sustainable Development (VbESD): Introducing a Pedagogical Framework for Education for Sustainable Development (ESD) Using a Values-Based Education (VbE) Approach. Sustainability 2024, 16, 3562. [Google Scholar] [CrossRef]
  4. Araneo, P. Exploring Education for Sustainable Development (ESD) Course Content in Higher Education; a Multiple Case Study Including What Students Say They Like. Environ. Educ. Res. 2024, 30, 631–660. [Google Scholar] [CrossRef]
  5. Pouresmaieli, M.; Ataei, M.; Nouri Qarahasanlou, A.; Barabadi, A. Building Ecological Literacy in Mining Communities: A Sustainable Development Perspective. Case Stud. Chem. Environ. Eng. 2024, 9, 100554. [Google Scholar] [CrossRef]
  6. Dos Santos, L.M. Online Learning after the COVID-19 Pandemic: Learners’ Motivations. Front. Educ. 2022, 7, 879091. [Google Scholar] [CrossRef]
  7. Aristovnik, A.; Karampelas, K.; Umek, L.; Ravšelj, D. Impact of the COVID-19 Pandemic on Online Learning in Higher Education: A Bibliometric Analysis. Front. Educ. 2023, 8, 1225834. [Google Scholar] [CrossRef]
  8. Asabere, N.Y.; Agyiri, J.; Tenkorang, R.; Darkwah, A. Learning Management System (LMS) Development for Higher Technical Education in Ghana. Int. J. Virtual Pers. Learn. Environ. 2021, 11, 87–109. [Google Scholar] [CrossRef]
  9. Baykasoglu, A.; Felekoglu, B.; Ünal, C. Perceived Usability Evaluation of Learning Management Systems via Axiomatic Design with a Real Life Application. Kybernetes 2024, 53, 83–101. [Google Scholar] [CrossRef]
  10. Syaad, P.; Hidayat, W.N. The Effectiveness of Learning Management System (LMS) on Computer Assisted Learning Course for Informatics Engineering Education Students. Adv. Sci. Lett. 2018, 24, 2642–2645. [Google Scholar] [CrossRef]
  11. Chen, C.K.; Almunawar, M.N. Cloud Learning Management System in Higher Education. In Opening Up Education for Inclusivity across Digital Economies and Societies; IGI Global: Hershey, PA, USA, 2019; pp. 29–51. [Google Scholar] [CrossRef]
  12. Bousbahi, F.; Alrazgan, M.S. Investigating IT Faculty Resistance to Learning Management System Adoption Using Latent Variables in an Acceptance Technology Model. Sci. World J. 2015, 2015, 375651. [Google Scholar] [CrossRef] [PubMed]
  13. Davis, F.D.; Bagozzi, R.P.; Warshaw, P.R. User Acceptance of Computer Technology: A Comparison of Two Theoretical Models. Manag. Sci 1989, 35, 982–1003. [Google Scholar] [CrossRef]
  14. Ajzen, I. The Theory of Planned Behavior. Organ. Behav. Hum. Decis. Process. 1991, 50, 179–211. [Google Scholar] [CrossRef]
  15. Venkatesh, V.; Davis, F.D. A Model of the Antecedents of Perceived Ease of Use: Development and Test. Decis. Sci. 1996, 27, 451–481. [Google Scholar] [CrossRef]
  16. Rogers, E.M. Diffusion of Innovations; Free Press: New York, NY, USA, 2003. [Google Scholar]
  17. Al-Mamary, Y.H.S. Understanding the Use of Learning Management Systems by Undergraduate University Students Using the UTAUT Model: Credible Evidence from Saudi Arabia. Int. J. Inf. Manag. Data Insights 2022, 2, 100092. [Google Scholar] [CrossRef]
  18. Alamri, M.M. Investigating Students’ Adoption of MOOCs during COVID-19 Pandemic: Students’ Academic Self-Efficacy, Learning Engagement, and Learning Persistence. Sustainability 2022, 14, 714. [Google Scholar] [CrossRef]
  19. Al-Adwan, A.S. Investigating the Drivers and Barriers to MOOCs Adoption: The Perspective of TAM. Educ. Inf. Technol. 2020, 25, 5771–5795. [Google Scholar] [CrossRef]
  20. Money, W.H.; Dean, B.P. Incorporating Student Population Differences for Effective Online Education: A Content-Based Review and Integrative Model. Comput. Educ. 2019, 138, 57–82. [Google Scholar] [CrossRef]
  21. Zhai, X.; Chu, X.; Chai, C.S.; Jong, M.S.Y.; Istenic, A.; Spector, M.; Liu, J.-B.; Yuan, J.; Li, Y. A Review of Artificial Intelligence (AI) in Education from 2010 to 2020. Complexity 2021, 2021, 8812542. [Google Scholar] [CrossRef]
  22. Zhang, K.; Aslan, A.B. AI Technologies for Education: Recent Research & Future Directions. Comput. Educ. Artif. Intell. 2021, 2, 100025. [Google Scholar] [CrossRef]
  23. Nizam Ismail, S.; Hamid, S.; Ahmad, M.; Alaboudi, A.; Jhanjhi, N. Exploring Students Engagement Towards the Learning Management System (LMS) Using Learning Analytics. Comput. Syst. Sci. Eng. 2021, 37, 73–87. [Google Scholar] [CrossRef]
  24. Mohiuddin, K.; Nasr, O.A.; Nadhmi Miladi, M.; Fatima, H.; Shahwar, S.; Noorulhasan Naveed, Q. Potentialities and Priorities for Higher Educational Development in Saudi Arabia for the next Decade: Critical Reflections of the Vision 2030 Framework. Heliyon 2023, 9, e16368. [Google Scholar] [CrossRef] [PubMed]
  25. Nazneen, A.; Elgammal, I.; Khan, Z.R.; Shoukat, M.H.; Shehata, A.E.; Selem, K.M. Towards Achieving University Sustainability! Linking Social Responsibility with Knowledge Sharing in Saudi Universities. J. Clean. Prod. 2023, 428, 139288. [Google Scholar] [CrossRef]
  26. Lee, Y.-H.; Hsieh, Y.-C.; Ma, C.-Y. A Model of Organizational Employees’ e-Learning Systems Acceptance. Knowl. Based Syst. 2011, 24, 355–366. [Google Scholar] [CrossRef]
  27. Ma, Y.; Fang, Y. Current Status, Issues, and Challenges of Blockchain Applications in Education. Int. J. Emerg. Technol. Learn. 2020, 15, 20–31. [Google Scholar] [CrossRef]
  28. Asilevi, M.N.; Kärkkäinen, S.; Sormunen, K.; Havu-Nuutinen, S. A Comparison of Science Learning Skills in the Teacher-Centered Approach and Inquiry-Based Science Fieldwork: Primary School Students’ Perceptions. Int. J. Educ. Math. Sci. Technol. 2023, 12, 1–19. [Google Scholar] [CrossRef]
  29. Dalipi, F.; Zdravkova, K.; Ahlgren, F. Sentiment Analysis of Students’ Feedback in MOOCs: A Systematic Literature Review. Front. Artif. Intell. 2021, 4, 728708. [Google Scholar] [CrossRef]
  30. Kiyabo, K.; Isaga, N. Entrepreneurial Orientation, Competitive Advantage, and SMEs’ Performance: Application of Firm Growth and Personal Wealth Measures. J. Innov. Entrep. 2020, 9, 12. [Google Scholar] [CrossRef]
  31. Hosen, M.; Ogbeibu, S.; Giridharan, B.; Cham, T.-H.; Lim, W.M.; Paul, J. Individual Motivation and Social Media Influence on Student Knowledge Sharing and Learning Performance: Evidence from an Emerging Economy. Comput. Educ. 2021, 172, 104262. [Google Scholar] [CrossRef]
  32. Al-Mamary, Y.H.S. Why Do Students Adopt and Use Learning Management Systems?: Insights from Saudi Arabia. Int. J. Inf. Manag. Data Insights 2022, 2, 100088. [Google Scholar] [CrossRef]
  33. Al-Fraihat, D.; Joy, M.; Masa’deh, R.; Sinclair, J. Evaluating E-Learning Systems Success: An Empirical Study. Comput. Hum. Behav. 2020, 102, 67–86. [Google Scholar] [CrossRef]
  34. Mah, D.-K. Learning Analytics and Digital Badges: Potential Impact on Student Retention in Higher Education. Technol. Knowl. Learn. 2016, 21, 285–305. [Google Scholar] [CrossRef]
  35. Zhang, L.; Carter, R.A.; Qian, X.; Yang, S.; Rujimora, J.; Wen, S. Academia’s Responses to Crisis: A Bibliometric Analysis of Literature on Online Learning in Higher Education during COVID-19. Br. J. Educ. Technol. 2022, 53, 620–646. [Google Scholar] [CrossRef] [PubMed]
  36. Alenezi, A. Barriers to Participation in Learning Management Systems in Saudi Arabian Universities. Educ. Res. Int. 2018, 2018, 9085914. [Google Scholar] [CrossRef]
  37. Okada, A.; Panselinas, G.; Bizoi, M.; Malagrida, R.; Torres, P.L. Fostering Transversal Skills through Open Schooling with the CARE-KNOW-DO Framework for Sustainable Education. Sustainability 2024, 16, 2794. [Google Scholar] [CrossRef]
  38. Badali, M.; Hatami, J.; Banihashem, S.K.; Rahimi, E.; Noroozi, O.; Eslami, Z. The Role of Motivation in MOOCs’ Retention Rates: A Systematic Literature Review. Res. Pract. Technol. Enhanc. Learn. 2022, 17, 5. [Google Scholar] [CrossRef]
  39. Schettino, G.; Capone, V. Learning Design Strategies in MOOCs for Physicians’ Training: A Scoping Review. Int. J. Environ. Res. Public Health 2022, 19, 14247. [Google Scholar] [CrossRef] [PubMed]
  40. Henry, M.; Carroll, F.; Cunliffe, D.; Kop, R. Learning a Minority Language through Authentic Conversation Using an Online Social Learning Method. Comput. Assist. Lang Learn. 2018, 31, 321–345. [Google Scholar] [CrossRef]
  41. Davis, F.D. Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology. MIS Q. 1989, 13, 319–340. [Google Scholar] [CrossRef]
  42. Ruipérez-Valiente, J.A.; Jaramillo-Morillo, D.; Joksimović, S.; Kovanović, V.; Muñoz-Merino, P.J.; Gašević, D. Data-Driven Detection and Characterization of Communities of Accounts Collaborating in MOOCs. Future Gener. Comput. Syst. 2021, 125, 590–603. [Google Scholar] [CrossRef]
  43. Lu, J.; Yu, C.; Liu, C.; Yao, J.E. Technology Acceptance Model for Wireless Internet. Internet Res. 2003, 13, 206–222. [Google Scholar] [CrossRef]
  44. Venkatesh, V.; Davis, F.D. A Theoretical Extension of the Technology Acceptance Model: Four Longitudinal Field Studies. Manag. Sci 2000, 46, 186–204. [Google Scholar] [CrossRef]
  45. Selim, H.M. Critical Success Factors for E-Learning Acceptance: Confirmatory Factor Models. Comput. Educ. 2007, 49, 396–413. [Google Scholar] [CrossRef]
  46. Ajzen, I. From Intentions to Actions: A Theory of Planned Behavior. In Action Control; Springer: Berlin/Heidelberg, Germany, 1985; pp. 11–39. [Google Scholar] [CrossRef]
  47. Fishbein, M.; Ajzen, I. Belief, Attitude, Intention and Behaviour: An Introduction to Theory and Research; Addison-Wesley: Reading, UK, 1975. [Google Scholar]
  48. Littlejohn, A. Re-Using Online Resources: A Sustainable Approach to e-Learning. J. Interact. Media Educ. 2003, 1. [Google Scholar] [CrossRef]
  49. Guglielmino, P.J.; Guglielmino, L.M. Are Your Learners Ready for E-Learning. In AMA Handbook of E-Learning; Piskurich, G., Ed.; American Management Association: New York, NY, USA, 2003; pp. 87–98. [Google Scholar]
  50. Jebeile, S. The Diffusion of E-Learning Innovations in an Australian Secondary College: Strategies and Tactics for Educational Leaders. Innov. J. 2003, 8, 1–21. [Google Scholar]
  51. Lyytinen, K.; Damsgaard, J. What’s Wrong with the Diffusion of Innovation Theory. In Diffusing Software Product and Process Innovations; Springer: Boston, MA, USA, 2001; pp. 173–190. [Google Scholar] [CrossRef]
  52. Venkatesh, V.; Morris, M.G.; Davis, G.B.; Davis, F.D. User Acceptance of Information Technology: Toward a Unified View. MIS Q. 2003, 27, 425–478. [Google Scholar] [CrossRef]
  53. Venkatesh, V.; Bala, H. Technology Acceptance Model 3 and a Research Agenda on Interventions. Decis. Sci. 2008, 39, 273–315. [Google Scholar] [CrossRef]
  54. Neslin, S.A.; Shankar, V. Key Issues in Multichannel Customer Management: Current Knowledge and Future Directions. J. Interact. Mark. 2009, 23, 70–81. [Google Scholar] [CrossRef]
  55. Venkatesh, V.; Speier, C.; Morris, M.G. User Acceptance Enablers in Individual Decision Making About Technology: Toward an Integrated Model. Decis. Sci. 2002, 33, 297–316. [Google Scholar] [CrossRef]
  56. Batanero-Ochaíta, C.; Fernández-Sanz, L.; Rivera-Galicia, L.F.; Rueda-Bernao, M.J.; López-Baldominos, I. Estimation of Interaction Time for Students with Vision and Motor Problems When Using Computers and E-Learning Technology. Appl. Sci. 2023, 13, 10978. [Google Scholar] [CrossRef]
  57. Rosli, M.S.; Saleh, N.S.; Aris, B.; Ahmad, M.H.; Sejzi, A.A.; Shamsudin, N.A. E-Learning and Social Media Motivation Factor Model. Int. Educ. Stud. 2016, 9, 20–30. [Google Scholar] [CrossRef]
  58. Kizilcec, R.F.; Viberg, O.; Jivet, I.; Martinez Mones, A.; Oh, A.; Hrastinski, S.; Mutimukwe, C.; Scheffel, M. The Role of Gender in Students’ Privacy Concerns about Learning Analytics. In Proceedings of the LAK23: 13th International Learning Analytics and Knowledge Conference, Arlington, TX, USA, 13–17 March 2023; ACM: New York, NY, USA, 2023; pp. 545–551. [Google Scholar] [CrossRef]
  59. Zhong, L.; Wang, X.; Yang, W.; Feng, X. Reliability and Validity Assessment of the Chinese Version of the Online Learning Readiness Scale (OLRS) for Nursing Students. Nurse Educ. Today 2023, 128, 105884. [Google Scholar] [CrossRef] [PubMed]
  60. Elewa, A.H.; Mohamed, S. Online Teaching Readiness, Challenges and Satisfaction as Perceived by Nursing Faculty Members during COVID-19 Pandemics. Int. Egypt. J. Nurs. Sci. Res. 2022, 2, 568–579. [Google Scholar] [CrossRef]
  61. Mphahlele, R.S.; Makgato-Khunou, P.; Tshephe, G.; Sethusha, M.J.; Tshesane, M.M.; Wright, R.; Denzil, C. First-Year Student’s E-Readiness to Use Learning Management System: COVID-19 Realities. Int. J. E-Learn. Distance Educ./Rev. Int. Du E-Learn. La Form. Distance 2023, 38. [Google Scholar] [CrossRef]
  62. Liu, J.C.; Kaye, E.R. Preparing Online Learning Readiness with Learner-Content Interaction. In Blended Learning: Concepts, Methodologies, Tools, and Applications; IGI Global: Hershey, PA, USA, 2016; pp. 216–243. [Google Scholar] [CrossRef]
  63. Allam, S.N.S.; Hassan, M.S.; Mohideen, R.S.; Ramlan, A.F.; Kamal, R.M. Online Distance Learning Readiness During COVID-19 Outbreak Among Undergraduate Students. Int. J. Acad. Res. Bus. Soc. Sci. 2020, 10, 642–657. [Google Scholar] [CrossRef] [PubMed]
  64. Samed Al-Adwan, A.; Khdour, N. Exploring Student Readiness to MOOCs in Jordan: A Structural Equation Modelling Approach. J. Inf. Technol. Educ. Res. 2020, 19, 223–242. [Google Scholar] [CrossRef] [PubMed]
  65. T Subramaniam, T.; Suhaimi, N.A.D.; Latif, L.A.; Abu Kassim, Z.; Fadzil, M. MOOCs Readiness. Int. Rev. Res. Open Distrib. Learn. 2019, 20. [Google Scholar] [CrossRef]
  66. Ranganathan, H.; Singh, D.K.A.; Kumar, S.; Sharma, S.; Chua, S.K.; Ahmad, N.B.; Harikrishnan, K. Readiness towards Online Learning among Physiotherapy Undergraduates. BMC Med. Educ. 2021, 21, 376. [Google Scholar] [CrossRef] [PubMed]
  67. Tahiru, F. AI in Education. J. Cases Inf. Technol. 2021, 23, 1–20. [Google Scholar] [CrossRef]
  68. Pane, J.F.; Griffin, B.A.; McCaffrey, D.F.; Karam, R. Effectiveness of Cognitive Tutor Algebra I at Scale. Educ. Eval. Policy Anal. 2014, 36, 127–144. [Google Scholar] [CrossRef]
  69. Shute, V.J. Focus on Formative Feedback. Rev. Educ. Res. 2008, 78, 153–189. [Google Scholar] [CrossRef]
  70. Keller, C.; Hrastinski, S.; Carlsson, S. Students’ Acceptance of e-Learning Environments: A Comparative Study in Sweden and Lithuania. In Proceedings of the European Conference on Information Systems (ECIS), St. Gallen, Switzerland, 7–9 June 2007; pp. 395–406. [Google Scholar]
  71. Kulik, C.-L.C.; Kulik, J.A. Effectiveness of Computer-Based Instruction: An Updated Analysis. Comput. Hum. Behav. 1991, 7, 75–94. [Google Scholar] [CrossRef]
  72. Barbour, M.K.; Reeves, T.C. The Reality of Virtual Schools: A Review of the Literature. Comput. Educ. 2009, 52, 402–416. [Google Scholar] [CrossRef]
  73. Shahabadi, M.M.; Uplane, M. Synchronous and Asynchronous E-Learning Styles and Academic Performance of e-Learners. Procedia Soc. Behav. Sci. 2015, 176, 129–138. [Google Scholar] [CrossRef]
  74. Comrey, A.L.; Lee, H.B. A First Course in Factor Analysis; Erlbaum: Hillsdale, NJ, USA, 1992. [Google Scholar]
  75. Denscombe, M. Web-Based Questionnaires and the Mode Effect. Soc. Sci. Comput. Rev. 2006, 24, 246–254. [Google Scholar] [CrossRef]
  76. Ganesh, K.; Rashid, N.A.; Hasnaoui, R.E.; Assiri, R.; Cordero, M.A.W. Analysis of Female Pre-Clinical Students’ Readiness, Academic Performance and Satisfaction in Online Learning: An Assessment of Quality for Curriculum Revision and Future Implementation. BMC Med. Educ. 2023, 23, 523. [Google Scholar] [CrossRef] [PubMed]
  77. Wang, Y.; Xia, M.; Guo, W.; Xu, F.; Zhao, Y. Academic Performance under COVID-19: The Role of Online Learning Readiness and Emotional Competence. Curr. Psychol. 2023, 42, 30562–30575. [Google Scholar] [CrossRef]
  78. Maan, A.; Malhotra, K. Mapping Students’ Readiness for E-Learning in Higher Education: A Bibliometric Analysis. J. Learn. Dev. 2024, 11, 27–51. [Google Scholar] [CrossRef]
  79. Yu, T. Examining Construct Validity of the Student Online Learning Readiness (SOLR) Instrument Using Confirmatory Factor Analysis. Online Learn. 2019, 22, 277–288. [Google Scholar] [CrossRef]
  80. Tinsley, H.E.; Tinsley, D.J. Uses of Factor Analysis in Counseling Psychology Research. J. Couns. Psychol. 1987, 34, 414–424. [Google Scholar] [CrossRef]
  81. Kaiser, H.F. An Index of Factorial Simplicity. Psychometrika 1974, 39, 31–36. [Google Scholar] [CrossRef]
  82. Henseler, J.; Sarstedt, M. Goodness-of-Fit Indices for Partial Least Squares Path Modeling. Comput. Stat. 2013, 28, 565–580. [Google Scholar] [CrossRef]
  83. Yu, T.-K.; Yu, T.-Y. Modelling the Factors That Affect Individuals’ Utilisation of Online Learning Systems: An Empirical Study Combining the Task Technology Fit Model with the Theory of Planned Behaviour. Br. J. Educ. Technol. 2010, 41, 1003–1017. [Google Scholar] [CrossRef]
  84. Kim, T.T.; Suh, Y.K.; Lee, G.; Choi, B.G. Modelling Roles of Task-Technology Fit and Self-Efficacy in Hotel Employees’ Usage Behaviours of Hotel Information Systems. Int. J. Tour. Res. 2010, 12, 709–725. [Google Scholar] [CrossRef]
  85. Afthanorhan, A.; Awang, Z.; Aimran, N. An Extensive Comparison of CB-SEM and PLS-SEM for Reliability and Validity. Int. J. Data Netw. Sci. 2020, 4, 357–364. [Google Scholar] [CrossRef]
  86. Field, A.P. Discovering Statistics Using IBM SPSS Statistics: And Sex and Drugs and Rock “n” Roll; Sage: Los Angeles, CA, USA, 2013. [Google Scholar]
  87. Kock, F.; Berbekova, A.; Assaf, A.G. Understanding and Managing the Threat of Common Method Bias: Detection, Prevention and Control. Tour. Manag. 2021, 86, 104330. [Google Scholar] [CrossRef]
  88. Chin, W.W.; Thatcher, J.B.; Wright, R.T. Assessing Common Method Bias: Problems with the ULMC Technique. MIS Q. 2012, 36, 1003–1019. [Google Scholar] [CrossRef]
  89. Podsakoff, P.M.; MacKenzie, S.B.; Lee, J.-Y.; Podsakoff, N.P. Common Method Biases in Behavioral Research: A Critical Review of the Literature and Recommended Remedies. J. Appl. Psychol. 2003, 88, 879–903. [Google Scholar] [CrossRef] [PubMed]
  90. Afthanorhan, A.; Awang, Z.; Majid, N.A.; Foziah, H.; Ismail, I.; Al Halbusi, H.; Tehseen, S. Gain More Insight from Common Latent Factor in Structural Equation Modeling. J. Phys. Conf. Ser. 2021, 1793, 012030. [Google Scholar] [CrossRef]
  91. Eachus, P.; Cassidy, S. Development of the Web Users Self-Efficacy Scale (WUSE). Issues Informing Sci. Inf. Technol. 2006, 3, 199–209. [Google Scholar]
  92. Firat, M. Integrating AI Applications into Learning Management Systems to Enhance E-Learning. Öğretim Teknol. Ve Hayat Boyu Öğrenme Derg.—Instr. Technol. Lifelong Learn. 2023, 4, 1–14. [Google Scholar] [CrossRef]
  93. Sayed, W.S.; Noeman, A.M.; Abdellatif, A.; Abdelrazek, M.; Badawy, M.G.; Hamed, A.; El-Tantawy, S. AI-Based Adaptive Personalized Content Presentation and Exercises Navigation for an Effective and Engaging E-Learning Platform. Multimed. Tools Appl. 2023, 82, 3303–3333. [Google Scholar] [CrossRef] [PubMed]
  94. Tang, K.-Y.; Chang, C.-Y.; Hwang, G.-J. Trends in Artificial Intelligence-Supported e-Learning: A Systematic Review and Co-Citation Network Analysis (1998–2019). Interact. Learn. Environ. 2023, 31, 2134–2152. [Google Scholar] [CrossRef]
  95. Jeyanthi, S.; Sathya, C.; Uma Maheswari, N.; Venkatesh, R.; Ganapathy Subramanian, V. AI-Based Development of Student E-Learning Framework. In Artificial Intelligence for Sustainable Applications; Wiley: Hoboken, NJ, USA, 2023; pp. 219–229. [Google Scholar] [CrossRef]
  96. Rohde, N.; Flindt, N.; Rietz, C.; Kassymova, G.K. How e-learning programs can be more individualized with artificial intelligence—A theoretical approach from a pedagogical point of view. Muall. J. Soc. Sci. Humanit. 2023, 7, 1–17. [Google Scholar] [CrossRef] [PubMed]
  97. Alotaibi, N.S. The Significance of Digital Learning for Sustainable Development in the Post-COVID19 World in Saudi Arabia’s Higher Education Institutions. Sustainability 2022, 14, 16219. [Google Scholar] [CrossRef]
Figure 1. Technology acceptance model.
Figure 1. Technology acceptance model.
Sustainability 16 05970 g001
Figure 2. The theory of planned behaviour.
Figure 2. The theory of planned behaviour.
Sustainability 16 05970 g002
Figure 3. Rogers’ diffusion of innovations (DOI) theory.
Figure 3. Rogers’ diffusion of innovations (DOI) theory.
Sustainability 16 05970 g003
Figure 4. Rogers’ diffusion of innovations (DOI) unified theory of acceptance and use of technology (UTAUT) theory.
Figure 4. Rogers’ diffusion of innovations (DOI) unified theory of acceptance and use of technology (UTAUT) theory.
Sustainability 16 05970 g004
Figure 5. The SOLR model.
Figure 5. The SOLR model.
Sustainability 16 05970 g005
Figure 6. Research model.
Figure 6. Research model.
Sustainability 16 05970 g006
Figure 7. Illustration of the present research model (results of structural model). Note: ***—significant at 0.001.
Figure 7. Illustration of the present research model (results of structural model). Note: ***—significant at 0.001.
Sustainability 16 05970 g007
Table 1. The constructs, along with their respective questionnaire items and related factor loadings.
Table 1. The constructs, along with their respective questionnaire items and related factor loadings.
Construct/FactorItemFactor Loading
Technical
Competencies
I am usually confident about using computer technology for specific tasks.0.842
I am experienced in using a wide variety of computer technologies.0.869
I quickly learn how to use new digital technologies.0.875
I understand the benefits of using computer technologies in learning.0.842
I am comfortable at the idea of using computer technologies in my learning activities.0.869
The use of computer technologies encourages me to study more.0.875
Social
Competencies
Digital technology helps me develop friendships with my classmates.0.945
Computer technology makes me think more about other students’ actions.0.933
Computer technology helps me develop my social interaction skills for different situations.0.914
Digital technology makes me feel less inhibited about contacting others.0.933
I feel that computer technology encourages respect between fellow students.0.914
Communication
Competencies
I am comfortable in communicating with others using computer technology.0.842
I feel am comfortable responding to other people’s ideas using Digital technology.0.917
Communicating via computers helps me to be clear about what I want to say.0.915
When using digital technology, I feel more able to give constructive and proactive feedback to others, even when I disagree.0.895
Student Readiness
for LMS
I look forward to engaging with LMS.0.866
I feel able to commit the time needed to complete tasks in LMS.0.893
The use of LMS would not discourage me from enrolling in a class.0.877
I feel LMS will help me successfully complete my course.0.766
I would like to learn more about LMS.0.893
I am comfortable with the idea of online assessments through LMS.0.877
I am willing to pay for courses that use LMS.0.866
Personalised
Learning
The automated feedback I receive is clear, constructive, and contributes to my understanding of the course material.0.86
I feel that the LMS-generated feedback is highly personalised, and addresses my specific learning needs.0.927
Feedback from LMS helps me identify areas for improvement in a constructive manner.0.955
I am satisfied with the speed and quality of feedback provided by the LMS after assignment submission.0.895
Test AdministrationGrades are promptly posted by LMS.0.907
I find the LMS-driven testing and grading process to be clear and fair.0.888
I trust the accuracy of the grades assigned by LMSs.0.833
LMS makes it easy for me to access and understand my grades.0.788
User Support
Features
The automated assignment management of LMS is easy to work with.0.893
The various reminders provided by LMS are very helpful.0.955
The scheduling and calendar features of LMS help me manage my studying more effectively.0.877
The LMS chatbot is very useful in helping to answer questions and resolve administrative issues.0.895
Table 2. An outline of the demographic characteristics of the participants.
Table 2. An outline of the demographic characteristics of the participants.
Demographic/Experience CategoryParticipants %
GenderMale58
Female42
EducationUndergraduate59
Postgraduate41
CollegesSciences and Engineering Colleges45
Social Sciences and Humanities Colleges29
Health Colleges26
NationalitiesSaudi77
Non-Saudi23
Most Used DevicePC and Laptop51
Smart Phone and Tablet49
Table 3. The model fit indices.
Table 3. The model fit indices.
Fit Measure CategoryFit MeasureResultMeets Recommended Criteria?
Absolute fit measuresChi-Square (χ2/DF)2.320Yes (<3.0)
SRMR0.962Yes (>0.80)
GFI0.978Yes (>0.90)
RMSEA0.043Yes (<0.05)
Parsimonious fit measuresPGFI0.652Yes (<0.05)
PNFI0.671Yes (<0.05)
Incremental fit measuresAGFI0.938Yes (>0.90)
IFI0.951Yes (>0.90)
NFI0.951Yes (>0.90)
CFI0.965Yes (>0.90)
Table 4. Outcomes of correlations for CR, CA, and AVE.
Table 4. Outcomes of correlations for CR, CA, and AVE.
Construct/FactorCACRAVECorrelations
1234567
Technical Competencies0.820.830.750.87
Social Competencies0.800.810.740.660.86
Communication Competencies0.790.820.660.630.650.81
Student Readiness for LMS0.810.780.630.510.610.660.79
Personalised Learning0.840.770.640.640.650.630.510.80
Test administration0.790.760.670.630.660.560.560.520.81
Scheduling, Reminders, and Query Resolution0.830.800.680.500.510.640.560.540.630.82
Note: The square root of AVE is in bold.
Table 5. The path coefficients and t-test value for the full sample.
Table 5. The path coefficients and t-test value for the full sample.
HypothesisStandardized Path Coefficientt-Test ValueSupport?
H1: Technical competency has a positive effect on student readiness for using an LMS.0.435.43 ***YES
H2: Social competency has a positive effect on student readiness for using an LMS.0.475.30 ***YES
H3: Communication competency has a positive effect on student readiness for using an LMS.0.415.19 ***YES
H4: Personalised learning has a positive effect on student readiness for using an LMS.0.525.41 ***YES
H5: Test administration has a positive effect on student readiness for using an LMS.0.575.91 ***YES
H6: Scheduling, reminders, and query resolution have a positive effect on student readiness for using an LMS.0.425.20 ***YES
Note: ***—significant at 0.001.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Mutambik, I. The Use of AI-Driven Automation to Enhance Student Learning Experiences in the KSA: An Alternative Pathway to Sustainable Education. Sustainability 2024, 16, 5970. https://doi.org/10.3390/su16145970

AMA Style

Mutambik I. The Use of AI-Driven Automation to Enhance Student Learning Experiences in the KSA: An Alternative Pathway to Sustainable Education. Sustainability. 2024; 16(14):5970. https://doi.org/10.3390/su16145970

Chicago/Turabian Style

Mutambik, Ibrahim. 2024. "The Use of AI-Driven Automation to Enhance Student Learning Experiences in the KSA: An Alternative Pathway to Sustainable Education" Sustainability 16, no. 14: 5970. https://doi.org/10.3390/su16145970

APA Style

Mutambik, I. (2024). The Use of AI-Driven Automation to Enhance Student Learning Experiences in the KSA: An Alternative Pathway to Sustainable Education. Sustainability, 16(14), 5970. https://doi.org/10.3390/su16145970

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop