Next Article in Journal
Lung Nodules Localization and Report Analysis from Computerized Tomography (CT) Scan Using a Novel Machine Learning Approach
Next Article in Special Issue
A Novel Method for Cross-Modal Collaborative Analysis and Evaluation in the Intelligence Era
Previous Article in Journal
Drone and Controller Detection and Localization: Trends and Challenges
Previous Article in Special Issue
How K12 Teachers’ Readiness Influences Their Intention to Implement STEM Education: Exploratory Study Based on Decomposed Theory of Planned Behavior
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Practice Promotes Learning: Analyzing Students’ Acceptance of a Learning-by-Doing Online Programming Learning Tool

Computer Science, Multimedia and Telecommunications Department, Universitat Oberta de Catalunya, 08018 Barcelona, Spain
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(24), 12613; https://doi.org/10.3390/app122412613
Submission received: 27 October 2022 / Revised: 4 December 2022 / Accepted: 7 December 2022 / Published: 9 December 2022
(This article belongs to the Special Issue STEAM Education and the Innovative Pedagogies in the Intelligence Era)

Abstract

:
Learning-by-doing is a pedagogical approach that helps in learning skills through practice. An online learning-by-doing tool, CodeLab, has been introduced to students undertaking the digital design and creation bachelor’s degree program at the Universitat Oberta de Catalunya. The tool has been used to facilitate and engage students, not well-acquainted with problem-solving techniques, in an introductory programming course. The aim of this study was to examine the factors that play vital roles in students’ acceptance of learning-by-doing tools that facilitate the development of problem-solving skills. The Unified Theory of Acceptance and the Use of Technology (UTAUT) model was used for this purpose and extended by adding the factor of motivation, which is essential for educational contexts. The results highlight that there is a strong relationship between acceptance and motivation, implying that students would use online learning-by-doing tools, such as CodeLab, depending on the amount of motivation and engagement while practicing the learning activities. A positive relationship between motivation and acceptance clearly supports the primary aim of using learning-by-doing tools in problem-solving courses.

1. Introduction

Technology-enhanced learning (TEL) has changed the mechanism of acquiring education [1]. From primary to higher levels of education, TEL has made it possible to learn online, full-time as well as part-time [2]. Online learning has numerous benefits, ranging from self-paced learning to exceptional circumstances when going to educational institutions is not a possibility [3,4]. Despite the numerous benefits of online learning, there are still many problems associated with it. One of the main problems is unsatisfactory educational models and teaching techniques leading to a lack of motivation and engagement among students, resulting in less participation in online courses and, in extreme cases, dropout from online education [5].
Numerous models and strategies have been presented for improving education over time. An effective approach to disseminate education is learning-by-doing [6]. The learning-by doing approach is defined as learning that is a result of ones’ own actions, efforts, and experiences [7].
It requires students to be actively involved in the learning process. It is not just the teacher who delivers the lectures; rather, students are expected to practice and participate in learning [8]. The learning-by-doing education approach also plays a vital role in online learning contexts. Students are expected to practice by performing learning activities as part of their learning process, which are dependent on the nature of the course [9]. Similar to face-to-face courses, memorization techniques are also usually applied in online courses that aim to disseminate knowledge. On the contrary, online courses that require the development of problem-solving skills, for example, in the cases of learning to program, learning to solve mathematical problems, and solving scientific problems, the learning-by-doing approach has proven beneficial [10,11]. In the context of online programming courses, especially for novice and non-steam learners, we believe that a virtual laboratory environment can be used. This environment should facilitate learners to develop programming skills by practicing, similar to the practice that can be performed in a face-to-face programming laboratory. It should also help the learners to achieve their intended pedagogical goals.
An important factor in the evaluation of online learning tools is students’ acceptance of and willingness to use the system [12]. Several studies have been conducted to evaluate these systems for acceptance that consider various models of acceptance [13,14]. In addition to students’ evaluations, teachers’ acceptance evaluations of online learning management systems have been conducted [15]. A widely used and validated model is the Unified Theory of Acceptance and Use of Technology (UTAUT) model, which takes into account the essential constructs for technology acceptance and use [16]. A construct is an idea that is based on theoretical explanations and is empirically verified [17]. Additionally, with acceptance evaluation through UTAUT, the context of usage and the necessary constructs specific to the context can also be evaluated [14].
Given the significance of the learning-by-doing approach and practice in the domain of problem solving and online programming education, a learning tool has been introduced to novice students to help them to learn to code. The tool, providing the opportunity to practice, has been incorporated into an introductory programming course for the Design and Digital Creation bachelor’s degree at the Universitat Oberta de Catalunya (UOC) with the aim of helping and engaging learners. The learners of this degree program are less familiar with programming and problem-solving concepts. The primary aim of this study was to identify the major constructs for acceptance of learning-by-doing practice-based tools in online education using the validated UTAUT model. The model was extended by including the construct of motivation, which is vital for online learning tools and systems. The results of the study highlight that motivation is an important construct to consider when evaluating practice-based online tools regarding acceptance. When exploring the relationships amongst the constructs included in this study, a statistically significant relationship was found between facilitating conditions and motivation. There was also a significant relationship found between effort expectancy, trust in the system, and social influence. These significant relationships highlight the fact that online learning tools based on practice, if designed by taking the significant constructs into account, can be successfully accepted by online learners.
Online education, blended or fully online, has proven its advantages and was a necessity during the COVID-19 pandemic, when going to educational institutes was not possible [18]. Students’ acceptance of and willingness to use practice-based online learning tools in their courses is crucial, as it will help promote student engagement and willingness to learn. Through successful acceptance of practice-based tools, students can experience learning with great potential and achieve their learning goals associated with skill acquisitions, such as learning to code. For instructors and institutions disseminating knowledge, engaging students is important to ensure quality in online higher education.

2. Theoretical Background

Several theories have been proposed for learning and educational contexts. In online education, basic learning theories, such as behaviorism, constructivism, humanism, and connectivism, have been applied [19]. Additionally, other theories, such as social learning [20], transformative learning [21], and experiential learning, have also been used in e-learning. In terms of learning skills, the approach of modern education is a feasible one that encourages the use of creative techniques and problem-solving-based approaches for learning [22]. The educational approach of learning-by-doing is aligned with the modern educational approach. This approach is derived from experiential and active learning and encourages learners to practice and acquire hands-on experience to learn rather than relying on theory alone [23]. The approach of experiential learning, learning-by-doing, dates to John Dewey’s theory of progressive education and is still applicable in this era [6]. The idea is to learn by practice, that is, learning is to be achieved through one’s own hands-on experiences, which could be mental or physical [24]. Learning-by-doing has been applied in a variety of contexts and fields of education and has provided favorable results. Learning-by-doing holds significance for scientific education, especially when real-life simulations are considered [25]. In situations where learning skills are of interest, a learning-by-doing approach proves to be feasible [24]. Knowledge about a domain can be acquired by traditional means of education, such as reading theory and attending lectures, but for the domains that require concepts to be developed as skills, practice is required [26]. Learning a concept by theory first, then applying it several times helps in acquiring a skill. Problem-solving skills are also developed by practicing similar problems based on a particular concept [27]. As for skills, learning to code is a skill that can be only developed by virtue of practice [28].
In this context, Nantsou et al. have presented learning-by-doing for physics and electronics education [29]. Similarly, in the domain of marketing education, experiential learning and learning-by-doing has provided beneficial experiences [30]. George et al. explored the effects of applying learning-by-doing in the field of criminal justice. The study based on undergraduate students concluded that there are several benefits of learning-by-doing in terms of professional development [31]. Niiranen also conducted a study that assessed the use of the learning-by-doing approach in the field of craft and technology education [32].
In online education, where there is a lack of face-to-face experience, learning-by-doing has been used. Not only does it help develop skills and effectively teach concepts, it also regulates student engagement [33]. For the learning of problem-solving skills in online contexts, various tools have been discussed in the literature [34,35,36]. In the specific case of online programming tools and environments, which also belong to the category of problem solving and take into account the modern education perspective, several learning tools and systems have been proposed for students, for various programming languages, ranging from novice [37] to advanced levels [38]. Rossano et al. proposed a tool for the programming of digital logic and design based on learning-by-doing and practice for high-school students [39]. In another study, Hosseini et al. proposed a tool that aims to teach Python to students in higher education through practice [40]. Minecraft is also a learning-by-doing tool that helps learners develop problem-solving skills through the process of gamification [41]. Similarly, w3school is an online learning platform that provides the opportunity to learn to code by practicing [42]. Code.org [43] and CodeAcademy [44] are also based on the ideas of experiential learning. However, not all tools and systems for teaching and learning programming are built upon the idea of encouraging students to learn to code through practice; rather, there is a limited set of activities and tools focusing on assessment alone.
Acceptance of technology and systems has been a concern among the research community for many years now [45]. Several models and theories have been proposed for this purpose [46,47,48]. Unified Theory of Acceptance and Use of Technology (UTAUT) has been used in several domains to evaluate the acceptance of technological systems and products [16,49]. Similarly, in the domain of education in general and online learning in particular, UTAUT has been used extensively to evaluate the factors that lead to the acceptance of systems [50,51,52,53,54]. In the case of a programming course, Morais et al. assessed the introduction of programming practices in a degree program of arts and design [55]. However, most studies on UTAUT in the domain of e-learning and online education evaluate acceptance in terms of a system having the scope to be implemented in a particular university or institution. The educational approach that the system uses is usually not evaluated for acceptance by students. In this study, the focus was on identifying and evaluating the factors that lead to acceptance of online learning tools that are based on the educational approach of learning-by-doing and practicing—an approach that is essential to learning to code and developing problem-solving skills.

3. Research Methodology

3.1. The Study Context

This research centered on an online mandatory course taught as part of the online bachelor’s degree program of “Design and Digital Creation” at the Universitat Oberta de Catalunya (UOC)—a fully online university. UOC has over 70,000 students and more than 4500 teaching and research staff and aims to provide online learners with quality personalized learning for online higher education. To address these aims of UOC, the CodeLab programming tool has been designed, developed, and incorporated into the course [56]. The learning tool aims to facilitate and engage novice learners in their first programming course of the degree, who typically have little to no programming experience. The students are provided with the opportunity to practice the concepts continuously, as learning to code requires practice and reflection. CodeLab considers the cognitive processes and social interactions of the students as they experiment and test the ideas and concepts and communicate with their peers.
CodeLab is based on a practice approach and contains a detailed set of learning itineraries related to coding concepts using the P5.js language. Not only is it possible to write and execute a piece of code, students can also visualize the executed code. First, the students are introduced to programming concepts that establish a basic understanding, and later they are expected to practice these concepts for the acquisition of programming skills related to these concepts. Once they feel confident, they are encouraged to move to advanced and complex concepts and to practice to continue learning. In the online course considered in this study, five modules in the tool were defined based on specific concepts, including several learning activities for assessment, recommended and complementary in nature, the latter two encouraging the practice of concepts. In this study, the CodeLab tool was used for one semester spanning 16 weeks, and each module was 3 weeks in duration. The students could ask questions, discuss, and enhance their knowledge and experience by interacting with teachers and peers through a communication channel provided within the tool.
Taking into account the unique experience of an online course that mainly involves interacting with an online programming tool, students’ acceptance evaluations of the tool seemed vital. This paper discusses the acceptance evaluation of a pilot study of the CodeLab tool and is part of the on-going research project on the learning-by-doing practice-based e-learning systems. The research methodology that is applied in this context is the “design and creation” methodology [57]. This research methodology helps in providing the opportunity to recognize, articulate, reflect, and design artifacts to address the problems at issue (Figure 1). The pilot study aimed to improve the educational and interaction design of the tool and to elicit qualitative and quantitative data from the students and teachers. The data generated through the usage of the CodeLab tool were also used for the evaluation of student behavior and engagement. Several e-learning tools and systems were evaluated for acceptance to help determine their continued usage by the students. The acceptance evaluations in this paper were not only specific to the tools but would also help to determine the factors that would need to be considered when developing and evaluating similar e-learning systems that encourage practice for learning.

3.2. Research Model and Hypothesis Development

The UTAUT model takes into account the behavioral intention (BI) to use a product or system for acceptance evaluation. Calder discusses that a construct is an idea that is based on theoretical explanations; however, in most cases constructs are not limited to theory alone; rather, they are empirically verified [17]. In this study, we considered this point of view, similar to the original UTAUT model, i.e., theoretical explanations were provided first and later we verified the constructs through statistical testing. To evaluate BI, Venkatesh et al. took into account the constructs of performance expectancy (PE), effort expectancy (EE), facilitating conditions (FCs), and social influence (SI) (Figure 2).
Performance expectancy (PE) refers to the degree of belief of an individual that by using a system they will attain benefits in their job or benefits associated with the intended purpose of using the system. In the case of CodeLab, the intended purpose is to learn effectively while being engaged in the course. As the students are encouraged to learn to code by practicing the programming-based learning activities, it was to be inquired into whether the tool helps them to learn. Based on the above discussion, the following hypothesis was proposed:
H1. 
Performance expectancy has a positive effect on the behavioral intention to use CodeLab.
Effort expectancy (EE) is related to the degree of ease of use of the system. For e-learning systems, ease of use is an important factor in acceptance and effectiveness [58]. In the specific case of CodeLab, the students were to be asked whether they thought and whether they found that the CodeLab tool was easy to use and whether it was easy for them to be skillful in learning to use CodeLab.
H2. 
Effort expectancy has a positive effect on the behavioral intention to use CodeLab.
Facilitating conditions (FCs) are the user’s perceptions of the factors in the environment of the system that help to use the system. In addition to general e-learning systems, FCs have also been used in studies related to tools for teaching programming [59]. The students were to be asked if they were provided with the necessary resources and knowledge and whether someone was available to help them in case they encounter difficulties when using CodeLab.
H3. 
Facilitating conditions have a positive effect on the behavioral intention to use CodeLab.
Social influence (SI) is the perceived effect of the opinion of other people and affects the intention to use. The construct of SI was evaluated by inquiring of the students whether the people who influence their behavior and are important, for example, their friends and family, believe that they should use CodeLab.
H4. 
Social influence has a positive effect on the behavioral intention to use CodeLab.
Trust is an important construct that has been used and validated in the context of e-learning systems [60], even though the original UTAUT model does not emphasize trust as a factor that influences the intention to use. For the acceptance of mobile learning systems, trust is an important construct [61]. The construct of trust has also been considered for CodeLab, not in terms of security but in terms of trust in the system (TC) that it will help them to learn to code. Trust is not a core construct of UTAUT; it has been used and validated in e-learning [62]. In this study, we considered a mediating effect between TC and EE and SI as the degree of ease of use of the system and that the people who influence a person’s behavior can inculcate TC.
H5. 
Trust has a positive effect on the behavioral intention to use CodeLab.
H5.a. 
The positive effect of trust in CodeLab on behavioral intention to use will be mediated through effort expectancy and social influence.
Motivation and engagement are considered essential constructs when it comes to education [59] and e-learning tools [63]. Reports in the literature have strongly emphasized the fact that the motivation and engagement of students is essential for learning and is a primary concern of researchers and teachers [64]. As CodeLab is an online educational tool, motivation is to be considered; motivated students will be able to learn productively by being engaged. As this is one of the first studies on the evaluation of acceptance of the tool, the construct of motivation has been taken into account by asking students if they feel motivated to learn to program because they are using the learning tool. In addition to the direct relationship between motivation (M) and BI, a mediating relationship with FCs was also evaluated.
H6. 
Motivation has a positive effect on the behavioral intention to use CodeLab.
H6.a. 
The positive effect of motivation with respect to CodeLab on the behavioral intention to use it will be mediated through facilitating conditions.
The behavioral intention to use (BI) a construct is the acceptance of the technological system in the core UTAUT model. However, for various situations, depending on the context of usage and the study, acceptance can differ. In the specific case of the CodeLab tool, the meaning of acceptance is also related to the usage context. The students who use the tool to study for the course will be using the CodeLab tool for one semester. The acceptance in this scenario is twofold: firstly, the acceptance to use CodeLab during the semester, and, secondly, to accept that they are willing to use similar tools that are based on an approach of learning-by-doing.

3.3. Procedure for Data Collection

The questionnaire used in the study was based on UTAUT as this is a validated and widely used model for examining users’ perceptions regarding the acceptance of a system and one that has been used in several domains, including education [49], as mentioned in Section 2. Other acceptance models could have been used, for example, TAM; however, it was observed that UTAUT is validated in the field of education, and the constructs it contains are relevant to this study. The UTAUT questionnaire uses the constructs of PE, EE, SI, FCs, and TC and any additional ones that are vital for the context, and in this case motivation was also used. The basic structure of the questionnaire was maintained; however, where necessary, it was adapted to the CodeLab context. The questionnaire consisted of 33 questions divided into 3 sections. The first section contained questions based on the UTAUT constructs, the second was about the usability of the system, and the third section captured demographic information. For the first section, a 7-point Likert scale ranging from strongly agree to strongly disagree was used to capture the responses. The questionnaire was developed using Qualtrics software [65] and was sent to the students via anonymous links on the classroom board, so that it was voluntary for them to answer. Qualtrics has been used in this context as it provides a variety of features that facilitate the development, distribution, and preparation of data for analysis as compared with other survey creation tools, such as Google Forms and SurveyMonkey. Additionally, it supports various languages and could easily be integrated with the learning environment of the university, facilitating the questionnaire-based research. As for the timing of sending the questionnaire out, it was sent in the final weeks of the semester, when the students were finishing the last module of activities, such that a post-usage evaluation of UTAUT was conducted.

3.4. Sample

The classrooms consisted of 116 students in total. There were 39 students in the Catalan classroom and 77 in the Spanish classroom. Answering the questionnaire was completely voluntary for the students. A consent form was first given to the students, and those who consented took part in the survey. A total number of 87 responses were captured, making for a 75% response rate. Later, the valid responses were evaluated that did not contain any missing data. Of the valid responses, 38% were from male students, with the majority of 62% being from female students. The students were also asked about their course type, i.e., full-time or part-time: 31% of the students were full-time; 69% of the students were part-time. As regards age, 90% of the students were within the age group of 21 to 45 years.

3.5. Data Filtering Software and Techniques

To test the model and the stated hypotheses, partial least squares (PLS) was used [66]. As has been mentioned, our overall sample space comprised 116 enrolled students and the total number of valid responses was 72, such that the sample size with respect to the context of the study was significant, though it can be generally considered low in magnitude. For this purpose, PLS was considered for the analysis as it has the potential to provide robust results even with smaller sample sizes [67]. The most complex construct in the study had 3 items; therefore, the minimum sample size required was 30, i.e., 10 times the maximum number of items in a construct [66]. The PLS analysis was performed using the smartPLS tool, version 3.3.3. Data collected from the survey were filtered for valid responses, and the partially completed responses were removed from the valid set of responses. The descriptive statistics and data related to gender, age, and student type were evaluated in SPSS.

4. Results

As the process for analysis with PLS-SME suggests, first, the measurement model was created and evaluated. In the second step, the structural model was created to verify the relationships between the constructs for statistical significance [66]. The model used in this research was reflective in nature; taking the nature of the model into account, the consistent version of the PLS algorithm was used [68].

4.1. Variance of the Endogenous Variable: R2 and Indicator Loadings

The coefficient of determination of variance, that is, the in-sample explanatory power, for the target dependent variable, R2, for BI was 0.417, i.e., 41%. This implies that the six latent variables PE, EE, SI, FCs, T, and M explained 42% of the variance in BI. For the indirect effects, the coefficient of determination, R2, for FCs, the latent variable, was 0.461; this implies that M explained 46% of the variance in FCs. Similarly, for SI, 44% of the variance of TC was explained by SI. However, for EE, 69% of the variance of TC was explained by EE. The R2 values in the case of this study moderately explain the variance for the respective dependent variables [69].
The standardized path coefficients for the outer loadings, M has the strongest effect on BI, 0.466. FCs has a path coefficient of 0.155, EE one of 0.319, PE one of −0.192, SI one of 0.011, and TC one of −0.078. The path coefficients for FCs, EE, and M were statistically significant; however, for SI, the path coefficient was insignificant, since it was lower than 0.1, and the path coefficients for PE and TC highlighted a negative relationship with BI [69]. For the mediated effects, the path coefficient of TC to EE was 0.843, and from TC to SI was 0.664; both in this case were statistically significant. The path coefficient from M to FCs was 0.679, and was significant too. Regarding the total indirect effects, the path TC-EE-BI was 0.0228, and was significant. The path TC-SI-BI was −0.004, showing insignificance and a negative correlation. The path M-FCs-BI was 0.128, which was close to a significant value.

4.2. Indicator Reliability

For indicator reliability, the squares of all the outer loadings of the items were considered; a value ≥0.7 was preferred, while 0.4 was acceptable [69]. Table 1 highlights the indicator reliabilities for all the constructs in the UTAUT study for CodeLab within the range. Internal consistency reliability depicted in smartPLS as composite reliability and evaluated by Cronbach’s alpha should be equal to or higher than 0.5 [70]. In the case of this study, the composite reliability was greater than 0.7 and reasonably high for all the latent variables. Another view about the reliability of PLS constructs was proposed in [68]—that the rho_A value should be ≥0.7 and ≤1. In the case of this study, the rho_A value for all the latent variables was also within this range (Table 1).

4.3. Convergent and Discriminant Validity

As for convergent validity, it depends on the average variance extracted (AVE) values. The AVE values should be greater than 0.5 [71]. In the case of this study, the AVE of values of all the constructs were greater than 0.5; however, for EE the AVE was equal to 0.5 (Table 1), highlighting convergent validity for all the constructs except EE. Discriminant validity was shown by the AVE values and the correlation of latent variables. F is used to evaluate discriminant validity, which suggests that the square root of AVE for each latent variable should be greater than its correlation with other latent variables. For this study, the criterion of [72] was satisfied for all the latent variables except EE, i.e., the square root of each latent variable was greater than its correlation (Table 2).

4.4. Hypothesis Testing

The measurement model was evaluated for statistical significance through the process of bootstrapping [67]. t-statistics and p-values were generated for the paths between the items of each latent construct and amongst the constructs. When performing bootstrapping in smartPLS, the sub-samples used were 5000, and the one-tailed test with a 0.05 significance level was used. As for the interactions between the latent constructs, four paths that were significant are highlighted in Table 3.
The path from motivation to behavioral intention was significant. It can be concluded that motivation is an important factor in the intention to use the CodeLab tool (Table 4). For the indirect effects, the measurement model was also bootstrapped to identify the statistical significance and as per the results the total indirect paths were not significant. The relationship between motivation (M) and the intention to use (BI) was positive and significant (Table 3), verifying H6. It can be concluded that motivation is an important construct when evaluating the intention to use e-learning systems that encourage learning-by-doing. As this was the first UTAUT evaluation of CodeLab, in the future, studies on other online learning systems that encourage learning-by-doing, not just CodeLab, can consider educational factors, such as student engagement, which is a positive outcome of student motivation.
Additionally, there was a positive and significant relationship between motivation (M) and facilitating conditions (FCs) (Table 3), partially supporting H6.a, implying that the presence of facilitating conditions contributes to the motivation of students when using the system. Facilitating conditions, as previously mentioned, are personal qualities or factors that contribute positively to the learning process. For future evaluations of CodeLab and similar e-learning tools, in-context facilitating conditions can also be considered, for example, course forums, teachers’ feedback, and channels for peer communication that facilitate the learning process.
Similarly, the relationships between trust (TC) and effort expectancy (EE) and between TC and social influence (SI) were positive and significant, partially supporting H5.a, since the total indirect effect was not significant. Regarding the degree of ease of use, EE does inculcate trust in CodeLab. e-learning systems that are easy to use can be relied upon and trusted in several respects, including those specific to education.
Taking into account the hypothesis for the PE construct, for H1 there was a negative and a non-significant relationship. As for the degree of ease of use, effort expectancy (EE) was positive but insignificant for CodeLab (H2). Similarly, in the case of the facilitating conditions, the effect was positive yet insignificant, highlighted by H3. For the construct of social influence represented in H4, the relationship between SI and acceptance was non-significant.

5. Discussion

Learning-by-doing promotes students’ motivation and engagement [32]. In this study, similar to the study conducted by [47], the relationship between motivation and acceptance was found to be prominently positive [52]. The CodeLab tool motivated the students to learn and actively participate in the course. This implies that if e-learning systems are designed with innovative educational strategies that keep students motivated and encourage their involvement, students are likely to willingly accept the e-learning tools and learn. As this was the first UTAUT evaluation of CodeLab, in the future, studies not just on CodeLab but on other online learning systems that encourage learning-by-doing can consider educational factors, such as student engagement, which is a positive outcome of student motivation.
Regarding the non-significant relationships determined in this study, an interesting study by Williams concluded that 23 out of a total of 116 studies on UTAUT did not find significant impacts of PE on students’ acceptance [45]. As has been mentioned, skills such as programming are acquired by learning-by-doing [28], and even though CodeLab helped the students to acquire programming skills, the non-significant relationship between PE and acceptance can be explained by an observation reported by the course teachers. In the classrooms using CodeLab, the students were only performing learning activities of a mandatory nature and not of the recommended and complementary nature, signifying that there was less practice being performed by the students to learn to program. For the next semester, it has been considered that the students should be made aware of the fact that the purpose of the CodeLab tool is to help them to learn to code by practicing. The more they do, i.e., practice the programming-based activities provided in CodeLab, the more they will learn.
Similarly, several studies have found an insignificant relationship between the intention to use and effort expectancy [45]. Additionally, as suggested in a UTAUT for computer programming, effort expectancy can improve overtime and could be low for novice learners [55]. In the case of the CodeLab tool, it can be claimed that there were some technical issues of a minor nature that were encountered by some students which could possibly have led them to believe that the EE for CodeLab was lower than it actually was. For example, the option to save the code did not work as expected for some students.
For the non-significant relationship with FCs, in the case of CodeLab, as it is an e-learning programming tool, a person, more specifically, a teacher, can be considered a facilitator who is available to help the students. Additionally, as the development of the CodeLab tool progresses, other ways and measures can be incorporated that facilitate the students in terms of learning to code using the tool. For example, a communication channel that encourages asynchronous communication not just between teachers and students but also encourages peer-to-peer interaction could be helpful and facilitate the students in using CodeLab in the course and enhance their learning.
In the case of online learning tools, the social aspect is vital, as highlighted in [20]. The relationship between social influence and acceptance in this study having been found to be insignificant, it could be argued that the items of SI used in this study are the core items of SI that are considered by the core UTAUT model. These items did not coincide well with the context of CodeLab. The SI items suggest that friends and family could influence the intention to use (BI) among the students. However, reflecting on the results for SI, we may consider context-specific people involved. In the case of CodeLab, the context-specific people could be the teachers teaching the course, friends, or colleagues in related or similar domains, including those who have studied online courses that encourage the learning-by-doing model and alumni of the course in which CodeLab was used. In future studies related to CodeLab and similar tools that encourage learning by virtue of practice, context-specific people and persons who have social influence on students should be considered.

6. Conclusions and Future Work

Learning-by doing is an educational approach that helps in the acquisition of skills in face-to-face, blended, and fully online contexts. The approach encourages students to learn by practice. The purpose of this study was to identify the factors that play an important role in successful acceptance of the practice-based online learning tools that are based on the pedagogical approach of learning-by-doing. As learning to program is a skill that is acquired by practice, the CodeLab tool, based on the learning-by-doing approach, has been incorporated into a programming course for a bachelor’s degree program. The learners are new to programming concepts and the tool is likely to help them acquire problem-solving skills. The UTAUT model was used for the evaluation of the CodeLab tool to identify the factors that lead to students’ acceptance of it and similar e-learning tools and systems that encourage learning-by-doing.
The core constructs of UTAUT—performance expectancy (PE), effort expectancy (EE), social influence (SI), and facilitating conditions (FCs)—were not significant according to the results of our study. The factors that possibly could have led to insignificant results for the mentioned constructs have been discussed and recommendations for the design of e-learning tools that encourage practice have been provided. It cannot be claimed that these constructs are not important in the context of e-learning systems; however, the design of the e-learning system should be supportive enough for these constructs to play a significant role in the acceptance of the systems. The results highlight that motivation is an important construct for practice-based tools that should be taken into account for the acceptance of these systems. In the specific case of learning-by-doing and practice-based tools, such as the case of CodeLab tool, these tools and systems are designed and incorporated into online courses with the aim of enhancing student motivation and engagement. The aim of incorporating CodeLab into an online programming course was to indeed help the students become involved and engaged actively in the course, which is supported by the results of this study.
In future, additional constructs that determine the successful acceptance of practice-based tools can be used. As this was the first pilot study on the CodeLab tool, the results can be acknowledged as important, yet additional pilot studies with more than one course could help determine the factors leading to acceptance more precisely. The relationship between student motivation and acceptance and willingness to use is interesting, as signified by this study; however, more insightful relationships, for example, between student engagement and its various dimensions, student performance and grades in online classrooms, and satisfaction, can be examined as determinants of successful acceptance of online learning-by-doing tools and systems. Additionally, apart from determining the factors that lead to successful adoption and acceptance, other important factors, for example, student engagement and student awareness of progress in the course and learners’ experiences can also be considered for the evaluation of learning-by-doing tools and systems. Another important aspect concerning such tools and systems could be to take into account teachers’ perspectives on the practice-based tools and to enhance the teaching experiences that would consequently positively impact the entire educational process.

Author Contributions

Conceptualization, S.I., A.-E.G.-R. and E.M.; Methodology, S.I., A.-E.G.-R. and E.M.; Validation, A.-E.G.-R.; Formal analysis, S.I.; Investigation, S.I., A.-E.G.-R. and E.M.; Writing—original draft, S.I.; Writing—review & editing, S.I., A.-E.G.-R. and E.M. All authors have read and agreed to the published version of the manuscript.

Funding

This work is part of the R + D + i project PID2021-128875NA-I00, funded by MCIN/AEI/10.13039/501100011033/ERDF A way of making Europe.

Institutional Review Board Statement

The study was conducted in accordance with the Ethics Committee of UOC (15 October 2019).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data used in this project are confidential and came from the learning management system of UOC, the CodeLab tool, and the questionnaires that included personal identifying information about the participants which cannot be disclosed.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Goodyear, P.; Retalis, S. Technology-Enhanced Learning; Sense Publishers: Rotterdam, The Netherlands, 2010; Volume 6. [Google Scholar]
  2. Appana, S. A review of benefits and limitations of online learning in the context of the student, the instructor and the tenured faculty. Int. J. E-Learn. 2008, 7, 5–22. [Google Scholar]
  3. Rapanta, C.; Botturi, L.; Goodyear, P.; Guàrdia, L.; Koole, M. Online university teaching during and after the Covid-19 crisis: Refocusing teacher presence and learning activity. Postdigit. Sci. Educ. 2020, 2, 923–945. [Google Scholar] [CrossRef]
  4. Verawardina, U.; Asnur, L.; Lubis, A.L.; Hendriyani, Y.; Ramadhani, D.; Dewi, I.P.; Darni, R.; Betri, T.J.; Susanti, W.; Sriwahyuni, T. Reviewing online learning facing the COVID-19 outbreak. Talent. Dev. Excell. 2020, 12, 385–392. [Google Scholar]
  5. Rostaminezhad, M.A.; Mozayani, N.; Norozi, D.; Iziy, M. Factors Related to E-learner Dropout: Case Study of IUST Elearning Center. Procedia Soc. Behav. Sci. 2013, 83, 522–527. [Google Scholar] [CrossRef] [Green Version]
  6. Williams, M. John Dewey in the 21st Century. J. Inq. Action Educ. 2017, 9, 91–102. [Google Scholar]
  7. Reese, H.W. The learning-by-doing principle. Behav. Dev. Bull. 2011, 17, 1. [Google Scholar] [CrossRef]
  8. Niiranen, S.; Rissanen, T. Learning by Doing and Creating Things with Hands: Supporting Students in Craft and Technology Education. PATT Proc. 2017, 34, 150–155. Available online: https://www.iteea.org/File.aspx?id=115739&v=21dfd7a (accessed on 4 December 2022).
  9. Kanakana-Katumba, M.G.; Maladzhi, R. Online Learning Approaches for Science, Engineering and Technology in Distance Education. In Proceedings of the 2019 IEEE International Conference on Industrial Engineering and Engineering Management (IEEM), Macao, China, 15–18 December 2019; pp. 930–934. [Google Scholar]
  10. Frache, G. A Constructively aligned Learning-by-Doing Pedagogical Model for 21st Century Education. Adv. Sci. Technol. Eng. Syst. J. 2017, 3, 1119–1124. [Google Scholar]
  11. Hettiarachchi, E.; Huertas, M.A. Introducing a Formative E-Assessment System to Improve Online Learning Experience and Performance. J. Univers. Comput. Sci. 2015, 21, 1001–1021. [Google Scholar]
  12. Olasina, G. Human and social factors affecting the decision of students to accept e-learning. Interact. Learn. Environ. 2019, 27, 363–376. [Google Scholar] [CrossRef]
  13. Fathema, N.; Shannon, D.; Ross, M. Expanding the Technology Acceptance Model (TAM) to examine faculty use of Learning Management Systems (LMSs) in higher education institutions. J. Online Learn. Teach. 2015, 11. Available online: https://jolt.merlot.org/Vol11no2/Fathema_0615.pdf (accessed on 4 December 2022).
  14. Persada, S.F.; Miraja, B.A.; Nadlifatin, R. Understanding the Generation Z Behavior on D-Learning: A Unified Theory of Acceptance and Use of Technology (UTAUT) Approach. Int. J. Emerg. Technol. Learn. 2019, 14, 20–33. [Google Scholar] [CrossRef]
  15. Dindar, M.; Suorsa, A.; Hermes, J.; Karppinen, P.; Näykki, P. Comparing technology acceptance of K-12 teachers with and without prior experience of learning management systems: A Covid-19 pandemic study. J. Comput. Assist. Learn. 2021, 37, 1553–1565. [Google Scholar] [CrossRef] [PubMed]
  16. Venkatesh, V.; Morris, M.G.; Davis, G.B.; Davis, F.D. User acceptance of information technology: Toward a unified view. MIS Q. 2003, 27, 425–478. [Google Scholar] [CrossRef] [Green Version]
  17. Calder, B.J.; Brendl, C.M.; Tybout, A.M.; Sternthal, B. Distinguishing constructs from variables in designing research. J. Consum. Psychol. 2021, 31, 188–208. [Google Scholar] [CrossRef]
  18. Xie, X.; Siau, K.; Nah, F.F.-H. COVID-19 pandemic–online education in the new normal and the next normal. J. Inf. Technol. Case Appl. Res. 2020, 22, 175–187. [Google Scholar] [CrossRef]
  19. Arghode, V.; Brieger, E.W.; McLean, G.N. Adult learning theories: Implications for online instruction. Eur. J. Train. Dev. 2017, 41, 593–609. [Google Scholar] [CrossRef]
  20. Hill, J.R.; Song, L.; West, R.E. Social learning theory and web-based learning environments: A review of research and discussion of implications. Am. J. Distance Educ. 2009, 23, 88–103. [Google Scholar] [CrossRef] [Green Version]
  21. Boyer, N.R.; Maher, P.A.; Kirkman, S. Transformative Learning in Online Settings: The Use of Self-Direction, Metacognition, and Collaborative Learning. J. Transform. Educ. 2006, 4, 335–361. [Google Scholar] [CrossRef]
  22. Pllana, D. Creativity in Modern Education. World J. Educ. 2019, 9, 136–140. [Google Scholar] [CrossRef]
  23. Zhang, G.X.; Sheese, R. 100 years of John Dewey and education in China. J. Gilded Age Progress. Era 2017, 16, 400. [Google Scholar] [CrossRef]
  24. Gutiérrez-Carreón, G.; Daradoumis, T.; Jorba, J. Integrating learning services in the cloud: An approach that benefits both systems and learning. Educ. Technol. Soc. 2015, 18, 145–157. [Google Scholar]
  25. Bot, L.; Gossiaux, P.-B.; Rauch, C.-P.; Tabiou, S. ‘Learning by doing’: A teaching method for active learning in scientific graduate education. Eur. J. Eng. Educ. 2005, 30, 105–119. [Google Scholar] [CrossRef]
  26. Ekici, D.I. The Use of Edmodo in Creating an Online Learning Community of Practice for Learning to Teach Science. Malays. Online J. Educ. Sci. 2017, 5, 91–106. [Google Scholar]
  27. Samani, M.; Putra, B.A.W.; Rahmadian, R.; Rohman, J.N. Learning Strategy to Develop Critical Thinking, Creativity, and Problem-Solving Skills for Vocational School Students. J. Pendidik. Teknol. Dan Kejuru. 2019, 25, 36–42. [Google Scholar] [CrossRef]
  28. Bareiss, R.; Griss, M. A story-centered, learn-by-doing approach to software engineering education. In Proceedings of the 39th SIGCSE Technical Symposium on Computer Science Education, Portland, OR, USA, 12–15 March 2008; pp. 221–225. [Google Scholar]
  29. Nantsou, T.; Frache, G.; Kapotis, E.C.; Nistazakis, H.E.; Tombras, G.S. Learning-by-Doing as an Educational Method of Conducting Experiments in Electronic Physics. In Proceedings of the 2020 IEEE Global Engineering Education Conference (EDUCON), Porto, Portugal, 27–30 April 2020; pp. 236–241. [Google Scholar] [CrossRef]
  30. Sangpikul, A. Challenging graduate students through experiential learning projects: The case of a marketing course in Thailand. J. Teach. Travel Tour. 2020, 20, 59–73. [Google Scholar] [CrossRef]
  31. George, M.; Lim, H.; Lucas, S.; Meadows, R. Learning by doing: Experiential learning in criminal justice. J. Crim. Justice Educ. 2015, 26, 471–492. [Google Scholar] [CrossRef]
  32. Niiranen, S. Supporting the development of students’ technological understanding in craft and technology education via the learning-by-doing approach. Int. J. Technol. Des. Educ. 2021, 31, 81–93. [Google Scholar] [CrossRef]
  33. Adkins, D.; Bossaller, J.; Brendler, B.; Buchanan, S.; Sandy, H.M. Learning by Doing: Using Field Experience to Promote Online Students’ Diversity Engagement and Professional Development. Expand. LIS Educ. Universe 2018, 17, 123–127. Available online: http://hdl.handle.net/2142/99018 (accessed on 4 December 2022).
  34. Gunawan, G.; Harjono, A.; Sahidu, H.; Herayanti, L. Virtual laboratory to improve students’ problem-solving skills on electricity concept. J. Pendidik. IPA Indones. 2017, 6, 257–264. [Google Scholar] [CrossRef] [Green Version]
  35. Hasibuan, A.M.; Saragih, S.; Amry, Z. Development of Learning Materials Based on Realistic Mathematics Education to Improve Problem Solving Ability and Student Learning Independence. Int. Electron. J. Math. Educ. 2019, 14, 243–252. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  36. Hettiarachchi, E. Technology-Enhanced Assessment for Skill and Knowledge Acquisition in Online Education; Universitat Oberta de Catalunya: Barcelona, Spain, 2014. [Google Scholar]
  37. Sim, T.Y.; Lau, S.L. Online Tools to Support Novice Programming: A Systematic Review. In Proceedings of the 2018 IEEE Conference on e-Learning, e-Management and e-Services (IC3e), Langkawi, Malaysia, 21–22 November 2018; pp. 91–96. [Google Scholar]
  38. Salleh, S.M.; Shukur, Z.; Judi, H.M. Analysis of Research in Programming Teaching Tools: An Initial Review. Procedia-Soc. Behav. Sci. 2013, 103, 127–135. [Google Scholar] [CrossRef] [Green Version]
  39. Rossano, V.; Roselli, T.; Quercia, G. Coding and Computational Thinking: Using Arduino to Acquire Problem-Solving Skills. In Technology Supported Innovations in School Education; Springer: Cham, Switzerland, 2020; pp. 91–114. [Google Scholar] [CrossRef]
  40. Hosseini, R.; Akhuseyinoglu, K.; Brusilovsky, P.; Malmi, L.; Pollari-Malmi, K.; Schunn, C.; Sirkiä, T. Improving Engagement in Program Construction Examples for Learning Python Programming. Int. J. Artif. Intell. Educ. 2020, 30, 299–336. [Google Scholar] [CrossRef]
  41. de Andrade, B.; Poplin, A.; de Sena, Í.S. Minecraft as a Tool for Engaging Children in Urban Planning: A Case Study in Tirol Town, Brazil. ISPRS Int. J. Geo-Inf. 2020, 9, 170. [Google Scholar] [CrossRef] [Green Version]
  42. Syntax, J. 2012. Available online: https://www.w3schools.com/js/js_json_syntax.asp (accessed on 4 December 2022).
  43. Code.org. Available online: https://code.org/ (accessed on 4 December 2022).
  44. Code Academy. Available online: https://www.codecademy.com/catalog (accessed on 4 December 2022).
  45. Williams, M.D.; Rana, N.P.; Dwivedi, Y.K. The unified theory of acceptance and use of technology (UTAUT): A literature review. J. Enterp. Inf. Manag. 2015, 28, 443–488. [Google Scholar] [CrossRef] [Green Version]
  46. Davis, F.D. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 1989, 13, 319–340. [Google Scholar] [CrossRef] [Green Version]
  47. Hale, J.L.; Householder, B.J.; Greene, K.L. The theory of reasoned action. Persuas. Handb. Dev. Theory Pract. 2002, 14, 259–286. [Google Scholar]
  48. Tolman, E.C. A cognition motivation model. Psychol. Rev. 1952, 59, 389. [Google Scholar] [CrossRef]
  49. Venkatesh, V.; Thong, J.Y.L.; Xu, X. Unified theory of acceptance and use of technology: A synthesis and the road ahead. J. Assoc. Inf. Syst. 2016, 17, 328–376. [Google Scholar] [CrossRef]
  50. Altalhi, M.M. Towards understanding the students’ acceptance of MOOCs: A unified theory of acceptance and use of technology (UTAUT). Int. J. Emerg. Technol. Learn. (IJET) 2021, 16, 237–253. [Google Scholar] [CrossRef]
  51. Guggemos, J.; Seufert, S.; Sonderegger, S. Humanoid robots in higher education: Evaluating the acceptance of Pepper in the context of an academic writing course using the UTAUT. Br. J. Educ. Technol. 2020, 51, 1864–1883. [Google Scholar] [CrossRef]
  52. Khechine, H.; Raymond, B.; Augier, M. The adoption of a social learning system: Intrinsic value in the UTAUT model. Br. J. Educ. Technol. 2020, 51, 2306–2325. [Google Scholar] [CrossRef]
  53. Ma, M.; Chen, J.; Zheng, P.; Wu, Y. Factors affecting EFL teachers’ affordance transfer of ICT resources in China. Interact. Learn. Environ. 2019, 27, 1–16. [Google Scholar] [CrossRef]
  54. Veiga, F.J.M.; de Andrade, A.M.V. Critical success factors in accepting technology in the classroom. Int. J. Emerg. Technol. Learn. 2021, 16, 4–22. [Google Scholar] [CrossRef]
  55. Morais, E.; Morais, C.; Paiva, J.C. Computer Programming Acceptance by Students in Higher Arts and Design Education. In Proceedings of the ICERI2018, Seville, Spain, 12–14 November 2018; pp. 2058–2065. [Google Scholar]
  56. Garcia-Lopez, C.; Mor, E.; Tesconi, S. Code. In International Conference on Human-Computer Interaction; Springer: Cham, Switzerland, 2021; pp. 437–455. [Google Scholar]
  57. Hevner, A.; Chatterjee, S. Design science research in information systems. In Design Research in Information Systems; Springer: Berlin/Heidelberg, Germany, 2010; pp. 9–22. [Google Scholar]
  58. Rahmi, B.; Birgoren, B.; Aktepe, A. A meta analysis of factors affecting perceived usefulness and perceived ease of use in the adoption of e-learning systems. Turk. Online J. Distance Educ. 2018, 19, 4–42. [Google Scholar]
  59. Wrycza, S.; Marcinkowski, B.; Gajda, D. The enriched UTAUT model for the acceptance of software engineering tools in academic education. Inf. Syst. Manag. 2017, 34, 38–49. [Google Scholar] [CrossRef]
  60. Casey, T.; Wilson-Evered, E. Predicting uptake of technology innovations in online family dispute resolution services: An application and extension of the UTAUT. Comput. Hum. Behav. 2012, 28, 2034–2045. [Google Scholar] [CrossRef]
  61. Almaiah, M.A.; Alamri, M.M.; Al-Rahmi, W. Applying the UTAUT model to explain the students’ acceptance of mobile learning system in higher education. IEEE Access 2019, 7, 174673–174686. [Google Scholar] [CrossRef]
  62. Han, J.; Conti, D. The use of UTAUT and post acceptance models to investigate the attitude towards a telepresence robot in an educational setting. Robotics 2020, 9, 34. [Google Scholar] [CrossRef]
  63. Kahn, P.; Everington, L.; Kelm, K.; Reid, L.; Watkins, F. Understanding student engagement in online learning environments: The role of reflexivity. Educ. Technol. Res. Dev. 2017, 65, 203–218. [Google Scholar] [CrossRef]
  64. Mandernach, B.J. Assessment of student engagement in higher education: A synthesis of literature and assessment tools. Int. J. Learn. Teach. Educ. Res. 2015, 12, 1–14. [Google Scholar]
  65. Qualtrics. Available online: https://www.qualtrics.com/ (accessed on 4 December 2022).
  66. Latan, H.; Noonan, R.; Matthews, L. Partial least squares path modeling. In Partial Least Squares Path Modeling: Basic Concepts, Methodological Issues and Applications; Springer: Cham, Switzerland, 2017. [Google Scholar]
  67. Haenlein, M.; Kaplan, A.M. A Beginner’s Guide to Partial Least Squares Analysis. Underst. Stat. 2004, 3, 283–297. [Google Scholar] [CrossRef]
  68. Dijkstra, T.K.; Henseler, J. Consistent partial least squares path modeling. MIS Q. 2015, 39, 297–316. [Google Scholar] [CrossRef]
  69. Hock, M.; Ringle, C.M. Local strategic networks in the software industry: An empirical analysis of the value continuum. Int. J. Knowl. Manag. Stud. 2010, 4, 132–151. [Google Scholar] [CrossRef]
  70. Hair, J.F.; Black, W.C.; Babin, B.J.; Anderson, R.E.; Tatham, R. Multivariate Data Analysis Uppersaddle River; Pearson Prentice Hall: Hoboken, NJ, USA, 2006. [Google Scholar]
  71. Anderson, J.C.; Gerbing, D.W. Structural equation modeling in practice: A review and recommended two-step approach. Psychol. Bull. 1988, 103, 411. [Google Scholar] [CrossRef]
  72. Fornell, C.; Larcker, D.F. Structural Equation Models with Unobservable Variables and Measurement Error: Algebra and Statistics; Sage Publications Sage CA: Los Angeles, CA, USA, 1981. [Google Scholar]
Figure 1. Design and creation adapted for CodeLab.
Figure 1. Design and creation adapted for CodeLab.
Applsci 12 12613 g001
Figure 2. Model adapted from [45].
Figure 2. Model adapted from [45].
Applsci 12 12613 g002
Table 1. Indicator and composite reliability: AVE values.
Table 1. Indicator and composite reliability: AVE values.
VariablesIndicatorsLoadingsIndicator ReliabilityComposite ReliabilityAVErho_A
PEPE_10.8890.7900.9640.8990.964
PE_20.9300.865
PE_31.0211.042
EEEE_10.6770.4580.7590.5170.777
EE_20.6710.450
EE_30.8430.711
SISI_11.0411.0830.9350.8290.954
SI_20.9100.828
SI_30.7950.632
FCsFC_10.7910.6250.8720.6940.873
FC_20.8690.755
FC_30.8360.698
TCT_10.8670.7510.8890.7280.891
T_20.8840.781
T_30.8070.651
MM_11.0671.1380.8930.8130.966
M_20.6980.487
BIBI_11.0001.0001.0001.0001.000
Table 2. Discriminant validity: the bolded values are AVE squares.
Table 2. Discriminant validity: the bolded values are AVE squares.
EEFCBIMPESITC
EE0.179
FC0.7880.833
BI0.5920.5281.000
M0.7380.6410.5860.901
PE0.6950.6330.4530.8120.948
SI0.5820.4910.3900.6270.6480.911
TC0.8400.7760.5580.8360.7710.6480.853
Table 3. Structural model results (bold rows highlight significant paths).
Table 3. Structural model results (bold rows highlight significant paths).
Patht-Statisticsp-Values
EE-BI1.2430.107
FC-BI1.0000.159
M-FC6.9650.000
M-BI1.7150.043
PE-BI0.5930.277
SI-BI0.1210.452
TC-EE12.6160.000
T-BI0.1760.430
TC-SI8.2470.000
Table 4. Hypothesis testing summary.
Table 4. Hypothesis testing summary.
StatementMeasurement Model (Preliminary Significance)Structural Model (Statistical Significance)
H1Performance expectancy (PE)Reject (negative relationship)Reject
H2Effort expectancy (EE)AcceptReject
H3Facilitating conditions (FCs)AcceptReject
H4Social influence (SI)RejectReject
H5Trust in CodeLab (TC)Reject (negative relationship)Reject
H5.aTC, EE, and SIAcceptAccept
H6MotivationAcceptAccept
H6.aM and FCsAcceptAccept
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Iftikhar, S.; Guerrero-Roldán, A.-E.; Mor, E. Practice Promotes Learning: Analyzing Students’ Acceptance of a Learning-by-Doing Online Programming Learning Tool. Appl. Sci. 2022, 12, 12613. https://doi.org/10.3390/app122412613

AMA Style

Iftikhar S, Guerrero-Roldán A-E, Mor E. Practice Promotes Learning: Analyzing Students’ Acceptance of a Learning-by-Doing Online Programming Learning Tool. Applied Sciences. 2022; 12(24):12613. https://doi.org/10.3390/app122412613

Chicago/Turabian Style

Iftikhar, Sidra, Ana-Elena Guerrero-Roldán, and Enric Mor. 2022. "Practice Promotes Learning: Analyzing Students’ Acceptance of a Learning-by-Doing Online Programming Learning Tool" Applied Sciences 12, no. 24: 12613. https://doi.org/10.3390/app122412613

APA Style

Iftikhar, S., Guerrero-Roldán, A. -E., & Mor, E. (2022). Practice Promotes Learning: Analyzing Students’ Acceptance of a Learning-by-Doing Online Programming Learning Tool. Applied Sciences, 12(24), 12613. https://doi.org/10.3390/app122412613

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop