Academic Teaching Quality Framework and Performance Evaluation Using Machine Learning
Abstract
:1. Introduction
2. Electronic Learning and Quality in Teaching
2.1. Electronic Learning
2.2. Quality in Teaching
3. Quality Teaching and Evaluation Framework—QTEF
- 1.
- Academic Criteria. Academic criteria are crucial for e-learning evaluation since they guarantee that the given information is of exceptional quality and complies with industrial or educational standards. These criteria ensure that students obtain a comprehensive education that equips them for success in their chosen fields. The integration of the online course, quality of instructional design including clarity and organization of the course materials, qualifications of the instructors, and the resources provided for learners with established curricular standards is a critical academic requirement. It follows that the course material must be effectively incorporated into the broader educational program and should align with the learning objectives and results listed in the curriculum [25,26,27].
- 2.
- Technical Criteria. Technical criteria cover the technicalities of an e-learning course or program, including the needed hardware and software, the platform or method of distribution, and the course’s accessibility for students with impairments. The learner’s capacity to access and utilize the course materials may be impacted by these characteristics, making them crucial for evaluating e-learning. The technological aspects of a course can impact the overall user experience in e-learning. Technical problems with the course, such as numerous errors or lengthy loading times, can be annoying to the students and negatively affect their interest and retention. Overall, it is critical to consider technological factors when evaluating an e-learning program to ensure that all students can access and benefit from it and that the course materials are provided smoothly and seamlessly [28,29].
- 3.
- Examination Criteria. The purpose of examination criteria in e-learning evaluation is to ensure that evaluations are valid, reliable, and fair. In order to be fair, all students must be given an equal chance to show off their knowledge and abilities. The term “reliability” relates to the assessment’s consistency and stability, which means it ought to yield the same results every time it is used. The degree to which an assessment captures what it is meant to capture is called validity. When assessing e-learning resources, there are several essential assessment criteria to consider. These include relevance, objectivity, authenticity, reliability, validity, and fairness. To make sure that e-learning resources are credible and effective instruments for evaluating learners’ knowledge and skills, it is crucial to carefully take into account these evaluation criteria while developing and reviewing e-learning products [30,31,32].
4. Machine Learning and Performance Evaluation
4.1. Machine Learning
4.2. Machine Learning Algorithm
- Simple Linear Regression (LR): A linear regression procedure is referred to as simple linear regression if only one independent variable is utilized to predict the outcome of a numerical dependent variable.
- Multiple Linear Regression (MLR): A linear regression process is referred to as multiple linear regression if it uses more than one independent variable to anticipate the value of a numerical dependent variable.
5. Methodology
5.1. Dataset and Data Description
5.2. Correlation
6. Results and Discussion
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
QTEF | Quality Teaching Evaluation Framework |
ML | Machine Learning |
Colab | Google Colaboratory |
MLR | Multiple Linear Regression |
LR | Simple Linear Regression |
References
- Zhang, T. Design of English Learning Effectiveness Evaluation System Based on K-Means Clustering Algorithm. Mob. Inf. Syst. 2021, 2021, 5937742. [Google Scholar] [CrossRef]
- Marzano, R.J. Teacher evaluation. Educ. Leadersh. 2012, 70, 14–19. [Google Scholar]
- Cheniti-Belcadhi, L.; Henze, N.; Braham, R. An Assessment Framework for eLearning in the Semantic Web. In Proceedings of the LWA 2004: Lernen-Wissensentdeckung-Adaptivität, Berlin, Germany, 4–6 October 2004; pp. 11–16. [Google Scholar]
- Danielson, C. New Trends in Teacher Evaluation. Educational leadership. Educ. Leadersh. 2001, 58, 12–15. [Google Scholar]
- Jiang, Y.; Li, B. Exploration on the teaching reform measure for machine learning course system of artificial intelligence specialty. Sci. Program. 2021, 2021, 8971588. [Google Scholar] [CrossRef]
- Martin, F.; Kumar, S. Frameworks for assessing and evaluating e-learning courses and programs. In Leading and Managing E-Learning: What the E-Learning Leader Needs to Know; Springer: Berlin/Heidelberg, Germany, 2018; pp. 271–280. [Google Scholar]
- Martin, F.; Kumar, S. Quality Framework on Contextual Challenges in Online Distance Education for Developing Countries; Computing Society of the Philippines: Quezon City, Philippines, 2018. [Google Scholar]
- The Sloan Consortium Quality Framework and the Five Pillars. Available online: http://www.mit.jyu.fi/OPE/kurssit/TIES462/Materiaalit/Sloan.pdf (accessed on 25 January 2023).
- Kear, K.; Rosewell, J.; Williams, K.; Ossiannilsson, E.; Rodrigo, C.; Sánchez-Elvira Paniagua, Á.; Mellar, H. Quality Assessment for E-Learning: A Benchmarking Approach; European Association of Distance Teaching Universities: Maastricht, The Netherlands, 2016. [Google Scholar]
- Phipps, R.; Merisotis, J. Quality on the Line: Benchmarks for Success in Internet-Based Distance Education; Institute for Higher Education Policy: Washington, DC, USA, 2000. [Google Scholar]
- Benchmarking Guide. Available online: https://www.acode.edu.au/pluginfile.php/550/mod_resource/content/8/TEL_Benchmarks.pdf (accessed on 26 January 2023).
- The Nadeosa Quality Criteria for Distance Education in South Africa. Available online: https://www.nadeosa.org.za/documents/NADEOSAQCSection2.pdf (accessed on 28 January 2023).
- OSCQR Course Design Review. Available online: https://onlinelearningconsortium.org/consult/oscqr-course-design-review/ (accessed on 25 January 2023).
- Baldwin, S.; Ching, Y.H.; Hsu, Y.C. Online course design in higher education: A review of national and statewide evaluation instruments. TechTrends 2018, 62, 46–57. [Google Scholar] [CrossRef] [Green Version]
- Martin, F.; Ndoye, A.; Wilkins, P. Using learning analytics to enhance student learning in online courses based on quality matters standards. J. Educ. Technol. Syst. 2016, 45, 165–187. [Google Scholar] [CrossRef]
- iNACOL Blended Learning Teacher Competency Framework. Available online: http://files.eric.ed.gov/fulltext/ED561318.pdf (accessed on 26 January 2023).
- Quality Assurance Framework–Asian Association of Open Universities. Available online: https://www.aaou.org/quality-assurance-framework/ (accessed on 29 January 2023).
- Taylor, E.S.; Tyler, J.H. Can teacher evaluation improve teaching. Educ. Next 2012, 12, 78–84. [Google Scholar]
- Kumar Basak, S.; Wotto, M.; Belanger, P. E-learning, M-learning and D-learning: Conceptual definition and comparative analysis. E-learning and Digital Media. E-Learn. Digit. Media 2018, 15, 191–216. [Google Scholar] [CrossRef]
- Oducado, R.M.F.; Soriano, G.P. Shifting the education paradigm amid the COVID-19 pandemic: Nursing students’ attitude to E learning. Afr. J. Nurs. Midwifery 2021, 23, 1–14. [Google Scholar]
- Soffer, T.; Kahan, T.; Nachmias, R. Patterns of students’ utilization of flexibility in online academic courses and their relation to course achievement. Int. Rev. Res. Open Distrib. Learn. 2019, 20, 202–220. [Google Scholar] [CrossRef]
- Saleem, F.; AlNasrallah, W.; Malik, M.I.; Rehman, S.U. Factors affecting the quality of online learning during COVID-19: Evidence from a developing economy. Front. Educ. 2022, 7, 13. [Google Scholar] [CrossRef]
- Ambrose, S.A.; Bridges, M.W.; DiPietro, M.; Lovett, M.C.; Norman, M.K. Bridging Learning Research and Teaching Practice. In How Learning Works: Seven Research-Based Principles for Smart Teaching; Ambrose, S.A., Bridges, M.W., DiPietro, M., Lovett, M.C., Norman, M.K., Richard, Eds.; Jossey-Bass, A Wiley Imprint: San Francisco, CA, USA, 2010. [Google Scholar]
- Teaching and Learning Frameworks. Available online: https://poorvucenter.yale.edu/BackwardDesign (accessed on 21 December 2022).
- An Overview of Key Criteria. Available online: https://nap.nationalacademies.org/read/10707/chapter/5 (accessed on 17 December 2022).
- Evaluating E-Learning: A Framework for Quality Assurance. 2006. Available online: https://www.elearningguild.com/articles/articles-detail.cfm?articleid=158 (accessed on 3 December 2022).
- Evaluating the Quality of Online Courses. 2022. Available online: https://www.onlinelearningconsortium.org/read/ (accessed on 11 December 2022).
- Gandellini, G. Evaluating e-learning: An analytical framework and research directions for management education. J. Abbr. 2006, 16, 428–433. [Google Scholar] [CrossRef]
- Te Pas, E.; Waard, M.W.D.; Blok, B.S.; Pouw, H.; van Dijk, N. Didactic and technical considerations when developing e-learning and CME. Educ. Inf. Technol. 2016, 21, 991–1005. [Google Scholar] [CrossRef]
- Barkhuus, L.; Dey, A.K.T. E-learning evaluation: A framework for evaluating the effectiveness of e-learning systems. Int. J. -Hum.-Comput. Stud. 2006, 64, 1159–1178. [Google Scholar]
- Brown, J.D.; Green, T.D. Evaluating e-learning effectiveness: A review of the literature. J. Comput. High. Educ. 2008, 20, 31–44. [Google Scholar]
- Du, Y.; Wagner, E.D. An empirical investigation of the factors influencing e-learning effectiveness. Educ. Technol. Soc. 2007, 10, 71–83. [Google Scholar]
- Muniasamy, A.; Alasiry, A. Deep learning: The impact on future elearning. Int. J. Emerg. Technol. Learn. 2020, 15, 188. [Google Scholar] [CrossRef] [Green Version]
- El Naqa, I.; Murphy, M.J. What is machine learning? In Machine Learning in Radiation Oncology; Springer: Cham, Switzerland, 2015; pp. 3–11. [Google Scholar]
- Jović, J.O.V.A.N.A.; Milić, M.; Cvetanović, S.V.E.T.L.A.N.A.; Chandra, K.A.V.I.T.H.A. Implementation of machine learning based methods in elearning systems. In Proceedings of the 10th International Conference on Elearning (eLearning-2019), Belgrade, Serbia, 26–27 September 2019. [Google Scholar]
- Kularbphettong, K.; Waraporn, P.; Tongsiri, C.T. Analysis of student motivation behavior on e-learning based on association rule mining. Int. J. Inf. Commun. Eng. 2012, 6, 794–797. [Google Scholar]
- Idris, N.; Hashim, S.Z.M.; Samsudin, R.; Ahmad, N.B.H. Intelligent learning model based on significant weight of domain knowledge concept for adaptive e-learning. Int. J. Adv. Sci. Eng. Inf. Technol. 2017, 7, 1486–1491. [Google Scholar] [CrossRef] [Green Version]
- Ayodele, T.O. Types of machine learning algorithms. In New Advances in Machine Learning; Zhang, Y., Ed.; InTech: Rijeka, Croatia, 2007; pp. 19–48. [Google Scholar]
- Maulud, D.; Abdulazeez, A.M.T. A review on linear regression comprehensive in machine learning. J. Appl. Sci. Technol. Trends 2020, 1, 140–147. [Google Scholar] [CrossRef]
- Draper, N.; Smith, H.L. Fitting a Straight Line by Least Squares. In Applied Regression Analysis, 3rd ed.; John Wiley & Sons: Toronto, ON, Canada, 1998; pp. 20–27. [Google Scholar]
- Glantz, S.A.; Slinker, B.K.; Neilands, T.B. Chapter Two: The First Step: Understanding Simple Linear Regression. In Primer of Applied Regression and Analysis of Variance; McGraw-Hill: New York, NY, USA, 1990. [Google Scholar]
- Waskom, M.L. Seaborn: Statistical data visualization. J. Open Source Softw. 2021, 6, 3021. [Google Scholar] [CrossRef]
- Hall, M.A. Correlation-Based Feature Selection for Machine Learning. Ph.D. Thesis, The University of Waikato, Hamilton, New Zealand, April 1999. [Google Scholar]
- McKinney, W. pandas: A foundational Python library for data analysis and statistics. Python High Perform. Sci. Comput. 2011, 14, 1–9. [Google Scholar]
- Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Duchesnay, E. Scikit-learn: Machine learning in Python. J. Mach. Learn. Res. 2021, 12, 2825–2830. [Google Scholar]
- Vallejo, W.; Díaz-Uribe, C.; Fajardo, C. Google Colab and Virtual Simulations: Practical e-Learning Tools to Support the Teaching of Thermodynamics and to Introduce Coding to Students. ACS Omega 2022, 7, 7421–7429. [Google Scholar] [CrossRef] [PubMed]
- Correlation vs. Causation. Available online: https://www.jmp.com/en_au/statistics-knowledge-portal/what-is-correlation/correlation-vs-causation.html (accessed on 2 January 2023).
Benchmarks/Framework | Parameters | Key Components Example |
---|---|---|
Council For Higher Education Accreditation [8] | 7 | Institutional Goals Structure Of The Organization Resources For Institutions Instructional Materials Faculty And Learner Support |
European Association Of Distance Teaching Universities (EADTU) [9] | 6 | Management Curriculum & Course Design Course Delivery Staff And Student Support |
Quality On The Line: Benchmarks For Success In Internet-Based Distance Education [10] | 24 | Instructional Support Course Structure And Development Student & Faculty Support Evaluation & Assessment Benchmarks |
Australasian Council on Open, Distance and e-Learning (ACODE) [11] | 8 | Information Technology Systems Services & Support Staff Professional Development & Student Training Development Staff And Student Support |
National Association For Distance Education And Open Learning In South Africa [12] | 13 | Course, Curriculum Design And Development Evaluation And Assessment |
Online Learning Consortium Quality Scorecard [13] | 9 | Institutional & Technology Support Instructional Design Course Structure Social & Student Engagement Faculty & Student Support Evaluations & Assessment |
Blackboard Exemplary Rubric [14] | 17 | Course Design Interaction & Collaboration Assessment Learner Support |
Quality Matters—QM [15] | 8 | Course Overview And Introduction, Educational Goals Evaluation & Measurement Instructional Content. Learner Interaction & Course Activities Course Technology, Learner Support Usability & Accessibility |
iNACOL [16] | 11 | Course Design & Management. Online Course Content Instructional Design Technology For Distance Education Assessment For Student |
Asian Association Of Open Universities (AAOU) [17] | 10 | Policy, Planning & Management. Students & Faculty Profiles Technology Infrastructure, Learning Contents & Media Assessment & Evaluation Research & Social Services Learner Support Design & Curriculum/ Course Development |
Category | Serial Nos. | Criteria Nos. | Criteria Description | Maximum Points |
---|---|---|---|---|
Academic | 1 | AC-1 | Course Information and Instructions | 4 |
2 | AC-2 | Purpose and Structure of Course | 4 | |
3 | AC-3 | Course learning objectives | 3 | |
4 | AC-4 | Learner Evaluation Criteria | 2 | |
5 | AC-5 | Modern Educational Materials | 2 | |
6 | AC-6 | Variety of Educational Materials | 2 | |
7 | AC-7 | Use of educational and explanatory aids whenever necessary | 4 | |
8 | AC-8 | Managing Lectures and Student Interactions | 4 | |
9 | AC-9 | Clear, Understandable, Explainable Voice during Lectures | 3 | |
10 | AC-10 | Utilization of Lecture Time | 3 | |
11 | AC-11 | Extent of adherence to lecture times | 3 | |
12 | AC-12 | Plan for the Educational Activities of the Course | 3 | |
13 | AC-13 | The extent of commitment to implementing the educational activities of the course | 3 |
Category | Serial Nos. | Criteria Nos. | Criteria Description | Maximum Points |
---|---|---|---|---|
Technical | 1 | TC-1 | Display the scientific material clearly | 3 |
2 | TC-2 | Links to the privacy policy and External tools required | 3 | |
3 | TC-3 | Computer skill | 3 | |
4 | TC-4 | Presentation of the Learning objectives in LMS | 3 | |
5 | TC-5 | Design and Easiness of the Course | 3 | |
6 | TC-6 | Course Organization | 3 | |
7 | TC-7 | Use of Interactive Tool/Material during Class | 2 | |
8 | TC-8 | Media Lectures | 3 | |
9 | TC-9 | Information Availability about tools use in course | 3 | |
10 | TC-10 | Educational Materials and Resources | 3 | |
11 | TC-11 | Display of Educational Material during Class | 2 | |
12 | TC-12 | Forums and Discussion | 3 | |
13 | TC-13 | Course Activities Advertisement/Announcement | 2 | |
14 | TC-14 | Virtual Office House | 2 | |
15 | TC-15 | Links about the University’s services, etc. | 2 |
Category | Serial Nos. | Criteria Nos. | Criteria Description | Maximum Points |
---|---|---|---|---|
Examination | 1 | EC-1 | Evaluation measures specific learning objectives | 3 |
2 | EC-2 | Clarity of the policy followed in distributing, monitoring and evaluating course grades | 2 | |
3 | EC-3 | Extent of adherence to exam dates | 2 | |
4 | EC-4 | Accuracy in formulating Assessments tasks | 3 | |
5 | EC-5 | Comprehensive Examination | 2 | |
6 | EC-6 | Coverage of learning outcomes in Assessment tasks | 3 | |
7 | EC-7 | Appropriate tests to measure learning outcomes | 3 | |
8 | EC-8 | Accuracy in Awarding grades | 2 |
Sr. No | Parameter | Final Score Value |
---|---|---|
1 | Records Count | 400 |
2 | Mean | 67.26 |
3 | Standard Deviation | 17.52 |
4 | Minimum Score | 38.50 |
5 | 25% | 47.57 |
6 | 50% | 68.70 |
7 | 75% | 84.50 |
8 | Maximum Score | 91.00 |
Criteria Category | Criteria Number | Correlation Type |
---|---|---|
Academic | AC-6, AC-7, AC-12 | High |
Technical | TC-4, TC-9, TC-15 | High |
Examination | EC-4, EC-5, EC-7 | High |
Academic | AC-9, AC-10, AC-13 | Low |
Technical | TC-2, TC-3, TC-13 | Low |
Examination | EC-1, EC-6, EC-8 | Low |
Correlation Type | Selected Criteria | R2 Value | Mean Square Error |
---|---|---|---|
Highly Correlated | AC-6, AC-7, AC-12, TC-4, TC-9, TC-15, EC-4, EC-5, EC-7 | 0.9322 | 20.12 |
Adjusted Variable—Scenario (1) | AC-6, AC-7, AC-12, TC-4, TC-9, TC-14, EC-3, EC-4, EC-7 | 0.9535 | 13.78 |
Adjusted Variable—Scenario (2) | AC-6, AC-7, AC-12, TC-7, TC-9, TC-14, EC-1, EC-4, EC-7 | 0.9577 | 12.56 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Almufarreh, A.; Noaman, K.M.; Saeed, M.N. Academic Teaching Quality Framework and Performance Evaluation Using Machine Learning. Appl. Sci. 2023, 13, 3121. https://doi.org/10.3390/app13053121
Almufarreh A, Noaman KM, Saeed MN. Academic Teaching Quality Framework and Performance Evaluation Using Machine Learning. Applied Sciences. 2023; 13(5):3121. https://doi.org/10.3390/app13053121
Chicago/Turabian StyleAlmufarreh, Ahmad, Khaled Mohammed Noaman, and Muhammad Noman Saeed. 2023. "Academic Teaching Quality Framework and Performance Evaluation Using Machine Learning" Applied Sciences 13, no. 5: 3121. https://doi.org/10.3390/app13053121
APA StyleAlmufarreh, A., Noaman, K. M., & Saeed, M. N. (2023). Academic Teaching Quality Framework and Performance Evaluation Using Machine Learning. Applied Sciences, 13(5), 3121. https://doi.org/10.3390/app13053121