Self-Assessment Guide to Quality in Accessible Virtual Education: An Expert Validation
Abstract
:1. Introduction
- Are the structural elements of the proposed model sufficient and do they effectively facilitate its application in accessible and equitable virtual environments in higher education?
- Are the scientific basis and content of the proposal sufficiently robust to support its practical application and adaptation in the context of accessible and equitable virtual higher education?
2. Background
2.1. Quality in Virtual Education
2.2. Accessibility as a Key Element of Quality
2.3. Models and Empirical Methodologies in Educational Evaluation
2.4. Self-Assessment Guide
- Systematic literature review (SLR): An exhaustive review of the literature was conducted using the Multivocal Literature Review (MLR) methodology [62] to identify models and criteria for quality assessment in e-learning. Additionally, it included procedures for self-assessment, evaluation, and/or accreditation in contexts of quality assurance in education.
- Identification of dimensions and criteria: Based on the SLR results, 134 dimensions and 504 criteria related to accessibility were identified. A thematic analysis was then conducted, resulting in 18 grouped dimensions and 40 phases or sub-processes applicable to self-assessment.
- Model construction: Based on the dimensions and criteria identified, focus groups were conducted with a panel of experts in accessibility and educational accreditation. A model was proposed consisting of four main dimensions (Organisation, Student Body, Teaching, and Infrastructure), with 16 standards, 48 requirements, and 63 pieces of evidence, as shown in Figure 1.
- Construction of the self-assessment methodology: From the identified phases, a methodology was proposed for implementing the model. Figure 2 shows the proposed model, which includes five phases (Planning, Model Tuning, Evaluation, Results, and Continuous Improvement), with 24 clearly defined activities.
- Design of the guide and preparation of the final document: The guide was structured into two key components: (a) the self-assessment model and (b) the self-assessment methodology. The final guide includes a conceptual framework, the self-assessment model, and a detailed methodology for its implementation.
3. Materials and Methods
- (a)
- Preliminary validation by a small group of high-level experts with the aim of identifying critical areas for improvement in the structure, clarity, and relevance of the guide’s content.
- (b)
- Expanded validation based on the improved version of the guide, with the aim of confirming and further exploring the internal consistency and criterion concordance among experts. The result is detailed feedback for making further improvements to the guide based on a larger and more diverse group of experts.
- (c)
- Implementation validation through a pilot study of the final version of the guide in four educational institutions. The objective was to identify the guide’s adaptability to real institutional environments, as well as any necessary adjustments for its implementation in a broader range of educational institutions.
3.1. Experts Involved
- Proven experience in quality evaluation of educational programmes.
- Knowledge and skills in web accessibility and universal design for learning to ensure that the guide adequately addresses these aspects.
- Publications or participation in research projects related to accessibility and/or quality evaluation in education.
- Proven experience in the areas of virtual education.
- Proven experience in quality evaluation for educational programmes and/or knowledge and skills in web accessibility and universal design for learning to ensure that the guide adequately addresses these aspects.
3.2. Preliminary Validation
3.3. Extended Validation
- (1)
- To validate the proposed self-assessment model by asking the evaluators to rate the sufficiency of each of the four dimensions and the coherence, relevance, and clarity of each of the standards (16 in total), with the option to provide constructive feedback for each standard.
- (2)
- To validate the proposed self-assessment methodology, in which experts were asked to assess the sufficiency of each of the five methodological phases and evaluate the coherence, relevance, and clarity of each activity within a phase (24 in total). For each activity, experts were also given the opportunity to provide constructive feedback.
3.4. Pilot Study
- (a)
- General information: For the registration of basic data from the participating institution, such as institution name, level of education (undergraduate, postgraduate), scope of application (institutional, degree or programme, course), mode of study (online, hybrid), date or period of evaluation, and names of the evaluator(s) and informant(s).
- (b)
- Self-assessment model: This section details the model, structured by dimensions, with a description and objective for each dimension and its corresponding standards. For each standard, the name, objective, rating scale, and each of the standard’s requirements are included. For each requirement, a description, evidence or sources of information, and examples of aspects that could justify the requirement’s fulfilment are provided. These examples serve as guidelines for determining the rating at the time of evaluation.
- (c)
- Rating for each standard: This section provides a detailed description of each standard, along with its corresponding requirements. For each requirement, a weight or score is assigned according to institutional relevance (critical, important, or desirable), determining applicability, and recording the fulfilment rating. A list of possible evidence to justify the fulfilment of the standard is also included, with checkboxes to indicate whether the evidence exists and whether it effectively contributes to meeting the requirement, along with a space for comments on the characteristics of the evidence. Additionally, the tool automatically calculates the final rating for the standard, taking into account the assigned weight and the scores obtained for each requirement. A qualitative rating and a field to describe the overall perception of the standard within the institutional context are also included, along with a space for proposing improvements for each evaluated standard.
- (d)
- Weighting for scoring: This section contains a summary of the weightings assigned by the institution (evaluator and informants) to each requirement and standard, allowing the weightings of the standards and requirements to be adjusted, ensuring that the guide is adaptable to the specific characteristics of each institution.
- (e)
- Results: This section provides a summary of the scores obtained and their equivalence in terms of meeting or not meeting the standard. It is complemented by “radar” charts, which provide a clear visualisation of the institution’s status in relation to the different evaluated dimensions, thus facilitating the interpretation of the results and the identification of priority areas for continuous improvement.
4. Results
4.1. Preliminary Validation
“The relative evaluation of each dimension, standard, and requirement (weighting) is suggested to be analysed and determined according to the level of importance conceived by the self-assessment team, based on the institutional reality and the level or scope of application of the evaluation model”.
4.2. Extended Validation
4.2.1. Internal Consistency
4.2.2. Concordance Between Evaluators
4.2.3. Content Validity
4.2.4. Comparison of Evaluations
4.3. Pilot Study
5. Discussion
6. Conclusions
6.1. Preliminary Validation
6.2. Extended Validation
6.3. Pilot Study
6.4. Limitations and Future Work
- (a)
- Sample size: Although the number of experts involved in the validation was relatively small, this limitation was influenced by the difficulty of recruiting experts with relevant experience. The inclusion criteria required the participants to have experience in quality assessment processes at the institutional level, which, while restricting the availability of experts, ensured the quality of the assessments. However, this limits the ability to generalise the results to a broader spectrum of institutional realities. Future studies could expand the sample to include experts with different levels of experience, allowing for a more diverse and generalisable perspective.
- (b)
- Scope of the pilot study: The implementation of the model was relatively short, covering only the phase of identifying strengths and weaknesses that require attention. This prevented the observation of its long-term impact on the institutions’ internal quality processes. Future work could conduct longitudinal studies to analyse how institutions are integrated into continuous improvement.
- (c)
- Survey fatigue: Despite the measures implemented to mitigate respondent fatigue due to the length of the extended survey questionnaire (165 possible responses, 126 quantitative and 39 qualitative), the impact of survey fatigue cannot be completely ruled out. This limitation may have influenced the accuracy or consistency of some responses.
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A
Eval 1 | Eval 2 | Eval 3 | Eval 4 | Aiken’s V | |
---|---|---|---|---|---|
Question 17 on the structure of the guide | |||||
Do you agree with the weighting and scoring proposed in the self-assessment guide for each of the dimensions and their standards? | 3 | 4 | 3 | 3 | 0.56 |
Shapiro–Wilk | ||||||||
---|---|---|---|---|---|---|---|---|
Dimensions | Media | Medium | SD | Min. | Max. | Aiken’s V | W | p |
Organisation | 3.50 | 3.63 | 0.419 | 2.81 | 4.00 | 0.83 | 0.882 | 0.019 |
Student Body | 3.79 | 3.90 | 0.216 | 3.20 | 4.00 | 0.93 | 0.797 | <0.001 |
Teaching | 3.69 | 3.69 | 0.279 | 3.00 | 4.00 | 0.90 | 0.915 | 0.079 |
Infrastructure | 3.79 | 3.90 | 0.334 | 2.70 | 4.00 | 0.93 | 0.688 | <0.001 |
Criteria | ||||||||
Sufficiency | 3.71 | 3.75 | 0.317 | 3.00 | 4.00 | 0.827 | 0.002 | |
Coherence | 3.70 | 3.81 | 0.280 | 3.06 | 4.00 | 0.884 | 0.021 | |
Relevance | 3.76 | 3.84 | 0.303 | 2.88 | 4.00 | 0.808 | 0.001 | |
Clarity | 3.53 | 3.53 | 0.347 | 2.75 | 4.00 | 0.943 | 0.268 |
Shapiro–Wilk | ||||||||
---|---|---|---|---|---|---|---|---|
Dimensions | Media | Medium | SD | Min. | Max. | Aiken’s V | W | p |
Planning | 3.83 | 3.96 | 0.297 | 2.85 | 4.00 | 0.94 | 0.627 | <0.001 |
Model tuning | 3.78 | 3.96 | 0.307 | 3.00 | 4.00 | 0.93 | 0.766 | <0.001 |
Evaluation | 3.83 | 4.00 | 0.351 | 2.69 | 4.00 | 0.94 | 0.566 | <0.001 |
ResultsPlann | 3.76 | 3.87 | 0.287 | 3.00 | 4.00 | 0.92 | 0.814 | 0.001 |
Continuos Improvement | 3.88 | 4.00 | 0.200 | 3.23 | 4.00 | 0.96 | 0.666 | <0.001 |
Criteria | ||||||||
Sufficiency | 3.88 | 4.00 | 0.164 | 3.40 | 4.00 | 0.724 | <0.001 | |
Coherence | 3.81 | 3.94 | 0.284 | 3.04 | 4.00 | 0.716 | <0.001 | |
Relevance | 3.84 | 3.96 | 0.232 | 3.30 | 4.00 | 0.738 | <0.001 | |
Clarity | 3.77 | 3.87 | 0.261 | 3.09 | 4.00 | 0.806 | 0.001 |
Description or Statement | |
---|---|
Requirement | 4.2.2. The e-learning platform ensures universal availability and accessibility of the programme or course for both teachers and students. |
Description of the requirement a | This requirement aims to make the learning platform universally accessible and available to all users, ensuring that both students and teachers can access course content and related services at any time and from anywhere. |
Evidence of the requirement a | Documents evidencing the availability and accessibility of the platform, records of accessibility audits, incident reports, and their solutions. |
Examples of actions a |
|
Standard to which it belongs | 4.2 Learning management platform: the institution has an accessible IT platform to support the virtual training process and administrative management. |
Objective of the standard a | Ensure that the institution provides a learning management platform that is accessible, reliable, and effective, facilitating the e-learning process and academic and administrative management in a way that meets the needs of all users, including those with disabilities. |
Rating scale of the standard a | Satisfactory compliance: The learning management platform is fully accessible and continuously available to the entire university community, supporting both teaching-learning and academic and administrative management. All requirements are met without limitations, allowing for a seamless and barrier-free user experience. Partial compliance: The platform is functional and meets most requirements, but there may be areas where accessibility or availability requires improvement. Some functions may not be fully accessible to all users or availability may have been occasionally interrupted. Poor compliance: The platform has significant shortcomings in terms of accessibility or availability, which affects users’ ability to access courses and administrative formalities. The limitations can lead to difficulties in the teaching–learning process. Non-compliance: The platform is not suitable or accessible for the e-learning process. The lack of availability or significant accessibility creates significant barriers for users, compromising the effectiveness of e-learning. |
References
- Crespo, B.; Míguez-Álvarez, C.; Arce, M.E.; Cuevas, M.; Míguez, J.L. The Sustainable Development Goals: An Experience on Higher Education. Sustainability 2017, 9, 1353. [Google Scholar] [CrossRef]
- Prieto-Jiménez, E.; López-Catalán, L.; López-Catalán, B.; Domínguez-Fernández, G. Sustainable Development Goals and Education: A Bibliometric Mapping Analysis. Sustainability 2021, 13, 2126. [Google Scholar] [CrossRef]
- Arredondo-Trapero, F.G.; Guerra-Leal, E.M.; Kim, J.; Vázquez-Parra, J.C. Competitiveness, Quality Education and Universities: The Shift to the Post-Pandemic World. J. Appl. Res. High. Educ. 2024. ahead-of-print. [Google Scholar] [CrossRef]
- Kestin, T.; Van Den Belt, M.; Denby, L.; Ross, K.; Thwaites, J.; Hawkes, M. Getting Started with the SDGs in Universities: A Guide for Universities, Higher Education Institutions, and the Academic Sector; Australia, New Zealand & Pacific Edition. 2017. Available online: https://www.unsdsn.org/resources/getting-started-with-the-sdgs-in-universities-australia-new-zealand-and-pacific-edition/?gad_source=1&gclid=CjwKCAiA3Na5BhAZEiwAzrfagJgIbQZe9CRYyDmkSIOWR4SkJUIc3GFyeC7uev4Eph3LJVADZTm-yBoC12YQAvD_BwE (accessed on 6 October 2024).
- UN General Assembly. Transforming Our World: The 2030 Agenda for Sustainable Development; UN General Assembly: New York, NY, USA, 2015. [Google Scholar]
- Comisión Económica para América Latina y el Caribe. Fondo de las Naciones Unidas para la Infancia. In La educación en tiempos de la pandemia de COVID-19; CEPAL-UNESCO: Santiago, Chile, 2020. [Google Scholar]
- Balalle, H.; Weerasinghe, T. Re-Design the Classroom for Online Learning. Int. J. Innov. Res. Sci. Eng. Technol. 2021, 10, 252–257. [Google Scholar] [CrossRef]
- Córdova-Morán, J.; Pacheco-Mendoza, S.; Mayorga-Albán, A. Oportunidades en la Educación Superior a partir de la nueva realidad educativa. Cienc. Soc. Y Económicas 2021, 5, 13–25. [Google Scholar] [CrossRef]
- Alvarez, S.M. El desafío de repensar la universidad en la era digital. Cuad. Univ. 2020, 13, 9–26. [Google Scholar] [CrossRef]
- Cabellos, N.G.M.; Sánchez, A.I.A. Prácticas inclusivas de estudiantes en formación docente. Educ. Física Y Cienc. 2020, 22, e109. [Google Scholar] [CrossRef]
- Marciniak, R.; Sallán, J.G. Dimensiones de evaluación de calidad de educación virtual: Revisión de modelos referentes. RIED. Rev. Iberoam. De Educ. A Distancia 2018, 21, 217–238. [Google Scholar] [CrossRef]
- Ossiannilsson, E.; Williams, K.; Camilleri, A.F.; Brown, M. Quality Models in Online and Open Education around the Globe: State of the Art and Recommendations; International Council for Open and Distance Education: Oslo, Norway, 2015. [Google Scholar]
- Duque Oliva, E.J.; Gómez, Y.D. Evolución conceptual de los modelos de medición de la percepción de calidad del servicio: Una mirada desde la educación superior. Suma De Neg. 2014, 5, 180–191. [Google Scholar] [CrossRef]
- Dilan, R.; Fernandez, P. Quality Framework on Contextual Challenges in Online Distance Education for Developing Countries; Computing Society of the Philippines: Tuguegarao, Philippines, 2015. [Google Scholar]
- Hidalgo, E.H.; Solà, R.R.; Ivanova, M.; Rozeva, A.; Durcheva, M. Internal Quality Assurance Procedures Applicable to eAssessment: Use Case of the TeSLA Project. In Proceedings of the 2018 17th International Conference on Information Technology Based Higher Education and Training (ITHET), Olhao, Portugal, 26–28 April 2018; pp. 1–6. [Google Scholar]
- Almenara, J.C.; Cejudo, M.d.C.L.; Lozano, J.A.M. Evaluación del desempeño docente en la formación virtual: Ideas para la configuración de un modelo. RIED-Rev. Iberoam. De Educ. A Distancia 2018, 21, 261–279. [Google Scholar] [CrossRef]
- Valencia, M.; Giraldo, C. Scientific Rigor in Qualitative Research. Investig. Y Educ. En Enfermería 2011, 29, 500–514. [Google Scholar] [CrossRef]
- Timbi-Sisalima, C.; Sánchez-Gordón, M.; Hilera-Gonzalez, J.R.; Otón-Tortosa, S. Quality Assurance in E-Learning: A Proposal from Accessibility to Sustainability. Sustainability 2022, 14, 3052. [Google Scholar] [CrossRef]
- Escobar-Pérez, J.; Cuervo-Martínez, Á. Validez de contenido y juicio de expertos: Una aproximación a su utilización. Av. En Medición 2008, 6, 27–36. [Google Scholar]
- Prasetya, F.H.; Widiantoro, A.D.; Harnadi, B. Effects of Quality Factor and Satisfaction on The Acceptance of Online Learning (Adoption Learning Technology). In Proceedings of the 2023 7th International Conference on Information Technology (InCIT), Chiang Rai, Thailand, 16–17 November 2023; pp. 98–103. [Google Scholar]
- Barreto, R.; Herrera, R. Six Sigma-Oriented Methodology for Multivariate Capability Analysis to Measure Service Quality in Virtual Education Environments. Int. J. Product. Qual. Manag. 2023, 39, 431–463. [Google Scholar] [CrossRef]
- Khari, S.; Salimi Akinabadi, A.; Zarmehrparirouy, M.; Pazokian, M.; NakhostinMoghadam, A.; Ashtari, M.; Abdali, M.; Hashemi, K.; Ashtari, S. Student views on course quality and virtual education during COVID-19: Impact on academic performance. International Journal of Health Promotion and Education 2024, 1, 1–14. [Google Scholar] [CrossRef]
- Flores-Cueto, J.J.; Garay-Argandoña, R.; Hernández, R.M. Modelo de calidad educativa de programas virtuales: Caso de la Universidad de San Martín de Porres. Rev. Venez. De Gerenc. 2021, 26, 697–710. [Google Scholar] [CrossRef]
- Marciniak, R.; Gairín-Sallán, J. Un modelo para la autoevaluación de la calidad de programas de educación universitaria virtual. Revista de Educación a Distancia (RED) 2017, 54, 1–30. [Google Scholar] [CrossRef]
- Cesteros, A.M.F.-P.; Sarasa-Cabezuelo, A. A Proposal to Measure the Quality of Virtual Teaching. In Proceedings of the Fourth International Conference on Technological Ecosystems for Enhancing Multiculturality, Association for Computing Machinery, New York, NY, USA, 2 November 2016; pp. 155–162. [Google Scholar]
- Troussas, C.; Krouska, A.; Sgouropoulou, C. Towards a Reference Model to Ensure the Quality of Massive Open Online Courses and E-Learning. Lect. Notes Comput. Sci. (Incl. Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinform.) 2020, 12462 LNAI, 169–175. [Google Scholar] [CrossRef]
- Dehghan, S.; Andarvazh, M.R.; Hosseinnataj, A.; Sajedi, M.; Mortezazadeh, F. Comparison of Face-to-Face and Virtual Education: The Perspective of Students at Mazandaran University of Medical Sciences. J. Maz. Univ. Med. Sci. 2023, 33, 162–173. [Google Scholar]
- Alshammari, S.H.; Alshammari, R.A. An Integration of Expectation Confirmation Model and Information Systems Success Model to Explore the Factors Affecting the Continuous Intention to Utilise Virtual Classrooms. Sci. Rep. 2024, 14, 18491. [Google Scholar] [CrossRef]
- Paniagua, M.A.B.; Gómez, D.R. Docencia virtual y aprendizaje autónomo: Algunas contribuciones al Espacio Europeo de Educación Superior. RIED-Rev. Iberoam. De Educ. A Distancia 2008, 11, 157–182. [Google Scholar] [CrossRef]
- Martínez De Miguel López, S.; Bernárdez-Gómez, A.; Salmerón Aroca, J.A. Retrospective Analysis for the Perception of Educational Tools for the Development of Collaborative Activities in Virtual Environments. RIED-Rev. Iberoam. De Educ. A Distancia 2024, 27, 35–55. [Google Scholar] [CrossRef]
- Tejada, H.S.R.; García, N.M.O.; Ríos, E.d.S.G. Estrategias didácticas de la educación virtual universitaria: Revisión sistemática. Edutec Rev. Electrónica De Tecnol. Educ. 2023, 83, 120–134. [Google Scholar] [CrossRef]
- Yadav, S. Digital Skills of Teachers: A Necessity for the Technology-Enabled Human Resource of the Education Sector in the Fast Transform. In Handbook of Research on Establishing Digital Competencies in the Pursuit of Online Learning; IGI Global: Hershey, Pennsylvania, 2023; pp. 187–207. [Google Scholar]
- Rodríguez, R.A.D.; Estay-Niculcar, C. Formación en buenas prácticas docentes para la educación virtual. RIED-Rev. Iberoam. De Educ. A Distancia 2016, 19, 209–232. [Google Scholar] [CrossRef]
- Mayol, J.J.; Perales, F.; Negre-Bennasar, F.; Nadal, G.F. El diseño web y material didáctico accesible en la enseñanza universitaria. Rev. De Educ. A Distancia (RED) 2019, 19, 1–19. [Google Scholar] [CrossRef]
- Jiménez, A.M.; Manzano, A.P. Usabilidad de tablets para el acceso a campus virtuales universitarios por alumnos con discapacidad. Hist. Y Comun. Soc. 2014, 19, 805–818. [Google Scholar] [CrossRef]
- Ștefan, I.A.; Baalsrud Hauge, J.; Sallinen, N.; Ștefan, A.; Gheorghe, A.F. Accessibility and Education: Are We Fulfilling State of the Art Requirements? ADL Romania: Bucharest, Romania, 2021; Volume 1, pp. 579–587. [Google Scholar]
- Iniesto, F.; Rodrigo, C. The Use of WCAG and Automatic Tools by Computer Science Students: A Case Study Evaluating MOOC Accessibility. J. Univers. Comput. Sci. 2024, 30, 85–105. [Google Scholar] [CrossRef]
- Bansal, R.; Bansal, T.; Pruthi, N. Demystifying the Benefits and Challenges of Virtual Learning for Differently Abled Students. In Virtual Lifelong Learning: Educating Society with Modern Communication Technologies; Bentham Science: Sharjah, United Arab Emirates, 2024; pp. 70–84. [Google Scholar]
- Cabrero, B.G.; Ortega, V.J.P. Evaluar la docencia en línea: Retos y complejidades. RIED-Rev. Iberoam. De Educ. A Distancia 2011, 14, 63–76. [Google Scholar] [CrossRef]
- Domínguez-Chica, M.B.; Esteves-Fajardo, Z.I. El Diseño Universal del Aprendizaje: Referente teórico práctico para la educación inclusiva. Cienciamatria 2023, 9, 370–385. [Google Scholar] [CrossRef]
- Amado-Salvatierra, H.R.; Hilera González, J.R.; Otón Tortosa, S. Formalization of a Methodological Framework towards the Implementation of an Accessible Virtual Educational Project. Educ. XX1 2018, 21, 349–371. [Google Scholar] [CrossRef]
- Hernández Otálora, S.J.; Quejada Durán, O.M.; Díaz, G.M. Methodological Guide for Development of Accessible Educational Virtual Environments: A Systematic Approach. Digit. Educ. Rev. 2016, 29, 166–180. [Google Scholar]
- Mendoza-González, R.; Timbi-Sisalima, C.; Sánchez-Gordón, M.; Otón-Tortosa, S. A Framework to Foster Accessibility in Post-Pandemic Virtual Higher Education. Heliyon 2024, 10, e34273. [Google Scholar] [CrossRef] [PubMed]
- Mejía-Madrid, G.; Molina-Carmona, R. Model for Quality Evaluation and Improvement of Higher Distance Education Based on Information Technology. Available online: https://www.researchgate.net/publication/311508177_Model_for_quality_evaluation_and_improvement_of_higher_distance_education_based_on_information_technology (accessed on 13 September 2020).
- Luna, E.; Ponce, S.; Cordero, G.; Cisneros-Cohernour, E. Marco para evaluar las condiciones institucionales de la enseñanza en línea. Rev. Electrónica De Investig. Educ. 2018, 20, 1–14. [Google Scholar] [CrossRef]
- Marshall, S. A Quality Framework for Continuous Improvement of E-Learning: The E-Learning Maturity Model. Available online: http://www.ijede.ca/index.php/jde/article/view/606 (accessed on 14 September 2020).
- ESVIAL Modelo de Acreditación de Accesibilidad En La Educación Virtual. Available online: http://www.esvial.org/guia/wp-content/uploads/2015/02/Elaboraci%C3%B3n-de-un-modelo-de-acreditaci%C3%B3n-de-accesibilidad-en-la-educaci%C3%B3n-virtual.pdf (accessed on 25 September 2020).
- Benítez, J.G.G.; Gamboa, L.A.A. Evaluación estandarizada de los aprendizajes: Una revisión sistemática de la literatura. CPU-E Rev. De Investig. Educ. 2022, 34, 321–351. [Google Scholar] [CrossRef]
- Rodríguez-Rodríguez, J.; Reguant-Álvarez, M. Calcular la fiabilidad de un cuestionario o escala mediante el SPSS: El coeficiente alfa de Cronbach. REIRE Rev. D’innovació I Recer. En Educ. 2020, 13, 1–13. [Google Scholar] [CrossRef]
- Perales, R.G. Diseño y construcción de un instrumento de evaluación de la competencia matemática: Aplicabilidad práctica de un juicio de expertos. Ens. Aval. Pol. Públ. Educ. 2018, 26, 347–372. [Google Scholar] [CrossRef]
- Hernández López, V.; Juárez Hernández, L.G.J. Diseño y validación de contenido por juicio de expertos de un instrumento para evaluar la acción tutorial en el nivel superior. UPIICSA. Investig. Interdiscip. 2020, 6, 1–12. [Google Scholar]
- Ličen, S.; Cassar, M.; Filomeno, L.; Yeratziotis, A.; Prosen, M. Development and Validation of an Evaluation Toolkit to Appraise eLearning Courses in Higher Education: A Pilot Study. Sustainability 2023, 15, 6361. [Google Scholar] [CrossRef]
- Hadullo, K.; Oboko, R.; Omwenga, E. International Journal of Education and Development Using ICT; Open Campus, The University of the West Indies: West Indies, North America, 2017; pp. 185–204. [Google Scholar]
- Farid, S.; Ahmad, R.; Alam, M.; Akbar, A.; Chang, V. A Sustainable Quality Assessment Model for the Information Delivery in E-Learning Systems. Inf. Discov. Deliv. 2018, 46, 1–25. [Google Scholar] [CrossRef]
- Ortiz-López, A.; Olmos-Migueláñez, S.; Sánchez-Prieto, J.C. E-Learning Quality Assessment in Higher Education: A Mapping Study. In Proceedings of the Eighth International Conference on Technological Ecosystems for Enhancing Multiculturality, Association for Computing Machinery, New York, NY, USA, 22 January 2021; pp. 833–838. [Google Scholar]
- Al-kfairy, M.; Alzaabi, M.; Snoh, B.; Almarzooqi, H.; Alnaqbi, W. Metaverse-Based Classroom: The Good and the Bad. In Proceedings of the 2024 IEEE Global Engineering Education Conference (EDUCON), Kos Island, Greece, 8–11 May 2024; pp. 1–7. [Google Scholar]
- Garlinska, M.; Osial, M.; Proniewska, K.; Pregowska, A. The Influence of Emerging Technologies on Distance Education. Electronics 2023, 12, 1550. [Google Scholar] [CrossRef]
- Chávez, D.A.Q.; Castillo, M.A.S. Revisión sistemática de la literatura sobre integración tecnológica en el aprendizaje en línea. Cienc. Lat. Rev. Científica Multidiscip. 2023, 7, 580–599. [Google Scholar] [CrossRef]
- Nunes, K.; Passos, A.; Santos, J.; Costa, Y.; Durand, J.; Mesquita, M.; Trindade, P.; Bernardes, E.; Silveira, R.; Oliveira, A.; et al. Developing a Checklist for Evaluating Virtual Learning Environments Through the Analysis of Evaluation Reports from an Educational Organization. In Proceedings of the HCI International 2022—Late Breaking Papers. Interaction in New Media, Learning and Games; Meiselwitz, G., Moallem, A., Zaphiris, P., Ioannou, A., Sottilare, R.A., Schwarz, J., Fang, X., Eds.; Springer Nature: Cham, Switzerland, 2022; pp. 364–376. [Google Scholar]
- Luís, C.; Rocha, Á.; Marcelino, M.J. Accessibility in Virtual Learning Environments. RISTI—Rev. Iber. De Sist. E Tecnol. De Inf. 2017, 2017, 54–65. [Google Scholar] [CrossRef]
- Ingavelez-Guerra, P.; Robles-Bykbaev, V.E.; Perez-Munoz, A.; Hilera-Gonzalez, J.; Oton-Tortosa, S.; Campo-Montalvo, E. RALO: Accessible Learning Objects Assessment Ecosystem Based on Metadata Analysis, Inter-Rater Agreement, and Borda Voting Schemes. IEEE Access 2023, 11, 8223–8239. [Google Scholar] [CrossRef]
- Garousi, V.; Felderer, M.; Mäntylä, M.V. Guidelines for Including Grey Literature and Conducting Multivocal Literature Reviews in Software Engineering. Inf. Softw. Technol. 2019, 106, 101–121. [Google Scholar] [CrossRef]
- Cabrera-Tenecela, P. New organization of research designs. South Am. Res. J. 2023, 3, 37–51. [Google Scholar]
- GNU Affero General Public License RStudio, New Open-Source IDE for R. Available online: https://www.rstudio.com/blog/rstudio-new-open-source-ide-for-r/ (accessed on 3 November 2022).
- The Jamovi Project Jamovi (Versión 2.4.8). Available online: https://www.jamovi.org/ (accessed on 28 September 2021).
- Aiken, L.R. Three Coefficients for Analyzing the Reliability and Validity of Ratings. Educ. Psychol. Meas. 1985, 45, 131–142. [Google Scholar] [CrossRef]
- Braun, V.; Clarke, V. Thematic Analysis: A Practical Guide; SAGE: London, UK; Thousand Oaks, CA, USA, 2021; ISBN 978-1-4739-5324-6. [Google Scholar]
- Cronbach, L.J. Coefficient Alpha and the Internal Structure of Tests. Psychometrika 1951, 16, 297–334. [Google Scholar] [CrossRef]
- McDonald, R.P. Test Theory: A Unified Treatment; Lawrence Erlbaum Associates Publishers: Mahwah, NJ, USA, 1999; ISBN 978-0-8058-3075-0. [Google Scholar]
- Shrout, P.E.; Fleiss, J.L. Intraclass Correlations: Uses in Assessing Rater Reliability. Psychol. Bull. 1979, 86, 420–428. [Google Scholar] [CrossRef]
- Tukey, J.W. Exploratory Data Analysis; Addison-Wesley: Boston, USA, 1977. [Google Scholar]
- Oviedo, H.C.; Campo-Arias, A. Aproximación al uso del coeficiente alfa de Cronbach. Rev. Colomb. De Psiquiatr. 2005, 34, 572–580. [Google Scholar]
- Roco-Videla, Á.; Aguilera-Eguía, R.; Olguin-Barraza, M.; Roco-Videla, Á.; Aguilera-Eguía, R.; Olguin-Barraza, M. Ventajas Del Uso Del Coeficiente de Omega de McDonald Frente al Alfa de Cronbach. Nutr. Hosp. 2024, 41, 262–263. [Google Scholar] [CrossRef]
Variable | Category | Validation Initial | Validation Extended |
---|---|---|---|
Gender | Female | 1 | 10 |
Male | 3 | 10 | |
Age | 31–39 years | 5 | |
40–49 years | 1 | 8 | |
50–59 years | 3 | 7 | |
Country | Cuba | 1 | |
Ecuador | 1 | 12 | |
Spain | 2 | 2 | |
Mexico | 3 | ||
Portugal | 1 | 2 | |
Highest level of education | Magister | 1 | 7 |
PhD | 3 | 13 | |
Profession | University lecturer | 4 | 20 |
Another | 0 | 0 | |
Area in which you work | Social | 1 | |
Education | 1 | 4 | |
Engineering | 1 | 3 | |
Computing | 2 | 9 | |
Administration | 2 | ||
Health | 1 | ||
Experience in educational quality evaluation | No | 0 | 2 |
Yes | 4 | 18 | |
Role in the evaluation process | Assistant evaluator | 1 | |
Internal evaluator | 2 | ||
Internal and external evaluator | 3 | 11 | |
External evaluator | 1 | 1 | |
Coordination | 1 | 3 | |
Years of accreditation and assessment experience | 0 years | 2 | |
1–5 years | 8 | ||
6–10 years | 6 | ||
11–15 years | 1 | 3 | |
16–20 years | 0 | ||
21–25 years | 2 | ||
26–30 years | 1 | 1 | |
Virtual or LMS-mediated training time | 1–5 years | 6 | |
6–10 years | 9 | ||
11–15 years | 1 | 2 | |
16–20 years | 2 | ||
21–25 years | 3 | 3 | |
Years of experience in accessibility | 0 years | 4 | |
1–5 years | 6 | ||
6–10 years | 5 | ||
11–15 years | 3 | ||
16–20 years | 4 | 1 |
Criterion | Description | Rating Scale |
---|---|---|
Sufficiency | The items (standard/activity) that belong to the same dimension/phase are sufficient to measure it. | 1–4 a |
Clarity | The item (standard/activity) is easily understood, i.e., its syntax and semantics are adequate. | 1–4 a |
Coherence | The item (standard/activity) is logically related to the dimension/phase or indicator it is measuring. | 1–4 a |
Relevance | Does the self-assessment guide allow institutions to improve their educational service? | 1–4 a |
Aspect Assessed | Number of Questions | Aiken’s V a |
---|---|---|
Practical value of the guide | 5 | 0.86 |
Contents of the guide | 2 | 0.81 |
Structure of the guide | 17 | 0.88 |
Guide configuration | 3 | 0.94 |
Total | 27 | 0.88 |
Dimension | Items | α | Ω | Interpretation |
---|---|---|---|---|
Organisation | 16 | 0.902 | 0.904 | Excellent reliability |
Student Body | 10-1 a | 0.720 | 0.764 | Acceptable reliability |
Teaching | 16 | 0.841 | 0.857 | Good reliability |
Infrastructure | 10 | 0.885 | 0.911 | Excellent reliability |
General | 52-1 a | 0.943 | 0.948 | Excellent reliability |
Phases | Items | α | Ω | Interpretation |
---|---|---|---|---|
Planning | 13 | 0.923 | 0.945 | Excellent reliability |
Model Tuning | 13 | 0.903 | 0.905 | Excellent reliability |
Evaluation | 16 | 0.963 | 0.967 | Excellent reliability |
Results | 19 | 0.909 | 0.919 | Excellent reliability |
Continuous Improvement | 13-1 a | 0.872 | 0.895 | Very good reliability |
General (All phases) | 74-1 a | 0.973 | 0.976 | Excellent reliability |
Dimensions | Phases | Total | Reliability | |||||||
---|---|---|---|---|---|---|---|---|---|---|
Grouping | Item | α | Ω | Item | α | Ω | Item | α | Ω | |
Sufficiency | 4 | 0.742 | 0.767 | 5 | 0.392 | 0.500 | 9 | 0.660 | 0.663 | Acceptable |
Coherence | 16-1 a | 0.846 | 0.871 | 23 | 0.938 | 0.942 | 38-1 a | 0.926 | 0.938 | Excellent |
Relevance | 16 | 0.864 | 0.879 | 23-1 a | 0.913 | 0.925 | 38-1 a | 0.916 | 0.933 | Excellent |
Clarity | 16 | 0.869 | 0.876 | 23 | 0.904 | 0.914 | 39 | 0.869 | 0.876 | Very good |
Variable | Criterion | Media | Medium | DE | Min. | Max. | Aiken’s V |
---|---|---|---|---|---|---|---|
Dimension 1 Organisation | Sufficiency | 3.40 | 3.00 | 0.50 | 3 | 4 | 0.80 |
Standard 1.1 Organisation | Coherence | 3.55 | 4.00 | 0.51 | 3 | 4 | 0.85 |
Standard 1.1 Organisation | Relevance | 3.65 | 4.00 | 0.67 | 2 | 4 | 0.88 |
Standard 1.1 Organisation | Clarity | 3.50 | 3.50 | 0.51 | 3 | 4 | 0.83 |
Standard 1.2 Course or academic programme information | Coherence | 3.6 | 4.00 | 0.60 | 2 | 4 | 0.87 |
Standard 1.2 Course or academic programme information | Relevance | 3.65 | 4.00 | 0.67 | 2 | 4 | 0.88 |
Standard 1.2 Course or academic programme information | Clarity | 3.40 | 3.50 | 0.68 | 2 | 4 | 0.80 |
Standard 1.3 Economics and technology finance | Coherence | 3.65 | 4.00 | 0.49 | 3 | 4 | 0.88 |
Standard 1.3 Economics and technology finance | Relevance | 3.65 | 4.00 | 0.59 | 2 | 4 | 0.88 |
Standard 1.3 Economics and technology finance | Clarity | 3.35 | 3.00 | 0.67 | 2 | 4 | 0.78 |
Standard 1.4 Knowledge management | Coherence | 3.45 | 4.00 | 0.76 | 2 | 4 | 0.82 |
Standard 1.4 Knowledge management | Relevance | 3.45 | 4.00 | 0.83 | 2 | 4 | 0.82 |
Standard 1.4 Knowledge management | Clarity | 3.30 | 3.50 | 0.80 | 2 | 4 | 0.77 |
Standard 1.5 Research and innovation | Coherence | 3.40 | 4.00 | 0.75 | 2 | 4 | 0.80 |
Standard 1.5 Research and innovation | Relevance | 3.60 | 4.00 | 0.68 | 2 | 4 | 0.87 |
Standard 1.5 Research and innovation | Clarity | 3.40 | 3.50 | 0.68 | 2 | 4 | 0.80 |
Total | 3.50 | 3.63 | 0.42 | 2.81 | 4 | 0.83 |
Shapiro–Wilk | |||||||
---|---|---|---|---|---|---|---|
Dimensions | Media | Medium | DE | Min | Max | W | p |
Criteria | |||||||
Sufficiency | 3.81 | 3.89 | 0.193 | 3.33 | 4.00 | 0.873 | 0.013 |
Coherence | 3.77 | 3.80 | 0.239 | 3.05 | 4.00 | 0.850 | 0.005 |
Relevance | 3.80 | 3.87 | 0.222 | 3.13 | 4.00 | 0.829 | 0.002 |
Clarity | 3.67 | 3.67 | 0.228 | 3.10 | 3.97 | 0.944 | 0.289 |
Standards | Institution 1 | Institution 2 | Institution 3 | Institution 4 |
---|---|---|---|---|
1. Organisation | ||||
1.1 Organisation | Critical | Critical | Critical | Important |
1.2. Course or academic programme information | Critical | Important | Critical | Important |
1.3 Economics and technology financing | Critical | Critical | Critical | Critical |
1.4 Knowledge management | Important | Important | Critical | Critical |
1.5. Research and innovation | Important | Important | Important | Critical |
2. Students | ||||
2.1 Student support | Critical | Critical | Critical | Important |
2.2 Admission | Critical | Critical | Critical | Critical |
2.3 Diversity and inclusion | Critical | Critical | Critical | Critical |
3. Teaching | ||||
3.1 Teacher profile | Critical | Critical | Critical | Critical |
3.2. Support for teachers | Important | Important | Critical | Critical |
3.3 Learning content and resources | Critical | Critical | Critical | Critical |
3.4 Learning strategies | Critical | Critical | Critical | Critical |
3.5 Electronic evaluation | Critical | Critical | Critical | Critical |
4. Infrastructure | ||||
4.1 Technological infrastructure and equipment | Critical | Critical | Critical | Critical |
4.2 Learning management platform | Critical | Critical | Critical | Critical |
4.3 Technical assistance and support | Critical | Important | Critical | Critical |
Level | Requirements | |||
---|---|---|---|---|
Institution 1 | Institution 2 | Institution 3 | Institution 4 | |
Critical | 39 | 33 | 45 | 8 |
Important | 11 | 14 | 6 | 44 |
Desirable | 2 | 3 | 1 | 1 |
Not applicable | 1 | 3 | 1 | 0 |
Standards | Institution 1 | Institution 2 | Institution 3 | Institution 4 |
---|---|---|---|---|
1. Organisation | 0.74 | 0.52 | 0.89 | 0.9 |
1.1 Organisation | 0.72 | 0.57 | 0.94 | 0.94 |
1.2 Course or academic programme information | 0.68 | 0.76 | 0.75 | 0.83 |
1.3 Economics and technology financing | 1 | 0.35 | 1 | 1 |
1.4 Knowledge management | 0.47 | 0.51 | 0.9 | 0.82 |
1.5 Research and innovation | 0.74 | 0.44 | 0.87 | 0.9 |
2. Students | 0.51 | 0.52 | 0.87 | 0.77 |
2.1 Student support | 0.49 | 0.72 | 1 | 0.86 |
2.2 Admission | 0.35 | 0.54 | 0.9 | 0.77 |
2.3 Diversity and inclusion | 0.7 | 0.3 | 0.7 | 0.7 |
3. Teaching | 0.23 | 0.4 | 0.55 | 0.83 |
3.1 Teacher profile | 0.15 | 0.41 | 0.6 | 0.85 |
3.2 Support for teachers | 0.43 | 0.58 | 0.9 | 0.8 |
3.3 Learning content and resources | 0.11 | 0.24 | 0.12 | 0.68 |
3.4 Learning strategies | 0.3 | 0.7 | 0.65 | 0.85 |
3.5 Electronic evaluation | 0.23 | 0.12 | 0.49 | 0.93 |
4. Infrastructure | 0.74 | 0.61 | 0.53 | 0.8 |
4.1 Technological infrastructure and equipment | 1 | 0.57 | 1 | 1 |
4.2 Learning management platform | 0.72 | 0.85 | 0.3 | 0.7 |
4.3 Technical assistance and support | 0.5 | 0.3 | 0.3 | 0.7 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Timbi-Sisalima, C.; Sánchez-Gordón, M.; Otón-Tortosa, S.; Mendoza-González, R. Self-Assessment Guide to Quality in Accessible Virtual Education: An Expert Validation. Sustainability 2024, 16, 10011. https://doi.org/10.3390/su162210011
Timbi-Sisalima C, Sánchez-Gordón M, Otón-Tortosa S, Mendoza-González R. Self-Assessment Guide to Quality in Accessible Virtual Education: An Expert Validation. Sustainability. 2024; 16(22):10011. https://doi.org/10.3390/su162210011
Chicago/Turabian StyleTimbi-Sisalima, Cristian, Mary Sánchez-Gordón, Salvador Otón-Tortosa, and Ricardo Mendoza-González. 2024. "Self-Assessment Guide to Quality in Accessible Virtual Education: An Expert Validation" Sustainability 16, no. 22: 10011. https://doi.org/10.3390/su162210011
APA StyleTimbi-Sisalima, C., Sánchez-Gordón, M., Otón-Tortosa, S., & Mendoza-González, R. (2024). Self-Assessment Guide to Quality in Accessible Virtual Education: An Expert Validation. Sustainability, 16(22), 10011. https://doi.org/10.3390/su162210011