Hierarchy-Based Competency Structure and Its Application in E-Evaluation
Abstract
:Featured Application
Abstract
1. Introduction
2. Related Works: Competency-Based Education and E-Evaluation
3. Proposed Hierarchy-Based Competency Structure and Its Application in E-Evaluation
3.1. Methodology for Competency Tree Design
- Identification of the highest level competency for the root of the competency tree.
- Each competency is analyzed and divided into smaller ones. For competency division into smaller ones, a couple of rules have to be applied:
- 2.1.
- Parent level competency must contain the property of subject context (artifact), which can be divided into smaller ones.
- 2.2.
- Child level competencies must be unambiguous and not overlapping with other nodes. The child level competencies should be as partial solutions for parent level competency and together should form the whole unit for parent level competency analog (without the integration competency). Child level competencies are divided by principle that their nodes of child-level have to get different values of the property of subject context (parent level).
- Iteratively each parent-child competencies should repeat step number 2.
- The tree design is finished when the child competencies are atomic and cannot be divided into smaller ones.
3.2. Competency Tree Application in E-Evaluation
- Evaluate the competencies from the course competency tree and select what needed to be evaluated in the task (if competency is selected, no child nodes need to be selected as all sub-levels are included automatically).
- Each competency is analyzed to define which competencies will be evaluated in the task and which are pre-required.
- 2.1.
- If competency will be evaluated, it is included in the test’s competency list. If one child competency is evaluated, all child competencies of the node must be included for evaluation too.
- 2.2.
- If competency will not be evaluated, it means the student must have the competency already. This competency and all its sibling competencies, as well as child competencies, are included in the list of pre-required competencies.
- 2.3.
- Students should be allowed to take the test only when all pre-required competencies are achieved. Therefore, additional analysis of task competency tree selection is required:
- 2.3.1.
- Previously evaluated tasks have to be taken into account to trace the sequence of competency achieving. If some competencies from the pre-required competency list were not covered in previous tasks, the task cannot be assigned to students. The tasks must be modified, or additional task needs to be added before this task.
- 2.3.2.
- Each student must have the needed competencies to do the task. If students with missing pre-required competencies exist, the system or tutor should advise those students to do additional or previous tasks to achieve the needed competencies.
- All tasks are arranged from the top score to lower score and stored as a list of possible tasks.
- The task with the highest score is placed to students’ tasks list for solving (if multiple tasks exist with score 1, it can be selected randomly to generate different conditions for different students).
- The student reads the task from the task list for solving and tries to provide the correct answer for the task:
- 3.1.
- If the answer is incorrect, the task is removed from the list, and new tasks are placed in the list with lower-level competencies. The tasks are selected as follows:
- 3.1.1.
- If the task is mapped with multiple competencies, tasks with one competency are added for each competency mapped with the failed tasks.
- 3.1.2.
- If the task is mapped with one competency only, tasks with lower-level competencies of this competency are added to the list of solving.
- 3.2.
- If the answer is correct, the task is removed from the list of solving, it is added to the list of solved tasks, and new tasks with higher competency level are added. It should be one level up competency task or task with multiple competencies of the same level (choice for the student can be provided).
- The step number 3 should be repeated until the student correctly solves a task with score 1, or it is already time for a competency evaluation.
- The final mark is calculated according to the list of solved tasks, based on the set of achieved competencies.
4. Case Analysis
4.1. Design of Competency Tree for Object-Oriented Programming Course and One of its Tests
4.2. Evaluation of Students’ Competencies with the Help of Competency Tree
5. Conclusions and Future Works
Author Contributions
Funding
Conflicts of Interest
References
- Ruževičius, J.; Daugvilienė, D.; Serafinas, D. Kokybės vadybos taikymo aukštosiose mokyklose įžvalgos. Viešoji politika ir administravimas 2008, 24, 99–113. [Google Scholar]
- Gasiūnaitė, M. Policy of Higher Education Quality Assurance: Opportunities and Barriers to the Development of Liberal Education in Lithuania. Viešoji Politika Ir Administravimas 2018, 17, 284–297. [Google Scholar]
- Gumuliauskienė, A. Švietimo Politikos Formavimas Ir Įgyvendinimas: Dermės Aspektas. Švietimas: Politika Vadyba Kokybė 2013, 1, 19–39. [Google Scholar]
- Černá, M.; Svobodová, L. Development of Computer Competence Courses in Seniors—Shift from Learning Space with Computer-Based Activities to Virtual Platform—Case Study. In Blended Learning. Enhancing Learning Success; Cheung, S.K.S., Kwok, L., Kubota, K., Lee, L.-K., Tokito, J., Eds.; Springer International Publishing: Basel, Switzerland, 2018; pp. 416–425. [Google Scholar]
- Catz, B.; Sabag, N.; Gero, A. Problem Based Learning and Students’ Motivation: The Case of an Electronics Laboratory Course. Int. J. Eng. Educ. 2018, 34, 1838–1847. [Google Scholar]
- Liu, F.-H.; Chen, L.-J. The Impacts of Competence-Based Marketing Capabilities: Relative Attention, Relationship Learning and Collaboration Development. J. Bus. Ind. Mark. 2018. [Google Scholar] [CrossRef]
- Birjali, M.; Beni-Hssane, A.; Erritali, M. A Novel Adaptive E-Learning Model Based on Big Data by Using Competence-Based Knowledge and Social Learner Activities. Appl. Soft Comput. 2018, 69, 14–32. [Google Scholar] [CrossRef]
- Ritzhaupt, A.D.; Martin, F.; Pastore, R.; Kang, Y. Development and Validation of the Educational Technologist Competencies Survey (ETCS): Knowledge, Skills, and Abilities. J. Comput. High. Educ. 2018, 30, 3–33. [Google Scholar] [CrossRef]
- Popat, S.; Starkey, L. Learning to Code or Coding to Learn? A Systematic Review. Comput. Educ. 2019, 128, 365–376. [Google Scholar] [CrossRef]
- Janssen, J.; Stoyanov, S.; Ferrari, A.; Punie, Y.; Pannekeet, K.; Sloep, P. Experts’ Views on Digital Competence: Commonalities and Differences. Comput. Educ. 2013, 68, 473–481. [Google Scholar] [CrossRef]
- Sapp Nelson, M.R. A Pilot Competency Matrix for Data Management Skills: A Step toward the Development of Systematic Data Information Literacy Programs. J. EScience Librariansh. 2017, 6, 1. [Google Scholar] [CrossRef]
- Benzian, H.; Greenspan, J.S.; Barrow, J.; Hutter, J.W.; Loomer, P.M.; Stauf, N.; Perry, D.A. A Competency Matrix for Global Oral Health. J. Dent. Educ. 2015, 79, 353–361. [Google Scholar]
- Programmer Competency Matrix|Sijin Joseph. Available online: https://sijinjoseph.com/programmer-competency-matrix/ (accessed on 26 July 2019).
- Peylo, C.; Teiken, W.; Rollinger, C.-R.; Gust, H. An Ontology as Domain Model in a Web-Based Educational System for Prolog. In FLAIRS Conference; AAAI Press: Menlo Park, CA, USA, 2000; pp. 55–59. [Google Scholar]
- Paramythis, A.; Loidl-Reisinger, S. Adaptive Learning Environments and E-Learning Standards. Electron. J. e-Learn. 2004, 2, 181–194. [Google Scholar]
- Fallon, C.; Brown, S. E-Learning Standards: A Guide to Purchasing, Developing, and Deploying Standards-Conformant E-Learning; CRC Press: Boca Raton, FL, USA, 2002. [Google Scholar]
- Chapman-Waterhouse, E.; Silva-Fletcher, A.; Whittlestone, K. The Use of Reusable Learning Objects to Enhance the Delivery of Veterinary Education: A Literature Review. In Online Course Management: Concepts, Methodologies, Tools, and Applications; IGI Global: Hershey, PA, USA, 2018; pp. 1778–1792. [Google Scholar]
- Pons, D.; Hilera, J.R.; Fernandez, L.; Pages, C. Managing the Quality of E-Learning Resources in Repositories. Comput. Appl. Eng. Educ. 2015, 23, 477–488. [Google Scholar] [CrossRef]
- Dondi, C.; Moretti, M.; Nascimbeni, F. Quality of E-Learning: Negotiating a Strategy, Implementing a Policy. In Handbook on Quality and Standardisation in E-Learning; Springer: Berlin, Germany, 2006; pp. 31–50. [Google Scholar]
- Zafar, A.; Albidewi, I. Evaluation Study of ELGuide: A Framework for Adaptive e-Learning. Comput. Appl. Eng. Educ. 2015, 23, 542–555. [Google Scholar] [CrossRef]
- Jonsdottir, A.H.; Stefansson, G. From Evaluation to Learning: Some Aspects of Designing a Cyber-University. Comput. Educ. 2014, 78, 344–351. [Google Scholar] [CrossRef]
- Schiaffino, S.; Garcia, P.; Amandi, A. ETeacher: Providing Personalized Assistance to e-Learning Students. Comput. Educ. 2008, 51, 1744–1754. [Google Scholar] [CrossRef]
- Zhang, M.-L.; Zhou, Z.-H. A Review on Multi-Label Learning Algorithms. IEEE Trans. Knowl. Data Eng. 2013, 26, 1819–1837. [Google Scholar] [CrossRef]
- Lin, H.-F. An Application of Fuzzy AHP for Evaluating Course Website Quality. Comput. Educ. 2010, 54, 877–888. [Google Scholar] [CrossRef]
- Kavcic, A. Fuzzy User Modeling for Adaptation in Educational Hypermedia. IEEE Trans. Syst. Man Cybern. Part C (Appl. Rev.) 2004, 34, 439–449. [Google Scholar] [CrossRef]
- Chrysafiadi, K.; Virvou, M. Modeling Student’s Knowledge on Programming Using Fuzzy Techniques. In Intelligent Interactive Multimedia Systems and Services; Springer: Berlin, Germany, 2010; pp. 23–32. [Google Scholar]
- Gomathi, C.; Rajamani, V. Skill-Based Education through Fuzzy Knowledge Modeling for e-Learning. Comput. Appl. Eng. Educ. 2018, 26, 393–404. [Google Scholar] [CrossRef]
- Liang, L.; Deng, X.; Liu, Q. Task-Driven and Objective-Oriented Hierarchical Education Method: A Case Study in Linux Curriculum. In 2008 IEEE International Symposium on IT in Medicine and Education; IEEE: Piscataway, NJ, USA, 2008; pp. 316–318. [Google Scholar]
- Qvortrup, A.; Rasmussen, H.F. Learning Objectives as Frameworks and Resources in Upper Secondary Education: Between Means and Aims. In Dealing with Conceptualisations of Learning; Brill Sense: Leiden, The Netherlands, 2017; pp. 61–80. [Google Scholar]
- Koraneekij, P.; Khlaisang, J. Development of Learning Outcome Based E-Portfolio Model Emphasizing on Cognitive Skills in Pedagogical Blended e-Learning Environment for Undergraduate Students at Faculty of Education, Chulalongkorn University. Procedia-Soc. Behav. Sci. 2015, 174, 805–813. [Google Scholar] [CrossRef]
- Bodily, R.; Ikahihifo, T.K.; Mackley, B.; Graham, C.R. The Design, Development, and Implementation of Student-Facing Learning Analytics Dashboards. J. Comput. High. Educ. 2018, 30, 572–598. [Google Scholar] [CrossRef]
- McClelland, S. A Model for Designing Objective-Oriented Training Evaluations. Ind. Commer. Train. 1994, 26, 3–9. [Google Scholar] [CrossRef]
- Slotkiene, A. Peculiarities of Applying the Design Method of Active Learning Object. In Proceedings of the ITI 2012 34th International Conference on Information Technology Interfaces, Cavtat, Croatia, 25–28 June 2012; pp. 243–248. [Google Scholar]
- Niu, N.; Easterbrook, S. Concept Analysis for Product Line Requirements. In Proceedings of the 8th ACM international conference on Aspect-oriented software development, Charlottesville, VA, USA, 2–6 March 2009; pp. 137–148. [Google Scholar]
- Tononi, G.; Edelman, G.M.; Sporns, O. Complexity and Coherency: Integrating Information in the Brain. Trends Cogn. Sci. 1998, 2, 474–484. [Google Scholar] [CrossRef]
- Deng, R.; Benckendorff, P.; Gannaway, D. Progress and New Directions for Teaching and Learning in MOOCs. Comput. Educ. 2019, 129, 48–60. [Google Scholar] [CrossRef]
- Eichmann, B.; Goldhammer, F.; Greiff, S.; Pucite, L.; Naumann, J. The Role of Planning in Complex Problem Solving. Comput. Educ. 2019, 128, 1–12. [Google Scholar] [CrossRef]
Traditional Evaluation | Proposed Evaluation | |
---|---|---|
Number of tasks in the exam | 1 | 8 |
Level of evaluated competencies in one task | 1 | [1; 5] |
Grade range | {0, 100} | [0; 100] |
Number of students (who tried to do the task) | 23 | 25 |
Average number of task submission for evaluation | 3.17 | 4.96 |
Average score in e-evaluation | 47.83 | 78.80 |
Standard deviation of the score in e-evaluation | 51.08 | 27.79 |
Score step in e-evaluation | 100 | 1 |
Average score in tutor evaluation | 72.17 | 72.40 |
Standard deviation of the score in tutor evaluation | 34.32 | 28.33 |
Score step in tutor evaluation | 10 | 10 |
Range of system and tutor scores | [−80; 30] | [−30; 40] |
Average difference between system and tutor scores | −24.35 | 6.4 |
Standard deviation of the difference between system and tutor scores | 35.14 | 15.78 |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ramanauskaitė, S.; Slotkienė, A. Hierarchy-Based Competency Structure and Its Application in E-Evaluation. Appl. Sci. 2019, 9, 3478. https://doi.org/10.3390/app9173478
Ramanauskaitė S, Slotkienė A. Hierarchy-Based Competency Structure and Its Application in E-Evaluation. Applied Sciences. 2019; 9(17):3478. https://doi.org/10.3390/app9173478
Chicago/Turabian StyleRamanauskaitė, Simona, and Asta Slotkienė. 2019. "Hierarchy-Based Competency Structure and Its Application in E-Evaluation" Applied Sciences 9, no. 17: 3478. https://doi.org/10.3390/app9173478
APA StyleRamanauskaitė, S., & Slotkienė, A. (2019). Hierarchy-Based Competency Structure and Its Application in E-Evaluation. Applied Sciences, 9(17), 3478. https://doi.org/10.3390/app9173478