A Predictive System Informed by Students’ Similar Behaviour
Abstract
:1. Introduction
1.1. Data Analysis and Learning Analytics
1.2. At-Risk Learners
2. Case Study Contextual Description
2.1. The Technological Context
2.2. The Educational Context
3. Research Areas and Related Research Questions
3.1. Perceived Usefulness
- [RQ1] What is the alleged practicality of u-Tutor?
- [RQ2] How effective is u-Tutor in causing tutor actions that would not occur without the tool?
3.2. Usability
- [RQ3] Do the tutors comprehend and be familiar with using the visual information and the interface options?
3.3. Success Rate of Classification
- [RQ4] To what extent do the classifications match the actual results?
- [RQ5] To what extent do the classifications match the tutors’ beliefs?
4. Methodology
4.1. Settings of the Case Study
4.2. Data Capture Methods
4.3. Analysis Methods
5. Results
5.1. Perceived Usefulness
5.2. Usability
5.3. Success Rate of Classification
6. Limitations of the Study
7. Conclusions and Future Work
Funding
Conflicts of Interest
Appendix A
Nr | Question | Type of Question | Possible Answers |
---|---|---|---|
1 | How often did you use the tool? | (Multiple-choice) |
|
2 | About the information given by u-Tutor | (Multiple-choice) |
|
3 | When you used the tool, what was your purpose? | (Multiple-choice) |
|
4 | Did you decide to actively support any student due to u-Tutor information? | (Multiple-choice) |
|
5 | If your previous answer was ‘yes’, explain what type of support. | (Open question) | - |
6 | Choose the reason for your support action. | (Multiple-choice) |
|
7 | For what task did u-Tutor support you? | (Open question) | - |
8 | Did you integrate u-Tutor into your daily workflow? | (Open question) | - |
9 | Would you like to use u-Tutor in future courses? | (Open question) | - |
References
- Siemens, G.; d Baker, R.S. Learning analytics and educational data mining. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge—LAK ’12, Vancouver, BC, Canada, 29 April–2 May 2012; ACM Press: New York, NY, USA, 2012; p. 252. [Google Scholar] [CrossRef]
- Vieira, C.; Parsons, P.; Byrd, V. Visual learning analytics of educational data: A systematic literature review and research agenda. Comput. Educ. 2018, 122, 119–135. [Google Scholar] [CrossRef]
- Papadakis, S.; Kalogiannakis, M.; Sifaki, E.; Vidakis, N. Access Moodle Using Smart Mobile Phones. A Case Study in a Greek University. In Interactivity, Game Creation, Design, Learning, and Innovation; ArtsIT 2017, DLI 2017; Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering; Brooks, A., Brooks, E., Vidakis, N., Eds.; Springer: Cham, Switzerland, 2018; Volume 229, pp. 376–385. [Google Scholar]
- Leony, D.; Crespo, R.M.; Perez-Sanagustin, M.; de la Fuente Valentín, L.; Pardo, A. Coverage metrics for learning-event datasets based on client-side monitoring. In Proceedings of the 2012 IEEE 12th International Conference on Advanced Learning Technologies, Rome, Italy, 4–6 July 2012. [Google Scholar]
- Romero-Zaldivar, V.-A.; Pardo, A.; Burgos, D.; Delgado Kloos, C. Monitoring student progress using virtual appliances: A case study. Comput. Educ. 2012, 58, 1058–1067. [Google Scholar] [CrossRef] [Green Version]
- Tobarra, L.; Ros, S.; Hernández, R.; Robles-Gómez, A.; Caminero, A.C.; Pastor, R. Integration of multiple data sources for predicting the engagement of students in practical activities. Int. J. Interact. Multimed. Artif. Intell. 2014, 2, 53–62. [Google Scholar] [CrossRef]
- Dunn, K.E.; Rakes, G.C.; Rakes, T.A. Influence of academic self-regulation, critical thinking, and age on online graduate students’ academic help-seeking. Distance Educ. 2014, 35, 75–89. [Google Scholar] [CrossRef]
- Greene, J.A.; Azevedo, R. A theoretical review of Winne and Hadwin’s model of self-regulated learning: New perspectives and directions. Rev. Educ. Res. 2007, 77, 334–372. [Google Scholar] [CrossRef] [Green Version]
- Mangaroska, K.; Giannakos, M.N. Learning analytics for learning design: A systematic literature review of analytics-driven design to enhance learning. IEEE Trans. Learn. Technol. 2018, 12, 516–534. [Google Scholar] [CrossRef] [Green Version]
- Jayaprakash, S.M.; Moody, E.W.; Lauría, E.J.M.; Regan, J.R.; Baron, J.D.; Baron, J.D. Early alert of academically at-risk students: An open source analytics initiative. J. Learn. Anal. 2014, 1, 6–47. [Google Scholar] [CrossRef] [Green Version]
- Bainbridge, J.; Melitski, J.; Zahradnik, A.; Lauría, E.J.M.; Jayaprakash, S.; Baron, J. Using learning analytics to predict at-risk students in online graduate public affairs and administration education. J. Public Aff. Educ. 2015, 21, 247–262. [Google Scholar] [CrossRef]
- Cambruzzi, W.; Rigo, S.J.; Barbosa, J.L.V. Dropout prediction and reduction in distance education courses with the learning analytics multitrail approach. J. Univers. Comput. Sci. 2015, 21, 23–47. [Google Scholar]
- Papamitsiou, Z.; Economides, A.A. Learning analytics and educational data mining in practice: A systematic literature review of empirical evidence. J. Educ. Technol. Soc. 2014, 17, 49–64. [Google Scholar]
- Prieto, L.P.; Rodríguez Triana, M.J.; Martínez Maldonado, R.; Dimitriadis, Y.A.; Gašević, D. Orchestrating learning analytics (OrLA): Supporting inter-stakeholder communication about adoption of learning analytics at the classroom level. Australas. J. Educ. Technol. 2019, 35, 14–33. [Google Scholar] [CrossRef]
- Lukarov, V.; Chatti, M.A.; Schroeder, U. Learning analytics evaluation—Beyond usability. In Proceedings of the DeLFI Workshops; Rathmayer, S., Pongratz, H., Eds.; CEUR Workshop Proceedings: Aachen, Germany, 2015; pp. 123–131. [Google Scholar]
- Dawson, S.; Gašević, D.; Siemens, G.; Joksimovic, S. Current state and future trends: A citation network analysis of the learning analytics field. In Proceedings of the Fourth International Conference on Learning Analytics and Knowledge—LAK ’14, Indianapolis, IN, USA, 24–28 March 2014; ACM Press: New York, NY, USA, 2014; pp. 231–240. [Google Scholar] [CrossRef]
- De-la-Fuente-Valentín, L.; Burgos, D. Am I doing well? A4Learning as a self-awareness tool to integrate in Learning Management Systems. Campus Virtuales. 2014, 3, 32–40. [Google Scholar]
- Shneiderman, B. The eyes have it: A task by data type taxonomy for information visualizations. In Proceedings of the 1996 IEEE Symposium on Visual Languages, Boulder, CO, USA, 3–6 September 1996. [Google Scholar]
- De-la-Fuente-Valentín, L.; Burgos, D. A4Learning: Un enfoque metodológico iterativo para apoyar mejor el aprendizaje y la enseñanza. IEEE Lat. Am. Trans. 2015, 13, 477–484. [Google Scholar]
- Drachsler, H.; Greller, W. The pulse of learning analytics understandings and expectations from the stakeholders. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, Vancouver, BC, Canada, 29 April–2 May 2012; ACM: New York, NY, USA, 2012; pp. 120–129. [Google Scholar]
- Stake, D.R.E. The Art of Case Study Research; Sage: Thousand Oaks, CA, USA, 1995. [Google Scholar]
- Brooke, J. SUS—A quick and dirty usability scale. In Usability Evaluation in Industry; Taylor & Francis: London, UK, 1996; p. 189. [Google Scholar]
Dimension | Values |
---|---|
Stakeholders | Data subjects: The students Data clients: Tutors |
Objective | Reflection: The system captures similarities among students to inform the tutor about the marks obtained by those who, in previous courses, behaved similarly to a given student. |
Data | Protected dataset: Students’ interactions within the LMS Time scale: The interactions were analysed within a frame of 3 weeks. |
Instruments | Algorithms: Similarity measurements as described by de-la-Fuente-Valentín and colleagues [18]. Visualization: Graphical solution designed to support this tool |
External limitations | Ethics: What are the dangers of misinterpreting the data? Data protection: The students have the legal right not to be analyzed. |
Internal limitations | Required competences: Will the visualization be eloquent enough to be easily understood by the tutors? |
Date | Single Students | Grouped Students |
---|---|---|
Last day–0 | 12 | 13 |
Last day–1 | 19 | 15 |
Last day–2 | 19 | 41 |
Last day–3 | 15 | 19 |
Last day–4 | 26 | 116 |
Date/Estimations Made | Severe Risk | Risk | Pass | Outstanding |
---|---|---|---|---|
Last day–0 | 9 | 2 | 3 | 1 |
Last day–1 | 15 | 3 | 4 | 2 |
Last day–2 | 7 | 9 | 5 | 7 |
Last day–3 | 3 | 10 | 4 | 3 |
Last day–4 | 18 | 16 | 15 | 17 |
Question | Selected Answers | |
---|---|---|
1 | How often did you use the tool? | Most of the time I worked in the supported courses. (Multiple-choice) |
2 | About the information given by u-Tutor | I could get the information by myself, but u-Tutor makes the task more agile. (Multiple-choice) |
3 | When you used the tool, what was your purpose? | Obtain information on the students. (Multiple-choice) |
4 | Did you decide to actively support any student due to u-Tutor information? | Yes, some of the students. (Multiple-choice) |
5 | If your previous answer was ‘yes’, explain what type of support. | It was easy to find inactive students. I called them to understand what was happening. (Open question) |
6 | Choose the reason for your support action. | I supported the student because u-Tutor warned me about a situation I would not have found by myself. (Multiple-choice) |
7 | For what task did u-Tutor support you? | To find students with low participation. (Open question) |
8 | Did you integrate u-Tutor into your daily workflow? | No, I did not./Yes, I have tried to integrate the tool. (Open question) |
9 | Would you like to use u-Tutor in future courses? | No, because in this case, all the marked activities are delivered at the end of the course, and I do not know if the activity is enough to classify students. It would be preferable to use it in courses with continuous submissions./ Yes, u-Tutor gives me an outstanding view of what is going on with my groups. I need to understand better how to use it more efficiently, but I think that the early results look promising and will help me in improving my support to the students. (Open question) |
Question | Answer | ||
---|---|---|---|
10 | To what extent do you agree with the following assertions? 1 = strongly disagree 5 = strongly agree (Likert scale) | u-Tutor, without any contextualization, is quite often successful in classifying students. | 4 |
11 | After contextualizing information from u-Tutor, I often succeed in classifying students. | 4 | |
12 | u-Tutor estimations match my estimations. | 4 |
Actual Score | Tutor Estimated Interval | Error in Estimation |
---|---|---|
6.92 | [7,8] | 0.33 |
6.42 | [4,6] | 0.92 |
9.3 | [6,9] | 1.05 |
8.04 | [5,7] | 1.54 |
8.14 | [4,6] | 2.64 |
0 | [3,5] | 3.5 |
8.34 | [4,5] | 3.59 |
Average error in human estimation if no success (student failure) | 2.39 |
Average error in human estimation if no success (student failure) (discarding dropouts) | 1.42 |
© 2020 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Burgos, D. A Predictive System Informed by Students’ Similar Behaviour. Sustainability 2020, 12, 706. https://doi.org/10.3390/su12020706
Burgos D. A Predictive System Informed by Students’ Similar Behaviour. Sustainability. 2020; 12(2):706. https://doi.org/10.3390/su12020706
Chicago/Turabian StyleBurgos, Daniel. 2020. "A Predictive System Informed by Students’ Similar Behaviour" Sustainability 12, no. 2: 706. https://doi.org/10.3390/su12020706
APA StyleBurgos, D. (2020). A Predictive System Informed by Students’ Similar Behaviour. Sustainability, 12(2), 706. https://doi.org/10.3390/su12020706