Measuring Cognition Load Using Eye-Tracking Parameters Based on Algorithm Description Tools
Abstract
:1. Introduction
2. Theoretical Background
3. Materials and Methods
3.1. Test Subjects
3.2. Test Conditions and Steps of the Research
3.3. Applied Algorithms
Algorithm 1: Possible Implementation of Decision |
1: data: T: input array; N: length of array; findable: value to look for. 2: procedure Decision (T, N, findable) 3: while i ≤ N and T[i] ≠ findable do 4: i ← i + 1 5: end while 6: if i ≤ N then 7: output: “Found” 8: else 9: output: “Not Found” 10: end if 11: end procedure |
Algorithm 2: Possible Implementation of Intersection |
1: data: A, B, C: input arrays; N: length of A; M: length of B 2: procedure Intersection (A, B, C, N, M) 3: k ← 0 4: for i ← 1 to N do 5: j ← 0 6: while j ≤ M and B[j] ≠ A[i] do 7: j ← j + 1 8: end while 9: if j ≤ M then 10: c[k] ← a[i] 11: k ← k + 1 12: end if 13: end for 14: end procedure |
Algorithm 3: Possible Implementation of Union |
1: data: A, B, C: input arrays; N: length of A; M: length of B 2: procedure Union (A, B, C, N, M) 3: for i ← 1 to N do 4: c[i] ← a[i] 5: k ← n 6: for j ← 1 to M do 7: i ← 0 8: while i ≤ N and B[i] ≠ A[i] do 9: i ← i + 1 10: end while 11: if i ≥ N then 12: c[k] ← b[j] 13: k ← k + 1 14: end if 15: end for 16: end procedure |
4. Results
4.1. Fixation Duration Mean Based on AOIs
4.2. Number of Fixations Based on AOIs
4.3. Average of Pupil Diameter Based on AOIs
5. Discussion
6. Conclusions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Charntaweekhun, K.; Wangsiripitak, S. Visual Programming Using Flowchart. In Proceedings of the 2006 International Symposium on Communications and Information Technologies, Bangkok, Thailand, 18–20 October 2006; IEEE: Piscataway, NJ, USA, 2006; pp. 1062–1065. [Google Scholar]
- Kovari, A. Study of Algorithmic Problem-Solving and Executive Function. Acta Polytech. Hung. 2020, 17, 241–256. [Google Scholar] [CrossRef]
- Francisti, J.; Balogh, Z.; Reichel, J.; Magdin, M.; Koprda, S.; Molnár, G. Application Experiences Using IoT Devices in Education. Appl. Sci. 2020, 10, 7286. [Google Scholar] [CrossRef]
- Kovari, A.; Rajcsányi-Molnár, M. Mathability and Creative Problem Solving in the MaTech Math Competition. Acta Polytech. Hung. 2020, 17, 147–161. [Google Scholar] [CrossRef]
- Xinogalos, S. Using Flowchart-Based Programming Environments for Simplifying Programming and Software Engineering Processes. In Proceedings of the 2013 IEEE Global Engineering Education Conference (EDUCON), Berlin, Germany, 13–15 March 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 1313–1322. [Google Scholar]
- Cabo, C. Effectiveness of Flowcharting as a Scaffolding Tool to Learn Python. In Proceedings of the 2018 IEEE Frontiers in Education Conference (FIE), San Jose, CA, USA, 3–6 October 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1–7. [Google Scholar]
- Hooshyar, D.; Ahmad, R.B.; Nasir, M.H.N.M.; Mun, W.C. Flowchart-Based Approach to Aid Novice Programmers: A Novel Framework. In Proceedings of the 2014 International Conference on Computer and Information Sciences (ICCOINS), Kuala Lumpur, Malaysia, 3–5 June 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 1–5. [Google Scholar]
- Shafeek, N.; Karunarathne, D.D. A Prototype Compiler to Convert Source-Code to Flowchart. In Proceedings of the 2018 18th International Conference on Advances in ICT for Emerging Regions (ICTer), Colombo, Sri Lanka, 26–29 September 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 157–167. [Google Scholar]
- Ying, M.; Feng, Y. A Flowchart Language for Quantum Programming. IEEE Trans. Softw. Eng. 2010, 37, 466–485. [Google Scholar] [CrossRef]
- Kovari, A. The synergy of digital society and digital education. Civ. Szle. 2020, 17, 69–72. [Google Scholar]
- Charleton, S.; O’Brien, T. Measurement of Cognitive States in Testing and Evaluation. In Handbook of Human Factors and Evaluation; CRC Press: Boca Raton, FL, USA, 2002; pp. 97–126. [Google Scholar]
- Guzsvinecz, T.; Orbán-Mihálykó, É.; Perge, E.; Sik-Lányi, C. Analyzing the spatial skills of university students with a Virtual Reality application using a desktop display and the Gear VR. Acta Polytech. Hung. 2020, 17, 35–56. [Google Scholar] [CrossRef]
- Guzsvinecz, T.; Sik-Lanyi, C.; Orban-Mihalyko, E.; Perge, E. The Influence of Display Parameters and Display Devices over Spatial Ability Test Answers in Virtual Reality Environments. Appl. Sci. 2020, 10, 526. [Google Scholar] [CrossRef] [Green Version]
- Guzsvinecz, T.; Orbán-Mihálykó, É.; Sik-Lányi, C.; Perge, E. Investigation of spatial ability test completion times in virtual reality using a desktop display and the Gear VR. Virtual Real. 2021, 1–14. [Google Scholar] [CrossRef]
- Hart, S.G.; Staveland, L.E. Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research. In Advances in Psychology; Elsevier: Amsterdam, The Netherlands, 1988; Volume 52, pp. 139–183. [Google Scholar]
- Card, S.; Moran, T.; Newell, A. The Model Human Processor- An Engineering Model of Human Performance. In Handbook of Perception and Human Performance; Wiley-Interscience: Hoboken, NJ, USA, 1986; p. 2. [Google Scholar]
- Magdin, M.; Balogh, Z.; Reichel, J.; Koprda, Š.; György, M. Automatic detection and classification of emotional states in virtual reality and standard environments (LCD): Comparing valence and arousal of induced emotions. Virtual Real. 2021, 25, 1029–1041. [Google Scholar] [CrossRef]
- Kovari, A.; Katona, J.; Costescu, C. Quantitative Analysis of Relationship between Visual Attention and Eye-Hand Coordination. Acta Polytech. Hung. 2020, 17, 77–95. [Google Scholar] [CrossRef]
- Kovari, A.; Katona, J.; Costescu, C. Evaluation of Eye-Movement Metrics in a Software Debugging Task using GP3 Eye Tracker. Acta Polytech. Hung. 2020, 17, 57–76. [Google Scholar] [CrossRef]
- Holmqvist, K.; Nyström, M.; Andersson, R.; Dewhurst, R.; Jarodzka, H.; Van de Weijer, J. Eye Tracking: A Comprehensive Guide to Methods and Measures; OUP Oxford: Oxford, UK, 2011. [Google Scholar]
- Derick, L.-R.; Gabriel, G.-S.; Máximo, L.-S.; Olivia, F.-D.; Noé, C.-S.; Juan, O.-R. Study of the User’s Eye Tracking to Analyze the Blinking Behavior While Playing a Video Game to Identify Cognitive Load Levels. In Proceedings of the 2020 IEEE International Autumn Meeting on Power, Electronics and Computing (ROPEC), Ixtapa, Mexico, 4–6 November 2020; IEEE: Piscataway, NJ, USA, 2020; Volume 4, pp. 1–5. [Google Scholar]
- Tsai, M.-J.; Hou, H.-T.; Lai, M.-L.; Liu, W.-Y.; Yang, F.-Y. Visual Attention for Solving Multiple-Choice Science Problem: An Eye-Tracking Analysis. Comput. Educ. 2012, 58, 375–385. [Google Scholar] [CrossRef]
- Evinger, C.; Manning, K.A.; Sibony, P.A. Eyelid Movements. Mechanisms and Normal Data. Investig. Ophthalmol. Vis. Sci. 1991, 32, 387–400. [Google Scholar]
- Orchard, L.N.; Stern, J.A. Blinks as an Index of Cognitive Activity during Reading. Integr. Physiol. Behav. Sci. 1991, 26, 108–116. [Google Scholar] [CrossRef] [PubMed]
- Rakoczi, G.; Pohl, M. Visualisation and Analysis of Multiuser Gaze Data: Eye Tracking Usability Studies in the Special Context of e-Learning. In Proceedings of the 2012 IEEE 12th International Conference on Advanced Learning Technologies, Rome, Italy, 4–6 July 2012; IEEE: Piscataway, NJ, USA, 2012; pp. 738–739. [Google Scholar]
- Just, M.A.; Carpenter, P.A. A Theory of Reading: From Eye Fixations to Comprehension. Psychol. Rev. 1980, 87, 329. [Google Scholar] [CrossRef] [PubMed]
- Just, M.A.; Carpenter, P.A. Eye Fixations and Cognitive Processes. Cogn. Psychol. 1976, 8, 441–480. [Google Scholar] [CrossRef]
- Loftus, G.R.; Mackworth, N.H. Cognitive Determinants of Fixation Location during Picture Viewing. J. Exp. Psychol. Hum. Percept. Perform. 1978, 4, 565. [Google Scholar] [CrossRef]
- Goldwater, B.C. Psychological Significance of Pupillary Movements. Psychol. Bull. 1972, 77, 340. [Google Scholar] [CrossRef]
- Suzuki, Y.; Hirayama, K.; Shimomura, T.; Uchiyama, M.; Fujii, H.; Mori, E.; Nishio, Y.; Iizuka, O.; Inoue, R.; Otsuki, M.; et al. Changes in Pupil Diameter Are Correlated with the Occurrence of Pareidolias in Patients with Dementia with Lewy Bodies. NeuroReport 2017, 28, 187. [Google Scholar] [CrossRef] [Green Version]
- Morad, Y.; Lemberg, H.; Yofe, N.; Dagan, Y. Pupillography as an Objective Indicator of Fatigue. Curr. Eye Res. 2000, 21, 535–542. [Google Scholar] [CrossRef]
- Tsai, Y.-F.; Viirre, E.; Strychacz, C.; Chase, B.; Jung, T.-P. Task Performance and Eye Activity: Predicting Behavior Relating to Cognitive Workload. Aviat. Space Environ. Med. 2007, 78, B176–B185. [Google Scholar] [PubMed]
- Allsop, J.; Gray, R.; Bulthoff, H.H.; Chuang, L. Effects of Anxiety and Cognitive Load on Instrument Scanning Behavior in a Flight Simulation. In Proceedings of the 2016 IEEE Second Workshop on Eye Tracking and Visualization (ETVIS), Baltimore, MD, USA, 23 October 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 55–59. [Google Scholar]
- Voßkühler, A. OGAMA Description (for Version 2.5). Available online: http://www.ogama.net/sites/default/files/pdf/OGAMA-DescriptionV25.pdf (accessed on 14 January 2022).
- Shasteen, J.R.; Sasson, N.J.; Pinkham, A.E. Eye Tracking the Face in the Crowd Task: Why Are Angry Faces Found More Quickly? PLoS ONE 2014, 9, e93914. [Google Scholar] [CrossRef] [PubMed] [Green Version]
AOI1FCh (Milliseconds) | AOI1N-S (Milliseconds) | ||||||
---|---|---|---|---|---|---|---|
Min | Max | Mean | SD | Min | Max | Mean | SD |
348 | 769 | 569.2 | 117.8 | 364 | 646 | 516.69 | 86.27 |
AOI2FCh-based implementation (Milliseconds) | AOI2N-S-based implementation (Milliseconds) | ||||||
---|---|---|---|---|---|---|---|
Min | Max | Mean | SD | Min | Max | Mean | SD |
331 | 596 | 456.24 | 80.46 | 328 | 505 | 414.67 | 53.27 |
AOI1FCh (Count) | AOI1N-S (Count) | ||||||
---|---|---|---|---|---|---|---|
Min | Max | Mean | SD | Min | Max | Mean | SD |
126 | 280 | 209.7 | 44.74 | 62 | 199 | 130.29 | 38.08 |
AOI2FCh-based implementation (Count) | AOI2N-S-based implementation (Count) | ||||||
---|---|---|---|---|---|---|---|
Min | Max | Mean | SD | Min | Max | Mean | SD |
163 | 498 | 339 | 88.35 | 89 | 299 | 188.9 | 65.69 |
AOI1FCh (Pixels) | AOI1N-S (Pixels) | ||||||
---|---|---|---|---|---|---|---|
Min | Max | Mean | SD | Min | Max | Mean | SD |
39.28 | 54.84 | 46.98 | 4.71 | 34.61 | 52.65 | 42.35 | 5.41 |
AOI2FCh-based implementation (Pixels) | AOI2N-S-based implementation (Pixels) | ||||||
---|---|---|---|---|---|---|---|
Min | Max | Mean | SD | Min | Max | Mean | SD |
38.28 | 55.85 | 46.88 | 5.52 | 31.61 | 58.97 | 42.75 | 8.62 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Katona, J. Measuring Cognition Load Using Eye-Tracking Parameters Based on Algorithm Description Tools. Sensors 2022, 22, 912. https://doi.org/10.3390/s22030912
Katona J. Measuring Cognition Load Using Eye-Tracking Parameters Based on Algorithm Description Tools. Sensors. 2022; 22(3):912. https://doi.org/10.3390/s22030912
Chicago/Turabian StyleKatona, Jozsef. 2022. "Measuring Cognition Load Using Eye-Tracking Parameters Based on Algorithm Description Tools" Sensors 22, no. 3: 912. https://doi.org/10.3390/s22030912
APA StyleKatona, J. (2022). Measuring Cognition Load Using Eye-Tracking Parameters Based on Algorithm Description Tools. Sensors, 22(3), 912. https://doi.org/10.3390/s22030912