A Fusion Algorithm Based on a Constant Velocity Model for Improving the Measurement of Saccade Parameters with Electrooculography
Abstract
:1. Introduction
2. Materials and Methods
2.1. Theoretical Models
2.1.1. Saccadic Eye Movements and EOG Signals
2.1.2. Noise Model
2.1.3. Corneo-Retinal Potential Models for Saccades
Constant Velocity Model
Kalman Filter Model
2.1.4. Other State Estimator Models
Brownian Motion
Constant Acceleration Model
Linear Reciprocal Model
Bandpass Filtering
Sensor Fusion
2.2. Experimental Description
2.2.1. Participants
2.2.2. Apparatus
2.2.3. Procedure
2.2.4. Calibration
2.2.5. Kalman Filter
2.2.6. Study Parameters
3. Results and Discussion
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A
References
- Heide, W.; Koenig, E.; Trillenberg, P.; Kömpf, D.; Zee, D. Electrooculography: Technical standards and applications. Electroencephalogr. Clin. Neurophysiol. Suppl. 1999, 52, 223–240. [Google Scholar] [PubMed]
- Dong, H.; Supratak, A.; Pan, W.; Wu, C.; Matthews, P.M.; Guo, Y. Mixed neural network approach for temporal sleep stage classification. IEEE Trans. Neural Syst. Rehab. Eng. 2017, 26, 324–333. [Google Scholar] [CrossRef]
- Maremmani, C.; Monastero, R.; Orlandi, G.; Salvadori, S.; Pieroni, A.; Baschi, R.; Pecori, A.; Dolciotti, C.; Berchina, G.; Rovini, E.; et al. Objective assessment of blinking and facial expressions in Parkinson’s disease using a vertical electro-oculogram and facial surface electromyography. Physiol. Meas. 2019, 40, 065005. [Google Scholar] [CrossRef] [PubMed]
- Gallicchio, G.; Cooke, A.; Ring, C. Assessing ocular activity during performance of motor skills using electrooculography. Psychophysiology 2018, 55, e13070. [Google Scholar] [PubMed]
- Mengoudi, K.; Ravi, D.; Yong, K.X.; Primativo, S.; Pavisic, I.M.; Brotherhood, E.; Lu, K.; Schott, J.M.; Crutch, S.J.; Alexander, D.C. Augmenting dementia cognitive assessment with instruction-less eye-tracking tests. IEEE J. Biomed. Health Inform. 2020, 24, 3066–3075. [Google Scholar] [CrossRef] [PubMed]
- Iskander, J.; Hossny, M.; Nahavandi, S. A review on ocular biomechanic models for assessing visual fatigue in virtual reality. IEEE Access 2018, 6, 19345–19361. [Google Scholar] [CrossRef]
- Ranjbaran, M.; Smith, H.L.; Galiana, H.L. Automatic classification of the vestibulo-ocular reflex nystagmus: Integration of data clustering and system identification. IEEE Trans. Biomed. Eng. 2015, 63, 850–858. [Google Scholar] [CrossRef]
- Duchowski, A.T. Eye tracking methodology. Theory Pract. 2007, 328, 2–3. [Google Scholar]
- Constable, P.A.; Bach, M.; Frishman, L.J.; Jeffrey, B.G.; Robson, A.G.; International Society for Clinical Electrophysiology of Vision. ISCEV Standard for clinical electro-oculography (2017 update). Doc. Ophthalmol. 2017, 134, 1–9. [Google Scholar] [CrossRef] [PubMed]
- Gunawardane, P.D.S.H.; Macneil, R.; Zhao, L.; Enns, J.; De Silva, C.; Chiao, M. A Fusion Algorithm for Saccade Eye Movement Enhancement with EOG and Lumped-element Models. IEEE Trans. Biomed. Eng. 2021, 68, 3048–3058. [Google Scholar] [CrossRef] [PubMed]
- Sharma, M.; Yadav, A.; Tiwari, J.; Karabatak, M.; Yildirim, O.; Acharya, U.R. An automated wavelet-based sleep scoring model using eeg, emg, and eog signals with more than 8000 subjects. Int. J. Environ. Res. Public Health 2022, 19, 7176. [Google Scholar] [CrossRef] [PubMed]
- Martínez-Cerveró, J.; Ardali, M.K.; Jaramillo-Gonzalez, A.; Wu, S.; Tonin, A.; Birbaumer, N.; Chaudhary, U. Open Software/Hardware Platform for Human-Computer Interface Based on Electrooculography (EOG) Signal Classification. Sensors 2020, 20, 2443. [Google Scholar] [CrossRef] [PubMed]
- Pleshkov, M.; Zaitsev, V.; Starkov, D.; Demkin, V.; Kingma, H.; van de Berg, R. Comparison of EOG and VOG obtained eye movements during horizontal head impulse testing. Front. Neurol. 2022, 13, 917413. [Google Scholar] [PubMed]
- Diaz, G.; Cooper, J.; Kit, D.; Hayhoe, M. Real-time recording and classification of eye movements in an immersive virtual environment. J. Vis. 2013, 13, 5. [Google Scholar]
- Nyström, M.; Holmqvist, K. An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data. Behav. Res. Methods 2010, 42, 188–204. [Google Scholar] [PubMed]
- Gunawardane, P.; de Silva, C.; Chiao, M. An Oculomotor Sensing Technique for Saccade Isolation of Eye Movements using OpenBCI. In Proceedings of the 2019 IEEE SENSORS, Montreal, QC, Canada, 27–30 October 2019; pp. 1–4. [Google Scholar]
- Gunawardane, P.D.S.H. A Model-Based Fusion Technique for Ocular Motion Sensing. Master’s Thesis, The University of British Columbia, Vancouver, BC, Canada, 2019. [Google Scholar]
- Stuart, S.; Hickey, A.; Vitorio, R.; Welman, K.; Foo, S.; Keen, D.; Godfrey, A. Eye-tracker algorithms to detect saccades during static and dynamic tasks: A structured review. Physiol. Meas. 2019, 40, 02TR01. [Google Scholar]
- Jia, Y.; Tyler, C.W. Measurement of saccadic eye movements by electrooculography for simultaneous EEG recording. Behav. Res. Methods 2019, 51, 2139–2151. [Google Scholar]
- Wadehn, F.; Weber, T.; Mack, D.J.; Heldt, T.; Loeliger, H.A. Model-based separation, detection, and classification of eye movements. IEEE Trans. Biomed. Eng. 2019, 67, 588–600. [Google Scholar] [CrossRef]
- Dasgupta, A.; Routray, A. A New Multi-resolution Analysis Method for Electrooculography Signals. IEEE Trans. Neural Syst. Rehabil. Eng. 2021, 1. [Google Scholar] [CrossRef]
- Yan, B.; Pei, T.; Wang, X. Wavelet method for automatic detection of eye-movement behaviors. IEEE Sens. J. 2018, 19, 3085–3091. [Google Scholar]
- Jiang, P.; Zhou, R. De-noising and recognition of EOG signal based on mathematical morphology. In Proceedings of the 2013 Sixth International Symposium on Computational Intelligence and Design, Hangzhou, China, 28–29 October 2013; Volume 2, pp. 351–354. [Google Scholar]
- Dasgupta, A.; Chakraborty, S.; Mondal, P.; Routray, A. Identification of eye saccadic signatures in electrooculography data using time-series motifs. In Proceedings of the 2016 IEEE Annual India Conference (INDICON), Bangalore, India, 16–18 December 2016; pp. 1–5. [Google Scholar]
- Dasgupta, A.; Chakraborty, S.; Routray, A. A two-stage framework for denoising electrooculography signals. Biomed. Signal Process. Control 2017, 31, 231–237. [Google Scholar]
- Bulut, Y.; Vines-Cavanaugh, D.; Bernal, D. Process and estimation for Kalman filtering. In Structural Dynamics; Springer: Berlin/Heidelberg, Germany, 2011; Volume 3, pp. 375–386. [Google Scholar]
- Acharya, S.; Mongan, W.M.; Rasheed, I.; Liu, Y.; Anday, E.; Dion, G.; Fontecchio, A.; Kurzweg, T.; Dandekar, K.R. Ensemble learning approach via Kalman filtering for a passive wearable respiratory monitor. IEEE J. Biomed. Health Inform. 2018, 23, 1022–1031. [Google Scholar] [CrossRef] [PubMed]
- Ranjbaran, M.; Jalaleddini, K.; Lopez, D.G.; Kearney, R.E.; Galiana, H.L. Analysis and modeling of noise in biomedical systems. In Proceedings of the 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Osaka, Japan, 3–7 July 2013; pp. 997–1000. [Google Scholar]
- Komogortsev, O.V.; Khan, J.I. Eye movement prediction by oculomotor plant Kalman filter with brainstem control. J. Control Theory Appl. 2009, 7, 14–22. [Google Scholar] [CrossRef]
- Barbara, N.; Camilleri, T.A.; Camilleri, K.P. EOG-Based Ocular and Gaze Angle Estimation Using an Extended Kalman Filter. In Proceedings of the ACM Symposium on Eye Tracking Research and Applications, Stuttgart, Germany, 2–5 June 2020; pp. 1–5. [Google Scholar]
- Barbara, N.; Camilleri, T.; Camilleri, K.P. A comparison of EOG baseline drift mitigation techniques. Biomed. Signal Process. Control 2020, 57, 101738. [Google Scholar] [CrossRef]
- Dai, W.; Selesnick, I.; Rizzo, J.R.; Rucker, J.; Hudson, T. Detection of normal and slow saccades using implicit piecewise polynomial approximation. J. Vis. 2021, 21, 8. [Google Scholar]
- Toivanen, M. An advanced Kalman filter for gaze tracking signal. Biomed. Signal Process. Control 2016, 25, 150–158. [Google Scholar] [CrossRef]
- Barbara, N.; Camilleri, T.A.; Camilleri, K.P. Real-time continuous EOG-based gaze angle estimation with baseline drift compensation under stationary head conditions. Biomed. Signal Process. Control 2023, 86, 105282. [Google Scholar] [CrossRef]
- Barbara, N.; Camilleri, T.A.; Camilleri, K.P. TEMoD: Target-Enabled Model-Based De-Drifting of the EOG Signal Baseline using a Battery Model of the Eye. In Proceedings of the 2021 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Guadalajara, Mexico, 1–5 November 2021; pp. 562–565. [Google Scholar]
- Verhaegen, M.; Verdult, V. Filtering and System Identification: A Least Squares Approach; Cambridge University Press: Cambridge, UK, 2007. [Google Scholar]
- Park, S.; Gil, M.S.; Im, H.; Moon, Y.S. Measurement noise recommendation for efficient Kalman filtering over a large amount of sensor data. Sensors 2019, 19, 1168. [Google Scholar] [PubMed]
Study Parameter | Symbol | Definition |
---|---|---|
Amplitude | Angle of the landing position of eye from the center | |
Peak velocity | Average of the maximum peaks at the fixation | |
Latency | Time taken to initiate the saccade after receiving cue | |
Abs. relative error | Absolute of difference between EOG and EL mag. | |
Abs. EyeLink error | Absolute difference between EL and physical mag. | |
Abs. relative error EOG | Absolute of difference between EOG and physical mag. | |
Normalized error | Normalized absolute relative error | |
Computation time | Time taken to process the data between markers | |
Signal-to-noise ratio | Ratio of the signal power to the noise power |
BP | BM | CVM | CAM | LR * | |||||||
---|---|---|---|---|---|---|---|---|---|---|---|
PID | Out. % | Out. % | Q | Out. % | Q | Out. % | Q | R | Q | R | Out. % |
P20 | 14.08 | 10.06 | 0.01 | 8.66 | 0.5 | 11.57 | 0.5 | 10.53 | 0.5 | 11 | 13 |
P22 | 9.95 | 7.52 | 0.01 | 8.76 | 0.5 | 7.19 | 0.5 | 15.56 | 0.5 | 16 | 11.1 |
P23 | 15.25 | 15.16 | 0.01 | 15.70 | 0.5 | 16.19 | 0.5 | 8.06 | 0.5 | 8 | 12.8 |
P24 | 6.31 | 8.29 | 0.01 | 4.71 | 0.5 | 6.58 | 0.5 | 15.55 | 0.5 | 16 | 7.3 |
P25 | 14.13 | 17.88 | 0.01 | 17.49 | 0.5 | 13.59 | 0.5 | 16.64 | 0.5 | 17 | 18 |
P26 | 21.24 | 9.26 | 0.01 | 8.50 | 0.5 | 11.09 | 0.5 | 5.55 | 0.5 | 6 | 10 |
P27 | 12.52 | 11.54 | 0.01 | 9.84 | 0.5 | 9.15 | 0.5 | 150.40 | 0.5 | 150 | 13.9 |
P28 | 16.03 | 14.15 | 0.01 | 14.49 | 0.5 | 14.51 | 0.5 | 10.14 | 0.5 | 10 | 14.8 |
P29 | 14.08 | 11.37 | 0.01 | 15.72 | 0.5 | 14.42 | 0.5 | 21.71 | 0.5 | 22 | 12.1 |
P32 | 17.10 | 19.24 | 0.01 | 19.20 | 0.5 | 17.43 | 0.5 | 22.85 | 0.5 | 13 | 15 |
P33 | 7.63 | 6.88 | 0.01 | 9.42 | 0.5 | 10.49 | 0.5 | 173.31 | 0.5 | 173 | 6.4 |
PID | BP | BM | CAM | LR * | EL | CVM |
---|---|---|---|---|---|---|
P20 | 0.25 | 0.28 | 0.29 | 0.03 | 0.07 | 0.25 |
P22 | 0.21 | 0.23 | 0.20 | 0.01 | 0.08 | 0.18 |
P23 | 0.48 | 0.40 | 0.52 | 0.06 | 0.07 | 0.47 |
P24 | 0.29 | 0.47 | 0.41 | 0.02 | 0.07 | 0.31 |
P25 | 0.12 | 0.08 | 0.09 | 0.03 | 0.08 | 0.09 |
P26 | 0.15 | 0.14 | 0.49 | 0.03 | 0.07 | 0.36 |
P27 | 0.50 | 0.99 | 0.57 | 0.04 | 0.07 | 0.42 |
P28 | 0.23 | 0.24 | 0.30 | 0.04 | 0.07 | 0.27 |
P29 | 0.27 | 1.11 | 0.42 | 0.02 | 0.08 | 0.37 |
P32 | 0.20 | 0.19 | 0.26 | 0.03 | 0.07 | 0.24 |
P33 | 0.17 | 0.37 | 0.25 | 0.02 | 0.08 | 0.25 |
Feature | BP | BM | CAM | LR * | CVM |
---|---|---|---|---|---|
Amplitude | 0.991 | 0.999 | 0.999 | 0.997 | 0.999 |
Error | 0.654 | 0.139 | 0.540 | 0.732 | 0.671 |
Peak Velocity | 0.976 | 0.935 | 0.157 | 0.485 | 0.967 |
Latency | 0.999 | 0.649 | 0.576 | 0.336 | 0.583 |
Improvement (%) | – | 8 | 8 | 7.5 | 8 |
Measurement | BP | BM | CAM | LR | CVM |
---|---|---|---|---|---|
SNR | 10.72 | 12.2 | 11.46 | 9.01 | 11.48 |
0.48 | 0.22 | 0.94 | 0.72 | 0.98 | |
% improvement in with respect to BP | − | −13.47 | 95.36 | 49.19 | 104.43 |
Parameters | BP | BM | CAM | CVM | LR * | |
---|---|---|---|---|---|---|
Mean | 2.85 | 2.85 | 2.85 | 2.85 | 2.85 | |
SD | 0.42 | 0.42 | 0.42 | 0.42 | 0.42 | |
Mean | 6.80 | 5.49 | 4.93 | 4.85 | 6.75 | |
SD | 1.04 | 0.95 | 0.85 | 0.84 | 1.23 | |
Improv. % | - | 19.30 | 27.50 | 28.70 | 1 | |
Mean | 7.57 | 6.76 | 6.18 | 5.88 | 7.83 | |
SD Improv. % | 1.13 | 1.16 | 0.97 | 0.93 | 1.28 | |
Improv. % | - | 10.70 | 18.40 | 22.30 | 3.43 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Gunawardane, P.D.S.H.; MacNeil, R.R.; Zhao, L.; Enns, J.T.; de Silva, C.W.; Chiao, M. A Fusion Algorithm Based on a Constant Velocity Model for Improving the Measurement of Saccade Parameters with Electrooculography. Sensors 2024, 24, 540. https://doi.org/10.3390/s24020540
Gunawardane PDSH, MacNeil RR, Zhao L, Enns JT, de Silva CW, Chiao M. A Fusion Algorithm Based on a Constant Velocity Model for Improving the Measurement of Saccade Parameters with Electrooculography. Sensors. 2024; 24(2):540. https://doi.org/10.3390/s24020540
Chicago/Turabian StyleGunawardane, Palpolage Don Shehan Hiroshan, Raymond Robert MacNeil, Leo Zhao, James Theodore Enns, Clarence Wilfred de Silva, and Mu Chiao. 2024. "A Fusion Algorithm Based on a Constant Velocity Model for Improving the Measurement of Saccade Parameters with Electrooculography" Sensors 24, no. 2: 540. https://doi.org/10.3390/s24020540
APA StyleGunawardane, P. D. S. H., MacNeil, R. R., Zhao, L., Enns, J. T., de Silva, C. W., & Chiao, M. (2024). A Fusion Algorithm Based on a Constant Velocity Model for Improving the Measurement of Saccade Parameters with Electrooculography. Sensors, 24(2), 540. https://doi.org/10.3390/s24020540