Design of a Wearable Eye-Movement Detection System Based on Electrooculography Signals and Its Experimental Validation
Abstract
:1. Introduction
2. Materials and Methods
2.1. System Overview
2.2. The EOG Signal Acquisition Device
2.2.1. Front-End Filter Circuit
2.2.2. Preamplifier
2.2.3. High-Pass Filter
2.2.4. Microcontroller Unit and Firmware Design
2.2.5. Timer Interrupt
2.2.6. Analog-to-Digital Converter
2.2.7. Moving Average
2.2.8. UART Interface
2.2.9. Power Management
2.2.10. Power Supply Circuit
2.2.11. Charging Circuit
2.2.12. Wireless Transmission
2.3. Software Algorithm Design
2.3.1. Target Types of Eye-Movement Detection
2.3.2. Variation Problem and Electrode Placement
2.3.3. Algorithm
Individual Parameter Calibration
The Signal Preprocessing
Feature Extraction
Classification
Details of the Detection Units
2.4. Human–Computer Interface
2.5. Experimental Goals
2.6. Experimental Procedure
2.6.1. Experiment 1
2.6.2. Experiment 2 (Phone-Dialing Experiment)
3. Results
3.1. Experiment 1
3.1.1. Sessions Involving Four Types of Eye Movement with Ag/AgCl Electrodes
3.1.2. Sessions Involving Six Types of Eye Movement with Ag/AgCl Electrodes
3.1.3. Sessions Involving Ten Types of Eye Movement with Ag/AgCl Electrodes
3.1.4. Sessions Involving Ten Types of Eye Movement with Foam Sensors
3.2. Experiment 2 (Phone-Dialing Experiment)
4. Discussion
4.1. Oblique-Direction Error
4.2. High Classification Accuracy
4.3. Robustness in the Vertical Direction
4.4. Error between up Saccades and Other Saccades
4.5. High Classification Performance with Wearable Goggles and Foam Sensors
4.6. The Practicality of the Proposed HCI System
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Canny, J. The Future of Human–Computer Interaction. Queue 2006, 4, 24–32. [Google Scholar] [CrossRef]
- Gamm, S.; Haeb-Umbach, R.; Langmann, D. The development of a command-based speech interface for a telephone answering machine. Speech Commun. 1997, 23, 161–171. [Google Scholar] [CrossRef]
- Jorge, C.A.F.; Mól, A.C.A.; Pereira, C.M.N.A.; Aghina, M.A.C.; Nomiya, D.V. Human-system interface based on speech recognition: Application to a virtual nuclear power plant control desk. Prog. Nucl. Energy 2010, 52, 379–386. [Google Scholar] [CrossRef]
- Funck, S. Video-Based Handsign Recognition for Intuitive Human-Computer-Interaction. In Proceedings of the 24th DAGM Symposium on Pattern Recognition, Zurich, Switzerland, 16–18 September 2002; pp. 26–33. [Google Scholar]
- Mitra, S.; Acharya, T. Gesture Recognition: A Survey. IEEE Trans. Syst. Man Cybern. Part C Appl. Rev. 2007, 37, 311–324. [Google Scholar] [CrossRef]
- Loewenich, F.; Maire, F.D. Motion-tracking and speech recognition for hands-free mouse-pointer Manipulation. In Speech Recognition; Mihelic, F., Zibert, J., Eds.; IN-TECH: München, Germany, 2008; pp. 427–434. [Google Scholar]
- Evans, D.G.; Drew, R.; Blenkhorn, P. Controlling mouse pointer position using an infrared head-operated joystick. IEEE Trans. Rehabil. Eng. 2000, 8, 107–117. [Google Scholar] [CrossRef]
- Chung-Min, W.; Ke-Guei, H.; Shei-Hsi, C.; Shu-Ching, H.; Ching-Ling, L. EOG single switch morse code translate input device for individuals with the motor neuron disease. In Proceedings of the TENCON 2007–2007 IEEE Region 10 Conference, Taipei, Taiwan, 30 October–2 November 2007; pp. 1–4. [Google Scholar]
- Lin, C.T.; Yu, Y.H.; King, J.T.; Liu, C.H.; Liao, L.D. Augmented Wire-Embedded Silicon-Based Dry-Contact Sensors for Electroencephalography Signal Measurements. IEEE Sens. J. 2020, 20, 3831–3837. [Google Scholar] [CrossRef]
- Yu, Y.H.; Lu, S.W.; Liao, L.D.; Lin, C.T. Design, Fabrication, and Experimental Validation of Novel Flexible Silicon-Based Dry Sensors for Electroencephalography Signal Measurements. IEEE J. Transl. Eng. Health Med. 2014, 2, 2700307. [Google Scholar] [CrossRef] [Green Version]
- Liao, L.D.; Wu, S.L.; Liou, C.H.; Lu, S.W.; Chen, S.A.; Chen, S.F.; Ko, L.W.; Lin, C.T. A Novel 16-Channel Wireless System for Electroencephalography Measurements With Dry Spring-Loaded Sensors. IEEE Transactions. Instrum. Meas. 2014, 63, 1545–1555. [Google Scholar] [CrossRef]
- Usakli, A.B.; Gurkan, S. Design of a Novel Efficient Human–Computer Interface: An Electrooculagram Based Virtual Keyboard. IEEE Trans. Instrum. Meas. 2010, 59, 2099–2108. [Google Scholar] [CrossRef]
- Dhillon, H.S.; Singla, R.; Rekhi, N.S.; Jha, R. EOG and EMG based virtual keyboard: A brain-computer interface. In Proceedings of the 2009. 2nd IEEE International Conference on Computer Science and Information Technology, Beijing, China, 8–11 August 2009; pp. 259–262. [Google Scholar]
- Septanto, H.; Prihatmanto, A.S.; Indrayanto, A. A computer cursor controlled by eye movements and voluntary eye winks using a single channel EOG. In Proceedings of the 2009. ICEEI ‘09 2009 International Conference on Electrical Engineering and Informatics, Selangor, Malaysia, 5–7 August 2009; pp. 117–120. [Google Scholar]
- Lian, L.; XiaoPei, W. Design and Implementation of Multimedia Control System Based on Bluetooth and Electrooculogram (EOG). In Proceedings of the (iCBBE) 2011 5th International Conference on Bioinformatics and Biomedical Engineering, Wuhan, China, 10–12 May 2011; pp. 1–4. [Google Scholar]
- Duvinage, M.; Castermans, T.; Dutoit, T. Control of a lower limb active prosthesis with eye movement sequences. In Proceedings of the 2011 IEEE Symposium on Computational Intelligence, Cognitive Algorithms, Mind, and Brain (CCMB), Singapore, 11–15 April 2011; pp. 1–7. [Google Scholar]
- Yathunanthan, S.; Chandrasena, L.U.R.; Umakanthan, A.; Vasuki, V.; Munasinghe, S.R. Controlling a Wheelchair by Use of EOG Signal. In Proceedings of the ICIAFS 2008. 2008 4th International Conference on Information and Automation for Sustainability, Colombo, Sri Lanka, 12–14 December 2008; pp. 283–288. [Google Scholar]
- Bulling, A.; Ward, J.A.; Gellersen, H.; Tröster, G. Eye Movement Analysis for Activity Recognition Using Electrooculography. IEEE Trans. Pattern Anal. Mach. Intell. 2011, 33, 741–753. [Google Scholar] [CrossRef]
- Bulling, A.; Ward, J.A.; Gellersen, H.; Tröster, G. Robust Recognition of Reading Activity in Transit Using Wearable Electrooculography. In International Conference on Pervasive Computing; Springer: Berlin, Heidelberg, 2008; pp. 19–37. [Google Scholar]
- Bulling, A.; Roggen, D.; Tröster, G. Wearable EOG goggles: Eye-based interaction in everyday environments. In Proceedings of the 27th International Conference Extended Abstracts on Human Factors in Computing Systems, Boston, MA, USA, 4–9 April 2009; pp. 3259–3264. [Google Scholar]
- Vidal, M.; Turner, J.; Bulling, A.; Gellersen, H. Wearable eye tracking for mental health monitoring. Comput. Commun. 2012, 35, 1306–1311. [Google Scholar] [CrossRef]
- Iáñez, E.; Azorín, J.M.; Fernández, E.; Úbeda, A. Interface based on electrooculography for velocity control of a robot arm. Appl. Bionics Biomech. 2010, 7, 199–207. [Google Scholar] [CrossRef] [Green Version]
- Cinar, E.; Sahin, F. EOG controlled mobile robot using Radial Basis Function Networks. In Proceedings of the 2009 Fifth International Conference on Soft Computing, Computing with Words and Perceptions in System Analysis, Decision and Control, Famagusta, North Cyprus, 2–4 September 2009; pp. 1–4. [Google Scholar]
- Yingxi, C.; Newman, W.S. A human-robot interface based on electrooculography. In Proceedings of the ICRA’04, 2004 IEEE International Conference on Robotics and Automation, New Orleans, LA, USA, 26 April–1 May 2004; 241, pp. 243–248. [Google Scholar]
- Liao, L.D.; Wang, Y.; Tsao, Y.C.; Wang, I.J.; Jhang, D.F.; Chuang, C.C.; Chen, S.F. Design and Implementation of a Multifunction Wearable Device to Monitor Sleep Physiological Signals. Micromachines 2020, 11, 672. [Google Scholar] [CrossRef] [PubMed]
- Liao, L.D.; Wang, Y.L.; Tsao, Y.C.; Wang, I.J.; Jhang, D.F.; Chu, T.S.; Tsao, C.H.; Tsai, C.N.; Chen, S.F.; Chuang, C.C.; et al. Design and Validation of a Multifunctional Android-Based Smart Home Control and Monitoring System. IEEE Access 2019, 7, 163313–163322. [Google Scholar] [CrossRef]
- Ubeda, A.; Ianez, E.; Azorin, J.M. Wireless and Portable EOG-Based Interface for Assisting Disabled People. IEEE/ASME Trans. Mechatron. 2011, 16, 870–873. [Google Scholar] [CrossRef]
- Wijesoma, W.S.; Kang Say, W.; Ong Choon, W.; Balasuriya, A.P.; Koh Tong, S.; Low Kay, S. EOG based control of mobile assistive platforms for the severely disabled. In Proceedings of the 2005 IEEE International Conference on Robotics and Biomimetics (ROBIO), Hong Kong, China, 5–9 July 2005; pp. 490–494. [Google Scholar]
- Sharma, R.; Pavlovic, V.I.; Huang, T.S. Toward multimodal human–computer interface. Proc. IEEE 1998, 86, 853–869. [Google Scholar] [CrossRef]
- Lin, C.-S.; Huan, C.-C.; Chan, C.-N.; Yeh, M.-S.; Chiu, C.-C. Design of a computer game using an eye-tracking device for eye’s activity rehabilitation. Opt. Lasers Eng. 2004, 42, 91–108. [Google Scholar] [CrossRef]
- Zhiwei, Z.; Qiang, J. Eye gaze tracking under natural head movements. In Proceedings of the CVPR 2005. IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Diego, CA, USA, 20–25 June 2005; Volum 911, pp. 918–923. [Google Scholar]
- Crane, H.D.; Steele, C.M. Generation-V dual-Purkinje-Image Eyetracker. Appl. Opt. 1985, 24, 527–537. [Google Scholar] [CrossRef] [Green Version]
- Geetha, A.; Ramalingam, V.; Palanivel, S.; Palaniappan, B. Facial expression recognition—A real time approach. Expert Syst. Appl. 2009, 36, 303–308. [Google Scholar] [CrossRef]
- Lin, C.T.; Liao, L.D.; Liu, Y.H.; Wang, I.J.; Lin, B.S.; Chang, J.Y. Novel Dry Polymer Foam Electrodes for Long-Term EEG Measurement. IEEE Trans. Biomed. Eng. 2011, 58, 1200–1207. [Google Scholar] [CrossRef] [PubMed]
- Lin, F.C.; Ko, L.W.; Chuang, C.H.; Su, T.P.; Lin, C.T. Generalized EEG-Based Drowsiness Prediction System by Using a Self-Organizing Neural Fuzzy System. IEEE Trans. Circuits Syst. 2012, 59, 2044–2055. [Google Scholar] [CrossRef]
- Vandani-Manaf, N.; Pournamdar, V. Classification of Eye Movement Signals Using Electrooculography in order to Device Controlling. In Proceedings of the 2017 IEEE 4th International Conference on Knowledge-Based Engineering and Innovation (Kbei), Tehran, Iran, 22 December 2017; pp. 339–342. [Google Scholar]
- Huang, Q.; Chen, Y.; Zhang, Z.; He, S.; Zhang, R.; Liu, J.; Zhang, Y.; Shao, M.; Li, Y. An EOG-based wheelchair robotic arm system for assisting patients with severe spinal cord injuries. J. Neural. Eng. 2019, 16, 026021. [Google Scholar] [CrossRef]
- Heo, J.; Yoon, H.; Park, K.S. A Novel Wearable Forehead EOG Measurement System for Human Computer Interfaces. Sensors 2017, 17, 1485. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Chen, S.C.; Tsai, T.T.; Luo, C.H. Portable and programmable clinical EOG diagnostic system. J. Med. Eng. Technol. 2000, 24, 154–162. [Google Scholar] [CrossRef] [PubMed]
- Stone, D.B.; Tamburro, G.; Fiedler, P.; Haueisen, J.; Comani, S. Automatic Removal of Physiological Artifacts in EEG: The Optimized Fingerprint Method for Sports Science Applications. Front. Hum. Neurosci. 2018, 12, 96. [Google Scholar] [CrossRef] [PubMed]
- Lin, C.T.; Huang, C.S.; Yang, W.Y.; Singh, A.K.; Chuang, C.H.; Wang, Y.K. Real-Time EEG Signal Enhancement Using Canonical Correlation Analysis and Gaussian Mixture Clustering. J. Healthc. Eng. 2018, 2018, 5081258. [Google Scholar] [CrossRef]
S1 | S2 | S3 | S4 | S5 | S6 | Total | |
---|---|---|---|---|---|---|---|
Up | 100% | 100% | 100% | 100% | 100% | 100% | 100% |
Down | 100% | 100% | 90% | 90% | 100% | 100% | 96.7% |
Left | 80% | 100% | 100% | 80% | 100% | 100% | 95.0% |
Right | 90% | 90% | 100% | 70% | 90% | 90% | 86.7% |
Total | 92.50% | 97.50% | 97.50% | 85.00% | 97.50% | 97.50% | 94.6% |
Classification Type | ||||||
---|---|---|---|---|---|---|
Up | Down | Left | Right | Unexpected | ||
Actual Type | Up | 100% | 0% | 0% | 0% | 0% |
Down | 3.33% | 96.67% | 0% | 0% | 0% | |
Left | 1.67% | 0% | 95% | 0% | 3.33% | |
Right | 5% | 0% | 0% | 86.67% | 8.33% |
S1 | S2 | S3 | S4 | S5 | S6 | Average | |
---|---|---|---|---|---|---|---|
Up | 100% | 70% | 100% | 100% | 100% | 100% | 93.3% |
Down | 80% | 90% | 70% | 90% | 50% | 100% | 85.0% |
Left | 100% | 70% | 70% | 80% | 100% | 80% | 88.3% |
Right | 80% | 80% | 80% | 90% | 90% | 100% | 88.3% |
Farther Left | 100% | 100% | 70% | 70% | 90% | 90% | 88.3% |
Farther Right | 90% | 90% | 90% | 80% | 70% | 100% | 86.7% |
Total | 92% | 83% | 92% | 85% | 83% | 95% | 88.3% |
Classification Type | ||||||||
---|---|---|---|---|---|---|---|---|
Up | Down | Left | Right | Much Left | Much Right | Unexpected | ||
Actual Type | Up | 93.3% | 6.7% | 0.0% | 0.0% | 0.0% | 0.0% | 0.0% |
Down | 10.0% | 85.0% | 0.0% | 0.0% | 0.0% | 0.0% | 5.0% | |
Left | 10.0% | 1.7% | 88.3% | 0.0% | 0.0% | 0.0% | 0.0% | |
Right | 11.7% | 0.0% | 0.0% | 88.3% | 0.0% | 0.0% | 0.0% | |
Much left | 3.3% | 3.3% | 5.0% | 0.0% | 88.3% | 0.0% | 0.0% | |
Much right | 11.7% | 1.7% | 0.0% | 0.0% | 0.0% | 86.7% | 0.0% |
S1 | S2 | S3 | S4 | S5 | S6 | Total | |
---|---|---|---|---|---|---|---|
Up | 100% | 100% | 100% | 100% | 90% | 90% | 96.67% |
Down | 100% | 100% | 100% | 100% | 100% | 80% | 96.67% |
Left | 70% | 90% | 100% | 90% | 100% | 100% | 91.67% |
Right | 90% | 100% | 90% | 100% | 90% | 80% | 91.67% |
Much Left | 90% | 70% | 100% | 80% | 100% | 90% | 88.33% |
Much Right | 80% | 40% | 100% | 90% | 90% | 100% | 83.33% |
Up-right | 60% | 90% | 90% | 80% | 70% | 100% | 81.67% |
Up-left | 80% | 90% | 100% | 70% | 60% | 100% | 83.33% |
Down-right | 70% | 100% | 80% | 50% | 80% | 90% | 78.33% |
Down-left | 80% | 100% | 100% | 70% | 90% | 70% | 85% |
Total | 82% | 88% | 96% | 83% | 87% | 90% | 87.67% |
Classification Type | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Up | Down | Left | Right | Much Left | Much Right | Up-Right | Up-Left | Down-Right | Down-Left | ||
Actual Type | Up | 96.67% | 3.33% | 0% | 0% | 0% | 0% | 0% | 0% | 0% | 0% |
Down | 3.33% | 96.67% | 0.0% | 0% | 0% | 0% | 0% | 0% | 0% | 0% | |
Left | 6.67% | 1.67% | 91.67% | 0% | 0% | 0% | 0% | 0% | 0% | 0% | |
Right | 3.33% | 0% | 1.67% | 91.67% | 0% | 3.33% | 0% | 0% | 0% | 0% | |
Much left | 3.33% | 0% | 8.33% | 0% | 88.33% | 0% | 0% | 0% | 0% | 0% | |
Much right | 1.67% | 0% | 0% | 13.33% | 0% | 83.33% | 0% | 0% | 0% | 1.67% | |
Up-right | 13.33% | 1.67% | 0% | 1.67% | 0% | 1.67% | 81.67% | 0% | 0% | 0% | |
Up-left | 16.67% | 0% | 0% | 0% | 0% | 0% | 0% | 83.33% | 0% | 0% | |
Down-right | 1.67% | 8.33% | 0% | 10% | 0% | 0% | 0% | 0% | 78.33% | 1.67% | |
Down-left | 3.33% | 6.67% | 5% | 0% | 0% | 0% | 0% | 0% | 0% | 85% |
S1 | S2 | S3 | S4 | S5 | Total | |
---|---|---|---|---|---|---|
Up | 90% | 100% | 100% | 100% | 90% | 96.00% |
Down | 100% | 100% | 90% | 90% | 90% | 94.00% |
Left | 80% | 100% | 90% | 100% | 90% | 92.00% |
Right | 90% | 100% | 80% | 90% | 90% | 90.00% |
Much Left | 100% | 100% | 90% | 100% | 100% | 98.00% |
Much Right | 100% | 90% | 90% | 100% | 100% | 96.00% |
Up-right | 100% | 60% | 80% | 100% | 90% | 86.00% |
Up-left | 100% | 90% | 70% | 100% | 70% | 86.00% |
Down-right | 90% | 80% | 60% | 40% | 60% | 66.00% |
Down-left | 80% | 90% | 80% | 70% | 90% | 82.00% |
Total | 93% | 91% | 83% | 89% | 87% | 88.60% |
Classification Type | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Up | Down | Left | Right | Much Left | Much Right | Up-Right | Up-Left | Down-Right | Down-Left | ||
Actual Type | Up | 96% | 2% | 2% | 0% | 0% | 0% | 0% | 0% | 0% | 0% |
Down | 4% | 94% | 0% | 0% | 2% | 0% | 0% | 0% | 0% | 0% | |
Left | 2% | 2% | 92% | 0% | 2% | 0% | 0% | 0% | 0% | 2% | |
Right | 4% | 4% | 0% | 90% | 2% | 0% | 0% | 0% | 0% | 0% | |
Much left | 2% | 0% | 0% | 0% | 98% | 0% | 0% | 0% | 0% | 0% | |
Much right | 4% | 0% | 0% | 0% | 0% | 96% | 0% | 0% | 0% | 0% | |
Up-right | 6% | 0% | 0% | 8% | 0% | 0% | 86% | 0% | 0% | 0% | |
Up-left | 6% | 0% | 6% | 0% | 2% | 0% | 0% | 86% | 0% | 0% | |
Down-right | 0% | 2% | 0% | 32% | 0% | 0% | 0% | 0% | 66% | 0% | |
Down-left | 4% | 0% | 12% | 0% | 2% | 0% | 0% | 0% | 0% | 82% |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Lin, C.-T.; Jiang, W.-L.; Chen, S.-F.; Huang, K.-C.; Liao, L.-D. Design of a Wearable Eye-Movement Detection System Based on Electrooculography Signals and Its Experimental Validation. Biosensors 2021, 11, 343. https://doi.org/10.3390/bios11090343
Lin C-T, Jiang W-L, Chen S-F, Huang K-C, Liao L-D. Design of a Wearable Eye-Movement Detection System Based on Electrooculography Signals and Its Experimental Validation. Biosensors. 2021; 11(9):343. https://doi.org/10.3390/bios11090343
Chicago/Turabian StyleLin, Chin-Teng, Wei-Ling Jiang, Sheng-Fu Chen, Kuan-Chih Huang, and Lun-De Liao. 2021. "Design of a Wearable Eye-Movement Detection System Based on Electrooculography Signals and Its Experimental Validation" Biosensors 11, no. 9: 343. https://doi.org/10.3390/bios11090343
APA StyleLin, C. -T., Jiang, W. -L., Chen, S. -F., Huang, K. -C., & Liao, L. -D. (2021). Design of a Wearable Eye-Movement Detection System Based on Electrooculography Signals and Its Experimental Validation. Biosensors, 11(9), 343. https://doi.org/10.3390/bios11090343