An Intelligent and Low-Cost Eye-Tracking System for Motorized Wheelchair Control
Abstract
:1. Introduction
2. Background and Related Works
2.1. State-Of-The-Art Eye Tracking Methods
2.2. Existing Eye-Controlled Wheelchair Systems
2.3. Convolutional Neural Networks (CNNs) for Eye Tracking
3. Methodology
3.1. Image Acquisition Frame Design
3.2. Gaze Estimation Algorithm
- Network Architecture Selection
- Data Preprocessing
- Loss function
- Training algorithm selection
- Hyperparameters setting
- Neurons in the same filter are only connected to local patches of the image to preserve spatial structure.
- Their weights are shared to reduce the total number of the model’s parameters.
- Convolution layer to learn features.
- Pooling (subsampling) layer to reduce the dimensionality the activation maps.
- Fully-connected layer to equip the network with classification capabilities.The architecture overview is illustrated in Figure 9.
3.3. Building a Database for Training and Testing
- The database may have changes in face position, which requires applying more stages to localize the eyes’ area. Besides, the set-up for this project is based on having only one head pose, disregarding the gaze direction.
- The database comprises only one gaze direction, which eliminates the possibility of using the dataset for testing for gaze tracking.
- The dataset is not labeled, and the time and effort needed to label it is comparatively higher than building a similar new dataset.
- One important feature that is missing in all of the available datasets is the transition between one gaze direction and another. This time should be known a priori, and be compared with the time needed by all the proposed algorithms.
- Furthermore, all these datasets lack variations in lighting conditions.
3.4. Safety System—Ultrasonic Sensors
3.5. Modifying the Wheelchair Controller
Joystick Control Mechanism
4. Implementation
4.1. Frame Implementation
4.2. Gaze Estimation Algorithm
4.2.1. Collecting Training Dataset for the CNN—Calibration Phase
4.2.2. Training the CNN
4.3. Modifying the Joystick Controller
4.4. Safety System Implementation
5. Results and Discussion
5.1. Computation Complexity Analysis
5.2. Real-time Performance
5.3. Classification Results
6. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Analysis of Wheelchair Need/Del-Corazon.org. Available online: http://www.del-corazon.org/analysis-of-wheelchair-need (accessed on 10 March 2017).
- Zandi, A.S.; Quddus, A.; Prest, L.; Comeau, F.J.E. Non-Intrusive Detection of Drowsy Driving Based on Eye Tracking Data. Transp. Res. Rec. J. Transp. Res. Board 2019, 2673, 247–257. [Google Scholar] [CrossRef]
- Zhang, J.; Yang, Z.; Deng, H.; Yu, H.; Ma, M.; Xiang, Z. Dynamic Visual Measurement of Driver Eye Movements. Sensors 2019, 19, 2217. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Strobl, M.A.R.; Lipsmeier, F.; Demenescu, L.R.; Gossens, C.; Lindemann, M.; De Vos, M. Look me in the eye: Evaluating the accuracy of smartphone-based eye tracking for potential application in autism spectrum disorder research. Biomed. Eng. Online 2019, 18, 51. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Shishido, E.; Ogawa, S.; Miyata, S.; Yamamoto, M.; Inada, T.; Ozaki, N. Application of eye trackers for understanding mental disorders: Cases for schizophrenia and autism spectrum disorder. Neuropsychopharmacol. Rep. 2019, 39, 72–77. [Google Scholar] [CrossRef]
- Cruz, R.; Souza, V.; Filho, T.B.; Lucena, V. Electric Powered Wheelchair Command by Information Fusion from Eye Tracking and BCI. In Proceedings of the 2019 IEEE International Conference on Consumer Electronics (ICCE), Las Vegas, NV, USA, 11–13 January 2019. [Google Scholar]
- Rupanagudi, S.R.; Koppisetti, M.; Satyananda, V.; Bhat, V.G.; Gurikar, S.K.; Koundinya, S.P.; Sumedh, S.K.M.; Shreyas, R.; Shilpa, S.; Suman, N.M.; et al. A Video Processing Based Eye Gaze Recognition Algorithm for Wheelchair Control. In Proceedings of the 2019 10th International Conference on Dependable Systems, Services and Technologies (DESSERT), Leeds, UK, 5–7 June 2019. [Google Scholar]
- Kumar, A.; Netzel, R.; Burch, M.; Weiskopf, D.; Mueller, K. Visual Multi-Metric Grouping of Eye-Tracking Data. J. Eye Mov. Res. 2018, 10, 17. [Google Scholar]
- Ahmed, H.M.; Abdullah, S.H. A Survey on Human Eye-Gaze Tracking (EGT) System “A Comparative Study”. Iraqi J. Inf. Technol. 2019, 9, 177–190. [Google Scholar]
- Vidal, M.; Turner, J.; Bulling, A.; Gellersen, H. Wearable eye tracking for mental health monitoring. Comput. Commun. 2012, 35, 1306–1311. [Google Scholar] [CrossRef]
- Reddy, T.K.; Gupta, V.; Behera, L. Autoencoding Convolutional Representations for Real-Time Eye-Gaze Detection. In Advances in Intelligent Systems and Computing; Springer Science and Business Media LLC: Singapore, 2018; pp. 229–238. [Google Scholar]
- Jafar, F.; Fatima, S.F.; Mushtaq, H.R.; Khan, S.; Rasheed, A.; Sadaf, M. Eye Controlled Wheelchair Using Transfer Learning. In Proceedings of the 2019 International Symposium on Recent Advances in Electrical Engineering (RAEE), Islamabad, Pakistan, 28–29 August 2019. [Google Scholar]
- Deshpande, S.; Adhikary, S.D.; Arvindekar, S.; Jadhav, S.S.; Rathod, B. Eye Monitored Wheelchair Control for People Suffering from Quadriplegia. Univers. Rev. 2019, 8, 141–145. [Google Scholar]
- Majaranta, P.; Bulling, A. Eye Tracking and Eye-Based Human–Computer Interaction. In Human–Computer Interaction Series; Springer Science and Business Media LLC: London, UK, 2014; pp. 39–65. [Google Scholar]
- Hickson, S.; Dufour, N.; Sud, A.; Kwatra, V.; Essa, I. Eyemotion: Classifying Facial Expressions in VR Using Eye-Tracking Cameras. In Proceedings of the 2019 IEEE Winter Conference on Applications of Computer Vision (WACV), Waikoloa Village, HI, USA, 7–11 January 2019. [Google Scholar]
- Harezlak, K.; Kasprowski, P. Application of eye tracking in medicine: A survey, research issues and challenges. Comput. Med. Imaging Graph. 2018, 65, 176–190. [Google Scholar] [CrossRef]
- Fuhl, W.; Tonsen, M.; Bulling, A.; Kasneci, E. Pupil detection for head-mounted eye tracking in the wild: An evaluation of the state of the art. Mach. Vis. Appl. 2016, 27, 1275–1288. [Google Scholar] [CrossRef]
- Liu, T.-L.; Fan, C.-P. Visible-light wearable eye gaze tracking by gradients-based eye center location and head movement compensation with IMU. In Proceedings of the 2018 IEEE International Conference on Consumer Electronics (ICCE), Las Vegas, NV, USA, 12–14 January 2018. [Google Scholar]
- Robbins, S.; McEldowney, S.; Lou, X.; Nister, D.; Steedly, D.; Miller, Q.S.C.; Bohn, D.D.; Terrell, J.P.; Goris, A.C.; Ackerman, N. Eye-Tracking System Using a Freeform Prism and Gaze-Detection Light. U.S. Patent 10,228,561, 12 March 2019. [Google Scholar]
- Sasaki, M.; Nagamatsu, T.; Takemura, K. Screen corner detection using polarization camera for cross-ratio based gaze estimation. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, Denver, CO, USA, 25–28 June 2019. [Google Scholar]
- Holland, J. Eye Tracking: Biometric Evaluations of Instructional Materials for Improved Learning. Int. J Educ. Pedag. Sci. 2019, 13, 1001–1008. [Google Scholar]
- Chen, B.-C.; Wu, P.-C.; Chien, S.-Y. Real-time eye localization, blink detection, and gaze estimation system without infrared illumination. In Proceedings of the 2015 IEEE International Conference on Image Processing (ICIP), Quebec, QC, Canada, 27–30 September 2015. [Google Scholar]
- Fuhl, W.; Santini, T.C.; Kübler, T.; Kasneci, E. Else: Ellipse selection for robust pupil detection in real-world environments. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, Charleston, SC, USA, 14–17 March 2016. [Google Scholar]
- Fuhl, W.; Kübler, T.; Sippel, K.; Rosenstiel, W.; Kasneci, E. ExCuSe: Robust Pupil Detection in Real-World Scenarios. In International Conference on Computer Analysis of Images and Patterns; Springer: Cham, Germany, 2015; pp. 39–51. [Google Scholar]
- Kassner, M.; Patera, W.; Bulling, A. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication, Seattle, WA, USA, 13–17 September 2014. [Google Scholar]
- Javadi, A.-H.; Hakimi, Z.; Barati, M.; Walsh, V.; Tcheang, L. SET: A pupil detection method using sinusoidal approximation. Front. Neuroeng. 2015, 8, 4. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Li, D.; Winfield, D.; Parkhurst, D.J. Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches. In Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05)-Workshops, San Diego, CA, USA, 21–23 September 2005. [Google Scholar]
- Świrski, L.; Bulling, A.; Dodgson, N. Robust real-time pupil tracking in highly off-axis images. In Proceedings of the Symposium on Eye Tracking Research and Applications, Santa Barbara, CA, USA, 28–30 March 2012. [Google Scholar]
- Mompeán, J.; Aragon, J.L.; Prieto, P.M.; Artal, P. Design of an accurate and high-speed binocular pupil tracking system based on GPGPUs. J. Supercomput. 2017, 74, 1836–1862. [Google Scholar] [CrossRef] [Green Version]
- Naeem, A.; Qadar, A.; Safdar, W. Voice controlled intelligent wheelchair using raspberry pi. Int. J. Technol. Res. 2014, 2, 65. [Google Scholar]
- Rabhi, Y.; Mrabet, M.; Fnaiech, F. A facial expression controlled wheelchair for people with disabilities. Comput. Methods Programs Biomed. 2018, 165, 89–105. [Google Scholar] [CrossRef]
- Arai, K.; Mardiyanto, R.; Nopember, K.I.T.S. A Prototype of ElectricWheelchair Controlled by Eye-Only for Paralyzed User. J. Robot. Mechatron. 2011, 23, 66–74. [Google Scholar] [CrossRef]
- Mani, N.; Sebastian, A.; Paul, A.M.; Chacko, A.; Raghunath, A. Eye controlled electric wheel chair. Int. J. Adv. Res. Electr. Electron. Instrum. Eng. 2015, 4. [Google Scholar] [CrossRef]
- Gautam, G.; Sumanth, G.; Karthikeyan, K.; Sundar, S.; Venkataraman, D. Eye movement based electronic wheel chair for physically challenged persons. Int. J. Sci. Technol. Res. 2014, 3, 206–212. [Google Scholar]
- Patel, S.N.; Prakash, V.; Narayan, P.S. Autonomous camera based eye controlled wheelchair system using raspberry-pi. In Proceedings of the 2015 International Conference on Innovations in Information, Embedded and Communication Systems (ICIIECS), Coimbatore, India, 19–20 March 2015. [Google Scholar]
- Chacko, J.K.; Oommen, D.; Mathew, K.K.; Sunny, N.; Babu, N. Microcontroller based EOG guided wheelchair. Int. J. Med. Health Pharm. Biomed. Eng. 2013, 7, 409–412. [Google Scholar]
- Al-Haddad, A.; Sudirman, R.; Omar, C. Guiding Wheelchair Motion Based on EOG Signals Using Tangent Bug Algorithm. In Proceedings of the 2011 Third International Conference on Computational Intelligence, Modelling & Simulation, Langkawi, Malaysia, 20–22 September 2011. [Google Scholar]
- Elliott, M.A.; Malvar, H.; Maassel, L.L.; Campbell, J.; Kulkarni, H.; Spiridonova, I.; Sophy, N.; Beavers, J.; Paradiso, A.; Needham, C.; et al. Eye-controlled, power wheelchair performs well for ALS patients. Muscle Nerve 2019, 60, 513–519. [Google Scholar] [CrossRef] [Green Version]
- Krafka, K.; Khosla, A.; Kellnhofer, P.; Kannan, H.; Bhandarkar, S.; Matusik, W.; Torralba, A. Eye Tracking for Everyone. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 26 June–1 July 2016. [Google Scholar]
- George, A.; Routray, A. Real-time eye gaze direction classification using convolutional neural network. In Proceedings of the 2016 International Conference on Signal Processing and Communications (SPCOM), Bangalore, India, 12–15 June 2016. [Google Scholar]
- Spinel IR Camera, Spinel 2MP full HD USB Camera Module Infrared OV2710 with Non-distortion Lens FOV 100 degree, Support 1920x1080@30fps, UVC Compliant, Support most OS, Focus Adjustable UC20MPD_ND. Available online: https://www.amazon.com/Spinel-Non-distortion-1920x1080-Adjustable-UC20MPD_ND/dp/B0711JVGTN (accessed on 10 March 2017).
- He, D.-C.; Wang, L. Texture Unit, Texture Spectrum, and Texture Analysis. IEEE Trans. Geosci. Remote. Sens. 1990, 28, 509–512. [Google Scholar]
- Lian, Z.; Er, M.J.; Li, J. A Novel Face Recognition Approach under Illumination Variations Based on Local Binary Pattern. In Proceedings of the International Conference on Computer Analysis of Images and Patterns, Seville, Spain, 29–31 August 2011. [Google Scholar]
- Cs.columbia.edu. CAVE/Database: Columbia Gaze Data Set. 2017. Available online: http://www.cs.columbia.edu/CAVE/databases/columbia_gaze/ (accessed on 10 March 2017).
- GI4E/Gi4E Database. Available online: Ttp://gi4e.unavarra.es/databases/gi4e/ (accessed on 10 March 2017).
- UBIRIS Database. Available online: http://iris.di.ubi.pt/ (accessed on 10 March 2017).
- Lakovic, N.; Brkic, M.; Batinic, B.; Bajic, J.; Rajs, V.; Kulundzic, N. Application of low-cost VL53L0X ToF sensor for robot environment detection. In Proceedings of the 2019 18th International Symposium Infoteh-Jahorina (INFOTEH), East Sarajevo, Srpska, 20–22 March 2019. [Google Scholar]
CNN Layers | Subsampling Layers | FC Layers | Activation Function | |||||
Number | Filter Size | Stride | Zero Padding | Number | Pooling Type | Subsampling Factor (x,y) | Number | |
2 | 3 | 1 | 0 | 2 | max | 4 | 2 | tanh |
Actual | |||||
Predicted | Right | Forward | Left | Closed | |
Right | 98.75 | 0 | 1.25 | 0 | |
Left | 1.56 | 98.44 | 0 | 0 | |
Forward | 0 | 0 | 100 | 0 | |
Closed | 0 | 0 | 0 | 100 |
Accuracy (%) | 99.3 |
Frame Rate (frames/sec) | 99 |
Maximum Training Time (min) | 1.97 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Dahmani, M.; Chowdhury, M.E.H.; Khandakar, A.; Rahman, T.; Al-Jayyousi, K.; Hefny, A.; Kiranyaz, S. An Intelligent and Low-Cost Eye-Tracking System for Motorized Wheelchair Control. Sensors 2020, 20, 3936. https://doi.org/10.3390/s20143936
Dahmani M, Chowdhury MEH, Khandakar A, Rahman T, Al-Jayyousi K, Hefny A, Kiranyaz S. An Intelligent and Low-Cost Eye-Tracking System for Motorized Wheelchair Control. Sensors. 2020; 20(14):3936. https://doi.org/10.3390/s20143936
Chicago/Turabian StyleDahmani, Mahmoud, Muhammad E. H. Chowdhury, Amith Khandakar, Tawsifur Rahman, Khaled Al-Jayyousi, Abdalla Hefny, and Serkan Kiranyaz. 2020. "An Intelligent and Low-Cost Eye-Tracking System for Motorized Wheelchair Control" Sensors 20, no. 14: 3936. https://doi.org/10.3390/s20143936
APA StyleDahmani, M., Chowdhury, M. E. H., Khandakar, A., Rahman, T., Al-Jayyousi, K., Hefny, A., & Kiranyaz, S. (2020). An Intelligent and Low-Cost Eye-Tracking System for Motorized Wheelchair Control. Sensors, 20(14), 3936. https://doi.org/10.3390/s20143936