Remote Eye Gaze Tracking Research: A Comparative Evaluation on Past and Recent Progress
Abstract
:1. Introduction
- Collecting more literature on REGT systems’ evolution (past periods before 2015, and recent periods after 2015).
- Comprehensively comparing the key concepts and changes recorded in the evolutionary periods of REGT’s hardware setups, software processes, and applications.
- Presenting current issues in REGT systems’ research for future attempts.
2. Hardware Setup
2.1. Interface
2.2. Illumination
2.3. Camera
2.4. Subject
3. Software Process
3.1. Feature-Based Versus Model-Based Methods
3.1.1. Image Acquisition and Pre-Processing
3.1.2. Feature Detection
3.1.3. Feature Extraction
3.1.4. Gaze Calibration and Mapping
- Five-point linear polynomial: The linear polynomial calibration points are the simplest. The method presents a five-point marker on a screen for the subject to look at. By looking at these points and clicking on them, the mapping between screen coordinates and the extracted feature parameters is performed using following equation derived in [152]:
- Nine or 25-point second-order polynomial: By fitting higher order polynomials, the second-order polynomial has been shown to increase the accuracy of this system compared to linear ones [8]. A second-order polynomial calibration function was used with a set of nine calibration points in [25,101] and 25 calibration points in [152]. The polynomial is defined as:
- Homography matrix: Under homography, the calibration routines capture homogeneous coordinates as screen points = (, , 1), and their corresponding feature points = (, , 1) are captured (homogeneous coordinates). The transformation of point in 3D from screen points to the feature points is given by;
- Interpolation: The authors of [140] had the subject look at several points on a screen to record the corresponding eye feature points and positions. These points served as the calibration points. Then they computed the gaze coordinates by interpolation (a 2D linear mapping from the eye feature to the gaze on screen). The details of this mapping function are as follows:
3.1.5. Calibration Error Calculation
3.2. Appearance-Based Methods
3.2.1. Image Acquisition and Pre-Processing
3.2.2. Model Training
- Convolutional Neural Networks
Deep Network Classification | Literature | Year | Input | Network Description | Output |
---|---|---|---|---|---|
Single-region CNN | [114] | 2017 | Full face |
| Point of gaze (2D) |
Multi-region CNN | [17] | 2016 | Right & Left eye, face, and face grid |
| |
[188] | 2017 | Head pose, and eye |
| ||
[189] | 2020 | Right & Left eye, full face, and face depth |
| ||
Single-region CNN | [53] | 2015 | Double eye, and head pose |
| Gaze angle (3D) |
[38] | 2019 | Double eye, and head pose |
| ||
[190] | 2020 | Full face |
| ||
Multi-region CNN | [191] | 2016 | Right & Left eye |
| |
[111] | 2018 | Right & Left eye, and head pose |
| ||
[112] | 2018 | Right & Left eye, and face |
| ||
[173] | 2019 | Right & Left eye |
| ||
CNN with RNN fusion | [177] | 2018 | Full-face, eye-region, facial landmarks |
| Gaze angle (3D) |
[176] | 2019 | Left & Right eye, and face |
| ||
[20] | 2019 | Full face |
| ||
[192] | 2020 | Right & Left eye |
| ||
GAN | [193] | 2020 | Full face |
| Gaze angle (3D) |
GAN | [194] | 2021 | Eye region, head pose |
| |
CNN with GAN fusion | [195] | 2017 | Eye |
| |
[178,179] | 2020 | Eye |
|
- 2.
- Recurrent Neural Network
- 3.
- Generative Adversarial Networks
3.3. Evaluation and Performance Metrics for REGTs
3.3.1. Precision Evaluation of REGT Systems
3.3.2. Accuracy Evaluation of REGT Systems
Methods | Literature | Accuracy | Hardware Setup | Software Process |
---|---|---|---|---|
Active light feature-based | [203] | <1° | Desktop, stereo infrared camera, 3 NIR | Purkinje Image, 1 point |
[24] | 0.9° | Desktop, 1 infrared camera, 2 NIR | PCCR, Multiple points | |
[204] | 96.71% | Desktop, 2 infrared camera, 4 NIR | PCCR, Multiple points | |
[205] | 10.3 mm | 1 infrared camera, 4 NIR | PCCR, Multiple point | |
Active light model-based | [206] | <1° | Desktop, 1 infrared camera, 4 NIR | Eye model, 1 point |
[207] | 1° | Desktop, stereo camera, Pan-tilt infrared camera, 1 NIR | Eye model, 2 points | |
[98] | <1° | Desktop, 1 infrared camera, 2 NIR | Eye model, Multiple | |
Passive light feature-based | [208,209] | 1.6° | Desktop, 1 web camera | PC-EC, GP, Grid |
[61,101,128] | 1.2°–2.5° | Desktop, 1 web camera | PC-EC, PI, Grid | |
[210] | >3° | Desktop, 1 web camera | EC, LI | |
Passive light model-based | [26] | 2.42° | Desktop, 1 web camera | ES-IC, Grid |
[70] | <1° | Desktop, 1 web camera | ES, Grid | |
[41] | ~500 Hz | Desktop, 2 web camera | Eye model, Grid | |
Passive light appearance-based with machine learning | [118] | 1.53° | Desktop, 1 web camera | RF, 25 points |
[124] | 2° | Desktop/Handheld, 1 web camera | GP, Grid | |
[120,121,122] | 2.2°–2.5° | Desktop, 1 web camera | LLI, Grid | |
[110] | <3.68° | Desktop, 1 web camera | ANN, 50 points | |
[119,123] | 3.5°–4.3° | Desktop, 1 web camera | LLI, Saliency | |
[108] | 4.8°–7.5° | Desktop, 1 web camera | KNN, Calibration free | |
Passive light appearance-based with deep learning | [211] | 7.74° | Handheld, 1 web camera | CNN, Calibration free |
[191] | 81.37% | Desktop, 1 web camera | CNN, Calibration free | |
[17] | 1.71 cm and 2.53 cm | Handheld, 1web camera | CNN, Calibration free |
3.4. Dataset
Benchmarks for Evaluating REGT Performance
- Cross-dataset Evaluation
- 2.
- Within-dataset Evaluation
- 3.
- Subject-specific Evaluation
- 4.
- Cross-device Evaluation
- 5.
- Robustness Evaluation
4. Applications
5. Summary
- Feature detection and extraction: The accuracy of gaze estimation depends largely on effective feature detection and extraction. Recent learning-based feature detection and extraction have brought remarkable progress for appearance-based gaze estimation, as researchers have continued to demonstrate its effectiveness [170,171,172]. However, the issue of deviations in location and degree of eye openness remains a challenge for this method.
- Variability issues in data collection: Collection of relevant data of varied characteristics has been a challenge for appearance-based gaze estimation. Recent attempts by researchers to create and make available robust datasets have been presented in [20,112,190]. However, the reported accuracies on these datasets are still not satisfactory, and thus require more attention.
- Subject calibration: Gaze estimation methods require a subject calibration process for different individuals. Appearance-based methods have demonstrated less stringent processes of calibration, where only a few samples of calibration data are required before they work for other individuals [185,232]. Even with this development, an appearance-based gaze estimation method that tries to either make the calibration process less stringent or absolutely remove it does so at the expense of estimation accuracy. Thus, the need to develop absolute appearance-based calibration-free methods with very good estimation accuracy remains a challenge.
- Head fixation: Although several appearance-based gaze estimation methods have been able to perform well with considerable accuracies, without requiring fixed head poses [93,108,111,113,211], most of them still can only handle small head movements to achieve high accuracy [110,116]. As such, more robust methods that freely allow for head movement are still sought.
- Model input: Researchers are trying to determine if the choice of model input has any effect on the performance of the model. For instance, does a model that uses both eyes perform better than one that uses one eye in terms of accuracy? We have seen several attempts with models that use one (single) eye [188,191], both (double) eyes [17,53,111,112], or full face images [114,190], augmented with other inputs, such as head pose and face grid information, to estimate gaze. At the moment, there is not a clear understanding of how to determine the best or standard model input for deep learning gaze estimation methods. It is done at the discretion of researchers, and still remains a choice of convenience for the models proposed.
- Resolution of input images: Does training and testing a model with different input image sizes improve performance? In [38,53], the model trained with images of one size achieved much worse results than the one trained on images of multiple sizes. On the other hand, Zhang et al. [190] demonstrated an improved performance when training and testing with a single image size at high resolutions. Based on this, researchers are suggesting methods to handle training of cross-resolution input images.
- Model augmentation: Are there working step-up techniques employed for improving gaze estimation errors on public datasets from recent years? Early attempts used a single-region CNN model for gaze estimation, as demonstrated in [53,114], but recent step-up attempts are opting for multi-region CNNs [17,112,173,188], where each input is processed through a separate network to provide each network with a higher resolution of the input image and improve the processing capabilities of the networks for a better performance of the overall model. Another step-up technique is to improve early CNNs’ poor generalization performances due to appearance and head pose variations using adversarial learning approaches. As recently attempted by researchers [179,180,181,182], the basic idea is to improve the generalization performance of a traditional CNN-based gaze estimator by incorporating adversarial nets with ConvNet. The adversarial nets are commonly used to improve the input image fed into the ConvNet to estimate gaze.
- Data annotation: Training a deep learning gaze estimation model by supervision requires data annotation for the model to learn the task. Grid-based annotation has been widely adopted for this method. How can its effectiveness be judged? It divides the screen into horizontal , vertical , and grids , which produces grids. This usually causes more calculations and affects the accuracy of the model’s output. Considering these drawbacks, bin-based annotation was proposed in [174] to control the number of labels on the gaze image. It divides the screen into horizontal and vertical bins and set the bin size as , making bins. The grid-based annotation yields more annotation data, whereas the bin-based annotation yields less annotation data but requires more processing steps. Due to lack of sufficient reports on alternative annotation techniques, it is difficult to judge the effectiveness of the grid-base annotation.
6. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- What Eye-Tracking Can and Can’t Tell You about Attention. Available online: https://www.nmsba.com/buying-neuromarketing/neuromarketing-techniques/what-eye-tracking-can-and-cant-tell-you-about-attention (accessed on 7 October 2019).
- Judd, C.H.; McAllister, C.N.; Steele, W.M. General introduction to a series of studies of eye movements by means of kinetoscopic photographs. Psychol. Rev. Monogr. 1905, 7, 1–16. [Google Scholar]
- Duchowski, A. Eye Tracking Methodology: Theory and Practice, 2nd ed.; Springer: London, UK, 2007. [Google Scholar] [CrossRef]
- Mowrer, O.H.; Theodore, C.R.; Miller, N.E. The corneo-retinal potential difference as the basis of the galvanometric method of recording eye movements. Am. J. Physiol. Leg. Content 1935, 114, 423–428. [Google Scholar] [CrossRef]
- Marge, E. Development of electro-oculography; standing potential of the eye in registration of eye movement. AMA Arch. Ophthalmol. 1951, 45, 169–185. [Google Scholar] [CrossRef]
- Glenstrup, A.; Engell-Nielsen, T. Eye Controlled Media: Present and Future State. Master’s Thesis, University of Copenhagen, Copenhagen, Denmark, 1995. [Google Scholar]
- Yoo, D.H.; Chung, M.J. Non-intrusive eye gaze estimation without knowledge of eye pose. In Proceedings of the 6th IEEE International Conference on Automatic Face and Gesture Recognition, Seoul, Korea, 19 May 2004; pp. 785–790. [Google Scholar] [CrossRef]
- Morimoto, C.H.; Mimica, M.R. Eye gaze tracking techniques for interactive applications. Comput. Vis. Image Underst. 2005, 98, 4–24. [Google Scholar] [CrossRef]
- Rayner, K. Eye movements in reading and information processing: 20 Years of research. Psychol. Bull. 1998, 124, 372–422. [Google Scholar] [CrossRef] [PubMed]
- Young, L.R.; Sheena, D. Survey of eye movement recording methods. Behav. Res. Methods Instrum. 1975, 7, 397–429. [Google Scholar] [CrossRef]
- Eggert, T. Eye movement recordings: Methods. Dev. Ophthamol. 2007, 40, 15–34. [Google Scholar] [CrossRef]
- Joyce, C.A.; Gorodnitsky, I.F.; King, J.W.; Kutas, M. Tracking eye fixations with electroocular and electroencephalographic recordings. Psychophysiology 2002, 39, 607–618. [Google Scholar] [CrossRef]
- Oeltermann, A.; Ku, S.; Logothetis, N.K. A novel functional magnetic resonance imaging compatible search-coil eye-tracking system. Magn. Reson. Imaging 2007, 25, 913–922. [Google Scholar] [CrossRef]
- Domdei, N.; Linden, M.; Reiniger, J.L.; Holz, F.G.; Harmening, W.M. Eye tracking-based estimation and compensation of chromatic offsets for multi-wavelength retinal microstimulation with foveal cone precision. Biomed. Opt. Express 2019, 10, 4126–4141. [Google Scholar] [CrossRef]
- Reingold, E.M. Eye Tracking Research and Technology: Towards Objective Measurement of Data Quality. Vis. Cogn. 2014, 22, 635–652. [Google Scholar] [CrossRef]
- Huang, Q.; Veeraraghavan, A.; Sabharwal, A. TabletGaze: Dataset and analysis for unconstrained appearance based gaze estimation in mobile tablets. Mach. Vis. Appl. 2017, 28, 445–461. [Google Scholar] [CrossRef]
- Krafka, K.; Khosla, A.; Kellnhofer, P.; Kannan, H.; Bhandarkar, S.; Matusik, W.; Torralba, A. Eye Tracking for Everyone. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 2176–2184. [Google Scholar] [CrossRef]
- Carlin, J.D.; Calder, A.J. The neural basis of eye gaze processing. Curr. Opin. Neurobiol. 2013, 23, 450–455. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Liu, G.; Yu, Y.; Funes-Mora, K.A.; Odobez, J. A Differential Approach for Gaze Estimation. IEEE Trans. Pattern Anal. Mach. Intell. 2019, 43, 1092–1099. [Google Scholar] [CrossRef] [Green Version]
- Kellnhofer, P.; Recasens, A.; Stent, S.; Matusik, W.; Torralba, A. Gaze360: Physically Unconstrained Gaze Estimation in the Wild. In Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), Seoul, Korea, 27 October–2 November 2019; pp. 6911–6920. [Google Scholar] [CrossRef] [Green Version]
- Hoshino, K.; Shimanoe, S.; Nakai, Y.; Noguchi, Y.; Nakamura, M. Estimation of the Line of Sight from Eye Images with Eyelashes. In Proceedings of the 5th International Conference on Intelligent Information Technology (ICIIT 2020), Hanoi, Vietnam, 19–22 February 2020; pp. 116–120. [Google Scholar] [CrossRef]
- Strupczewski, A. Commodity Camera Eye Gaze Tracking. Ph.D. Dissertation, Warsaw University of Technology, Warsaw, Poland, 2016. [Google Scholar]
- Wang, J.; Sung, E. Study on eye gaze estimation. IEEE Trans. Syst. Man Cybern. 2002, 32, 332–350. [Google Scholar] [CrossRef]
- Guestrin, E.D.; Eizenman, M. General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Trans. Biomed. Eng. 2006, 53, 1124–1133. [Google Scholar] [CrossRef]
- Morimoto, C.H.; Koons, D.; Amir, A.; Flickner, M. Pupil detection and tracking using multiple light sources. Image Vis. Comput. 2000, 18, 331–335. [Google Scholar] [CrossRef]
- Baek, S.; Choi, K.; Ma, C.; Kim, Y.; Ko, S. Eyeball model-based iris center localization for visible image-based eye-gaze tracking systems. IEEE Trans. Consum. Electron. 2013, 59, 415–421. [Google Scholar] [CrossRef]
- Lee, J.W.; Cho, C.W.; Shin, K.Y.; Lee, E.C.; Park, K.R. 3D gaze tracking method using Purkinje images on eye optical model and pupil. Opt. Lasers Eng. 2012, 50, 736–751. [Google Scholar] [CrossRef]
- Sigut, J.; Sidha, S. Iris Center Corneal Reflection Method for Gaze Tracking Using Visible Light. IEEE Trans. Biomed. Eng. 2011, 58, 411–419. [Google Scholar] [CrossRef] [PubMed]
- Murphy-Chutorian, E.; Doshi, A.; Trivedi, M.M. Head Pose Estimation for Driver Assistance Systems: A Robust Algorithm and Experimental Evaluation. In Proceedings of the 2007 IEEE Intelligent Transportation Systems Conference, Washington, DC, USA, 30 September–3 October 2007; pp. 709–714. [Google Scholar] [CrossRef] [Green Version]
- Fu, X.; Guan, X.; Peli, E.; Liu, H.; Luo, G. Automatic Calibration Method for Driver’s Head Orientation in Natural Driving Environment. IEEE Trans. Intell. Transp. Syst. 2013, 14, 303–312. [Google Scholar] [CrossRef]
- Lee, S.J.; Jo, J.; Jung, H.G.; Park, K.R.; Kim, J. Real-Time Gaze Estimator Based on Driver’s Head Orientation for Forward Collision Warning System. IEEE Trans. Intell. Transp. Syst. 2011, 12, 254–267. [Google Scholar] [CrossRef]
- Wang, Y.; Yuan, G.; Mi, Z.; Peng, J.; Ding, X.; Liang, Z.; Fu, X. Continuous Driver’s Gaze Zone Estimation Using RGB-D Camera. Sensors 2019, 19, 1287. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Kaminski, J.Y.; Knaan, D.; Shavit, A. Single image face orientation and gaze detection. Mach. Vis. Appl. 2008, 21, 85. [Google Scholar] [CrossRef]
- Smith, P.; Shah, M.; Lobo, N. Determining driver visual attention with one camera. IEEE Trans. Intell. Transp. Syst. 2003, 4, 205–218. [Google Scholar] [CrossRef] [Green Version]
- Valenti, R.; Sebe, N.; Gevers, T. Combining Head Pose and Eye Location Information for Gaze Estimation. IEEE Trans. Image Process. 2012, 21, 802–815. [Google Scholar] [CrossRef] [Green Version]
- Zhu, Z.; Ji, Q. Eye and gaze tracking for interactive graphic display. Mach. Vis. Appl. 2004, 15, 139–148. [Google Scholar] [CrossRef]
- Lu, F.; Sugano, Y.; Okabe, T.; Sato, Y. Adaptive Linear Regression for Appearance-Based Gaze Estimation. IEEE Trans. Pattern Anal. Mach. Intell. 2014, 36, 2033–2046. [Google Scholar] [CrossRef]
- Zhang, X.; Sugano, Y.; Fritz, M.; Bulling, A. MPIIGaze: Real-World Dataset and Deep Appearance-Based Gaze Estimation. IEEE Trans. Pattern Anal. Mach. Intell. 2019, 41, 162–175. [Google Scholar] [CrossRef] [Green Version]
- Shehu, I.S.; Wang, Y.; Athuman, A.M.; Fu, X. Paradigm Shift in Remote Eye Gaze Tracking Research: Highlights on Past and Recent Progress. In Proceedings of the Future Technologies Conference (FTC 2020), Vancouver, BC, Canada, 5–6 November 2021; Volume 1. [Google Scholar] [CrossRef]
- Zhang, X.; Huang, M.X.; Sugano, Y.; Bulling, A. Training Person-Specific Gaze Estimators from User Interactions with Multiple Devices. In Proceedings of the Conference on Human Factors in Computing Systems (CHI 2018), Montreal, QC, Canada, 21–26 April 2018. [Google Scholar] [CrossRef]
- Zhu, Z.; Ji, Q. Novel Eye Gaze Tracking Techniques under Natural Head Movement. IEEE Trans. Biomed. Eng. 2007, 54, 2246–2260. [Google Scholar] [CrossRef] [PubMed]
- Sticky by Tobii Pro. Available online: https://www.tobiipro.com/product-listing/sticky-by-tobii-pro/ (accessed on 10 October 2019).
- Wood, E.; Bulling, A. EyeTab: Model-based gaze estimation on unmodified tablet computers. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA 2014), Safety Harbor, FL, USA, 26–28 March 2014; pp. 207–210. [Google Scholar] [CrossRef]
- Zhang, Y.; Chong, M.K.; Müller, J.; Bulling, A.; Gellersen, H. Eye tracking for public displays in the wild. Pers. Ubiquitous Comput. 2015, 19, 967–981. [Google Scholar] [CrossRef]
- The iMotions Screen-Based Eye Tracking Module. Available online: https://imotions.com/blog/screen-based-eye-tracking-module/ (accessed on 5 February 2020).
- Matsuno, S.; Sorao, S.; Susumu, C.; Akehi, K.; Itakura, N.; Mizuno, T.; Mito, K. Eye-movement measurement for operating a smart device: A small-screen line-of-sight input system. In Proceedings of the 2016 IEEE Region 10 Conference (TENCON 2016), Singapore, 22–25 November 2016; pp. 3798–3800. [Google Scholar] [CrossRef]
- How to Get a Good Calibration. Available online: https://www.tobiidynavox.com/supporttraining/eye-tracker-calibration/how-to-get-a-good-calibration/ (accessed on 16 September 2019).
- Drewes, H.; De Luca, A.; Schmidt, A. Eye-Gaze Interaction for Mobile Phones. In Proceedings of the 4th International Conference on Mobile Technology, Applications, and Systems and the 1st International Symposium on Computer Human Interaction in Mobile Technology, Singapore, 10–12 September 2007; pp. 364–371. [Google Scholar] [CrossRef] [Green Version]
- Cheng, H.; Liu, Y.; Fu, W.; Ji, Y.; Yang, L.; Zhao, Y.; Yang, J. Gazing Point Dependent Eye Gaze Estimation. Pattern Recognit. 2017, 71, 36–44. [Google Scholar] [CrossRef]
- Gaze Tracking Technology: The Possibilities and Future. Available online: http://journal.jp.fujitsu.com/en/2014/09/09/01/ (accessed on 17 September 2019).
- Cho, D.; Kim, W. Long-Range Gaze Tracking System for Large Movements. IEEE Trans. Biomed. Eng. 2013, 60, 3432–3440. [Google Scholar] [CrossRef]
- Zhang, X.; Sugano, Y.; Bulling, A. Evaluation of Appearance-Based Methods and Implications for Gaze-Based Applications. In Proceedings of the Conference on Human Factors in Computing Systems (CHI 2019), Glasgow, UK, 4–9 May 2019. [Google Scholar] [CrossRef] [Green Version]
- Zhang, X.; Sugano, Y.; Fritz, M.; Bulling, A. Appearance-based gaze estimation in the wild. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2015), Boston, MA, USA, 7–12 June 2015; pp. 4511–4520. [Google Scholar] [CrossRef] [Green Version]
- Ramanauskas, N. Calibration of Video-Oculographical Eye Tracking System. Electron. Electr. Eng. 2006, 8, 65–68. [Google Scholar]
- Kotus, J.; Kunka, B.; Czyzewski, A.; Szczuko, P.; Dalka, P.; Rybacki, R. Gaze-tracking and Acoustic Vector Sensors Technologies for PTZ Camera Steering and Acoustic Event Detection. In Proceedings of the 2010 Workshops on Database and Expert Systems Applications, Bilbao, Spain, 30 August–3 September 2010; pp. 276–280. [Google Scholar] [CrossRef]
- Ohno, T.; Mukawa, N.; Yoshikawa, A. FreeGaze: A gaze tracking system for everyday gaze interaction. In Proceedings of the Eye Tracking Research & Application Symposium (ETRA 2002), New Orleans, LA, USA, 25–27 March 2002; pp. 125–132. [Google Scholar] [CrossRef]
- Ebisawa, Y.; Satoh, S. Effectiveness of pupil area detection technique using two light sources and image difference method. In Proceedings of the 15th IEEE Engineering Conference in Medicine and Biology Society, San Diego, CA, USA, 31 October 1993; pp. 1268–1269. [Google Scholar] [CrossRef]
- Morimoto, C.H.; Amir, A.; Flickner, M. Detecting eye position and gaze from a single camera and 2 light sources. In Proceedings of the International Conference on Pattern Recognition, Quebec City, QC, Canada, 11–15 August 2002; pp. 314–317. [Google Scholar] [CrossRef]
- Tomono, A.; Lida, M.; Kobayashi, Y. A TV Camera System Which Extracts Feature Points for Non-Contact Eye Movement Detection. In Optics, Illumination, and Image Sensing for Machine; Svetkoff, D.J., Ed.; SPIE: Bellingham, WA, USA, 1990; Volume 1194. [Google Scholar] [CrossRef]
- Coutinho, F.L.; Morimoto, C.H. Free head motion eye gaze tracking using a single camera and multiple light sources. In Proceedings of the 19th Brazilian Symposium on Computer Graphics and Image Processing, Amazonas, Brazil, 8–11 October 2006; pp. 171–178. [Google Scholar] [CrossRef]
- Cheung, Y.; Peng, Q. Eye Gaze Tracking with a Web Camera in a Desktop Environment. IEEE Trans. Hum. Mach. Syst. 2015, 45, 419–430. [Google Scholar] [CrossRef]
- Accuracy and Precision Test Method for Remote Eye Trackers: Test Specification (Version: 2.1.1). Available online: https://www.tobiipro.com/siteassets/tobii-pro/learn-and-support/use/what-affects-the-performance-of-an-eye-tracker/tobii-test-specifications-accuracy-and-precision-test-method.pdf/?v=2.1.1 (accessed on 10 February 2011).
- Lupu, R.G.; Ungureanu, F. A survey of eye tracking methods and applications. Bul. Inst. Politeh. Iasi 2013, 3, 72–86. [Google Scholar]
- Kim, S.M.; Sked, M.; Ji, Q. Non-intrusive eye gaze tracking under natural head movements. In Proceedings of the 26th IEEE Engineering Conference in Medicine and Biology Society, San Francisco, CA, USA, 1–4 September 2004; Volume 1, pp. 2271–2274. [Google Scholar] [CrossRef]
- Hennessey, C.; Fiset, J. Long range eye tracking: Bringing eye tracking into the living room. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA 2012), Santa Barbara, CA, USA, 28–30 March 2012; pp. 249–252. [Google Scholar] [CrossRef]
- Jafari, R.; Ziou, D. Gaze estimation using Kinect/PTZ camera. In Proceedings of the IEEE International Symposium on Robotic and Sensors Environments, Magdeburg, Germany, 16–18 November 2012; pp. 13–18. [Google Scholar] [CrossRef]
- Lee, H.C.; Lee, W.O.; Cho, C.W.; Gwon, S.Y.; Park, K.R.; Lee, H.; Cha, J. Remote gaze tracking system on a large display. Sensors 2013, 13, 13439–13463. [Google Scholar] [CrossRef]
- Kar, A.; Corcoran, P. A Review and Analysis of Eye-Gaze Estimation Systems, Algorithms and Performance Evaluation Methods in Consumer Platforms. IEEE Access 2017, 5, 16495–16519. [Google Scholar] [CrossRef]
- Mansouryar, M.; Steil, J.; Sugano, Y.; Bulling, A. 3D Gaze Estimation from 2D Pupil Positions on Monocular Head-Mounted Eye Trackers. In Proceedings of the 9th ACM International Symposium on Eye Tracking Research & Applications (ETRA 2016), Charleston, SC, USA, 14–17 March 2016; pp. 197–200. [Google Scholar] [CrossRef] [Green Version]
- Venkateswarlu, R. Eye gaze estimation from a single image of one eye. In Proceedings of the 9th IEEE International Conference on Computer Vision, Nice, France, 13–16 October 2003; Volume 1, pp. 136–143. [Google Scholar] [CrossRef] [Green Version]
- Ferhat, O.; Vilariño, F. Low Cost Eye Tracking: The Current Panorama. Comput. Intell. Neurosci. 2016, 2016, 8680541. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Wang, X.; Liu, K.; Qian, X. A Survey on Gaze Estimation. In Proceedings of the 10th International Conference on Intelligent Systems and Knowledge Engineering (ISKE 2015), Taipei, Taiwan, 24–27 November 2015; pp. 260–267. [Google Scholar] [CrossRef]
- Ki, J.; Kwon, Y.M. 3D Gaze Estimation and Interaction. In Proceedings of the IEEE 3DTV Conference: The True Vision—Capture, Transmission and Display of 3D Video, Istanbul, Turkey, 28–30 May 2008; pp. 373–376. [Google Scholar] [CrossRef]
- Model, D.; Eizenman, M. User-calibration-free remote eye-gaze tracking system with extended tracking range. In Proceedings of the 24th Canadian Conference on Electrical and Computer Engineering (CCECE 2011.), Niagara Falls, ON, Canada, 8–11 May 2011; pp. 001268–001271. [Google Scholar] [CrossRef]
- Pichitwong, W.; Chamnongthai, K. 3-D gaze estimation by stereo gaze direction. In Proceedings of the 13th International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology (ECTI-CON 2016), Chiang Mai, Thailand, 28 June–1 July 2016; pp. 1–4. [Google Scholar] [CrossRef]
- Zhu, Z.; Ji, Q. Eye gaze tracking under natural head movements. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR 2005), San Diego, CA, USA, 20–25 June 2005; Volume 1, pp. 918–923. [Google Scholar] [CrossRef]
- Wen, Q.; Bradley, D.; Beeler, T.; Park, S.; Hilliges, O.; Yong, J.; Xu, F. Accurate real-time 3D Gaze Tracking Using a Lightweight Eyeball Calibration. Comput. Graph. Forum 2020, 39, 475–485. [Google Scholar] [CrossRef]
- Wang, K.; Ji, Q. Real Time Eye Gaze Tracking with 3D Deformable Eye-Face Model. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 1003–1011. [Google Scholar] [CrossRef]
- Funes Mora, K.A.; Odobez, J.M. Geometric Generative Gaze Estimation (G3E) for Remote RGB-D Cameras. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 23–28 June 2014; pp. 1773–1780. [Google Scholar] [CrossRef] [Green Version]
- Li, Y.; Monaghan, D.S.; O’Connor, N.E. Real-Time Gaze Estimation Using a Kinect and a HD Webcam. In Proceedings of the International Conference on Multimedia Modeling, Dublin, Ireland, 6–10 January 2014; pp. 506–517. [Google Scholar] [CrossRef] [Green Version]
- Chen, J.; Ji, Q. 3D gaze estimation with a single camera without IR illumination. In Proceedings of the 19th International Conference on Pattern Recognition, Tampa, FL, USA, 8–11 December 2008; pp. 1–4. [Google Scholar] [CrossRef]
- Sun, L.; Liu, Z.; Sun, M. Real time gaze estimation with a consumer depth camera. Inf. Sci. 2015, 320, 346–360. [Google Scholar] [CrossRef]
- Xiong, X.; Cai, Q.; Liu, Z.; Zhang, Z. Eye Gaze Tracking Using an RGBD Camera: A Comparison with an RGB Solution. In Proceedings of the ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp 2014), Seattle, WA, USA, 13–17 September 2014; pp. 1113–1121. [Google Scholar] [CrossRef]
- Pieszala, J.; Diaz, G.; Pelz, J.; Speir, J.; Bailey, R. 3D Gaze Point Localization and Visualization Using LiDAR-based 3D reconstructions. In Proceedings of the ACM Symposium on Eye Tracking Research & Applications (ETRA 2016), Charleston, SC, USA, 14–17 March 2016; pp. 201–204. [Google Scholar] [CrossRef] [Green Version]
- Wang, H.; Pi, J.; Qin, T.; Shen, S.; Shi, B.E. SLAM-based localization of 3D gaze using a mobile eye tracker. In Proceedings of the ACM Symposium on Eye Tracking Research & Applications (ETRA 2018), Warsaw, Poland, 14–17 June 2018; pp. 1–5. [Google Scholar] [CrossRef]
- How to Position Participants and the Eye Tracker. Available online: https://www.tobiipro.com/learnand-support/learn/steps-in-an-eye-tracking-study/run/how-to-position-the-participant-and-the-eye-tracker/ (accessed on 26 December 2019).
- Hansen, D.W.; Ji, Q. In the Eye of the Beholder: A Survey of Models for Eyes and Gaze. IEEE Trans. Pattern Anal. Mach. Intell. 2010, 32, 478–500. [Google Scholar] [CrossRef]
- Sireesha, M.V.; Vijaya, P.A.; Chellamma, K. A Survey on Gaze Estimation Techniques. In Proceedings of the International Conference on VLSI, Communication, Advanced Devices, Signals & Systems and Networking (VCASAN-2013), Bangalore, India, 17–19 June 2013; pp. 353–361. [Google Scholar] [CrossRef]
- Jiang, J.; Zhou, X.; Chan, S.; Chen, S. Appearance-Based Gaze Tracking: A Brief Review. In Proceedings of the International Conference on Intelligent Robotics and Applications, Shenyang, China, 8–11 August 2019; pp. 629–640. [Google Scholar] [CrossRef]
- Lindén, E.; Sjöstrand, J.; Proutiere, A. Learning to Personalize in Appearance-Based Gaze Tracking. In Proceedings of the IEEE/CVF International Conference on Computer Vision Workshop (ICCVW 2019), Seoul, Korea, 27–28 October 2019; pp. 1140–1148. [Google Scholar] [CrossRef] [Green Version]
- Al-Rahayfeh, A.; Faezipour, M. Eye Tracking and Head Movement Detection: A State-of-Art Survey. IEEE J. Transl. Eng. Health Med. 2013, 1, 2100212. [Google Scholar] [CrossRef]
- Tonsen, M.; Steil, J.; Sugano, Y.; Bulling, A. InvisibleEye: Mobile Eye Tracking Using Multiple Low-Resolution Cameras and Learning-Based Gaze Estimation. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2017, 1, 1–21. [Google Scholar] [CrossRef]
- Wood, E.; Baltrušaitis, T.; Morency, L.P.; Robinson, P.; Bulling, A. Learning an Appearance Based Gaze Estimator from One Million Synthesised Images. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications (ETRA 2016), Charleston, SC, USA, 14–17 March 2016; pp. 131–138. [Google Scholar] [CrossRef] [Green Version]
- Blignaut, P. Mapping the Pupil-Glint Vector to Gaze Coordinates in a Simple Video-Based Eye Tracker. J. Eye Mov. Res. 2014, 7, 1–11. [Google Scholar] [CrossRef]
- Cerrolaza, J.; Villanueva, A.; Cabeza, R. Taxonomic Study of Polynomial Regressions Applied to the Calibration of Video-Oculographic Systems. In Proceedings of the Eye Tracking Research and Applications Symposium (ETRA 2008), Savannah, GA, USA, 26–28 March 2008; pp. 259–266. [Google Scholar] [CrossRef]
- Cherif, Z.R.; Nait-Ali, A.; Motsch, J.F.; Krebs, M.O. An adaptive calibration of an infrared light device used for gaze tracking. In Proceedings of the 19th IEEE Instrumentation and Measurement Technology Conference (IEEE Cat. No.00CH37276), Anchorage, AK, USA, 21–23 May 2002; Volume 2, pp. 1029–1033. [Google Scholar] [CrossRef]
- Jian-nan, C.; Chuang, Z.; Yan-tao, Y.; Yang, L.; Han, Z. Eye Gaze Calculation Based on Nonlinear Polynomial and Generalized Regression Neural Network. In Proceedings of the Fifth International Conference on Natural Computation, Tianjian, China, 14–16 August 2009; Volume 3, pp. 617–623. [Google Scholar] [CrossRef]
- Hennessey, C.; Noureddin, B.; Lawrence, P. A Single Camera Eye-Gaze Tracking System with Free Head Motion. In Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA 2006), San Diego, CA, USA, 27–29 March 2006; pp. 87–94. [Google Scholar] [CrossRef]
- Meyer, A.; Böhme, M.; Martinetz, T.; Barth, E. A Single-Camera Remote Eye Tracker. In Proceedings of the International Tutorial and Research Workshop on Perception and Interactive Technologies for Speech-Based Systems, Kloster Irsee, Germany, 19–21 June 2006; pp. 208–211. [Google Scholar] [CrossRef]
- Jian-nan, C.; Peng-yi, Z.; Si-yi, Z.; Chuang, Z.; Ying, H. Key Techniques of Eye Gaze Tracking Based on Pupil Corneal Reflection. In Proceedings of the WRI Global Congress on Intelligent Systems, Xiamen, China, 19–21 May 2009; Volume 2, pp. 133–138. [Google Scholar] [CrossRef]
- Cai, H.; Yu, H.; Zhou, X.; Liu, H. Robust Gaze Estimation via Normalized Iris Center-Eye Corner Vector. In Proceedings of the International Conference on Intelligent Robotics and Applications, Tokyo, Japan, 22–24 August 2016; Volume 9834, pp. 300–309. [Google Scholar] [CrossRef] [Green Version]
- Wu, H.; Chen, Q.; Wada, T. Conic-based algorithm for visual line estimation from one image. In Proceedings of the Sixth IEEE International Conference on Automatic Face and Gesture Recognition, Seoul, Korea, 19 May 2004; pp. 260–265. [Google Scholar] [CrossRef]
- Hansen, D.W.; Pece, A. Eye typing off the shelf. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Washington, DC, USA, 27 June–2 July 2004; Volume 2, p. II. [Google Scholar] [CrossRef]
- Yamazoe, H.; Utsumi, A.; Yonezawa, T.; Abe, S. Remote and head-motion-free gaze tracking for real environments with automated head-eye model calibrations. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, Anchorage, AK, USA, 23–28 June 2008; pp. 1–6. [Google Scholar] [CrossRef] [Green Version]
- Huang, S.; Wu, Y.; Hung, W.; Tang, C. Point-of-Regard Measurement via Iris Contour with One Eye from Single Image. In Proceedings of the IEEE International Symposium on Multimedia, Taichung, Taiwan, 13–15 December 2010; pp. 336–341. [Google Scholar] [CrossRef]
- Ohno, T.; Mukawa, N.; Kawato, S. Just Blink Your Eyes: A Head-Free Gaze Tracking System. In Proceedings of the CHI ’03 Extended Abstracts on Human Factors in Computing Systems, Ft. Lauderdale, FL, USA, 5–10 April 2003; pp. 950–957. [Google Scholar] [CrossRef]
- Wu, H.; Kitagawa, Y.; Wada, T.; Kato, T.; Chen, Q. Tracking Iris Contour with a 3D Eye-Model for Gaze Estimation. In Proceedings of the Asian Conference on Computer Vision, Tokyo, Japan, 18–22 November 2007; pp. 688–697. [Google Scholar] [CrossRef]
- Wang, Y.; Zhao, T.; Ding, X.; Peng, J.; Bian, J.; Fu, X. Learning a gaze estimator with neighbor selection from large-scale synthetic eye images. Knowl.-Based Syst. 2017, 139, 41–49. [Google Scholar] [CrossRef]
- Baluja, S.; Pomerleau, D. Non-Intrusive Gaze Tracking Using Artificial Neural Networks. Tech. Rep. 1994, 1–16. Available online: https://www.aaai.org/Papers/Symposia/Fall/1993/FS-93-04/FS93-04-032.pdf (accessed on 23 August 2021).
- Sewell, W.; Komogortsev, O. Real-Time Eye Gaze Tracking with an Unmodified Commodity Webcam Employing a Neural Network. In Proceedings of the CHI ’10 Extended Abstracts on Human Factors in Computing Systems, Atlanta, GA, USA, 10–15 April 2010; pp. 3739–3744. [Google Scholar] [CrossRef]
- Cheng, Y.; Lu, F.; Zhang, X. Appearance-Based Gaze Estimation via Evaluation-Guided Asymmetric Regression. In Computer Vision—ECCV; Ferrari, V., Hebert, M., Sminchisescu, C., Weiss, Y., Eds.; Springer International Publishing: Cham, Switzerland, 2018; pp. 105–121. [Google Scholar] [CrossRef]
- Fischer, T.; Chang, H.J.; Demiris, Y. RT-GENE: Real-Time Eye Gaze Estimation in Natural Environments. In Computer Vision—ECCV; Ferrari, V., Hebert, M., Sminchisescu, C., Weiss, Y., Eds.; Springer International Publishing: Cham, Switzerland, 2018; pp. 339–357. [Google Scholar] [CrossRef] [Green Version]
- Yu, Y.; Liu, G.; Odobez, J.M. Deep Multitask Gaze Estimation with a Constrained Landmark-Gaze Model. Computer Vision—ECCV; Ferrari, V., Hebert, M., Sminchisescu, C., Weiss, Y., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 456–474. [Google Scholar] [CrossRef]
- Zhang, X.; Sugano, Y.; Fritz, M.; Bulling, A. It’s Written All Over Your Face: Full-Face Appearance-Based Gaze Estimation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Honolulu, HI, USA, 21–26 July 2017; pp. 2299–2308. [Google Scholar] [CrossRef] [Green Version]
- Huang, Y.; Dong, X.; Hao, M. Eye gaze calibration based on support vector regression machine. In Proceedings of the 9th World Congress on Intelligent Control and Automation, Taipei, Taiwan, 21–25 June 2011; pp. 454–456. [Google Scholar] [CrossRef]
- Zhu, Z.; Ji, Q.; Bennett, K.P. Nonlinear Eye Gaze Mapping Function Estimation via Support Vector Regression. In Proceedings of the 18th International Conference on Pattern Recognition (ICPR’06), Hong Kong, China, 20–24 August 2006; Volume 1, pp. 1132–1135. [Google Scholar] [CrossRef]
- Sugano, Y.; Matsushita, Y.; Sato, Y. Learning-by-Synthesis for Appearance-Based 3D Gaze Estimation. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 23–28 June 2014; pp. 1821–1828. [Google Scholar] [CrossRef]
- Wang, Y.; Shen, T.; Yuan, G.; Bian, J.; Fu, X. Appearance-based Gaze Estimation using Deep Features and Random Forest Regression. Knowl.-Based Syst. 2016, 110, 293–301. [Google Scholar] [CrossRef]
- Alnajar, F.; Gevers, T.; Valenti, R.; Ghebreab, S. Calibration-Free Gaze Estimation Using Human Gaze Patterns. In Proceedings of the IEEE International Conference on Computer Vision, Sydney, Australia, 1–8 December 2013; pp. 137–144. [Google Scholar]
- Lu, F.; Okabe, T.; Sugano, Y.; Sato, Y. Learning gaze biases with head motion for head pose-free gaze estimation. Image Vis. Comput. 2014, 32, 169–179. [Google Scholar] [CrossRef]
- Lu, F.; Sugano, Y.; Okabe, T.; Sato, Y. Head pose-free appearance-based gaze sensing via eye image synthesis. In Proceedings of the 21st International Conference on Pattern Recognition, Tsukuba, Japan, 11–15 November 2012; pp. 1008–1011. [Google Scholar]
- Lu, F.; Sugano, Y.; Okabe, T.; Sato, T. Gaze Estimation from Eye Appearance: A Head Pose-Free Method via Eye Image Synthesis. IEEE Trans. Image Process. 2015, 24, 3680–3693. [Google Scholar] [CrossRef] [PubMed]
- Sugano, Y.; Matsushita, Y.; Sato, Y. Appearance-Based Gaze Estimation Using Visual Saliency. IEEE Trans. Pattern Anal. Mach. Intell. 2013, 35, 329–341. [Google Scholar] [CrossRef]
- Ferhat, O.; Vilariño, F.; Sánchez, F.J. A cheap portable eye-tracker solution for common setups. J. Eye Mov. Res. 2014, 7, 1–10. [Google Scholar] [CrossRef]
- Williams, O.; Blake, A.; Cipolla, R. Sparse and semi-supervised visual mapping with the S^3GP. In Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’06), New York, NY, USA, 17–22 June 2006; Volume 1, pp. 230–237. [Google Scholar] [CrossRef]
- Sesma-Sanchez, L.; Villanueva, A.; Cabeza, R. Gaze Estimation Interpolation Methods Based on Binocular Data. IEEE Trans. Biomed. Eng. 2012, 59, 2235–2243. [Google Scholar] [CrossRef]
- Shih, S.W.; Wu, Y.T.; Liu, J. A calibration-free gaze tracking technique. In Proceedings of the 15th International Conference on Pattern Recognition, Barcelona, Spain, 3–7 September 2000; Volume 4, pp. 201–204. [Google Scholar] [CrossRef]
- Sesma, L.; Villanueva, A.; Cabeza, R. Evaluation of Pupil Center-Eye Corner Vector for Gaze Estimation Using a Web Cam. In Proceedings of the Symposium on Eye Tracking Research and Applications, Santa Barbara, CA, USA, 28–30 March 2012; pp. 217–220. [Google Scholar] [CrossRef]
- Guo, Z.; Zhou, Q.; Liu, Z. Appearance-based gaze estimation under slight head motion. Multimed. Tools Appl. 2016, 76, 2203–2222. [Google Scholar] [CrossRef]
- Tan, K.H.; Kriegman, D.J.; Ahuja, N. Appearance-based eye gaze estimation. In Proceedings of the 6th IEEE Workshop on Applications of Computer Vision, Orlando, FL, USA, 4 December 2002; pp. 191–195. [Google Scholar] [CrossRef]
- Lukander, K. Measuring Gaze Point on Handheld Mobile Devices. In Proceedings of the CHI ’04 Extended Abstracts on Human Factors in Computing. Association for Computing Machinery, Vienna, Austria, 24–29 April 2004; p. 1556. [Google Scholar] [CrossRef]
- Martinez, F.; Carbone, A.; Pissaloux, E. Gaze estimation using local features and non-linear regression. In Proceedings of the IEEE International Conference on Image Processing, Orlando, FL, USA, 30 September–3 October 2012; pp. 1961–1964. [Google Scholar] [CrossRef]
- Majaranta, P.; Räihä, K.J. Twenty years of eye typing: Systems and design issues. In Proceedings of the Eye Tracking Research and Applications Symposium, New Orleans, LA, USA, 25–27 March 2002; pp. 15–22. [Google Scholar] [CrossRef]
- Kawato, S.; Tetsutani, N. Detection and tracking of eyes for gaze-camera control. Image Vis. Comput. 2004, 22, 1031–1038. [Google Scholar] [CrossRef]
- Long, X.; Tonguz, O.K.; Kiderman, A. A High Speed Eye Tracking System with Robust Pupil Center Estimation Algorithm. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Lyon, France, 22–26 August 2007; pp. 3331–3334. [Google Scholar] [CrossRef]
- Alioua, N.; Amine, A.; Rziza, M.; Aboutajdine, D. Eye state analysis using iris detection based on Circular Hough Transform. In Proceedings of the International Conference on Multimedia Computing and Systems, Ouarzazate, Morocco, 7–9 April 2011; pp. 1–5. [Google Scholar] [CrossRef]
- Juhong, A.; Treebupachatsakul, T.; Pintavirooj, C. Smart eye-tracking system. In Proceedings of the International Workshop on Advanced Image Technology, Chiang Mai, Thailand, 7–9 January 2018; pp. 1–4. [Google Scholar] [CrossRef]
- Söylemez, Ö.F.; Ergen, B. Circular hough transform based eye state detection in human face images. In Proceedings of the Signal Processing and Communications Applications Conference, Haspolat, Turkey, 24–26 April 2013; pp. 1–4. [Google Scholar] [CrossRef]
- Kocejko, T.; Bujnowski, A.; Wtorek, J. Eye mouse for disabled. In Proceedings of the Conference on Human System Interactions, Krakow, Poland, 25–27 May 2009; pp. 199–202. [Google Scholar] [CrossRef]
- Zhu, J.; Yang, J. Subpixel Eye Gaze Tracking. In Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition, Washington, DC, USA, 21 May 2002; pp. 131–136. [Google Scholar] [CrossRef]
- Shubhangi, T.; Meshram, P.M.; Rahangdale, C.; Shivhare, P.; Jindal, L. 2015. Eye Gaze Detection Technique to Interact with Computer. Int. J. Eng. Res. Comput. Sci. Eng. 2015, 2, 92–96. [Google Scholar]
- Viola, P.; Jones, M. Rapid object detection using a boosted cascade of simple features. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Kauai, HI, USA, 8–14 December 2001; Volume 1, p. 1. [Google Scholar] [CrossRef]
- Villanueva, A.; Cerrolaza, J.J.; Cabeza, R. Geometry Issues of Gaze Estimation. In Advances in Human Computer Interaction; Pinder, S., Ed.; InTechOpen: London, UK, 2008. [Google Scholar]
- Świrski, L.; Bulling, A.; Dodgson, N. Robust real-time pupil tracking in highly off-axis images. In Proceedings of the Eye Tracking Research and Applications Symposium (ETRA), Santa Barbara, CA, USA, 28–30 March 2012; pp. 173–176. [Google Scholar] [CrossRef] [Green Version]
- Li, D.; Winfield, D.; Parkhurst, D.J. Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05)—Workshops, San Diego, CA, USA, 21–23 September 2005; p. 79. [Google Scholar] [CrossRef]
- Santini, T.; Fuhl, W.; Kasneci, E. PuRe: Robust pupil detection for real-time pervasive eye tracking. Comput. Vis. Image Underst. 2018, 170, 40–50. [Google Scholar] [CrossRef] [Green Version]
- Fuhl, W.; Santini, T.C.; Kübler, T.; Kasneci, E. ElSe: Ellipse selection for robust pupil detection in real-world environments. In Proceedings of the ETRA ‘16: 2016 Symposium on Eye Tracking Research and Applications, Charleston, SC, USA, 14–17 March 2016; pp. 123–130. [Google Scholar] [CrossRef]
- Kassner, M.; Patera, W.; Bulling, A. Pupil: An open source platform for pervasive eye tracking and mobile gaze-based interaction. In Proceedings of the UbiComp ‘14: The 2014 ACM Conference on Ubiquitous Computing, Seattle, WA, USA, 13–17 September 2014; pp. 1151–1160. [Google Scholar] [CrossRef]
- Fuhl, W.; Kübler, T.; Sippel, K.; Rosenstiel, W.; Kasneci, E. Excuse: Robust pupil detection in real-world scenarios. In Proceedings of the International Conference on Computer Analysis of Images and Patterns, Valletta, Malta, 2–4 September 2015; pp. 39–51. [Google Scholar] [CrossRef]
- Fitzgibbon, A.; Pilu, M.; Fisher, R.B. Direct least square fitting of ellipses. IEEE Trans. Pattern Anal. Intell. 1999, 21, 476–480. [Google Scholar] [CrossRef] [Green Version]
- Fischler, M.A.; Bolles, R.C. Random Sample Consensus: A Paradigm for Model Fitting with applications to image analysis and automated cartography. Commun. ACM 1981, 24, 381–395. [Google Scholar] [CrossRef]
- Ramanauskas, N.; Daunys, G.; Dervinis, D. Investigation of Calibration Techniques in Video Based Eye Tracking System. In Proceedings of the 11th international conference on Computers Helping People with Special Needs, Linz, Austria, 9–11 July 2008; pp. 1208–1215. [Google Scholar] [CrossRef]
- Hansen, J.P.; Mardanbegi, D.; Biermann, F.; Bækgaard, P. A gaze interactive assembly instruction with pupillometric recording. Behav. Res. Methods 2018, 50, 1723–1733. [Google Scholar] [CrossRef]
- Hansen, D.W.; Hammoud, R.I. An improved likelihood model for eye tracking. Comput. Vis. Image Underst. 2007, 106, 220–230. [Google Scholar] [CrossRef]
- Lemley, J.; Kar, A.; Drimbarean, A.; Corcoran, P. Convolutional Neural Network Implementation for Eye-Gaze Estimation on Low-Quality Consumer Imaging Systems. IEEE Trans. Consum. Electron. 2019, 65, 179–187. [Google Scholar] [CrossRef] [Green Version]
- Arar, N.M.; Gao, H.; Thiran, J.P. A Regression-Based User Calibration Framework for Real-Time Gaze Estimation. IEEE Trans. Circuits Syst. Video Technol. 2016, 27, 2623–2638. [Google Scholar] [CrossRef] [Green Version]
- Dubey, N.; Ghosh, S.; Dhall, A. Unsupervised learning of eye gaze representation from the web. In Proceedings of the 2019 International Joint Conference on Neural Networks, Budapest, Hungary, 14–19 July 2019; pp. 1–7. [Google Scholar] [CrossRef] [Green Version]
- Yu, Y.; Odobez, J.M. Unsupervised representation learning for gaze estimation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 23–28 June 2020; pp. 7314–7324. [Google Scholar]
- Chen, Z.; Deng, D.; Pi, J.; Shi, B.E. Unsupervised Outlier Detection in Appearance-Based Gaze Estimation. In Proceedings of the International Conference on Computer Vision Workshop (ICCVW), Seoul, Korea, 27–28 October 2019; pp. 1088–1097. [Google Scholar] [CrossRef]
- Akashi, T.; Wakasa, Y.; Tanaka, K.; Karungaru, S.; Fukumi, M. Using Genetic Algorithm for Eye Detection and Tracking in Video Sequence. J. Syst. Cybern. Inform. 2007, 5, 72–78. [Google Scholar]
- Amarnag, S.; Kumaran, R.S.; Gowdy, J.N. Real time eye tracking for human computer interfaces. In Proceedings of the International Conference on Multimedia and Expo. ICME ’03, Baltimore, MD, USA, 6–9 July 2003; Volume 3, p. III-557. [Google Scholar] [CrossRef]
- Haro, A.; Flickner, M.; Essa, I. Detecting and tracking eyes by using their physiological properties, dynamics, and appearance. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Hilton Head, SC, USA, 15 June 2000; Volume 1, pp. 163–168. [Google Scholar] [CrossRef] [Green Version]
- Coetzer, R.C.; Hancke, G.P. Eye detection for a real-time vehicle driver fatigue monitoring system. In Proceedings of the IEEE Intelligent Vehicles Symposium, Baden-Baden, Germany, 5–9 June 2011; pp. 66–71. [Google Scholar] [CrossRef]
- Sung Ho Park, S.H.; Yoon, H.S.; Park, K.R. Faster R-CNN and Geometric Transformation-Based Detection of Driver’s Eyes Using Multiple Near-Infrared Camera Sensors. Sensors 2019, 19, 197. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Gudi, A.; Li, X.; Gemert, J. Efficiency in Real-time Webcam Gaze Tracking. In Proceedings of the European Conference on Computer Vision, Glasgow, UK, 23–28 August 2020; pp. 529–543. [Google Scholar]
- Schneider, T.; Schauerte, B.; Stiefelhagen, R. Manifold Alignment for Person Independent Appearance-Based Gaze Estimation. In Proceedings of the International Conference on Pattern Recognition, Stockholm, Sweden, 24–28 August 2014; pp. 1167–1172. [Google Scholar] [CrossRef]
- Bäck, D. Neural Network Gaze Tracking Using Web Camera. Master’s Thesis, Linköping University, Linköping, Sweden, 2005. [Google Scholar]
- Wang, J.; Zhang, G.; Shi, J. 2D Gaze Estimation Based on Pupil-Glint Vector Using an Artificial Neural Network. Appl. Sci. 2016, 6, 174. [Google Scholar] [CrossRef] [Green Version]
- Cho, S.W.; Baek, N.R.; Kim, M.C.; Koo, J.H.; Kim, J.H.; Park, K.R. Face Detection in Nighttime Images Using Visible-Light Camera Sensors with Two-Step Faster Region-Based Convolutional Neural Network. Sensors 2018, 18, 2995. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 39, 1137–1149. [Google Scholar] [CrossRef] [Green Version]
- Redmon, J.; Farhadi, A. YOLO9000: Better, Faster, Stronger. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 6517–6525. [Google Scholar] [CrossRef] [Green Version]
- Redmon, J.; Farhadi, A. YOLOv3: An Incremental Improvement. arXiv 2018, arXiv:1804.02767. [Google Scholar]
- Cortacero, K.; Fischer, T.; Demiris, Y. RT-BENE: A Dataset and Baselines for Real-Time Blink Estimation in Natural Environments. In Proceedings of the International Conference on Computer Vision Workshop, Seoul, Korea, 27–28 October 2019; pp. 1159–1168. [Google Scholar] [CrossRef] [Green Version]
- Xia, Y.; Liang, B. Gaze Estimation Based on Deep Learning Method. In Proceedings of the 4th International Conference on Computer Science and Application Engineering, Sanya, China, 20–22 October 2020; pp. 1–6. [Google Scholar] [CrossRef]
- Ansari, M.F.; Kasprowski, P.; Obetkal, M. Gaze Tracking Using an Unmodified Web Camera and Convolutional Neural Network. Appl. Sci. 2021, 11, 9068. [Google Scholar] [CrossRef]
- Zhou, X.; Lin, J.; Jiang, J.; Chen, S. Learning a 3d gaze estimator with improved Itracker combined with bidirectional LSTM. In Proceedings of the IEEE International Conference on Multimedia and Expo, Shanghai, China, 8–12 July 2019; pp. 850–855. [Google Scholar] [CrossRef]
- Palmero, C.; Selva, J.; Bagheri, M.A.; Escalera, S. Recurrent CNN for 3d gaze estimation using appearance and shape cues. In Proceedings of the The British Machine Vision Conference, Safety Harbor, FL, USA, 26–28 March 2018. [Google Scholar]
- Kim, J.H.; Jeong, J.W. Gaze Estimation in the Dark with Generative Adversarial Networks. In Proceedings of the ACM Symposium on Eye Tracking Research and Applications (ETRA ‘20 Adjunct). Association for Computing Machinery, Stuttgart, Germany, 2–5 June 2020; Volume 33, pp. 1–3. [Google Scholar] [CrossRef]
- Kim, J.-H.; Jeong, J.W. Gaze in the Dark: Gaze Estimation in a Low-Light Environment with Generative Adversarial Networks. Sensors 2020, 20, 4935. [Google Scholar] [CrossRef] [PubMed]
- Wang, K.; Zhao, R.; Ji, Q. A Hierarchical Generative Model for Eye Image Synthesis and Eye Gaze Estimation. In Proceedings of the Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018; pp. 440–448. [Google Scholar] [CrossRef]
- Wang, K.; Zhao, R.; Su, H.; Ji, Q. Generalizing Eye Tracking with Bayesian Adversarial Learning. In Proceedings of the Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 15–20 June 2019; pp. 11899–11908. [Google Scholar] [CrossRef]
- He, Z.; Spurr, A.; Zhang, X.; Hilliges, O. Photo-Realistic Monocular Gaze Redirection Using Generative Adversarial Networks. In Proceedings of the International Conference on Computer Vision, Seoul, Korea, 27 October–2 November 2019; pp. 6931–6940. [Google Scholar] [CrossRef] [Green Version]
- Wang, K.; Ji, Q. 3D gaze estimation without explicit personal calibration. Pattern Recognit. 2018, 79, 216–227. [Google Scholar] [CrossRef]
- Khan, S.; Rahmani, H.; Shah, S.A.; Bennamoun, M. A Guide to Convolutional Neural Networks for Computer Vision. Synth. Lect. Comput. Vis. 2018, 8, 1–207. [Google Scholar] [CrossRef]
- Park, S.; Mello, S.D.; Molchanov, P.; Iqbal, U.; Hilliges, O.; Kautz, J. Few-Shot Adaptive Gaze Estimation. In Proceedings of the International Conference on Computer Vision, Seoul, Korea, 27 October–2 November 2019; pp. 9367–9376. [Google Scholar] [CrossRef] [Green Version]
- Khan, A.; Sohail, A.; Zahoora, U.; Qureshi, A.S. A survey of the recent architectures of deep convolutional neural networks. Artif. Intell. Rev. 2019, 53, 5455–5516. [Google Scholar] [CrossRef] [Green Version]
- Jia, Y.; Shelhamer, E.; Donahue, J.; Karayev, S.; Long, J.; Girshick, R.; Guadarrama, S.; Darrell, T. Caffe: Convolutional Architecture for Fast Feature Embedding. In Proceedings of the 22nd ACM international conference on Multimedia. Association for Computing Machinery, Orlando, FL, USA, 3–7 November 2014; pp. 675–678. [Google Scholar] [CrossRef]
- Zhu, W.; Deng, H. Monocular Free-Head 3D Gaze Tracking with Deep Learning and Geometry Constraints. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 3162–3171. [Google Scholar] [CrossRef]
- Zhang, Z.; Lian, D.; Gao, S. RGB-D-based gaze point estimation via multi-column CNNs and facial landmarks global optimization. Vis. Comput. 2020, 37, 1731–1741. [Google Scholar] [CrossRef]
- Zhang, X.; Park, S.; Beeler, T.; Bradley, D.; Tang, S.; Hilliges, O. ETH-XGaze: A Large Scale Dataset for Gaze Estimation under Extreme Head Pose and Gaze Variation. In Proceedings of the European Conference on Computer Vision, Glasgow, UK, 23–28 August 2020; pp. 365–381. [Google Scholar] [CrossRef]
- George, A.; Routray, A. Real-time eye gaze direction classification using convolutional neural network. In Proceedings of the International Conference on Signal Processing and Communications, Banglaore, India, 12–15 June 2016; pp. 1–5. [Google Scholar] [CrossRef] [Green Version]
- Park, S.; Aksan, E.; Zhang, X.; Hilliges, O. Towards End-to-end Video-based Eye-Tracking. In Proceedings of the European Conference on Computer Vision (ECCV), Glasgow, UK, 23–28 August 2020; pp. 747–763. [Google Scholar] [CrossRef]
- Zheng, Y.; Park, S.; Zhang, X.; De Mello, S.; Hilliges, O. Selflearning transformations for improving gaze and head redirection. arXiv 2020, arXiv:2010.12307. [Google Scholar]
- Chen, J.; Zhang, J.; Sangineto, E.; Chen, T.; Fan, J.; Sebe, N. Coarseto-fine gaze redirection with numerical and pictorial guidance. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, Waikoloa, HI, USA, 3–8 January 2021; pp. 3665–3674. [Google Scholar] [CrossRef]
- Shrivastava, A.; Pfister, T.; Tuzel, O.; Susskind, J.; Wang, W.; Webb, R. Learning from simulated and unsupervised images through adversarial training. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 2107–2116. [Google Scholar]
- Ahmed, M.; Laskar, R.H. Evaluation of accurate iris center and eye corner localization method in a facial image for gaze estimation. Multimed. Syst. 2021, 27, 429–448. [Google Scholar] [CrossRef]
- Min-Allah, N.; Jan, F.; Alrashed, S. Pupil detection schemes in human eye: A review. Multimed. Syst. 2021, 27, 753–777. [Google Scholar] [CrossRef]
- Wang, X.; Zhang, J.; Zhang, H.; Zhao, S.; Liu, H. Vision-based Gaze Estimation: A Review. IEEE Trans. Cogn. Dev. Syst. 2021, 99, 1–19. [Google Scholar] [CrossRef]
- Park, S.; Zhang, X.; Bulling, A.; Hilliges, O. Learning to Find Eye Region Landmarks for Remote Gaze Estimation in Unconstrained Settings. In Proceedings of the ACM Symposium on Eye Tracking Research & Applications, Warsaw, Poland, 14–17 June 2018; Volume 21, pp. 1–10. [Google Scholar] [CrossRef] [Green Version]
- Bayoudh, K.; Knani, R.; Hamdaoui, F.; Mtibaa, A. A survey on deep multimodal learning for computer vision: Advances, trends, applications, and datasets. Vis. Comput. 2021, 1–32. [Google Scholar] [CrossRef]
- Feit, A.M.; Williams, S.; Toledo, A.; Paradiso, A.; Kulkarni, H.; Kane, S.; Morris, M.R. Toward Everyday Gaze Input: Accuracy and Precision of Eye Tracking and Implications for Design. In Proceedings of the CHI ‘17: CHI Conference on Human Factors in Computing Systems, Denver, CO, USA, 6–11 May 2017; pp. 1118–1130. [Google Scholar] [CrossRef] [Green Version]
- Eye Tracker Accuracy and Precision. Available online: https://www.tobiipro.com/learn-and-support/learn/eye-tracking-essentials/what-affects-the-accuracy-and-precision-of-an-eye-tracker/ (accessed on 25 July 2021).
- Shih, S.W.; Liu, J. A novel approach to 3-D gaze tracking using stereo cameras. IEEE Trans. Syst. Man Cybern. Part B Cybern. 2004, 34, 234–245. [Google Scholar] [CrossRef] [Green Version]
- Pérez, A.; Córdoba, M.L.; García, A.; Méndez, R.; Muñoz, M.L.; Pedraza, J.L.; Sánchez, F. A Precise Eye-Gaze Detection and Tracking System. In Proceedings of the 11th International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision, Pilsen, Czech Republic, 3–7 February 2003; pp. 105–108. [Google Scholar]
- Kang, J.J.; Guestrin, E.D.; Maclean, W.J.; Eizenman, M. Simplifying the cross-ratios method of point-of-gaze estimation. CMBES Proc. 2007, 30, 1–4. [Google Scholar]
- Villanueva, A.; Cabeza, R. A Novel Gaze Estimation System with One Calibration Point. IEEE Trans. Syst. Man Cybern. Part B Cybern. 2008, 38, 1123–1138. [Google Scholar] [CrossRef] [PubMed]
- Ohno, T.; Mukawa, N. A free-head, simple calibration, gaze tracking system that enables gaze-based interaction. In Proceedings of the Symposium on Eye Tracking Research & Applications, San Antonio, TX, USA, 22–24 March 2004; Volume 22, pp. 115–122. [Google Scholar] [CrossRef]
- Hansen, D.W.; Nielsen, M.; Hansen, J.P.; Johansen, A.S.; Stegmann, M.B. Tracking Eyes Using Shape and Appearance. In Proceedings of the IAPR Workshop on Machine Vision Applications, Nara, Japan, 11–13 December 2002; pp. 201–204. [Google Scholar]
- Hansen, D.W.; Hansen, J.P.; Nielsen, M.; Johansen, A.S.; Stegmann, M.B. Eye typing using Markov and active appearance models. In Proceedings of the 6th IEEE Workshop on Applications of Computer Vision, Orlando, FL, USA, 4 December 2002; pp. 132–136. [Google Scholar] [CrossRef]
- Nguyen, P.; Fleureau, J.; Chamaret, C.; Guillotel, P. Calibration-free gaze tracking using particle filter. In Proceedings of the IEEE International Conference on Multimedia and Expo, San Jose, CA, USA, 15–19 July 2013; pp. 1–6. [Google Scholar] [CrossRef]
- Zhang, C.; Yao, R.; Cai, J. Efficient eye typing with 9-direction gaze estimation. Multimed. Tools Appl. 2018, 77, 19679–19696. [Google Scholar] [CrossRef] [Green Version]
- Kar, A.; Corcoran, P. Performance evaluation strategies for eye gaze estimation systems with quantitative metrics and visualizations. Sensors 2018, 18, 3151. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Asteriadis, S.; Soufleros, D.; Karpouzis, K.; Kollias, S. A natural head pose and eye gaze dataset. In Proceedings of the International Workshop on Affective-Aware Virtual Agents and Social Robots, Boston, MA, USA, 6 November 2009; pp. 1–4. [Google Scholar] [CrossRef]
- McMurrough, C.D.; Metsis, V.; Kosmopoulos, D.; Maglogiannis, I.; Makedon, F. A dataset for point of gaze detection using head poses and eye images. J. Multimodal User Interfaces 2013, 7, 207–215. [Google Scholar] [CrossRef]
- Ponz, V.; Villanueva, A.; Cabeza, R. Dataset for the evaluation of eye detector for gaze estimation. In Proceedings of the ACM Conference on Ubiquitous Computing, Pittsburgh, PA, USA, 5–8 September 2012; pp. 681–684. [Google Scholar] [CrossRef]
- Smith, B.A.; Yin, Q.; Feiner, S.K.; Nayar, S.K. Gaze locking: Passive eye contact detection for human-object interaction. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology, St. Andrews, UK, 8–11 October 2013; pp. 271–280. [Google Scholar] [CrossRef]
- Villanueva, A.; Ponz, V.; Sesma-Sanchez, L.; Ariz, M.; Porta, S.; Cabeza, R. Hybrid method based on topography for robust detection of iris center and eye corners. ACM Trans. Multimed. Comput. Commun. Appl. (TOMM) 2013, 9, 1–20. [Google Scholar] [CrossRef] [Green Version]
- Weidenbacher, U.; Layher, G.; Strauss, P.M.; Neumann, H. A comprehensive head pose and gaze database. In Proceedings of the 3rd IET International Conference on Intelligent Environments, Ulm, Germany, 24–25 September 2007; pp. 455–458. [Google Scholar] [CrossRef] [Green Version]
- He, Q.; Hong, X.; Chai, X.; Holappa, J.; Zhao, G.; Chen, X.; Pietikäinen, M. OMEG: Oulu multi-pose eye gaze dataset. In Proceedings of the Scandinavian Conference on Image Analysis, Copenhagen, Denmark, 15–17 June 2015; pp. 418–427. [Google Scholar] [CrossRef] [Green Version]
- Schöning, J.; Faion, P.; Heidemann, G.; Krumnack, U. Providing video annotations in multimedia containers for visualization and research. In Proceedings of the IEEE Winter Conference on Applications of Computer Vision, Santa Rosa, CA, USA, 24–31 March 2017; pp. 650–659. [Google Scholar] [CrossRef]
- Wood, E.; Baltrusaitis, T.; Zhang, X.; Sugano, Y.; Robinson, P.; Bulling, A. Rendering of eyes for eye-shape registration and gaze estimation. In Proceedings of the IEEE International Conference on Computer Vision, Santiago, Chile, 7–13 December 2015; pp. 3756–3764. [Google Scholar] [CrossRef] [Green Version]
- Cheng, Y.; Zhang, X.; Lu, F.; Sato, Y. Gaze estimation by exploring two-eye asymmetry. IEEE Trans. Image Process. 2020, 29, 5259–5272. [Google Scholar] [CrossRef]
- Funes Mora, K.A.; Monay, F.; Odobez, J.M. Eyediap: A database for the development and evaluation of gaze estimation algorithms from rgb and rgb-d cameras. In Proceedings of the Symposium on Eye Tracking Research and Applications, Safety Harbor, FL, USA, 26–28 March 2014; Volume 26, pp. 255–258. [Google Scholar] [CrossRef]
- Cheng, Y.; Huang, S.; Wang, F.; Qian, C.; Lu, F. A coarse-to-fine adaptive network for appearance-based gaze estimation. In Proceedings of the AAAI Conference on Artificial Intelligence, New York, NY, USA, 7–12 February 2020; Volume 34, pp. 10623–10630. [Google Scholar] [CrossRef]
- Zhao, T.; Yan, Y.; Shehu, I.S.; Fu, X. Image purification networks: Real-time style transfer with semantics through feed-forward synthesis. In Proceedings of the International Joint Conference on Neural Networks, Rio de Janeiro, Brazil, 8–13 July 2018; pp. 1–7. [Google Scholar] [CrossRef]
- Gatys, L.A.; Ecker, A.S.; Bethge, M. A neural algorithm of artistic style. arXiv 2015, arXiv:1508.06576. [Google Scholar] [CrossRef]
- Johnson, J.; Alahi, A.; Fei-Fei, L. Perceptual losses for real-time style transfer and super-resolution. In Proceedings of the European Conference on Computer Vision, Amsterdam, The Netherlands, 11–14 October 2016; pp. 694–711. [Google Scholar] [CrossRef] [Green Version]
- Selim, A.; Elgharib, M.; Doyle, L. Painting style transfer for head portraits using convolutional neural networks. ACM Trans. Graph. 2016, 35, 1–18. [Google Scholar] [CrossRef]
- Zhao, T.; Yan, Y.; Shehu, I.S.; Fu, X.; Wang, H. Purifying naturalistic images through a real-time style transfer semantics network. Eng. Appl. Artif. Intell. 2019, 81, 428–436. [Google Scholar] [CrossRef] [Green Version]
- Zhao, T.; Yan, Y.; Shehu, I.S.; Wei, H.; Fu, X. Image purification through controllable neural style transfer. In Proceedings of the International Conference on Information and Communication Technology Convergence, Jeju, Korea, 17–19 October 2018; pp. 466–471. [Google Scholar] [CrossRef]
- Xiong, Y.; Kim, H.J.; Singh, V. Mixed effects neural networks (menets) with applications to gaze estimation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 15–20 June 2019; pp. 7743–7752. [Google Scholar] [CrossRef]
- Yu, Y.; Liu, G.; Odobez, J.M. Improving few-shot user-specific gaze adaptation via gaze redirection synthesis. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 15–20 June 2019; pp. 11937–11946. [Google Scholar] [CrossRef] [Green Version]
- Duchowski, A. A breadth-first survey of eye-tracking applications. Behav. Res. Methods Instrum. Comput. 2002, 34, 455–470. [Google Scholar] [CrossRef] [PubMed]
- Armstrong, T.; Olatunji, B.O. Eye tracking of attention in the affective disorders: A meta-analytic review and synthesis. Clin. Psychol. Rev. 2012, 32, 704–723. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Kanowski, M.; Rieger, J.W.; Noesselt, T.; Tempelmann, C.; Hinrichs, H. Endoscopic eye tracking system for fMRI. J. Neurosci. Methods 2007, 160, 10–15. [Google Scholar] [CrossRef] [PubMed]
- Papageorgiou, E.; Hardiess, G.; Mallot, H.A.; Schiefer, U. Gaze patterns predicting successful collision avoidance in patients with homonymous visual field defects. Vis. Res. 2012, 65, 25–37. [Google Scholar] [CrossRef] [PubMed]
- Fu, B.; Yang, R. Display control based on eye gaze estimation. In Proceedings of the 4th International Congress on Image and Signal Processing, Shanghai, China, 15–17 October 2011; Volume 1, pp. 399–403. [Google Scholar] [CrossRef]
- Heidenburg, B.; Lenisa, M.; Wentzel, D.; Malinowski, A. Data mining for gaze tracking system. In Proceedings of the Conference on Human System Interactions, Krakow, Poland, 25–27 May 2008; pp. 680–683. [Google Scholar] [CrossRef]
- Top 8 Eye Tracking Applications in Research. Available online: https://imotions.com/blog/top-8-applications-eye-tracking-research/ (accessed on 16 February 2020).
- Chen, M.; Chen, Y.; Yao, Z.; Chen, W.; Lu, Y. Research on eye-gaze tracking network generated by augmented reality application. In Proceedings of the Second International Workshop on Knowledge Discovery and Data Mining, Moscow, Russia, 23–25 January 2009; pp. 594–597. [Google Scholar] [CrossRef]
- Danforth, R.; Duchowski, A.; Geist, R.; McAliley, E. A platform for gaze-contingent virtual environments. In Smart Graphics (Papers from the 2000 AAAI Spring Symposium, Technical Report SS-00-04); American Association for Artificial Intelligence: Palo Alto, CA, USA, 2000; pp. 66–70. [Google Scholar]
- Nilsson, S. Interaction without gesture or speech—A gaze controlled AR system. In Proceedings of the 17th International Conference on Artificial Reality and Telexistence, Esbjerg, Denmark, 28–30 November 2007; pp. 280–281. [Google Scholar]
- Roy, D.; Ghitza, Y.; Bartelma, J.; Kehoe, C. Visual memory augmentation: Using eye gaze as an attention filter. In Proceedings of the 8th International Symposium on Wearable Computers, Arlington, VA, USA, 31 October–3 November 2004; Volume 1, pp. 128–131. [Google Scholar] [CrossRef]
- Tateno, K.; Takemura, M.; Ohta, Y. Enhanced eyes for better gaze-awareness in collaborative mixed reality. In Proceedings of the Fourth IEEE and ACM International Symposium on Mixed and Augmented Reality, Vienna, Austria, 5–8 October 2005; pp. 100–103. [Google Scholar] [CrossRef]
- Calvi, C.; Porta, M.; Sacchi, D. e5Learning, an e-learning environment based on eye tracking. In Proceedings of the Eighth IEEE International Conference on Advanced Learning Technologies, Santander, Spain, 1–5 July 2008; pp. 376–380. [Google Scholar] [CrossRef]
- Georgiou, T.; Demiris, Y. Adaptive user modelling in car racing games using behavioural and physiological data. User Model. User-Adapt. Interact. 2017, 27, 267–311. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Porta, M.; Ricotti, S.; Perez, C.J. Emotional e-learning through eye tracking. In Proceedings of the IEEE Global Engineering Education Conference, Marrakech, Morocco, 17–20 April 2012; pp. 1–6. [Google Scholar] [CrossRef]
- Rajashekar, U.; Van Der Linde, I.; Bovik, A.C.; Cormack, L.K. GAFFE: A gaze-attentive fixation finding engine. IEEE Trans. Image Process. 2008, 17, 564–573. [Google Scholar] [CrossRef] [Green Version]
- Rasouli, A.; Kotseruba, I.; Tsotsos, J.K. Agreeing to cross: How drivers and pedestrians communicate. In Proceedings of the IEEE Intelligent Vehicles Symposium, Los Angeles, CA, USA, 11–14 June 2017; pp. 264–269. [Google Scholar] [CrossRef] [Green Version]
- Chen, J.; Luo, N.; Liu, Y.; Liu, L.; Zhang, K.; Kolodziej, J. A hybrid intelligence-aided approach to affect-sensitive e-learning. Computing 2016, 98, 215–233. [Google Scholar] [CrossRef]
- De Luca, A.; Denzel, M.; Hussmann, H. Look into my Eyes! Can you guess my Password? In Proceedings of the 5th Symposium on Usable Privacy and Security, Mountain View, CA, USA, 15–17 July 2009; pp. 1–12. [Google Scholar] [CrossRef]
- De Luca, A.; Weiss, R.; Drewes, H. Evaluation of eye-gaze interaction methods for security enhanced PIN-entry. In Proceedings of the 19th Australasian Conference on Computer-Human Interaction: Entertaining User Interfaces, Adelaide, Australia, 28–30 November 2007; pp. 199–202. [Google Scholar] [CrossRef] [Green Version]
- Fookes, C.; Maeder, A.; Sridharan, S.; Mamic, G. Gaze based personal identification. In Behavioral Biometrics for Human Identification: Intelligent Applications; Wang, L., Geng, X., Eds.; IGI Global: Hershey, PA, USA, 2010; pp. 237–263. [Google Scholar] [CrossRef]
- Kumar, M.; Garfinkel, T.; Boneh, D.; Winograd, T. Reducing shoulder-surfing by using gaze-based password entry. In Proceedings of the 3rd Symposium on Usable Privacy and Security, Pittsburgh, PA, USA, 18–20 July 2007; pp. 13–19. [Google Scholar] [CrossRef]
- Weaver, J.; Mock, K.; Hoanca, B. Gaze-based password authentication through automatic clustering of gaze points. In Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics, Anchorage, AK, USA, 9–12 October 2011; pp. 2749–2754. [Google Scholar] [CrossRef] [Green Version]
- Klaib, A.F.; Alsrehin, N.O.; Melhem, W.Y.; Bashtawi, H.O. IoT Smart Home Using Eye Tracking and Voice Interfaces for Elderly and Special Needs People. J. Commun. 2019, 14, 614–621. [Google Scholar] [CrossRef]
- Wu, M.; Louw, T.; Lahijanian, M.; Ruan, W.; Huang, X.; Merat, N.; Kwiatkowska, M. Gaze-based intention anticipation over driving manoeuvres in semi-autonomous vehicles. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Macau, China, 3–8 November 2019; pp. 6210–6216. [Google Scholar] [CrossRef] [Green Version]
- Subramanian, M.; Songur, N.; Adjei, D.; Orlov, P.; Faisal, A.A. A.Eye Drive: Gaze-based semi-autonomous wheelchair interface. In Proceedings of the 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Berlin, Germany, 23–27 July 2019; pp. 5967–5970. [Google Scholar] [CrossRef]
- Kamp, J.; Sundstedt, V. Gaze and Voice controlled drawing. In Proceedings of the 1st Conference on Novel Gaze-Controlled Applications (NGCA ‘11), Karlskrona Sweden, 26–27 May 2011; Volume 9, pp. 1–8. [Google Scholar] [CrossRef] [Green Version]
- Scalera, L.; Seriani, S.; Gallina, P.; Lentini, M.; Gasparetto, A. Human–Robot Interaction through Eye Tracking for Artistic Drawing. Robotics 2021, 10, 54. [Google Scholar] [CrossRef]
- Santella, A.; DeCarlo, D. Abstracted painterly renderings using eye-tracking data. In Proceedings of the 2nd International Symposium on Non-Photorealistic Animation and Rendering (NPAR ‘02), Annecy, France, 3–5 June 2002; p. 75. [Google Scholar] [CrossRef]
- Scalera, L.; Seriani, S.; Gasparetto, A.; Gallina, P. A Novel Robotic System for Painting with Eyes. In Advances in Italian Mechanism Science. IFToMM ITALY 2020. Mechanisms and Machine Science; Niola, V., Gasparetto, A., Eds.; Springer: Cham, Switzerland, 2020; Volume 91. [Google Scholar] [CrossRef]
- Lallé, S.; Conati, C.; Carenini, G. Predicting Confusion in Information Visualization from Eye Tracking and Interaction Data. In Proceedings of the 25th International Joint Conference on Artificial Intelligence (IJCAI-16), New York, NY, USA, 9–15 July 2016; pp. 2529–2535. [Google Scholar]
- Salminen, J.; Jansen, B.J.; An, J.; Jung, S.G.; Nielsen, L.; Kwak, H. Fixation and Confusion: Investigating Eye-tracking Participants’ Exposure to Information in Personas. In Proceedings of the 2018 Conference on Human Information Interaction & Retrieval, New Brunswick, NJ, USA, 11–15 March 2018; pp. 110–119. [Google Scholar] [CrossRef]
- Sims, S.D.; Putnam, V.; Conati, C. Predicting confusion from eye-tracking data with recurrent neural networks. arXiv 2019, arXiv:1906.11211. [Google Scholar]
- Hayhoe, M.M.; Matthis, J.S. Control of gaze in natural environments: Effects of rewards and costs, uncertainty and memory in target selection. Interface Focus 2018, 8, 1–7. [Google Scholar] [CrossRef]
- Jording, M.; Engemann, D.; Eckert, H.; Bente, G.; Vogeley, K. Distinguishing Social from Private Intentions Through the Passive Observation of Gaze Cues. Front. Hum. Neurosci. 2019, 13, 442. [Google Scholar] [CrossRef] [PubMed]
- Uma, S.; Eswari, R. Accident prevention and safety assistance using IOT and machine learning. J. Reliab. Intell. Environ. 2021, 1–25. [Google Scholar] [CrossRef]
- Shimauchi, T.; Sakurai, K.; Tate, L.; Tamura, H. Gaze-Based Vehicle Driving Evaluation of System with an Actual Vehicle at an Intersection with a Traffic Light. Electronics 2020, 9, 1408. [Google Scholar] [CrossRef]
- Ledezma, A.; Zamora, V.; Sipele, Ó.; Sesmero, M.P.; Sanchis, A. Implementing a Gaze Tracking Algorithm for Improving Advanced Driver Assistance Systems. Electronics 2021, 10, 1480. [Google Scholar] [CrossRef]
- Berkovsky, S.; Taib, R.; Koprinska, I.; Wang, E.; Zeng, Y.; Li, J.; Kleitman, S. Detecting Personality Traits Using Eye-Tracking Data. In Proceedings of the CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; Volume 221, p. 12. [Google Scholar] [CrossRef]
- Brunyé, T.T.; Drew, T.; Weaver, D.L.; Elmore, J.G. A review of eye tracking for understanding and improving diagnostic interpretation. Cogn. Res. Princ. Implic. 2019, 4, 7. [Google Scholar] [CrossRef] [PubMed]
- Maurage, P.; Masson, N.; Bollen, Z.; D’Hondt, F. Eye tracking correlates of acute alcohol consumption: A systematic and critical review. Neurosci. Biobehav. Rev. 2019, 108, 400–422. [Google Scholar] [CrossRef] [PubMed]
- Iannizzotto, G.; Nucita, A.; Fabio, R.A.; Caprì, T.; Lo Bello, L. Remote Eye-Tracking for Cognitive Telerehabilitation and Interactive School Tasks in Times of COVID-19. Information 2020, 11, 296. [Google Scholar] [CrossRef]
- Jin, N.; Mavromatis, S.; Sequeira, J.; Curcio, S. A Robust Method of Eye Torsion Measurement for Medical Applications. Information 2020, 11, 408. [Google Scholar] [CrossRef]
- Maimon-Mor, R.O.; Fernandez-Quesada, J.; Zito, G.A.; Konnaris, C.; Dziemian, S.; Faisal, A.A. Towards free 3D end-point control for robotic-assisted human reaching using binocular eye tracking. In Proceedings of the International Conference on Rehabilitation Robotics (ICORR), London, UK, 17–20 July 2017; pp. 1049–1054. [Google Scholar] [CrossRef]
- Palinko, O.; Sciutti, A.; Wakita, Y.; Matsumoto, Y.; Sandini, G. If looks could kill: Humanoid robots play a gaze-based social game with humans. In Proceedings of the 2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids), Cancun, Mexico, 15–17 November 2016; pp. 905–910. [Google Scholar] [CrossRef]
- Schwab, D.; Fejza, A.; Vial, L.; Robert, Y. The GazePlay Project: Open and Free Eye-Trackers Games and a Community for People with Multiple Disabilities. In Proceedings of the CCHP: International Conference on Computers Helping People with Special Needs, Linz, Austria, 11–13 July 2018; pp. 254–261. [Google Scholar] [CrossRef] [Green Version]
- Wöhle, L.; Gebhard, M. Towards Robust Robot Control in Cartesian Space Using an Infrastructureless Head- and Eye-Gaze Interface. Sensors 2021, 21, 1798. [Google Scholar] [CrossRef]
- Bozkir, E.; Günlü, O.; Fuhl, W.; Schaefer, R.F.; Kasneci, E. Differential Privacy for Eye Tracking with Temporal Correlations. PLoS ONE 2020, 16, e0255979. [Google Scholar] [CrossRef] [PubMed]
- Bozkir, E.; Ünal, A.B.; Akgün, M.; Kasneci, E.; Pfeifer, N. Privacy Preserving Gaze Estimation using Synthetic Images via a Randomized Encoding Based Framework. In Proceedings of the ACM Symposium on Eye Tracking Research and Applications (ETRA 2020), Stuttgart, Germany, 2–5 June 2020; Volume 21, pp. 1–5. [Google Scholar] [CrossRef]
- Liu, A.; Xia, L.; Duchowski, A.; Bailey, R.; Holmqvist, K.; Jain, E. Differential Privacy for Eye-Tracking Data. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications (ETRA 2019), Denver, CO, USA, 25–28 June 2019; Volume 28, p. 10. [Google Scholar] [CrossRef] [Green Version]
- Steil, J.; Hagestedt, I.; Huang, M.X.; Bulling, A. Privacy-Aware Eye Tracking Using Differential Privacy. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications (ETRA 2019), Denver, CO, USA, 25–28 June 2019; Volume 27, pp. 1–9. [Google Scholar] [CrossRef] [Green Version]
- Abdrabou, Y.; Khamis, M.; Eisa, R.M.; Ismail, S.; Elmougy, A. Just Gaze and Wave: Exploring the Use of Gaze and Gestures for Shoulder-Surfing Resilient Authentication. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, Denver, CO, USA, 25–28 June 2019; Volume 29, p. 10. [Google Scholar] [CrossRef] [Green Version]
- Khamis, M.; Alt, F.; Hassib, M.; Zezschwitz, E.V.; Hasholzner, R.; Bulling, A. GazeTouchPass: Multimodal Authentication Using Gaze and Touch on Mobile Devices. In Proceedings of the CHI Conference Extended Abstracts on Human Factors in Computing Systems, San Jose, CA, USA, 7–12 May 2016; pp. 2156–2164. [Google Scholar] [CrossRef]
- Khamis, M.; Hasholzner, R.; Bulling, A.; Alt, F. GTmoPass: Two-Factor Authentication on Public Displays Using Gaze-Touch Passwords and Personal Mobile Devices. In Proceedings of the 6th ACM International Symposium on Pervasive Displays, Lugano, Switzerland, 7–9 June 2017; Volume 8, p. 9. [Google Scholar] [CrossRef] [Green Version]
- Khamis, M.; Hassib, M.; Zezschwitz, E.V.; Bulling, A.; Alt, F. GazeTouchPIN: Protecting Sensitive Data on Mobile Devices Using Secure Multimodal Authentication. In Proceedings of the 19th ACM International Conference on Multimodal Interaction, Glasgow, UK, 13–17 November 2017; pp. 446–450. [Google Scholar] [CrossRef] [Green Version]
- Mathis, F.; Vaniea, K.; Williamson, J.; Khamis, M. RubikAuth: Fast and Secure Authentication in Virtual Reality. In Proceedings of the Extended Abstracts of the CHI Conference on Human Factors in Computing Systems (CHI 2020), Honolulu, HI, USA, 25–30 April 2020; pp. 1–9. [Google Scholar] [CrossRef]
- Top 12 Eye Tracking Hardware Companies. Available online: https://imotions.com/blog/top-eyetracking-hardware-companies/ (accessed on 23 August 2021).
- Tobii. Available online: https://www.tobii.com/ (accessed on 23 August 2021).
- SensoMotoric. Available online: http://www.smivision.com/ (accessed on 23 August 2021).
- EyeLink. Available online: http://www.eyelinkinfo.com/ (accessed on 23 August 2021).
- NNET. Available online: https://userweb.cs.txstate.edu/~ok11/nnet.html (accessed on 23 August 2021).
- EyeTab. Available online: https://github.com/errollw/EyeTab (accessed on 23 August 2021).
- Opengazer. Available online: http://www.inference.phy.cam.ac.uk/opengazer/ (accessed on 23 August 2021).
- TurkerGaze. Available online: https://github.com/PrincetonVision/TurkerGaze (accessed on 23 August 2021).
- Camgaze. Available online: https://github.com/wallarelvo/camgaze (accessed on 23 August 2021).
- ITU. Available online: https://github.com/devinbarry/GazeTracker (accessed on 23 August 2021).
- CVC ET. Available online: https://github.com/tiendan/ (accessed on 23 August 2021).
- Xlabs. Available online: https://xlabsgaze.com/ (accessed on 23 August 2021).
- Gazepointer. Available online: https://sourceforge.net/projects/gazepointer/ (accessed on 23 August 2021).
- MyEye. Available online: https://myeye.jimdofree.com/ (accessed on 23 August 2021).
- NetGazer. Available online: http://sourceforge.net/projects/netgazer/ (accessed on 23 August 2021).
- OpenEyes. Available online: http://thirtysixthspan.com/openEyes/software.html (accessed on 23 August 2021).
- Ogama. Available online: http://www.ogama.net/ (accessed on 23 August 2021).
- GazeParser. Available online: http://gazeparser.sourceforge.net/ (accessed on 23 August 2021).
- Pygaze. Available online: http://www.pygaze.org/ (accessed on 23 August 2021).
- Paperswithcodes. Available online: https://www.paperswithcode.com/task/gaze-estimation?page=2 (accessed on 23 August 2021).
Methods | Ex. Gaze Mapping Techniques | Data | Light | Merit | Demerit |
---|---|---|---|---|---|
Feature-based | 2D Regression [94,95,96] Neural Network [36,97] Cross Ratio [24,98,99] | Vectors, anchor points | Active, passive |
| |
Model-based | 2D/3D Geometry Fitting [43,70,102,103,104,105,106,107] | Eye model | Active, passive |
| |
Appearance-based | K-Nearest Neighbors [93,108], Artificial Neural Networks [109,110], Convolutional Neural Networks [111,112,113,114], Support Vector Machines [115,116], Random Forest [16,117,118], Local Linear Interpolation [119,120,121,122,123], Gaussian Process [124,125]. | Pixel intensity, texture deference | Passive |
|
Sample | Dataset | Year | Mode (Annotation) | # Samples | Resolution (Pixels) | Ex. Accuracy for Training |
---|---|---|---|---|---|---|
Image | MPIIGaze [53] | 2015 | RGB eye, 2D & 3D gaze | 213,659 | 1280 × 720 | 7.74° [211], 4.3° [112], 4.8° [114] |
GazeCapture [17] | 2016 | RGB full face, 2D gaze | 2,445,504 | 640 × 480 | 3.18° [185] | |
RT-GENE [112] | 2018 | RGB-D full face, 3D gaze | 277,286 | 1920 × 1080 | 7.7° [112], 24.2° [20], 8.4° [222] | |
RT-BENE [173] | 2019 | RGB-D full face, 3D gaze | 210,000 | 1920 × 1080 | 0.71° [173] | |
Gaze360 [20] | 2019 | RGB full face, 3D gaze | 172,000 | 4096 × 3382 | 2.9° [20] | |
XGaze [190] | 2020 | RGB full face, 2D & 3D gaze | 1,083,492 | 6000 × 4000 | 4.5° [190] | |
Video | EyeDiap [223] | 2014 | RGB-D full face, 2D & 3D gaze | 94 | 1920 × 1080 | 5.71° [222], 5.84° [176], 5.3° [224] |
TabletGaze [16] | 2017 | RGB full face, 2D gaze | 816 | 1280 × 720 | 3.63° [53], 3.17° [16], 2.58° [17] | |
EVE [192] | 2020 | RGB full face, 2D & 3D gaze | 161 | 1920 × 1080 | 2.49° [192] |
Method | Provider | Language | Description |
---|---|---|---|
Passive-light | Itracker [17] | Python, Matlab | A CNN based eye tracker, which runs in real time (10–15 fps) on a modern mobile device |
RecurrentGaze [177] | Python | Based on a fusion of CNN-RNN | |
NNET [293] | - | ANN based eye tracker implementation for iPad devices | |
EyeTab [294] | Pthon, C++ | Webcam model-based approach for binocular gaze estimation | |
Opengazer [295] | C++, C | Based on the Viola-Jones face detector, that locates the largest face in the video stream capture from PC webcam | |
TurkerGaze [296] | JavaScript, HTML | A webcam-based eye tracking game for collecting large-scale eye tracking data via crowdsourcing | |
Camgaze [297] | Python | Binocular gaze estimation for webcam | |
ITU gaze tracker [298] | - | Based on remote webcam setup | |
CVC ET [299] | C++ | Enhanced Opengazer with head repositioning feature which allows users to correct their head pose during eye tracker usage in order to improve accuracy. | |
xLabs [300] | - | Webcam-based eye tracker, built as a browser extension for Google Chrome. | |
Gazepointer [301] | C#, HTML | Windows-based web camera gaze estimation | |
MyEye [302] | - | Gaze-based input designed for use by people with amyotrophic lateral sclerosis (ALS), a neuromuscular disease. | |
NetGazer [303] | C++ | Port of Opengazer for the Windows platform | |
Active-light | OpenEyes [304] | Matlab | Based on infrared illumination. |
Ogama [305] | C#.NET | Uses infrared ready webcams | |
GazeParser [306] | Python | Based on infrared illumination. Python. | |
Pygaze [307] | Python | Wrapper for EyeLink, SMI, and Tobii systems. |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Shehu, I.S.; Wang, Y.; Athuman, A.M.; Fu, X. Remote Eye Gaze Tracking Research: A Comparative Evaluation on Past and Recent Progress. Electronics 2021, 10, 3165. https://doi.org/10.3390/electronics10243165
Shehu IS, Wang Y, Athuman AM, Fu X. Remote Eye Gaze Tracking Research: A Comparative Evaluation on Past and Recent Progress. Electronics. 2021; 10(24):3165. https://doi.org/10.3390/electronics10243165
Chicago/Turabian StyleShehu, Ibrahim Shehi, Yafei Wang, Athuman Mohamed Athuman, and Xianping Fu. 2021. "Remote Eye Gaze Tracking Research: A Comparative Evaluation on Past and Recent Progress" Electronics 10, no. 24: 3165. https://doi.org/10.3390/electronics10243165
APA StyleShehu, I. S., Wang, Y., Athuman, A. M., & Fu, X. (2021). Remote Eye Gaze Tracking Research: A Comparative Evaluation on Past and Recent Progress. Electronics, 10(24), 3165. https://doi.org/10.3390/electronics10243165