High-Accuracy 3D Gaze Estimation with Efficient Recalibration for Head-Mounted Gaze Tracking Systems
Abstract
:1. Introduction
- (1)
- A novel hybrid 3D gaze estimation method is proposed to achieve higher gaze estimation accuracy than the state-of-the-art methods. The two key parameters, eyeball center and camera optical center, are estimated in head frame with geometry-based method, so that a mapping relationship between two direction features is established to calculate the direction of the visual axis. As the direction features are formulated with the accurately estimated parameters, the complexity of mapping relationship is reduced and a better fitting performance can be achieved.
- (2)
- A calibration point sampling strategy is proposed to improve the uniformity of training set for fitting the polynomial mapping and prevent appreciable extrapolation errors. By estimating the pose of the eyeball coordinate system, the calibration points are retrieved with uniform angular intervals over human’s field of view for symbol recognition.
- (3)
- An efficient recalibration method is proposed to reduce the burden of recovering gaze estimation performance when slippage occurs. A rotation vector is introduced to our algorithm, and an iteration strategy is employed to find the optimal solution for the rotation vector and new regression parameters. With an updated eyeball coordinate system, only one extra recalibration point is enough for the algorithm to get comparable gaze estimation accuracy with primary calibration.
2. Materials and Methods
2.1. Model Formulation
2.2. Estimation of the Transformation Matrix
2.3. Estimation of the Eyeball Center
- (1)
- Reducing , which is the inner diameter of the small hole;
- (2)
- Reducing , which means increasing the distance between the small hole and gaze point, and decreasing the distance between the small hole and the user (see Figure 4b);
- (3)
- Setting the angle between two collected visual axes, , considering the contradictory relation between region width and depth.
2.4. Regression Model Fitting
2.5. Sampling and Denoising of Calibration Points
2.5.1. Sampling Strategy of Calibration Points
2.5.2. Denoising Strategy of Calibration Points
- Denoising in Data Collection
- Removing Outliers after Data Collection
2.6. Recalibration Strategy
3. Experiment and Results
3.1. Evaluation of Primary Calibration Method
3.2. Evaluation of Recalibration Method
3.3. Comparison with Other Methods
- Nonlinear optimization
- Two mapping surfaces
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Su, D.; Li, Y.-F.; Chen, H. Toward Precise Gaze Estimation for Mobile Head-Mounted Gaze Tracking Systems. IEEE Trans. Ind. Inform. 2019, 15, 2660–2672. [Google Scholar] [CrossRef]
- Li, S.; Zhang, X. Implicit Intention Communication in Human–Robot Interaction Through Visual Behavior Studies. IEEE Trans. Hum.-Mach. Syst. 2017, 47, 437–448. [Google Scholar] [CrossRef]
- Hansen, D.W.; Ji, Q. In the eye of the beholder: A survey of models for eyes and gaze. IEEE Trans. Pattern Anal. Mach. Intell. 2010, 32, 478–500. [Google Scholar] [CrossRef] [PubMed]
- Santini, T.; Fuhl, W.; Kasneci, E. Calibme: Fast and unsupervised eye tracker calibration for gaze-based pervasive human-computer interaction. In Proceedings of the 2017 Chi Conference on Human Factors in Computing Systems, Denver, CO, USA, 6–11 May 2017; pp. 2594–2605. [Google Scholar]
- Villanueva, A.; Cabeza, R. A novel gaze estimation system with one calibration point. IEEE Trans. Syst. Man Cybern. B Cybern. 2008, 38, 1123–1138. [Google Scholar] [CrossRef] [PubMed]
- Swirski, L.; Dodgson, N. A fully-automatic, temporal approach to single camera, glint-free 3D eye model fitting. In Proceedings of the PETMEI, Lind, Sweden, 13–15 August 2013; pp. 1–10. [Google Scholar]
- Wan, Z.; Xiong, C.-H.; Chen, W.; Zhang, H.; Wu, S. Pupil-Contour-Based Gaze Estimation with Real Pupil Axes for Head-Mounted Eye Tracking. IEEE Trans. Ind. Inform. 2021, 18, 3640–3650. [Google Scholar] [CrossRef]
- Mansouryar, M.; Steil, J.; Sugano, Y.; Bulling, A. 3D gaze estimation from 2D pupil positions on monocular head-mounted eye trackers. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, Charleston, SC, USA, 14–17 March 2016; pp. 197–200. [Google Scholar]
- Su, D.; Li, Y.-F.; Chen, H. Cross-Validated Locally Polynomial Modeling for 2-D/3-D Gaze Tracking With Head-Worn Devices. IEEE Trans. Ind. Inform. 2020, 16, 510–521. [Google Scholar] [CrossRef]
- Mardanbegi, D.; Hansen, D.W. Parallax error in the monocular head-mounted eye trackers. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing, Pittsburgh, PA, USA, 5–8 September 2012; pp. 689–694. [Google Scholar]
- Rattarom, S.; Aunsri, N.; Uttama, S. A framework for polynomial model with head pose in low cost gaze estimation. In Proceedings of the 2017 International Conference on Digital Arts, Media and Technology (ICDAMT), Chiang Mai, Thailand, 1–4 March 2017; pp. 24–27. [Google Scholar]
- Cerrolaza, J.J.; Villanueva, A.; Cabeza, R. Study of polynomial mapping functions in video-oculography eye trackers. ACM Trans. Comput.-Hum. Interact. 2012, 19, 1–25. [Google Scholar] [CrossRef]
- Sesma-Sanchez, L.; Zhang, Y.; Bulling, A.; Gellersen, H. Gaussian processes as an alternative to polynomial gaze estimation functions. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, Charleston, SC, USA, 14–17 March 2016; pp. 229–232. [Google Scholar]
- Lee, Y.; Shin, C.; Plopski, A.; Itoh, Y.; Piumsomboon, T.; Dey, A.; Lee, G.; Kim, S.; Billinghurst, M. Estimating Gaze Depth Using Multi-Layer Perceptron. In Proceedings of the 2017 International Symposium on Ubiquitous Virtual Reality (ISUVR), Nara, Japan, 27–29 June 2017; pp. 26–29. [Google Scholar]
- Li, S.; Zhang, X.; Webb, J.D. 3-D-Gaze-Based Robotic Grasping Through Mimicking Human Visuomotor Function for People With Motion Impairments. IEEE Trans. Biomed. Eng. 2017, 64, 2824–2835. [Google Scholar] [CrossRef] [PubMed]
- Takemura, K.; Takahashi, K.; Takamatsu, J.; Ogasawara, T. Estimating 3-D Point-of-Regard in a Real Environment Using a Head-Mounted Eye-Tracking System. IEEE Trans. Hum.-Mach. Syst. 2014, 44, 531–536. [Google Scholar] [CrossRef]
- Munn, S.M.; Pelz, J.B. 3D point-of-regard, position and head orientation from a portable monocular video-based eye tracker. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications, Savannah, GA, USA, 26–28 March 2008; pp. 181–188. [Google Scholar]
- Abbott, W.W.; Faisal, A.A. Ultra-low-cost 3D gaze estimation: An intuitive high information throughput compliment to direct brain-machine interfaces. J. Neural. Eng. 2012, 9, 046016. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Wan, Z.; Xiong, C.; Li, Q.; Chen, W.; Wong, K.K.L.; Wu, S. Accurate Regression-Based 3D Gaze Estimation Using Multiple Mapping Surfaces. IEEE Access 2020, 8, 166460–166471. [Google Scholar] [CrossRef]
- Lee, K.F.; Chen, Y.L.; Yu, C.W.; Chin, K.Y.; Wu, C.H. Gaze Tracking and Point Estimation Using Low-Cost Head-Mounted Devices. Sensors 2020, 20, 1917. [Google Scholar] [CrossRef] [PubMed]
- Liu, M.; Li, Y.; Liu, H. Robust 3-D Gaze Estimation via Data Optimization and Saliency Aggregation for Mobile Eye-Tracking Systems. IEEE Trans. Instrum. Meas. 2021, 70, 5008010. [Google Scholar] [CrossRef]
- Niehorster, D.C.; Santini, T.; Hessels, R.S.; Hooge, I.T.C.; Kasneci, E.; Nystrom, M. The impact of slippage on the data quality of head-worn eye trackers. Behav. Res. Methods 2020, 52, 1140–1160. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Atchison, D.A.; Smith, G.; Smith, G. Optics of the Human Eye; Butterworth-Heinemann Oxford: Oxford, UK, 2000; Volume 2. [Google Scholar]
- Markley, F.L.; Cheng, Y.; Crassidis, J.L.; Oshman, Y. Averaging Quaternions. J. Guid. Control. Dyn. 2007, 30, 1193–1197. [Google Scholar] [CrossRef]
- Marquardt, D.W. An algorithm for least-squares estimation of nonlinear parameters. J. Soc. Ind. Appl. Math. 1963, 11, 431–441. [Google Scholar] [CrossRef]
- Tara, A.; Lawson, G.; Renata, A. Measuring magnitude of change by high-rise buildings in visual amenity conflicts in Brisbane. Landsc. Urban Plan. 2021, 205, 103930. [Google Scholar] [CrossRef]
- Santini, T.; Fuhl, W.; Kasneci, E. PuRe: Robust pupil detection for real-time pervasive eye tracking. Comput. Vis. Image Underst. 2018, 170, 40–50. [Google Scholar] [CrossRef] [Green Version]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Xia, Y.; Liang, J.; Li, Q.; Xin, P.; Zhang, N. High-Accuracy 3D Gaze Estimation with Efficient Recalibration for Head-Mounted Gaze Tracking Systems. Sensors 2022, 22, 4357. https://doi.org/10.3390/s22124357
Xia Y, Liang J, Li Q, Xin P, Zhang N. High-Accuracy 3D Gaze Estimation with Efficient Recalibration for Head-Mounted Gaze Tracking Systems. Sensors. 2022; 22(12):4357. https://doi.org/10.3390/s22124357
Chicago/Turabian StyleXia, Yang, Jiejunyi Liang, Quanlin Li, Peiyang Xin, and Ning Zhang. 2022. "High-Accuracy 3D Gaze Estimation with Efficient Recalibration for Head-Mounted Gaze Tracking Systems" Sensors 22, no. 12: 4357. https://doi.org/10.3390/s22124357
APA StyleXia, Y., Liang, J., Li, Q., Xin, P., & Zhang, N. (2022). High-Accuracy 3D Gaze Estimation with Efficient Recalibration for Head-Mounted Gaze Tracking Systems. Sensors, 22(12), 4357. https://doi.org/10.3390/s22124357