Wearable Biosensor Smart Glasses Based on Augmented Reality and Eye Tracking
Abstract
:1. Introduction
2. Related Work
3. Materials and Methods
3.1. Overall Structural Design
3.2. Introduction to the Dataset
3.3. Eye-Pupil-Based Tracking Recognition
3.4. Calculation of Gaze Direction and Gaze Drop Combined with Scene Information
4. Results
4.1. Analysis of Pupil Recognition Effect
4.2. Line-of-Sight Estimation Evaluation Indicators and Results
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Kim, D.; Choi, Y. Applications of Smart Glasses in Applied Sciences: A Systematic Review. Appl. Sci. 2021, 11, 4956. [Google Scholar] [CrossRef]
- Baashar, Y.; Alkawsi, G.; Wan Ahmad, W.N.; Alomari, M.A.; Alhussian, H.; Tiong, S.K. Towards wearable augmented reality in healthcare: A comparative survey and analysis of head-mounted displays. Int. J. Environ. Res. Public Health 2023, 20, 3940. [Google Scholar] [CrossRef] [PubMed]
- Ham, J.; Hong, J.; Jang, Y.; Ko, S.H.; Woo, W. Smart wristband: Touch-and-motion–tracking wearable 3D input device for smart glasses. In Proceedings of the Distributed, Ambient, and Pervasive Interactions: Second International Conference, Heraklion, Crete, Greece, 22–27 June 2014. [Google Scholar]
- Condino, S.; Montemurro, N.; Cattari, N.; D’Amato, R.; Thomale, U.; Ferrari, V.; Cutolo, F. Evaluation of a wearable AR platform for guiding complex craniotomies in neurosurgery. Ann. Biomed. Eng. 2021, 49, 2590–2605. [Google Scholar] [CrossRef] [PubMed]
- Yutong, Q.; Hang, J.; Chen, J.R.; Ng, P.J. The impact of smart glasses on a new generation of users. Int. J. Biomed. Sci. Appl. 2021, 2, 1–25. [Google Scholar] [CrossRef]
- Koutromanos, G.; Kazakou, G. Augmented reality smart glasses use and acceptance: A literature review. Comp. Exp. Res. 2023, 2, 100028. [Google Scholar] [CrossRef]
- Suh, A.; Prophet, J. The state of immersive technology research: A literature analysis. Comput. Hum. Behav. 2018, 86, 77–90. [Google Scholar] [CrossRef]
- Dewan, M.H.; Godina, R.; Chowdhury, M.R.K.; Noor, C.W.M.; Wan Nik, W.M.N.; Man, M. Immersive and non-immersive simulators for education and training in the maritime domain—A Review. J. Mar. Sci. Eng. 2023, 1, 147. [Google Scholar] [CrossRef]
- Chadalavada, R.T.; Andreasson, H.; Schindler, M.; Palm, R.; Lilienthal, A.J. Bi-directional navigation intent communication using spatial augmented reality and eye-tracking glasses for improved safety in human–robot interaction. Robot. Cim-Int. Manuf. 2020, 61, 101830. [Google Scholar] [CrossRef]
- Yan, Z.; Wu, Y.; Shan, Y.; Chen, W.; Li, X. A dataset of eye gaze images for calibration-free eye tracking augmented reality headset. Sci. Data. 2022, 9, 115. [Google Scholar] [CrossRef]
- Plopski, A.; Hirzle, T.; Norouzi, N.; Qian, L.; Bruder, G.; Langlotz, T. The eye in extended reality: A survey on gaze interaction and eye tracking in head-worn extended reality. ACM Comput. Surv. 2022, 55, 1–39. [Google Scholar] [CrossRef]
- Ribo, M.; Lang, P.; Ganster, H.; Brandner, M.; Stock, C.; Pinz, A. Hybrid tracking for outdoor augmented reality applications. IEEE Comput. Graph. Appl. 2002, 22, 54–63. [Google Scholar] [CrossRef]
- Rigas, I.; Raffle, H.; Komogortsev, O.V. Hybrid ps-v technique: A novel sensor fusion approach for fast mobile eye-tracking with sensor-shift aware correction. IEEE Sens. J. 2017, 17, 8356–8366. [Google Scholar] [CrossRef]
- Syed, T.A.; Siddiqui, M.S.; Abdullah, H.B.; Jan, S.; Namoun, A.; Alzahrani, A.; Nadeem, A.; Alkhodre, A.B. In-depth review of augmented reality: Tracking technologies, development tools, AR displays, collaborative AR, and security concerns. Sensors. 2022, 23, 146. [Google Scholar] [CrossRef] [PubMed]
- Kim, S.K.; Kang, S.J.; Choi, Y.J.; Choi, M.H.; Hong, M. Augmented-reality survey: From concept to application. KSII Trans. Internet Inf. Syst. 2017, 11, 982–1004. [Google Scholar]
- Wang, Y.; Lu, S.; Harter, D. Multi-sensor eye-tracking systems and tools for capturing student attention and understanding engagement in learning: A review. IEEE Sens. J. 2021, 21, 22402–22413. [Google Scholar] [CrossRef]
- Nixon, N.; Thomas, P.B.; Jones, P.R. Feasibility study of an automated Strabismus screening test using augmented reality and eye-tracking (STARE). Eye 2023, 37, 3609–3614. [Google Scholar] [CrossRef]
- Yu, C.Y.; Kim, J.H.; Mostowfi, S.; Wang, F.; Oprean, D.; Seo, K. Developing an augmented reality-based interactive learning system with real-time location and motion tracking. In Proceedings of the International Conference on Human-Computer Interaction, Copenhagen, Denmark, 9 June 2023. [Google Scholar]
- Wu, S.; Hou, L.; Chen, H.; Zhang, G. Measuring the impact of Augmented Reality warning systems on onsite construction workers using object detection and eye-tracking. In Proceedings of the 28th EG-ICE International Workshop on Intelligent Computing in Engineering, Berlin, Germany, 3 January 2022. [Google Scholar]
- Lu, S.; Sanchez Perdomo, Y.P.; Jiang, X.; Zheng, B. Integrating eye-tracking to augmented reality system for surgical training. J. Med. Syst. 2020, 44, 192. [Google Scholar] [CrossRef]
- Shahid, M.; Nawaz, T.; Habib, H.A. Eye-gaze and augmented reality framework for driver assistance. Life Sci. 2013, 10, 125. [Google Scholar]
- Pierdicca, R.; Paolanti, M.; Naspetti, S.; Mandolesi, S.; Zanoli, R.; Frontoni, E. User-centered predictive model for improving cultural heritage augmented reality applications: An HMM-based approach for eye-tracking data. J. Imaging 2018, 4, 101. [Google Scholar] [CrossRef]
- Meißner, M.; Pfeiffer, J.; Pfeiffer, T.; Oppewal, H. Combining virtual reality and mobile eye tracking to provide a naturalistic experimental environment for shopper research. J. Bus. Res. 2019, 100, 445–458. [Google Scholar] [CrossRef]
- Damala, A.; Schuchert, T.; Rodriguez, I.; Moragues, J.; Gilleade, K.; Stojanovic, N. Exploring the affective museum visiting experience: Adaptive augmented reality (A2R) and cultural heritage. Int. J. Hum. Dev. Educ. 2013, 2, 117–142. [Google Scholar] [CrossRef]
- Jang, C.; Bang, K.; Moon, S.; Kim, J.; Lee, S.; Lee, B. Retinal 3D: Augmented reality near-eye display via pupil-tracked light field projection on retina. ACM Trans. Graph. 2017, 36, 1–13. [Google Scholar] [CrossRef]
- Lo Valvo, A.; Croce, D.; Garlisi, D.; Giuliano, F.; Giarré, L.; Tinnirello, I. A navigation and augmented reality system for visually impaired people. Sensors. 2021, 21, 3061. [Google Scholar] [CrossRef] [PubMed]
- Cai, X.; Yang, Z.; Dong, L.; Ma, Q.; Miao, X.; Liu, Z. Multi-user mobile augmented reality with ID-Aware visual interaction. ACM T. Sens. Netw. 2023, 20, 1–23. [Google Scholar] [CrossRef]
- Tsamis, G.; Chantziaras, G.; Giakoumis, D.; Kostavelis, I.; Kargakos, A.; Tsakiris, A.; Tzovaras, D. Intuitive and safe interaction in multi-user human robot collaboration environments through augmented reality displays. In Proceedings of the 30th IEEE International Conference on Robot & Human Interactive Communication, Vancouver, BC, Canada, 23 August 2021. [Google Scholar]
- McGill, M.; Gugenheimer, J.; Freeman, E. A quest for co-located mixed reality: Aligning and assessing SLAM tracking for same-space multi-user experiences. In Proceedings of the 26th ACM Symposium on Virtual Reality Software and Technology, New York, NY, USA, 1 November 2020. [Google Scholar]
- George, A.; Routray, A. Fast and accurate algorithm for eye localisation for gaze tracking in low-resolution images. IET Comput. Vis. 2016, 10, 660–669. [Google Scholar] [CrossRef]
- Yu, M.X.; Lin, Y.Z.; Tang, X.Y.; Xu, J.; Schmidt, D.; Wang, X.Z.; Guo, Y. An easy iris center detection method for eye gaze tracking system. J. Eye Mov. Res. 2015, 8, 1–20. [Google Scholar] [CrossRef]
- Cai, H.; Yu, H.; Zhou, X.; Liu, H. Robust gaze estimation via normalized iris center-eye corner vector. In International Conference on Intelligent Robotics and Applications; Springer: Cham, Switzerland, 2016; Volume 9834, pp. 300–309. [Google Scholar]
- Wang, J.; Zhang, G.; Shi, J. 2D Gaze Estimation Based on Pupil-Glint Vector Using an Artificial Neural Network. Appl. Sci. 2016, 6, 174. [Google Scholar] [CrossRef]
- Mestre, C.; Gautier, J.; Pujol, J. Robust Eye Tracking Based on Multiple Corneal Reflections for Clinical Applications. J. Biomed. Opt. 2018, 23, 035001. [Google Scholar] [CrossRef]
- Sesma-Sanchez, L.; Villanueva, A.; Cabeza, R. Gaze Estimation Interpolation Methods Based on Binocular Data. IEEE Trans. Biomed. Eng. 2012, 59, 2235–2243. [Google Scholar] [CrossRef]
- Zhang, T.N.; Bai, J.J.; Meng, C.N.; Chang, S.J. Eye-Gaze Tracking Based on One Camera and Two Light Sources. J. Optoelectron. Laser. 2012, 23, 1990–1995. [Google Scholar]
- Zhu, Z.; Ji, Q. Novel Eye Gaze Tracking Techniques Under Natural Head Movement. IEEE Trans. Biomed. Eng. 2008, 54, 2246–2260. [Google Scholar]
- Arar, N.M.; Gao, H.; Thiran, J.P. A Regression-Based User Calibration Framework for Real-Time Gaze Estimation. IEEE Trans. Circuits Syst. Video Technol. 2017, 27, 2623–2638. [Google Scholar] [CrossRef]
- Sasaki, M.; Nagamatsu, T.; Takemura, K. Cross-Ratio Based Gaze Estimation for Multiple Displays Using a Polarization Camera. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology, New York, NY, USA, 17 October 2019; pp. 1–3. [Google Scholar]
- Sasaki, M.; Nagamatsu, T.; Takemura, K. Screen Corner Detection Using Polarization Camera for Cross-Ratio Based Gaze Estimation. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, New York, NY, USA, 25 June 2019. [Google Scholar]
- Hansen, D.W.; Agustin, J.S.; Villanueva, A. Homography Normalization for Robust Gaze Estimation in Uncalibrated Setups. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, New York, NY, USA, 22 March 2010; pp. 13–20. [Google Scholar]
- Morimoto, C.H.; Coutinho, F.L.; Dan, W.H. Screen-Light Decomposition Framework for Point-of-Gaze Estimation Using a Single Uncalibrated Camera and Multiple Light Sources. J. Math. Imaging Vis. 2020, 62, 585–605. [Google Scholar] [CrossRef]
- Choi, K.A.; Ma, C.; Ko, S.J. Improving the Usability of Remote Eye Gaze Tracking for Human-Device Interaction. IEEE Trans. Consum. Electron. 2014, 60, 493–498. [Google Scholar] [CrossRef]
- Lu, F.; Sugano, Y.; Okabe, T.; Sato, Y. Adaptive linear regression for appearance-based gaze estimation. IEEE Trans. Pattern Anal. Mach. Intell. 2014, 36, 2033–2046. [Google Scholar] [CrossRef]
- Wang, Y.; Zhao, T.; Ding, X.; Peng, J.; Bian, J.; Fu, X. Learning a gaze estimator with neighbor selection from large-scale synthetic eye images. Knowl. -Based Syst. 2018, 139, 41–49. [Google Scholar] [CrossRef]
- Kacete, A.; Séguier, R.; Collobert, M.; Royan, J. Unconstrained gaze estimation using random forest regression voting. In Proceedings of the Asian Conference on Computer Vision, Taipei, Taiwan, 20–24 November 2016; Springer: Cham, Switzerland, 2016; pp. 419–432. [Google Scholar]
- Zhuang, Y.; Zhang, Y.; Zhao, H. Appearance-based gaze estimation using separable convolution neural networks. In Proceedings of the 2021 IEEE 5th Advanced Information Technology, Electronic and Automation Control Conference (IAEAC), IEEE, Chongqing, China, 12–14 March 2021; Volume 5, pp. 609–612. [Google Scholar]
- Deng, H.; Zhu, W. Monocular free-head 3d gaze tracking with deep learning and geometry constraints. In Proceedings of the IEEE International Conference on Computer Vision (ICCV), IEEE, Venice, Italy, 22 October 2017; pp. 3143–3152. [Google Scholar]
- Wang, Z.; Zhao, J.; Lu, C.; Huang, H.; Yang, F.; Li, L.; Guo, Y. Learning to detect head movement in unconstrained remote gaze estimation in the wild. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), IEEE, Snowmass, CO, USA, 1–5 March 2020; pp. 3443–3452. [Google Scholar]
- Cheng, Y.; Feng, L.; Zhang, X. Appearance-based gaze estimation via evaluation-guided asymmetric regression. In Proceedings of the European Conference on Computer Vision (ECCV), Taipei, Taiwan, 8–14 September 2018; Springer: Cham, Switzerland, 2018; pp. 100–115. [Google Scholar]
- Yu, Y.; Liu, G.; Odobez, J.M. Improving few-shot user-specific gaze adaptation via gaze redirection synthesis. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), IEEE, Long Beach, CA, USA, 15 June 2019; pp. 11937–11946. [Google Scholar]
- Lindén, E.; Sjöstrand, J.; Proutiere, A. Learning to personalize in appearance-based gaze tracking. In Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), IEEE, Seoul, Republic of Korea, 2 November 2019. [Google Scholar]
- Guo, Z.; Yuan, Z.; Zhang, C.; Chi, W.; Ling, Y.; Zhang, S. Domain adaptation gaze estimation by embedding with prediction consistency. In Proceedings of the Asian Conference on Computer Vision (ACCV), Kyoto, Japan, 30 November 2020. [Google Scholar]
- Yu, Y.; Odobez, J.M. Unsupervised representation learning for gaze estimation. In Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), IEEE, Seattle, WA, USA, 13–19 June 2020; pp. 7314–7324. [Google Scholar]
- Dubey, N.; Ghosh, S.; Dhall, A. Unsupervised learning of eye gaze representation from the web. In Proceedings of the 2019 International Joint Conference on Neural Networks (IJCNN), IEEE, Budapest, Hungary, 14–19 July 2019; pp. 1–7. [Google Scholar]
- Tonsen, M.; Zhang, X.; Sugano, Y.; Bulling, A. Labelled Pupils in the Wild: A Dataset for Studying Pupil Detection in Unconstrained Environments. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, New York, NY, USA, 14 March 2016. [Google Scholar]
- Fuhl, W.; Kübler, T.; Sippel, K.; Rosenstiel, W.; Kasneci, E. EXCUSE: Robust Pupil Detection in Real-World Scenarios. In Proceedings of the Computer Analysis of Images and Patterns: 16th International Conference, Valletta, Malta, 2–4 September 2015. [Google Scholar]
Methods | Cameras | Eye Feature | Main Error Factors | References |
---|---|---|---|---|
PCT/ICT-based | 1 | Pupil/iris center, corner point | Sensitive to head movement, requires looking at multiple calibration points, less practical in moving applications | [30,31,32] |
PCRT/ICRT-based | 1 | Pupil/iris center, glints | Dependent on the detection of the corneal reflection point (glint), with a single light source system, the accuracy decreases significantly when the head deviates from the calibrated position | [33,34,35,36] |
≥2 | Pupil/iris center, glints | Multi-point calibration to compensate for kappa angle (deviation between visual and optical axes) | [37] | |
CR-based | ≥1 | Pupil center, glints | The accuracy performance during head movement is not as good as in the case of a fixed head, especially when multiple light sources are not coplanar, and the error increases further | [38,39,40] |
HN-based | ≥1 | Pupil center, glints | It requires the light sources to be set at the four corners of the screen and mapped by a single reactivity matrix, thus placing high demands on the accuracy of the light source positions | [41,42,43] |
Methods | Algorithms | Main Error Factors | References |
---|---|---|---|
Conventional machine learning-based | KNN, RF, GP, SVR, ALR | Individual, head motion Head motion, environment | [44,45,46] |
CNN-based | Supervised learning model | Supervised CNNs need well-designed architecture and parameters but require many samples and long training. Synthetic images differ from real ones | [47,48,49] |
Weakly/semi/self-supervised learning model: | Performance still relies on having high-quality labeled data and falls behind fully supervised methods | [50,51,52] | |
Unsupervised learning model: | It is prone to deviate from the ground truth due to issues such as individual differences. | [53,54,55] |
Model Type | Model Complexity | Training Difficulty | Generalization Ability | Computation Speed | Advantages | Disadvantages |
---|---|---|---|---|---|---|
First-order Polynomial Regression Mode | Low | Low | Moderate | Fast | Simple and efficient and suitable for linear problems | Cannot handle complex nonlinear relationships |
Second-order Polynomial Regression Mode | Moderate | Moderate | High | Fast | Can handle some nonlinear relationships, low model complexity, and high computational efficiency | Limited performance with high-dimensional data and limited ability to handle complex nonlinearity |
Gaussian Process Regression Model | High | High | Very High | Slow | Excellent generalization ability and handles uncertainty well | High computational complexity, slow training, and inference, especially on large datasets |
Artificial Neural Network Regression Model | Very High | Very High | Very High | Moderate | Strong ability to handle nonlinear data and suitable for complex datasets and multi-dimensional inputs | Requires large datasets, prone to overfitting, long training time, and difficult to interpret |
Metric | One-Stage Model | Two-Stage Model | One-Stage Model (Standard Deviation) | Two-Stage Model (Standard Deviation) |
---|---|---|---|---|
Accuracy | 85% | 92% | ±2% | ±1% |
Recall | 0.80 | 0.90 | ±0.04 | ±0.02 |
F1 Score | 0.79 | 0.89 | ±0.03 | ±2 |
RMSE | 1.5° | 1.0° | ±0.2° | ±0.1° |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Gao, L.; Wang, C.; Wu, G. Wearable Biosensor Smart Glasses Based on Augmented Reality and Eye Tracking. Sensors 2024, 24, 6740. https://doi.org/10.3390/s24206740
Gao L, Wang C, Wu G. Wearable Biosensor Smart Glasses Based on Augmented Reality and Eye Tracking. Sensors. 2024; 24(20):6740. https://doi.org/10.3390/s24206740
Chicago/Turabian StyleGao, Lina, Changyuan Wang, and Gongpu Wu. 2024. "Wearable Biosensor Smart Glasses Based on Augmented Reality and Eye Tracking" Sensors 24, no. 20: 6740. https://doi.org/10.3390/s24206740
APA StyleGao, L., Wang, C., & Wu, G. (2024). Wearable Biosensor Smart Glasses Based on Augmented Reality and Eye Tracking. Sensors, 24(20), 6740. https://doi.org/10.3390/s24206740