Enhancing User Experience of Eye-Controlled Systems: Design Recommendations on the Optimal Size, Distance and Shape of Interactive Components from the Perspective of Peripheral Vision
Abstract
:1. Introduction
1.1. The Process of Eye-Controlled Interaction and GUI Elements
1.2. Peripheral Vision in Eye-Controlled Interaction
1.3. Visual Variables in Eye-Controlled Interface
1.4. The Research Content of This Paper
2. Experiment Method
2.1. Pre-Experiment: Screening Icons Based on Shape Classification Criteria
2.1.1. Preparation
2.1.2. Participants and Task Design
- Visual complexity represents the simplicity or complexity of the visual features of the icons. When there are many visual features, the amount of information received by the observer will increase.
- Familiarity indicates whether people are familiar with the icons. The more often an icon appears in daily life, the more familiar it is.
- Image agreement indicates how closely each icon resembles subjects’ mental image of the object, which can also be defined as the consistency between icon and the imagination. The icons with high consistency are concrete, whereas those with low consistency are abstract.
- Name agreement indicates the consistency between the name of the icon and its actual meaning. If the subjects can accurately understand the meaning of the icon after recognizing its name, it indicates that the icon has a high name agreement.
2.1.3. Results and Data Analysis
2.2. Experiment
2.2.1. Experiment Design
- Independent variables:
- Icon shape: concrete shape and abstract shape.
- Icon distances (distance from the center of the interface to the center of the icon): 243 px (5.966°), 351 px (8.609°), 459 px (11.243°).
- Icon sizes (expressed as the side length of the rectangular base): 80 px (1.889°), 100 px (2.389°), 120 px (2.889°), 140 px (3.389°), 160 px (3.889°).
- Dependent variable:
- Correct rate: The ratio of the number of correct trials to the total number of trials for each combination of variable levels. The correct rate was used to measure the likelihood of user errors when performing a visual search task in an eye-controlled interface.
- Reaction time: The time taken by the participant from receiving the stimulus to completion of the visual search task.
- Movement time: The time taken by the participant to perform the eye-controlled triggering process on the search object.
2.2.2. Participants, Apparatus, and Experiment Environment
2.2.3. Procedure
2.2.4. Parameter Set
2.3. Questionnaire Evaluation
3. Results and Analysis
3.1. Experiment
3.1.1. Correct Rate
3.1.2. Reaction Time
3.1.3. Movement Time
3.2. Subjective Questionnaire
4. General Discussion
5. Conclusions
- (1)
- The size of the GUI elements in the eye-controlled interaction interface is very important. An appropriate size would benefit both visual-searching and eye-controlled-triggering processes. Combining the results of the subjective evaluation, we derived the recommended values of 2.889°, 3.389°, and 3.889° for size.
- (2)
- Multiple interface components influence each other when they are in close proximity, so they need to be spaced appropriately from each other. According to our previous study [68], 1.032° is a suitable spacing value.
- (3)
- For eye-controlled modalities that require saccades to complete triggering (e.g., eye-gesture interaction), the distance factor has less impact on the performance of eye-controlled triggering when performing saccades within 12°. In addition, if an efficient visual search is to be ensured, the recommended values for the distance of the saccades are 5.966° and 8.609°.
- (4)
- The shape of the eye-controlled GUI components significantly affects the user recognition procedure. Abstract-shaped components can lead to a greater cognitive load than concrete-shaped components in the process of user recognition of eye-controlled interactive components. Designers should attempt to simplify the recognition process by selecting materials that fit the user’s semantic perception (e.g., using simple concrete icons).
6. Study Limitation and Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Almansouri, A.S. Tracking eye movement using a composite magnet. IEEE Trans. Magn. 2022, 58, 1–5. [Google Scholar] [CrossRef]
- Bulling, A.; Roggen, D.; Tröster, G. Wearable EOG goggles: Eye-based interaction in everyday environments. In Proceedings of the 27th International Conference on Human Factors in Computing Systems, CHI 2009, Boston, MA, USA, 4–9 April 2009; pp. 3259–3264. [Google Scholar]
- Mollenbach, E.; Hansen, J.P.; Lillholm, M. Eye Movements in Gaze Interaction. J. Eye Mov. Res. 2013, 6. [Google Scholar] [CrossRef]
- McSorley, E. Looking and acting: Vision and eye movements in natural behaviour. Perception 2010, 39, 582–583. [Google Scholar]
- Foulds, R.A.; Fincke, R. A computerized line of gaze system for rapid non-vocal individuals. In Proceedings of the National Computer Conference 1979 Personal Computing Proceedings, New York City, NY, USA, 4–7 June 1979; p. 81. [Google Scholar]
- Rinard, G.; Rugg, D. An ocular control device for use by the severely handicapped. In Proceedings of the 1986 Conference on Systems and Devices for the Disabled, Boston, MA, USA, 1 May 1986; pp. 76–79. [Google Scholar]
- Sesin, A.; Adjouadi, M.; Cabrerizo, M.; Ayala, M.; Barreto, A. Adaptive eye-gaze tracking using neural-network-based user profiles to assist people with motor disability. J. Rehabil. Res. Dev. 2008, 45, 801–817. [Google Scholar] [CrossRef]
- Riman, C.F. Implementation of a Low Cost Hands Free Wheelchair Controller. Indep. J. Manag. Prod. 2020, 11, 880–891. [Google Scholar] [CrossRef]
- Zhang, X.; Kulkarni, H.; Morris, M.R. Smartphone-Based Gaze Gesture Communication for People with Motor Disabilities. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, CO, USA, 6–11 May 2017; pp. 2878–2889. [Google Scholar]
- Tuisku, O.; Majaranta, P.; Isokoski, P.; Räihä, K.-J. Now Dasher! Dash away! longitudinal study of fast text entry by Eye Gaze. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications, Savannah, Georgia, 26–28 March 2008; pp. 19–26. [Google Scholar]
- Wobbrock, J.O.; Robinstein, J.; Sawyer, M.W.; Duchowski, A.T. Longitudinal Evaluation of Discrete Consecutive Gaze Gestures for Text Entry. In Proceedings of the Eye Tracking Research and Applications Symposium, Savannah, GA, USA, 26–28 March 2008; pp. 11–18. [Google Scholar]
- Fujii, K.; Salerno, A.; Sriskandarajah, K.; Kwok, K.-W.; Shetty, K.; Yang, G.-Z. Gaze Contingent Cartesian Control of a Robotic Arm for Laparoscopic Surgery. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Tokyo, Japan, 3–7 November 2013; pp. 3582–3589. [Google Scholar]
- Fujii, K.; Gras, G.; Salerno, A.; Yang, G.-Z. Gaze gesture based human robot interaction for laparoscopic surgery. Med. Image Anal. 2018, 44, 196–214. [Google Scholar] [CrossRef] [PubMed]
- Howell, I.; Vickers, S.; Hyrskykari, A. Gaze-based interaction with massively multiplayer on-line games. In Proceedings of the Conference on Human Factors in Computing Systems, Boston, MA, USA, 4–9 April 2009. [Google Scholar]
- Heikkilä, H. EyeSketch: A Drawing Application for Gaze Control. In Proceedings of the Conference on Eye Tracking South Africa, Cape Town, South Africa, 29–31 August 2013. [Google Scholar]
- Rajanna, V.; Malla, A.H.; Bhagat, R.A.; Hammond, T. DyGazePass: A gaze gesture-based dynamic authentication system to counter shoulder surfing and video analysis attacks. In Proceedings of the IEEE International Conference on Identity, Singapore, 11–12 January 2018; pp. 1–8. [Google Scholar]
- Land, M.F.; Hayhoe, M. In what ways do eye movements contribute to everyday activities? Vis. Res. 2001, 41, 3559–3565. [Google Scholar] [CrossRef]
- Jacob, R.J.K. Eye Tracking in Advanced Interface Design. Advanced Interface Design & Virtual Environments; Oxford University Press, Inc.: New York, NY, USA, 1995; pp. 258–288. [Google Scholar]
- Drewes, H.; Schmidt, A. Interacting with the Computer Using Gaze Gestures. In Human-Computer Interaction–INTERACT 2007; Springer: Berlin/Heidelberg, Germany, 2007; pp. 475–488. [Google Scholar]
- Isokoski, P. Text input methods for eye trackers using off-screen targets. In Proceedings of the 2000 symposium on Eye tracking research & applications, Palm Beach Gardens, FL, USA, 6–8 November 2000; pp. 15–21. [Google Scholar]
- Huckauf, A.; Urbina, M.H. Object selection in gaze controlled systems: What you don’t look at is what you get. ACM Trans. Appl. Percept. 2011, 8, 1–14. [Google Scholar] [CrossRef]
- Jenke, M.; Huppenbauer, L.; Maier, T. Investigation of continual operation procedures via a user centered gaze control by means of flexible gaze gestures. Z. Für Arb. 2018, 72, 23–34. [Google Scholar] [CrossRef]
- Jenke, M.; Maier, T. What Does the Eye Want? An Investigation of Interface Parameters to Ensure Intuitive Gaze-Controlled Interactions for Multidimensional Inputs; Springer International Publishing: Berlin/Heidelberg, Germany, 2018; pp. 3–14. [Google Scholar]
- Istance, H.; Hyrskykari, A.I. Supporting Making Fixations and the Effect on Gaze Gesture Performance. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, CO, USA, 6–11 May 2017; pp. 3022–3033. [Google Scholar]
- Porta, M.; Turina, M. Eye-S: A Full-Screen Input Modality for Pure Eye-based Communication. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications, Savannah, GA, USA, 26–28 March 2008. [Google Scholar]
- Urbina, M.H.; Huckauf, A. Alternatives to single character entry and dwell time selection on eye typing. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, Austin, TX, USA, 22–24 March 2010; pp. 315–322. [Google Scholar]
- Bee, N.; André, E. Writing with Your Eye: A Dwell Time Free Writing System Adapted to the Nature of Human Eye Gaze. In Proceedings of the 4th IEEE tutorial and research workshop on Perception and Interactive Technologies for Speech-Based Systems, Irsee, Germany, 16–18 June 2008; Springer: Berlin/Heidelberg, Germany; pp. 111–122. [Google Scholar]
- Jungwirth, F.; Haslgrübler, M.; Ferscha, A. Contour-guided gaze gestures: Using object contours as visual guidance for triggering interactions. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, Warsaw, Poland, 14–17 June 2018; p. 28. [Google Scholar]
- Niu, Y.; Li, X.; Yang, W.; Xue, C.; Peng, N.; Jin, T. Smooth Pursuit Study on an Eye-Control System for Continuous Variable Adjustment Tasks. Int. J. Hum. Comput. Interact. 2021. [Google Scholar] [CrossRef]
- Hou, W.-J.; Wu, S.-Q.; Chen, X.-L.; Chen, K.-X. Study on Spatiotemporal Characteristics of Gaze Gesture Input. In Human-Computer Interaction. Recognition and Interaction Technologies; Springer International Publishing: Berlin/Heidelberg, Germany, 2019; pp. 283–302. [Google Scholar]
- Menges, R.; Kumar, C.; Müller, D.; Sengupta, K. GazeTheWeb: A Gaze-Controlled Web Browser. In Proceedings of the 14th International Web for All Conference, Perth, Western Australia, Australia, 2–4 April 2017; p. 25. [Google Scholar]
- Porta, M.; Ravelli, A. WeyeB, an eye-controlled Web browser for hands-free navigation. In Proceedings of the 2009 2nd Conference on Human System Interactions, Catania, Italy, 21–23 May 2009. [Google Scholar]
- Menges, R.; Kumar, C.; Sengupta, K.; Staab, S. eyeGUI: A Novel Framework for Eye-Controlled User Interfaces. In Proceedings of the 9th Nordic Conference on Human-Computer Interaction (NordiCHI), Gothenburg, Sweden, 23–27 October 2016. [Google Scholar]
- Desimone, R. Neural Mechanisms of Selective Visual Attention. Annu. Rev. Neurosci. 1995, 18, 193–222. [Google Scholar] [CrossRef] [PubMed]
- Neisser, U. Focal Attention and Figural Synthesis. In Cognitive Psychology; Psychology Press: London, UK, 2014; pp. 82–98. [Google Scholar]
- Bundesen, C. A theory of visual attention. Psychol. Rev. 1990, 97, 523–547. [Google Scholar] [CrossRef] [PubMed]
- Trevarthen, C.B. Two mechanisms of vision in primates. Psychol. Forsch. 1968, 31, 299–337. [Google Scholar] [CrossRef] [PubMed]
- Meng, X.; Wang, Z. A pre-attentive model of biological vision. In Proceedings of the 2009 IEEE International Conference on Intelligent Computing and Intelligent Systems, Shanghai, China, 15 July 2009. [Google Scholar]
- Paillard, J.; Amblard, B. Static versus Kinetic Visual Cues for the Processing of Spatial Relationships. In Brain Mechanisms and Spatial Vision; Springer: Dordrecht, The Netherlands, 1985. [Google Scholar]
- Khan, M.; Lawrence, G.; Franks, I.; Buckolz, E. The utilization of visual feedback from peripheral and central vision in the control of direction. Exp. Brain Res. 2004, 158, 241–251. [Google Scholar] [CrossRef] [PubMed]
- Roux-Sibilon, A.; Trouilloud, A.; Kauffmann, L.; Guyader, N.; Mermillod, M.; Peyrin, C. Influence of peripheral vision on object categorization in central vision. J. Vis. 2019, 19, 7. [Google Scholar] [CrossRef]
- Sharkey, T.J.; Hennessy, R.T. Head-mounted, ambient vision display for helicopter pilotage. In Proceedings of the Conference on Helmet- and Head-Mounted Displays V, Orlando, FL, USA, 24–25 April 2000; pp. 46–57. [Google Scholar]
- Lenneman, J.K.; Backs, R.W. A psychophysiological and driving performance evaluation of focal and ambient visual processing demands in simulated driving. Transp. Res. Part F-Traffic Psychol. Behav. 2018, 57, 84–96. [Google Scholar] [CrossRef]
- Barbut, M. Sémiologie Graphique; Mouton: Paris, France, 1967. [Google Scholar]
- Niu, Y.F.; Liu, J.; Cui, J.Q.; Yang, W.J.; Zuo, H.R.; He, J.X.; Xiao, L.; Wang, J.H.; Ma, G.R.; Han, Z.J. Research on visual representation of icon colour in eye-controlled systems. Adv. Eng. Inform. 2022, 52, 101570. [Google Scholar]
- Weelden, L.V.; Cozijn, R.; Maes, A.; Schilperoord, J. Perceptual similarity in visual metaphor processing. In Proceedings of the Cognitive Shape Processing, Papers from the 2010 AAAI Spring Symposium, Technical Report SS-10-02, Stanford, CA, USA, 22–24 March 2010. [Google Scholar]
- Jansen, B.J. The Graphical User Interface. Acm Sigchi Bull. 1998, 30, 22–26. [Google Scholar] [CrossRef]
- Boyd, L.H.; Others, A. The Graphical User Interface: Crisis, Danger, and Opportunity. J. Vis. Impair. Blind. 1990, 84, 496–502. [Google Scholar] [CrossRef]
- Humphreys, G.W.; Forde, E.M.E. Hierarchies, similarity, and interactivity in object recognition: “Category-specific” neuropsychological deficits. Behav. Brain Sci. 2001, 24, 453–476. [Google Scholar] [CrossRef]
- Biederman, I. Recognition-by-components: A theory of human image understanding. Psychol. Rev. 1987, 94, 115–147. [Google Scholar] [CrossRef] [PubMed]
- Van Weelden, L.; Maes, A.; Schilperoord, J.; Cozijn, R. The Role of Shape in Comparing Objects: How Perceptual Similarity May Affect Visual Metaphor Processing. Metaphor. Symb. 2011, 26, 272–298. [Google Scholar] [CrossRef]
- Rosch, E.; Mervis, C.B.; Gray, W.D.; Johnson, D.M.; Boyes-Braem, P. Basic objects in natural categories. Cogn. Psychol. 1976, 8, 382–439. [Google Scholar] [CrossRef]
- Aldrich, V.C. Visual metaphor. J. Aesthetic Educ. 1968, 2, 73–86. [Google Scholar] [CrossRef]
- Carroll, N. Visual metaphor. In Aspects of Metaphor; Springer: Berlin/Heidelberg, Germany, 1994; pp. 189–218. [Google Scholar]
- Kumar, M. Gaze-Enhanced User Interface Design. Ph.D. Thesis, Stanford University, Stanford, CA, USA, 2007. [Google Scholar]
- Fitts, P.M. The information capacity of the HUMAN motor system In Controlling the Amplitude of movement. J. Exp. Psychol. 1954, 47, 381–391. [Google Scholar] [CrossRef]
- Ware, C.; Mikaelian, H.H. An evaluation of an eye tracker as a device for computer input2. In Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface, Toronto, ON, Canada, 5–9 April 1987. [Google Scholar]
- Miniotas, D. Application of Fitts’ law to eye gaze interaction. In Proceedings of the CHI00: Conference on Human Factors In Computing systems, The Hague, The Netherlands, 1–6 April 2000. [Google Scholar]
- Mackenzie, I.S. Fitts’ Law as a Research and Design Tool in Human-Computer Interaction. Hum. Comput. Interact. 1992, 7, 91–139. [Google Scholar] [CrossRef]
- Vertegaal, R. A Fitts Law comparison of eye tracking and manual input in the selection of visual targets. In Proceedings of the 10th International Conference on Multimodal Interfaces, ICMI 2008, Chania, Crete, Greece, 20–22 October 2008. [Google Scholar]
- Snodgrass, J.G.; Vanderwart, M. A standardized set of 260 pictures: Norms for name agreement, image agreement, familiarity, and visual complexity. J. Exp. Psychol. Hum. Learn. 1980, 6, 174–215. [Google Scholar] [CrossRef]
- Georgopoulos, A. Neural coding of the direction of reaching and a comparison with saccadic eye movements. Cold Spring Harb. Symp. Quant. Biol. 1990, 55, 849–859. [Google Scholar] [CrossRef]
- Faul, F.; Erdfelder, E.; Lang, A.-G.; Buchner, A. G*Power 3: A flexible statistical power analysis program for the social, be-havioral, and biomedical sciences. Behav. Res. Methods 2007, 39, 175–191. [Google Scholar] [CrossRef]
- Land, M.F.; Furneaux, S. The knowledge base of the oculomotor system. Philos. Trans. R. Soc. Lond. Ser. B Biol. Sci. 1997, 352, 1231–1239. [Google Scholar] [CrossRef]
- Jacob, R.J.K. The use of eye movements in human-computer interaction techniques: What you look at is what you get. ACM Trans. Inf. Syst. 1991, 9, 152–169. [Google Scholar] [CrossRef]
- Ovarfordt, P. Conversing with the user based on eye-gaze patterns. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Portland, OR, USA, 2–7 April 2005. [Google Scholar]
- Majaranta, P.; Mackenzie, I.S.; Aula, A.; Räihä, K.-J. Effects of feedback and dwell time on eye typing speed and accuracy. Univers. Access Inf. Soc. 2006, 5, 199–208. [Google Scholar] [CrossRef]
- Niu, Y.F.; Zuo, H.R.; Yang, X.; Xue, C.Q.; Peng, N.Y.; Zhou, L.; Zhou, X.Z.; Jin, T. Improving accuracy of gaze-control tools: Design recommendations for optimum position, sizes, and spacing of interactive objects. Hum. Factors Ergon. Manuf. Serv. Ind. 2021, 31, 249–269. [Google Scholar] [CrossRef]
- Field, A. Discovering statistics using R. Discovering Statistics Using SPSS, 2nd ed.; Sage Publications, Inc.: Thousand Oaks, CA, USA, 2005. [Google Scholar]
- Treisman, A.; Gormican, S. Feature analysis in early vision: Evidence from search asymmetries. Psychol. Rev. 1988, 95, 15–48. [Google Scholar] [CrossRef]
- Zelinsky, G.; Sheinberg, D. Why some search tasks take longer than others: Using eye movements to redefine reaction times. studies in visual information processing. Stud. Vis. Inf. Process. 1995, 6, 325–336. [Google Scholar]
- Hooge, I.; Erkelens, C.J. Peripheral vision and oculomotor control during visual search. Vis. Res. 1999, 39, 1567–1575. [Google Scholar] [CrossRef]
- Pelli, D.G. Crowding: A cortical constraint on object recognition. Curr. Opin. Neurobiol. 2008, 18, 445–451. [Google Scholar] [CrossRef]
- Whitney, D.; Levi, D.M. Visual crowding: A fundamental limit on conscious perception and object recognition. Trends Cogn. 2011, 15, 160–168. [Google Scholar]
- Abrams, R.A.; Meyer, D.E.; Kornblum, S. Speed and Accuracy of Saccadic Eye Movements: Characteristics of Impulse Variability in the Oculomotor System. J. Exp. Psychol. Hum. Percept. Perform. 1989, 15, 529. [Google Scholar] [CrossRef]
- Bahill, A.T.; Adler, D.; Stark, L. Most naturally occurring human saccades have magnitudes of 15 degrees or less. Investig. Ophthalmol. 1975, 14, 468–469. [Google Scholar]
- Bahill, A.T.; Stark, L. Overlapping saccades and glissades are produced by fatigue in the saccadic eye movement system. Exp. Neurol. 1975, 48, 95–106. [Google Scholar] [CrossRef]
- Wu, X.; Yan, H.; Niu, J.; Gu, Z. Study on semantic-entity relevance of industrial icons and generation of metaphor design. J. Soc. Inf. Display 2022, 30, 209–223. [Google Scholar] [CrossRef]
Icons | Visual Complexity | Familiarity | Image Agreement | Name Agreement | |
---|---|---|---|---|---|
Concrete-shape group | Mean | 2.05 | 4.1 | 3.78 | 3.92 |
N | 15 | 15 | 15 | 15 | |
St. Deviation | 0.41 | 0.55 | 0.5 | 0.59 | |
Abstract-shape group | Mean | 1.98 | 4.26 | 3.05 | 3.6 |
N | 15 | 15 | 15 | 15 | |
St. Deviation | 0.26 | 0.41 | 0.42 | 0.38 |
Size | Distance | Shape | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
80 px | 100 px | 120 px | 140 px | 160 px | 243 px | 351 px | 459 px | Abstract | Concrete | ||
Descriptive statistics | N | 120 | 120 | 120 | 120 | 120 | 200 | 200 | 200 | 300 | 300 |
Mean | 1.461 | 1.276 | 1.293 | 1.334 | 1.304 | 1.306 | 1.291 | 1.404 | 1.422 | 1.245 | |
St. Deviation | 0.437 | 0.354 | 0.379 | 0.414 | 0.374 | 0.378 | 0.365 | 0.437 | 0.407 | 0.366 | |
Variances homogeneity test | Levene statistic | 1.776 | 2.136 | 5.097 | |||||||
df1 | 4 | 2 | 1 | ||||||||
df2 | 595 | 597 | 598 | ||||||||
Sig. | 0.132 | 0.119 | 0.024 | ||||||||
Welch | Statistic | 3.771 | 4.329 | 31.043 | |||||||
df1 | 4 | 2 | 1 | ||||||||
df2 | 297.095 | 395.822 | 591.364 | ||||||||
Sig. | 0.005 | 0.014 | 0.000 |
Size | Distance | Shape | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
80 px | 100 px | 120 px | 140 px | 160 px | 243 px | 351 px | 459 px | Abstract | Concrete | ||
Descriptive statistics | N | 119 | 119 | 118 | 119 | 117 | 198 | 196 | 198 | 296 | 296 |
Mean | 0.448 | 0.463 | 0.430 | 0.452 | 0.428 | 0.436 | 0.444 | 0.453 | 0.444 | 0.445 | |
St. Deviation | 0.103 | 0.110 | 0.091 | 0.105 | 0.076 | 0.103 | 0.088 | 0.102 | 0.100 | 0.096 | |
Variances homogeneity test | Levene statistic | 2.684 | 2.348 | 0.190 | |||||||
df1 | 4 | 2 | 1 | ||||||||
df2 | 587 | 589 | 590 | ||||||||
Sig. | 0.031 | 0.096 | 0.663 | ||||||||
Welch | Statistic | 2.866 | 1.349 | 0.053 | |||||||
df1 | 4 | 2 | 1 | ||||||||
df2 | 292.339 | 390.868 | 589.100 | ||||||||
Sig. | 0.024 | 0.261 | 0.818 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Niu, Y.; Tian, J.; Han, Z.; Qu, M.; Tong, M.; Yang, W.; Xue, C. Enhancing User Experience of Eye-Controlled Systems: Design Recommendations on the Optimal Size, Distance and Shape of Interactive Components from the Perspective of Peripheral Vision. Int. J. Environ. Res. Public Health 2022, 19, 10737. https://doi.org/10.3390/ijerph191710737
Niu Y, Tian J, Han Z, Qu M, Tong M, Yang W, Xue C. Enhancing User Experience of Eye-Controlled Systems: Design Recommendations on the Optimal Size, Distance and Shape of Interactive Components from the Perspective of Peripheral Vision. International Journal of Environmental Research and Public Health. 2022; 19(17):10737. https://doi.org/10.3390/ijerph191710737
Chicago/Turabian StyleNiu, Yafeng, Jingze Tian, Zijian Han, Mengyuan Qu, Mu Tong, Wenjun Yang, and Chengqi Xue. 2022. "Enhancing User Experience of Eye-Controlled Systems: Design Recommendations on the Optimal Size, Distance and Shape of Interactive Components from the Perspective of Peripheral Vision" International Journal of Environmental Research and Public Health 19, no. 17: 10737. https://doi.org/10.3390/ijerph191710737
APA StyleNiu, Y., Tian, J., Han, Z., Qu, M., Tong, M., Yang, W., & Xue, C. (2022). Enhancing User Experience of Eye-Controlled Systems: Design Recommendations on the Optimal Size, Distance and Shape of Interactive Components from the Perspective of Peripheral Vision. International Journal of Environmental Research and Public Health, 19(17), 10737. https://doi.org/10.3390/ijerph191710737