Next Article in Journal
How the V4 Nations Handle the Idea of Smart Cities
Next Article in Special Issue
Cloud Gamification: Bibliometric Analysis and Research Advances
Previous Article in Journal
Incremental Entity Blocking over Heterogeneous Streaming Data
Previous Article in Special Issue
Deep Reinforcement Learning-Based iTrain Serious Game for Caregivers Dealing with Post-Stroke Patients
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Serious Games for Vision Training Exercises with Eye-Tracking Technologies: Lessons from Developing a Prototype

Department of Computer Science, Electrical Engineering and Mathematical Sciences, Western Norway University of Applied Sciences, 5063 Bergen, Norway
*
Authors to whom correspondence should be addressed.
Information 2022, 13(12), 569; https://doi.org/10.3390/info13120569
Submission received: 14 October 2022 / Revised: 24 November 2022 / Accepted: 2 December 2022 / Published: 7 December 2022
(This article belongs to the Special Issue Cloud Gamification 2021 & 2022)

Abstract

:
Eye-tracking technologies (ETs) and serious games (SGs) have emerged as new methods promising better support for vision screening and training. Previous research has shown the practicality of eye-tracking technology for vision screening in health care, but there remains a need for studies showing that the effective utilization of SGs and ETs are beneficial for vision training. This study investigates the feasibility of SGs and ETs for vision training by designing, developing, and evaluating a prototype influenced by commercially available games, based on a battery of exercises previously defined by vision experts. Data were collected from five participants, including a vision teacher, through a user experience questionnaire (UEQ) following a mixed method. Data analysis of the UEQ results and interviews highlighted the current challenges and positive attitudes in using SGs and ET for vision training. In conjunction with UEQ indicators such as attractiveness and perspicuity, the stimulation of the vision training battery based on the user experience provided insights into using ETs and further developing SGs to better approach different eye movements for vision training.

1. Introduction

Vision problems affect academic performance and cognitive and social well-being [1,2]. Many vision impairments are due to functional vision disorders involving several anomalies in eye movements, including fixations, saccadic, and smooth pursuit [3,4]. A frequent functional vision problem is oculomotor dysfunction (OMD), which involves problems with coordinating the muscles in the left and right eyes. The symptoms of such vision impairment include slow reading, headache, fatigue, dizziness, and poor attention span [5]. Vision exercises can improve the eye muscles effectively so trainers can more easily control their eye muscles and improve their stability. While these disorders influence the whole population, the most vulnerable are children, many of whom are suffering from undetected problems [6,7]. At the same time, early recognition, and training could improve the sight of patients, positively influencing their reading capacity [8,9].
In our previous study, we proposed a suite of serious games (SGs) using eye-tracking technologies (ETs) developed at Western Norway University of Applied Sciences to train the eyes of school-age children with vision impairment [10]. The games were initial prototypes, but they did not acquire relevant data showing the connection between stimuli and eye movements in the games. In this paper, we present an improved solution for vision training by adding a game and illustrating the possibility of storing and visualizing eye movement data; we also discuss future solutions for vision training. The main objectives of this paper were to present a prototype for a training solution, examine this prototype’s usability for end-users and experts, and contribute to a better understanding of critical issues in vision training. We describe the design process of the solution and the required interactions between the different components, such as possible visualization, database connectivity, and integration with ETs.
The rapid advancement in technology enables computerized software incorporating ETs to screen vision problems and provide objective measurements, e.g., for the duration of fixations, the length of saccades, and smooth pursuit while the eyes are performing visual tasks on a screen [11,12]. Available solutions primarily consider main eye movements (e.g., RightEye [13] or C&Look [7]) for vision screening, which is a step in the identification of functional vision problems before training. There are also promising training solutions involving more advanced computer or virtual reality technologies [14,15]. While these methods collect objective measures, the measurements from screening and training are not connected; they do not show the improvements gained during the training. If training activities can improve vision, as many examples from the literature show (e.g., [16,17]), it should be possible to measure changes during the training time. As these problems are common, portable, cheap and easy-to-use technologies are needed, for both identification and possible treatment. For possible treatments, the training procedures should be defined, supported with measurements, and validated, as done with most of the investigations for eyeglasses and surgical interventions [6].
Providing a more comprehensive technical solution for the treatment of some functional eye problems requires significant interdisciplinary work between several different research fields and practical domains. This study does not claim to produce a validated solution for immediate use but rather seeks to discover important aspects for approaching such a solution using ETs and SGs. These aspects relate to adjusting different parameters such as stimulus size, direction of stimulus, and speed and frequency from SGs to influence possible measurements using ET.
While using ETs and SGs for OMD screening has been studied [16], it has not yet been determined whether they are effective in vision therapy. In this study, we take the first steps toward developing a computerized vision therapy technology that can use ETs and SGs in vision therapy. In a previous study [10], through physical exercises supervised by vision experts, six games using ETs and SGs showed potential to complement current vision therapies. However, these games do not have databases that store data from players together with eye-tracking data or offer possibilities to examine the performance after the analysis by, e.g., replaying the games with eye movements. In this study, we further develop these games by enabling examination of the performance afterward. We also investigate possibilities to add aspects from commercial games (the game “Catch the Fruits”) into the training solution along with additional aspects influencing user experiences, e.g., sounds and music.

2. Background Literature

The diagnostic methods of functional vision impairments are based on the traditional subjective and computerized objective measurements that vision professionals perform. The traditional objective tools used for screening OMD are Development Eye Movements (DEM), King-Devick (K-D), and NSUCO oculomotor test (developed by Northeastern State University College of Optometry) [17]. Notably, there is no clear correlation between the results of these tests and specific eye movements. For example, although the saccades for the DEM test “may not correlate directly with eye movement parameters, they do correlate with aspects of reading performance and thus may serve a diagnostic role in clinical practice” [18]. Because there is no clear delimitation between the OMD problems within the many different other vision problems under the broad category of vision impairment, this paper does not claim to make a clear differentiation.
The use of digital technologies for personalized health services has increased the importance of research in vision therapy. This section discusses current research on (1) vision therapy approaches associated with vision problems, (2) state-of-the-art technologies used for vision therapy, and (3) the current role of eye-tracking technologies for vision therapy.
A common vision problem in early childhood is amblyopia. This issue is associated with abnormal visual acuity in or between two eyes that disrupts visual function [19]. The treatment of amblyopia is limited by age, and it is recommended to begin amblyopia treatment as early as possible [20]. Typically, vision in children reaches full maturity by age 9 or 10 years, and treatment should begin at the age of 6 years [21]. Dichoptic perceptual simulation tasks have enabled vision specialists to use this method to improve visual acuity in amblyopic children [22]. Dichoptic therapy is an approach that designs tasks using low contrast for the healthy eye and high contrast for the amblyopic eye or displaying different stimuli to both eyes. Hernández et al. [23] identified the progress of computer-based and Virtual Reality (VR) devices for amblyopia therapy. In the computer-based dichoptic environment, participants wear anaglyph red-green glasses, and VR has given promising results in using dichoptic environments because this platform allows adjustable parameters to be used for the stimuli in both monocular and binocular visual tasks without the use of occlusion [24].
Convergence insufficiency (CI) is another common vision disorder characterized by a lack of the ability to converge one’s eyes while performing near tasks [25]. Carvelho et al. [15] developed and tested a computer game for home-based therapy to examine the use of computer games in the therapy of convergence insufficiency. The results showed that four of the seven participants achieved improvements in their CI, and all seven participants found the gaming solution motivating for CI therapy. Boon et al. [26] compared a gamified training solution, Snake Game, in VR with anaglyphs in adults. The results indicated that participants spent more time on VR than on anaglyphs and showed that the benefits of using a gaming solution for CI in VR are feasible.
The role of eye-training technologies in vision screening was discussed in the previous section. The potential of ETs in vision therapy, however, has not been extensively studied in the past. Donmez et al. [27] designed and developed a vision training solution based on SGs and an ETs. The developed solution was tested by one individual with low-vision problems. The results indicated that it is possible to use an ET and SGs to increase a participant’s motivation in training sessions.
Furthermore, a recent study [28] used an ET to propose a vision training solution that did not include SGs. A calibration method was developed to help achieve optimal ET data, and features from saccade and pursuit tests were analyzed. Table 1 shows the technologies used to train various vision problems and indicates the use of gamification elements.

3. Materials and Methods

In this study, the approach was influenced by the “Design and Creation” [31] methodology (see Figure 1). Qualitative and quantitative data from five participants were collected after evaluating a full functional prototype. An evaluation was performed via observations while playing the game, with interviews and questionnaires administered afterwards; each session lasted approximately one hour per participant. The intention of choosing this method was to gain insight into how the end-users and enablers of the technologies view the applications, assess usability and user experience problems, and find improvement opportunities (e.g., [32,33]). The test examined the possibilities of the prototype, including (a) attitudes towards using such games for vision therapy, with a focus on differences from earlier games (developed in the study presented by Heldal and her colleagues [11]), (b) possibilities of developing a battery for vision therapy that can be extended, and (c) lessons for user experiences. Gaining information from different stakeholders for further development is necessary to inform research on handling multiple technologies, but involving more participants is not yet needed.
As part of our previous study [10], participants’ feedback, observations and interviews were considered for further application development, such as the database system and documents obtained from an ET manufacturer (Tobii) to determine the optimal viewing angle and distance from the screen for participants.

3.1. Overall Approach to Design and Development of Eye-Tracking Based Technologies for Vision Training

A recent study [28] used Tobii 4C ET to develop a vision training system to quantify the reliability of eye-movement data. The proposed training solution in this study for functional vision training was also developed using Tobii 4C ET and Tobii Pro SDK in Unity 3D (see Figure 2). Entity Framework Core (EF Core) was used to access the database, and SQLite was used as the database provider in EF Core. This solution stores the database on a local file instead of a server, which facilitates installation and the use of a database program. The Tobii default calibration system for Windows was applied before playing the games.
All games were designed and developed by following the guidelines (see Table 2) given by vision teachers, determining which eye movements need to be exercised for specific vision problems [10]. Most currently available training programs utilize such a correspondence table defined by vision experts and formulated/translated to several exercises, e.g., employing exercises from CogPack [34], improving saccadic eye movements through RightEye [35], or using C&Look [10]. The new game “Catch the Fruits” cooperated with all the gaze movements mentioned in Table 2. In this game, the player must move his or her eyes in all directions to scan and catch the fruits.

3.2. System Design for Vision Training

According to users who experienced the previous games (presented in [10]), the currently used battery should be simple, reminiscent of the earliest computer games. This is because end-users must follow specific movements on the screen such as fixations, smooth pursuit, and saccadic movements. These movements often follow eye-catching representations moving at various speeds on the screen. Neither the users nor the experts helping the users can exactly know if the movements are correctly performed and have an effect; they can only make predictions based on this performance. Considering the limitations of earlier games, feedback from users and vision teachers provided the opportunity to make amendments to the existing vision training solution. Figure 2 shows the overall system design for the proposed vision training solution. Defining and integrating a database enables users to store the game play, score, and gaze recordings of both the left and right eyes. The gaze recordings and visualization of the eye movement data followed the procedures of the C&Look [7] program.
The desired vision training solutions allow users and experts to do the following:
  • Register a new user.
  • Adjust the difficulty level of the game.
  • Play/ replay the games.
  • Examine visual graphs of eye movements.
At the same time, the database stores information about gameplay, e.g., scores and time, according to the rules of the game and the needs of individuals. The database needs to store data from ET (x,y,t), e.g., the (x) and (y) coordinates of the gaze and the associated time (t) for eye movements. The vision training solution should also visualize the horizontal and vertical eye-movement behaviors of both eyes, both separately and together, for further examinations. These eye movements include fixations, saccades, and smooth pursuits.
Figure 2. System design of vision training games.
Figure 2. System design of vision training games.
Information 13 00569 g002

3.3. Catch the Fruits Game

In the Catch the Fruits game, the player must attempt to catch the falling fruits, including cherry, banana, apple and orange, by moving his or her gaze simultaneously on top of the fruits as they drop from top to bottom of the screen (see Figure 3). The concept of this game is like that of Fruit Ninja [36], where fruits appear on the screen and the player has to slice the fruits by moving his or her fingers against the fruits. However, in the Catch the Fruits game, the player must be attentive and continue scanning the screen for the fruits because the speed of falling fruits increases with time. The game’s interface is relatively like most games where the score and the remaining time for the game are displayed on the screen. When the player catches the fruit, a buzzer sound plays to indicate a point scored.
At the end of the remaining time, the game stops and displays an interface showing the player’s score and buttons to replay, play again, or return to the menu (see Figure 4). Game replays and scores are saved in the user profile database and can be viewed at any time, allowing players and experts to see their progress throughout the game.
In addition to the game, visualization of the recorded eye movements is displayed on graphs, which can allow vision experts to easily understand various types of eye movements from the charts, including glissade, fixation, regression and saccade [37]. This analysis can help vision experts identify the problematic areas for the player during gameplay and dynamically adjust training plans and game settings to maximize the potential vision training. Figure 5 shows exemplary graphs for horizontal and vertical eye-movements in the recorded data.

3.4. Eye-Tracker Setup

It is imperative to calibrate an eye-tracker device before using it with a computer screen to achieve a highly accurate and efficient gaze measurement. The calibration method is used to map eye movements to image coordinates on a computer screen in real-time [38]. There are usually 2, 5 or 9 target points in a calibration set. It was recommended by Tobii that a score of at least 5 points is required for good performance. For this study, 7 points were chosen. The eye-tracking device was mounted underneath the laptop screen, as shown in Figure 6. The participant’s height and preferences were accommodated by adjusting the lid of the laptop screen.
Eye-trackers can only give optimal gaze measurements if participants are within the designated tracking range on the screen to achieve the correct gaze angle (α) (see Figure 7a. The eye-tracking device will not be able to collect data if α is exceeded, e.g., from the top and bottom corners of the screen. The distance between the human heads and the eye tracker was approximately 600–660 mm, and the distance did not exceed 36 degrees. Figure 7b illustrates the participant’s position from the eye-tracker.

3.5. User Experience Questionnaire (UEQ)

Developing a new and innovative solution incorporating many SGs for health purposes requires several quantitative and qualitative approaches [39]. This study was inspired by participatory research involving different stakeholders interested in a good end- product, including game researchers, vision teachers, and children as end-users. The user experience was evaluated using the User Experience Questionnaire (UEQ) provided by Schrepp et al. [40] and interviews with the participants.
The UEQ enables a rapid and direct evaluation of the user’s experience. The main goal of the questionnaire is to measure the users’ feelings, attitudes and impressions based on their spontaneous initial experience with the product [41].
The UEQ comprises a total of 26 items, and each item has a pair with an opposite meaning. Each item is rated on a 7-point Likert scale, ranging from 1 to 7 (see Table 3). The order of positive and negative terms is randomized. The 26 items have been categorized into six scales: attractiveness (impression of the product), perspicuity (the ease of use of an app is assessed), efficiency (speed of solving tasks), dependability (users’ control over product interaction), stimulation (assess users’ reactions), and novelty (innovating to what extent).

3.6. The Participants

Five participants, including three graduate students, a professor, and a vision teacher (VT), volunteered to test the vision therapy games. Fan et al. [42] argued that using different groups of participants can help gather usability feedback from various perspectives, and in the early stages of a prototype, results from a few, but representative, users can be important [43]. Because vision experts (here, a vision teacher) enable technology use, and students can be considered end-users, we examined the attitudes and usability from the perspectives of the two main stakeholder groups. The demographics of the participants are shown in Table 4.

4. Results

4.1. UEQ Results

Playing all games, filling in questionnaires, and conducting interviews took approximately one hour per participant, except for a more thorough discussion with the vision expert (a practicing vision teacher with extensive experience with vision problems for children and adults). The interview with the vision teacher took approximately two hours.
The UEQ [40] results for the full cohort are shown in Figure 8. These results illustrate the scores for the pragmatic quality aspects of perspicuity, efficiency, and dependability as excellent, good and above average, respectively. The scales describing hedonic aspects (stimulation and novelty) were found to be excellent and above average, respectively, while the scale measuring attractiveness was considered excellent (see Table 5).
Schrepp et al. [40] noted that the three aspects of UX, perspicuity, originality, and stimulation, must be excellent, while other UEQ scales must be at least above average. In our analysis, we found that perspicuity and stimulation were excellent, and originality/novelty was above average.

4.2. Open-Ended Questionnaire Results

To evaluate their impressions, all participants were asked to fill out an open-ended questionnaire to provide further information regarding the improvements to the game mechanism. A questionnaire was designed with the following questions.
What is your first impression of the game? Do you think it can be:
  • Used by children/patients (alone)?
  • Used with help from somebody (parents/experts, etc.). If help is needed, who can help?
  • Can it be used every day, if needed?
  • Does it complement the other training? If yes, how?
This vision training program received positive responses from the participants. According to part (a), one participant believed children may need assistance using the games, while the vision teacher agreed that the game has a clear purpose and can be easily used by children. Another participant suggested that tutorials should be provided for children when playing the game. The objective of the game was generally agreed upon by most participants (b). Four of the responses indicated that anyone could assist children in playing the game. The vision teacher believed that most children would be able to understand the game if it is not too difficult or complicated (c). There were mixed responses among participants. According to two participants, goal setting is an important element of vision training. One participant believed that goalsetting can be used every day, while two other responses suggested adding more variety to keep it interesting and motivating. The vision teacher wrote that the game could complement fixation training. One participant believed that the game will be easy to play and understand, while the other believed it was more like an actual game. In the fifth participant’s opinion, the game should clearly define what needs to be complemented. As shown in Figure 9, a word cloud was created from all the answers given by participants in above mentioned questions in Figure 9a–d.
According to the quantification of word use (see Figure 10), “easy to understand/use” were the most common words (n = 8) used by the participants in the interview. Answers to the posed questions were given a “Code” summarizing the idea behind the response. The used code was “Easy to understand/use”, which indicates that the game could be used by anyone regardless of their background with technology/vision, while “Requires Supervision” and “Needs a little supervision” indicated varying needs for specialized assistance. “More supporting features” described possible improvements to the game, and “Needs to be more focused” expressed an interest in having the game be more focused for a singular purpose. Finally, “Can help experts” represented responses highlighting how the game can be used for vision therapy with proper guidance from an expert.
Figure 9. Word clouds generated from the answers to each of the four questions.
Figure 9. Word clouds generated from the answers to each of the four questions.
Information 13 00569 g009

4.3. Interview Results

After the UEQ, we conducted interviews to gather qualitative feedback from participants to reflect on their use of the ET and “Catch the Fruits” game. The themes for the interview questions were divided into four parts: experience in using the ET, interaction, performance, and suggestions for improvements.

4.3.1. Using the Eye-Tracker

All the participants followed the calibration process before playing the game. Afterward, they played a fruit game twice (Level 1 and Level 2) using eye movements. In the interview, all participants explicitly stated that interaction with the ET was “good and intuitive”. The ET also performed effectively with participants wearing spectacles during calibration and gameplay. None of the participants experienced problems using the ET.

4.3.2. Interaction with the Game and Objects

While interaction with objects was considered by participants to be easy and intuitive, most participants expressed concern about the size and quantity of text on the instruction screen. The remarks were, e.g., “the text should be bigger”, “it was difficult to read lots of text”, or “maybe start with a tutorial [instead of having written comments]”. Further, participants expressed their wish to be able to change fruit objects and adjust their size according to the difficulty level in the game.

4.3.3. Performance of the Game

All the participants were satisfied with the performance of the game, but in the beginning of the test, four out of five participants were concerned that the eye-tracking was not working on the left side of the screen because the ET needed proper cleaning. When the dust was cleaned off, and the ET was connecting again, the ET functioned correctly for the fifth participant. The remarks of the fifth participant were “working properly, no problem”.

4.3.4. Suggested Improvements

The overall experience of the participants was good. However, some suggestions were made for further improvements, e.g., some participants wanted to see a tutorial of the gameplay at the beginning, with less text used for explaining selection and difficulty levels on the screen. Participants also suggested having more animation, e.g., randomized background images and music associated with the movements in the game. One participant also expressed that “I want to see my gaze get registered on the fruit”.

5. Discussion and Future Work

Implementing technology support that can be used by vision professionals in the healthcare sector and eventually by other non-professionals to help children with their vision problems cannot be a short-term project. In future work, several steps of further testing are needed to make the technology support have greater precision, to tailor the software to stakeholders, and to make the software more user-friendly. This paper illustrated serious games that can support eye training, taking inspiration for ease of use from commercial games, and demonstrating the game’s feasibility for end-users.
As mentioned in the background literature section, this work considered the design principles for developing games with ET for children with low vision problems, as suggested by Donmez et al. [27]. While this study focused on a slightly different situation related to supporting children with functional vision problems, especially those with OMD, we mostly respected these guidelines. Here are some comments and lessons related to this study:
  • Clear and short instructions should guide the gameplay. These instructions were defined and implemented in the flow of the game.
  • Short tutorials should be provided before playing the game with an illustration about the game. However, these tutorials must be tailored to the relevant problems and stakeholders.
  • A complete goal should be defined when starting with the training solution—for example, adjusting parameters of games such as frequency, speed, and time for each session.
  • For children who have difficulty understanding the games, some help should be given by someone, e.g., parents. How people can help, depending on their role, should also be defined. At this stage, this study only included help from the developers based on initial guidelines from vision teachers [10].
  • Games should have different levels and backgrounds to overcome the boredom that can arise with repetition. This boredom can be mitigated by, e.g., including different challenges and levels in the game. While for these short tests, the games were appreciated as challenging and able to support high-quality user experiences, these issues should be considered for real use, such as when users need to train several days a week and for several weeks [44].
  • Games should include factors that focus on eye movements, e.g., fixations, saccades, and smooth pursuits. Measurements of these eye movements that indicate eventual problems have not yet been described.
By analyzing the UEQ score for this type of application, we can identify specific goals for designing future SGs and issues to be considered in these SGs. To improve efficiency and dependability, user suggestions will have to be implemented. The most common suggestions that participants mentioned during the interview sessions in this study included larger text sizes, tutorials at the beginning of the game, and less text.
An additional problem we need to address in identifying eye problems (not depending on general or functional eye problems) is improving the connection between screening and interpretation of the results. Considering screening at the right time during vision rehabilitation is a necessary step for better training. Additionally, as shown in this paper, including elements from current, commercially available games would be beneficial.
Because children with vision problems must be advised to use training by eye professionals, and training needs to be suggested to their guardians, games need to be incorporated into a specific relevant training battery. Such a battery includes several physical exercises with the eyes in addition to, for example, balance training [45]. Therefore, it is not enough for the game to be tested only by children and vision experts. The game should also be accepted by groups from different domains who are supporting the children’s training. A limitation of defining such games is the missing access to expert competencies. An essential next step for the future is to define a better platform incorporating screening and training that can be used by children, experts, and non-professionals to help children’s training. For this purpose, one of the most critical issues is ensuring continuous contact with vision experts during the entirety of game development. Analyzing the eye data at this stage of game development also has limitations. Today, these data are not accurate enough, even though gaze movements can be recorded and visualized using charts (see Figure 2). Further development is needed to adjust and visualize other eye movements such as saccades and smooth pursuit. Eye-tracking data can also provide valuable information by comparing gaze behavior with on-screen stimuli. These measurements were inspired by data presented in Ciman et al. [46], such as first response time upon arrival of the fruit, shifting one’s gaze towards the fruit object, and the time taken to catch the fruit. Some charts can be plotted to examine the movements of stimuli and gaze movements. Such analyses enable vision experts to study gaze movement patterns relative to stimuli movements patterns.
Furthermore, by using tools such as immersive HMDs, there are additional possibilities to test accommodation and convergence, which are also essential for functional vision. While different eye-training [47] and immersive technologies have limitations today, many research projects are focusing on developing more supportive immersive VR for therapy in amblyopia and strabismus, as well as rehabilitation of cognitive and functional performance [48]. We also need to determine the economic feasibility of this training support solution.
The next step for this study is to integrate training analytics and results to give all stakeholders an overview of the training. Currently, eye-tracking data are represented in charts, which only vision experts can interpret to understand eye movements. A quantitative analysis to extract eye events such as fixations and saccades should be implemented [49]. With this future work, the novelty of the present vision training solution would increase significantly.
We will further develop and design gamification elements suggested by the participating vision teacher. One of the most critical needs for future research is up-to-date data to consider different types of functional and cognitive impairments in the background. On the scale of technology readiness level, the proposed game achieves a score of 3, “Analytical and experimental critical function and/or characteristic proof-of concept” [50]. We investigated the feasibility of the vision training solution in the laboratory with only five participants. This limited evaluation still provided useful feedback, with many suggestions for improving the game. After implementing the above-mentioned suggestions, the next step should be validation in both the laboratory and a real environment with children who have vision problems to achieve closer to technology readiness level 5 or 6.
While OMD and other functional vision problems are common among the general population, this study focused on school-aged children. Therefore, now, we can only suggest that this product will be suitable for children. However, we also wish to examine our ideas among a more general population, including others with OMD problems, such as elderly stroke patients [51] or people with dementia [52].

6. Conclusions

This paper presented a method for using ETs and SGs for vision training and examined the method’s feasibility. The proposed solution was designed for vision professionals, laypersons, parents, and helpers of users with different vision impairments. In the examples presented and evaluated in this study, we illustrated the benefits of using eye-tracking to play games for the training of OMD and visualize the performance of eye movements while playing a game. However, the statistical connection between eye-tracking data and the existing visualized charts remains to be further developed. The user experiences of the evaluated game were, in many respects, considered excellent.
There are several suggestions that could improve and validate the solution. As a first step, researchers should investigate how different types of stakeholders can use the application to reach their goals through examination and training support. These requirements may be different from those of children or other end-users suffering from functional vision problems. Superimposing eye movements was necessary to inform end-users and other stakeholders about actual vision status. However, this result does not indicate problems and progress. Future work should explore how problems and progress can be visualized and how the different examinations associated with vision performance can be used to support vision experts or lay persons helping end-users.

Author Contributions

Conceptualization, Q.A. and I.H.; methodology, C.H. and I.H.; formal analysis, Q.A. and A.D.; investigation, Q.A. and I.H.; data curation, Q.A.; writing—original draft preparation, Q.A. and I.H.; writing—Q.A., I.H., A.D. and C.G.H.; visualization, Q.A.; supervision, I.H.; project administration, I.H. All authors have read and agreed to the published version of the manuscript.

Funding

This paper was financed by the Vision 2030 project, Securing education for children in Tanzania. Funding N.F.R., Project no: 267524/H30.

Data Availability Statement

All the data used in this research can be provided upon request.

Acknowledgments

The authors wish to thank the students at HVL for their help in the development of games. We thank Eva Bjånes, Tor André Larsen, Øystein Vikane Knutsen, Gudsteinn Arnarson, Kristian Eliassen, and Stian Grønås [53].

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Alvarez-Peregrina, C.; Sánchez-Tena, M.Á.; Andreu-Vázquez, C.; Villa-Collar, C. Visual Health and Academic Performance in School-Aged Children. Int. J. Environ. Res. Public Health 2020, 17, 2346. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Zaba, J.N. Children’s Vision Care in the 21st Century & Its Impact on Education Literacy, Social Issues, & The Workplace: A Call to Action. J. Behav. Optom. 2011, 22, 39–41. [Google Scholar]
  3. Iyer, J.; Taub, M.B. The VisionPrint System: A new tool in the diagnosis of ocular motor dysfunction. Optom. Vis. Dev. 2011, 42, 17–24. [Google Scholar]
  4. Tanke, N.; Barsingerhorn, A.D.; Boonstra, F.N.; Goossens, J. Visual fixations rather than saccades dominate the developmental eye movement test. Sci. Rep. 2021, 11, 1162. [Google Scholar] [CrossRef] [PubMed]
  5. Bonilla-Warford, N.; Allison, C. A Review of the Efficacy of Oculomotor Vision Therapy in Improving Reading Skills. J. Optom. Vis. Dev. 2004, 35, 108–115. [Google Scholar]
  6. Ali, Q.; Heldal, I.; Helgesen, C.G.; Costescu, C.; Kovari, A.; Katona, J.; Thill, S. Eye-tracking Technologies Supporting Vision Screening In Children. In Proceedings of the 2020 11th IEEE International Conference on Cognitive Infocommunications (CogInfoCom), Mariehamn, Finland, 23–25 September 2020; pp. 471–478. [Google Scholar]
  7. Eide, M.G.; Heldal, I.; Helgesen, C.G.; Wilhelmsen, G.B.; Watanabe, R.; Geitung, A.; Soleim, H.; Costescu, C. Eye tracking complementing manual vision screening for detecting oculomotor dysfunction. In Proceedings of the 2019 E-Health and Bioengineering Conference (EHB), Lasi, Romania, 21–23 November 2019; pp. 1–5. [Google Scholar]
  8. Thiagarajan, P.; Ciuffreda, K.J.; Capo-Aponte, J.E.; Ludlam, D.P.; Kapoor, N. Oculomotor neurorehabilitation for reading in mild traumatic brain injury (mTBI): An integrative approach. NeuroRehabilitation 2014, 34, 129–146. [Google Scholar] [CrossRef]
  9. Wilhelmsen, G.B.; Felder, M. Structured Visual Learning and Stimulation in School: An Intervention Study. Create. Educ. 2021, 12, 757–779. [Google Scholar] [CrossRef]
  10. Heldal, I.; Helgesen, C.; Ali, Q.; Patel, D.; Geitung, A.B.; Pettersen, H. Supporting School Aged Children to Train Their Vision by Using Serious Games. Computers 2021, 10, 53. [Google Scholar] [CrossRef]
  11. Ujbanyi, T.; Katona, J.; Sziladi, G.; Kovari, A. Eye-tracking analysis of computer networks exam question besides different skilled groups. In Proceedings of the 2016 7th IEEE International Conference on Cognitive Infocommunications (CogInfoCom), Wroclaw, Poland, 16–18 October 2016; pp. 277–282. [Google Scholar]
  12. Hirota, M.; Kato, K.; Fukushima, M.; Ikeda, Y.; Hayashi, T.; Mizota, A. Analysis of smooth pursuit eye movements in a clinical context by tracking the target and eyes. Sci. Rep. 2022, 12, 8501. [Google Scholar] [CrossRef]
  13. Murray, N.; Kubitz, K.; Roberts, C.-M.; Hunfalvay, M.; Bolte, T.; Tyagi, A. An examination of the oculomotor behavior metrics within a suite of digitized eye tracking tests. IEEE J. Transl. Eng. Health. Med. 2019, 5, 1–5. [Google Scholar]
  14. Fortenbacher, D.L.; Bartolini, A.; Dornbos, B.; Tran, T. Vision Therapy and Virtual Reality Applications. Adv. Ophthalmol. Optom. 2018, 3, 39–59. [Google Scholar] [CrossRef]
  15. Carvelho, T.; Allison, R.S.; Irving, E.L.; Herriot, C. Computer gaming for vision therapy. In Proceedings of the 2008 Virtual Rehabilitation, Vancouver, BC, Canada, 25–27 August 2008; pp. 198–204. [Google Scholar]
  16. Gaggi, O.; Ciman, M. The use of games to help children eyes testing. Multimed. Tools Appl. 2016, 75, 3453–3478. [Google Scholar] [CrossRef]
  17. Facchin, A. Spotlight on the Developmental Eye Movement (DEM) Test. Clin. Optom. 2021, 13, 73–81. [Google Scholar] [CrossRef] [PubMed]
  18. Ayton, L.N.; Abel, L.A.; Fricke, T.R.; McBrien, N.A. Developmental eye movement test: What is it really measuring? Optom. Vis. Sci. 2009, 86, 722–730. [Google Scholar] [CrossRef] [PubMed]
  19. Sehgal, S.; Satgunam, P. Quantifying Suppression in Anisometropic Amblyopia With VTS4 (Vision Therapy System 4). Transl. Vis. Sci. Technol. 2020, 9, 24. [Google Scholar] [CrossRef]
  20. Sanchez, I.; Ortiz-Toquero, S.; Martin, R.; de Juan, V. Advantages, limitations, and diagnostic accuracy of photoscreeners in early detection of amblyopia: A review. Clin. Ophthalmol. 2016, 10, 1365–1373. [Google Scholar] [CrossRef] [Green Version]
  21. Bortoli, A.D.; Gaggi, O. PlayWithEyes: A new way to test children eyes. In Proceedings of the 2011 IEEE 1st International Conference on Serious Games and Applications for Health (SeGAH), Braga, Portugal, 16–18 November 2011; pp. 1–4. [Google Scholar]
  22. Li, S.L.; Reynaud, A.; Hess, R.F.; Wang, Y.Z.; Jost, R.M.; Morale, S.E.; De La Cruz, A.; Dao, L.; Stager, D., Jr.; Birch, E.E. Dichoptic movie viewing treats childhood amblyopia. J. Am. Assoc. Pediatr. Ophthalmol. Strabismus 2015, 19, 401–405. [Google Scholar] [CrossRef] [Green Version]
  23. Hernández-Rodríguez, C.J.; Piñero, D.P.; Molina-Martín, A.; Morales-Quezada, L.; de Fez, D.; Leal-Vega, L.; Arenillas, J.F.; Coco-Martín, M.B. Stimuli Characteristics and Psychophysical Requirements for Visual Training in Amblyopia: A Narrative Review. J. Clin. Med. 2020, 9, 3985. [Google Scholar] [CrossRef]
  24. Eastgate, R.M.; Griffiths, G.D.; Waddingham, P.E.; Moody, A.D.; Butler, T.K.H.; Cobb, S.V.; Comaish, I.F.; Haworth, S.M.; Gregson, R.M.; Ash, I.M.; et al. Modified virtual reality technology for treatment of amblyopia. Eye 2006, 20, 370–374. [Google Scholar] [CrossRef]
  25. Singh, A.; Saxena, V.; Yadav, S.; Agrawal, A.; Ramawat, A.; Samanta, R.; Panyala, R.; Kumar, B. Comparison of home-based pencil push-up therapy and office-based orthoptic therapy in symptomatic patients of convergence insufficiency: A randomized controlled trial. Int. Ophthalmol. 2021, 41, 1327–1336. [Google Scholar] [CrossRef]
  26. Boon, M.Y.; Asper, L.J.; Chik, P.; Alagiah, P.; Ryan, M. Treatment and compliance with virtual reality and anaglyph-based training programs for convergence insufficiency. Clin. Exp. Optom. 2020, 103, 870–876. [Google Scholar] [CrossRef] [PubMed]
  27. Donmez, M.; Cagiltay, K. Development of eye movement games for students with low vision: Single-subject design research. Educ. Inf. Technol. 2019, 24, 295–305. [Google Scholar] [CrossRef]
  28. Kita, R.; Yamamoto, M.; Kitade, K. Development of a Vision Training System Using an Eye Tracker by Analyzing Users’ Eye Movements. In Proceedings of the 22nd HCI International Conference 2020—Late Breaking Papers: Interaction, Knowledge and Social Media, Copenhagen, Denmark, 19–24 July 2020; pp. 371–382. [Google Scholar]
  29. Handa, T.; Ishikawa, H.; Shoji, N.; Ikeda, T.; Totuka, S.; Goseki, T.; Shimizu, K. Modified iPad for treatment of amblyopia: A preliminary study. J. Am. Assoc. Pediatr. Ophthalmol. Strabismus 2015, 19, 552–554. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  30. Jiménez-Rodríguez, C.; Yélamos-Capel, L.; Salvestrini, P.; Pérez-Fernández, C.; Sánchez-Santed, F.; Nieto-Escámez, F. Rehabilitation of visual functions in adult amblyopic patients with a virtual reality videogame: A case series. Virtual Real. 2021. [Google Scholar] [CrossRef]
  31. Oates, B.J. Researching Information Systems and Computing, 1st ed.; Sage: Thousand Oaks, CA, USA, 2005; p. 360. [Google Scholar]
  32. Hettervik-Frøland, T.; Heldal, I.; Ersvaer, E.; Sjøholt, G. Merging 360°-videos and Game-Based Virtual Environments for Phlebotomy Training: Teachers and Students View. In Proceedings of the 2021 International Conference on e-Health and Bioengineering, Lasi, Romania, 18–19 November 2021; pp. 1–6. [Google Scholar]
  33. Costescu, C.; Rosan, A.; Brigitta, N.; Hathazi, A.; Kovari, A.; Katona, J.; Demeter, R.; Heldal, I.; Helgesen, C.; Thill, S.; et al. Assessing Visual Attention in Children Using GP3 Eye Tracker. In Proceedings of the 2019 10th IEEE International Conference on Cognitive Infocommunications (CogInfoCom), Naples, Italy, 23–25 October 2019; pp. 343–348. [Google Scholar]
  34. Günther, V.; Holzner, B.; Kemmler, G.; Pircher, M.; Trebo, E.; Hinterhuber, H. Computer-assisted cognitive training in psychiatric outpatients. Eur. Neuropsychopharmacol. 1997, 1002, 287–288. [Google Scholar] [CrossRef]
  35. Murray, N.P.; Hunfalvay, M.; Roberts, C.-M.; Tyagi, A.; Whittaker, J.; Noel, C. Oculomotor Training for Poor Saccades Improves Functional Vision Scores and Neurobehavioral Symptoms. Arch. Rehabil. Res. Clin. Transl. 2021, 3, 100126. [Google Scholar] [CrossRef]
  36. Aker, Ç.; Rızvanoğlu, K.; İnal, Y.; Yılmaz, A.S. Analyzing playability in multi-platform games: A case study of the Fruit Ninja Game. In Proceedings of the 5th 2016 International Conference of Design, User Experience, and Usability, Toronto, Canada, 17–22 July 2016; pp. 229–239. [Google Scholar]
  37. Eide, G.M.; Watanabe, R.; Heldal, I.; Helgesen, C.; Geitung, A.; Soleim, H. Detecting oculomotor problems using eye tracking: Comparing EyeX and TX300. In Proceedings of the 10th IEEE Conference on Cognitive Infocommunication (CogInfoCom), Naples, Italy, 23–25 October 2019; pp. 381–388. [Google Scholar]
  38. Ramanauskas, N. Calibration of Video-Oculographical Eye-Tracking System. Elektron. Elektrotech. 2006, 8, 65–68. [Google Scholar]
  39. Timans, R.; Wouters, P.; Heilbron, J. Mixed methods research: What it is and what it could be. Theory Soc. 2019, 48, 193–216. [Google Scholar] [CrossRef] [Green Version]
  40. Schrepp, M.; Hinderks, A.; Thomaschewski, J. Construction of a Benchmark for the User Experience Questionnaire (UEQ). Int. J. Interact. Multi. 2017, 4, 40–44. [Google Scholar] [CrossRef] [Green Version]
  41. Tarantino, E.; De Falco, I.; Scafuri, U. A mobile personalized tourist guide and its user evaluation. Inf. Technol. Tour. 2019, 21, 413–455. [Google Scholar] [CrossRef]
  42. Fan, Z.; Brown, K.; Nistor, S.; Seepaul, K.; Wood, K.; Uribe-Quevedo, A.; Perera, S.; Waller, E.; Lowe, S. Use of Virtual Reality Technology for CANDU 6 Reactor Fuel Channel Operation Training. In Proceedings of the 9th Games and Learning Alliance, Laval, France, 9–10 December 2020; pp. 91–101. [Google Scholar]
  43. Turner, C.W.; Nielsen, J.; Lewis, J.R. Current issues in the determination of usability test sample size: How many users is enough. In Proceedings of the 2002 Usability Professionals Association, Orlando, FL, USA, 8–12 July 2002; pp. 1–5. [Google Scholar]
  44. Ali, Q.; Heldal, I.; Eide, M.G.; Helgesen, C.; Wilhelmsen, G.B. Using Eye-tracking Technologies in Vision Teachers’ Work–a Norwegian Perspective. In Proceedings of the 2020 E-Health and Bioengineering Conference (EHB), Lasi, Romania, 29–30 October 2020; pp. 1–5. [Google Scholar]
  45. Hyldgaard, D.; Schweder, F.J.v.; Ali, Q.; Heldal, I.; Knapstad, M.K.; Aasen, T. Open Source Affordable Balance Testing based on a Nintendo Wii Balance Board. In Proceedings of the 2021 International Conference on e-Health and Bioengineering (EHB), Lasi, Romania, 18–19 November 2021; pp. 1–4. [Google Scholar]
  46. Ciman, M.; Gaggi, O.; Sgaramella, T.M.; Nota, L.; Bortoluzzi, M.; Pinello, L. Serious Games to Support Cognitive Development in Children with Cerebral Visual Impairment. Mob. Netw. Appl. 2018, 23, 1703–1714. [Google Scholar] [CrossRef]
  47. Blignaut, P. Fixation identification: The optimum threshold for a dispersion algorithm. Atten. Percept. Psychophys. 2009, 71, 881–895. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  48. Ali, Q.; Heldal, I.; Helgesen, C.G. A Bibliometric Analysis of Virtual Reality-Aided Vision Therapy. Stud. Health Technol.Inform. 2022, 295, 526–529. [Google Scholar] [PubMed]
  49. Salvucci, D.; Goldberg, J. Identifying fixations and saccades in eye-tracking protocols. In Proceedings of the 2000 the Eye Tracking Research and Application Symposium (ERTA), Beach Gardens, FL, USA, 6–8 November 2002; pp. 71–78. [Google Scholar]
  50. Mankins, J.C. Technology readiness levels- A White Paper. Business 1995, 6, 1–5. [Google Scholar]
  51. Rowe, F.; Brand, D.; Jackson, C.A.; Price, A.; Walker, L.; Harrison, S.; Eccleston, C.; Scott, C.; Akerman, N.; Dodridge, C.; et al. Visual impairment following stroke: Do stroke patients require vision assessment? Age Ageing 2009, 38, 188–193. [Google Scholar] [CrossRef] [Green Version]
  52. Russell, L.L.; Greaves, C.V.; Convery, R.S.; Bocchetta, M.; Warren, J.D.; Kaski, D.; Rohrer, J.D. Eye movements in frontotemporal dementia: Abnormalities of fixation, saccades and anti-saccades. Alzheimer’s Dement. Transl. Res. Clin. Interv. 2021, 7, e12218. [Google Scholar] [CrossRef]
  53. Arnarson, G.; Eliassen, K.; Grønås, S. Improvement of Games using Eye Tracking for Oculomotor Training. Bachelor’s Thesis, Western Norway University of Applied Sciences, Bergen, Norway, 2021. [Google Scholar]
Figure 1. Design and creation approach [31] utilized for this study.
Figure 1. Design and creation approach [31] utilized for this study.
Information 13 00569 g001
Figure 3. Screenshot illustrating playing the game “Catch the Fruits”.
Figure 3. Screenshot illustrating playing the game “Catch the Fruits”.
Information 13 00569 g003
Figure 4. The display showing choices and scores after finishing the game.
Figure 4. The display showing choices and scores after finishing the game.
Information 13 00569 g004
Figure 5. Eye movement graphs for horizontal and vertical eye movements.
Figure 5. Eye movement graphs for horizontal and vertical eye movements.
Information 13 00569 g005
Figure 6. Eye-tracker placement and calibration window. Calibration software shows six points in two parts.
Figure 6. Eye-tracker placement and calibration window. Calibration software shows six points in two parts.
Information 13 00569 g006
Figure 7. The left side (a) of the picture illustrates the approach of gaze measurements on the target on the screen. This angle was able to correctly collect eye movement data. The right side (b) shows the participant sitting at a correct distance from the eye-tracker for reading the instructions before the game.
Figure 7. The left side (a) of the picture illustrates the approach of gaze measurements on the target on the screen. This angle was able to correctly collect eye movement data. The right side (b) shows the participant sitting at a correct distance from the eye-tracker for reading the instructions before the game.
Information 13 00569 g007
Figure 8. Mean values of UEQ questionnaire indicators.
Figure 8. Mean values of UEQ questionnaire indicators.
Information 13 00569 g008
Figure 10. The frequency of most common words.
Figure 10. The frequency of most common words.
Information 13 00569 g010
Table 1. Literature for vision therapy supported by technologies with (✓) or without (×) gamification elements.
Table 1. Literature for vision therapy supported by technologies with (✓) or without (×) gamification elements.
AuthorsYearVision ProblemTechnologyGamification Elements
Li et al. [22]2015Amblyopia3D monitor×
Handa et al. [29]2015AmblyopiaiPad
Eastgate et al. [24]2006AmblyopiaVirtual Reality
Jiménez et al. [30]2021AmblyopiaVirtual Reality
Carvelho et al. [15]2008Convergence insufficiencyComputer
Boon et al. [26]2020Convergence insufficiencyVirtual Reality
Donmez et al. [27]2019Low visionEye-tracker
Kita et al. [28]2020Eye-movementsEye-tracker×
Table 2. Ocular motor activities connected to relevant training exercises.
Table 2. Ocular motor activities connected to relevant training exercises.
ChallengeOcular Motor ActivitiesExercises
Field of viewSaccades
Visual attention
Regression
Horizontal movements
Vertical movements
Diagonal movements
Circular movements
Visual acuityFixations
Endurance
Saccades
Mini saccades
Searching/Scanning
Find objects in a crowd
Horizontal movements
Vertical movements
Diagonal movements
Circular movements
Smooth pursuit
Find pairs/similarities
Labyrinths point to point
StereopsisAccommodations
Convergence
Double vision
Movements and flashes at a distance—different depths
Objects that vary in size
Eye-hand coordination Use the mouse or keyboard-based on events on the screen, and vice versa.
Table 3. All 26 items of the UEQ.
Table 3. All 26 items of the UEQ.
1234567
1UncomfortableComfortable
2IncomprehensibleComprehensible
3CreativeNot creative
4Easy to understandDifficult to understand
5NoticeablePoor
6BoringFascinating
7InsignificantInteresting
8UnpredictablePredictable
9FastSlow
10OriginalConventional
11ObstructiveOf support
12AgreeableDisagreeable
13ComplicateEasy
14RepellentAttractive
15UsualModern
16AppreciatedUnpleasant
17SureUnsure
18StimulatingSoporific
29SatisfyingScant
20InefficientEfficient
21ClearMessy
22Not much practicalPractical
23OrderedUnordered
24AttractiveNot attractive
25FriendlyHostile
26ConservativeInnovative
Table 4. The participants’ infographics.
Table 4. The participants’ infographics.
ParticipantProfessionUse SpectaclesAge
P1Vision TeacherYes≈55
P2TeacherYes≈60
P3StudentYes30
P4StudentNo25
P5StudentNo23
Table 5. Shows interpretation of UEQ results from the benchmark.
Table 5. Shows interpretation of UEQ results from the benchmark.
IndicatorAttractivenessPerspicuityEfficiencyDependabilityStimulationNovelty
Value1.92.151.51.151.550.75
CategoryExcellentExcellentGoodAbove AverageExcellentAbove average
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Ali, Q.; Heldal, I.; Helgesen, C.G.; Dæhlen, A. Serious Games for Vision Training Exercises with Eye-Tracking Technologies: Lessons from Developing a Prototype. Information 2022, 13, 569. https://doi.org/10.3390/info13120569

AMA Style

Ali Q, Heldal I, Helgesen CG, Dæhlen A. Serious Games for Vision Training Exercises with Eye-Tracking Technologies: Lessons from Developing a Prototype. Information. 2022; 13(12):569. https://doi.org/10.3390/info13120569

Chicago/Turabian Style

Ali, Qasim, Ilona Heldal, Carsten Gunnar Helgesen, and Are Dæhlen. 2022. "Serious Games for Vision Training Exercises with Eye-Tracking Technologies: Lessons from Developing a Prototype" Information 13, no. 12: 569. https://doi.org/10.3390/info13120569

APA Style

Ali, Q., Heldal, I., Helgesen, C. G., & Dæhlen, A. (2022). Serious Games for Vision Training Exercises with Eye-Tracking Technologies: Lessons from Developing a Prototype. Information, 13(12), 569. https://doi.org/10.3390/info13120569

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop