Next Article in Journal
MgO Modified with MgF2 as an Electrolyte Immobilizing Agent for the High-Temperature Cells
Next Article in Special Issue
Usability Measures in Mobile-Based Augmented Reality Learning Applications: A Systematic Review
Previous Article in Journal
Optimization of Accelerator Mixing Ratio for EPDM Rubber Grommet to Improve Mountability Using Mixture Design
Previous Article in Special Issue
AR Object Manipulation on Depth-Sensing Handheld Devices
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Hand Gestures in Virtual and Augmented 3D Environments for Down Syndrome Users

by
Marta Sylvia Del Rio Guerra
1,2,
Jorge Martin-Gutierrez
1,*,
Renata Acevedo
2 and
Sofía Salinas
2
1
Universidad La Laguna, 38200 Tenerife, Spain
2
Universidad de Monterrey, 66238 Monterrey, Mexico
*
Author to whom correspondence should be addressed.
Appl. Sci. 2019, 9(13), 2641; https://doi.org/10.3390/app9132641
Submission received: 9 May 2019 / Revised: 21 June 2019 / Accepted: 23 June 2019 / Published: 29 June 2019
(This article belongs to the Special Issue Augmented Reality: Current Trends, Challenges and Prospects)

Abstract

:

Featured Application

This work provides the keys to design interactive apps based on virtual and augmented reality using suitable Mid-Air gestures according to considerations of User Experience.

Abstract

Studies have revealed that applications using virtual and augmented reality provide immersion, motivation, fun and engagement. However, to date, few studies have researched how users with Down syndrome interact with these technologies. This research has identified the most commonly used interactive 3D gestures according to the literature and tested eight of these using Oculus, Atheer and Leap Motion technologies. By applying MANOVAs to measurements of the time taken to complete each gesture and the success rate of each gesture when performed by participants with Down syndrome versus neurotypical participants, it was determined that significant difference was not shown for age or gender between these two sample groups. From the results, a difference was only demonstrated for the independent variable Down syndrome when analysed as a group. By using ANOVAs, it was determined that both groups found it easier to perform the gestures Stop, Point, Pan and Grab; thus, it is argued that these gestures should be used when programming software to create more inclusive AR and VR environments. The hardest gestures were Take, Pinch, Tap and Swipe; thus, these should be used to confirm critical actions, such as deleting data or cancelling actions. Lastly, the authors gather and make recommendations on how to develop inclusive 3D interfaces for individuals with Down syndrome.

1. Introduction

The use of technologies by individuals with Down syndrome (DS) is an emerging field of study [1]. Throughout mainstream society, devices such as tablets and smartphones have become increasingly popular. They have been widely adopted across all areas of our lives because of just how useful they have proven to be in many cases. In recent years, there has been a growing trend towards [2] using Augmented Reality (AR) and Virtual Reality (VR) devices, and as such, new applications are now being developed that are of use to society. According to the experts, this is yet another piece of new technology that is probably here to stay [3].
In the literature, there are currently few studies on interactions with VR and AR, and even less on users with Down syndrome. Kumin et al. [4] mention that studies do exist demonstrating how adults and children with Down syndrome use the computer keyboard and mouse without problems, although these contradict observations made in other scientific publications that describe difficulties and challenges relating to fine motor skills. What is clear is that further research is needed on 3D gestures and interactions within AR and VR environments to identify whether devices currently being manufactured suit all user needs and adapt to all types of users.
It is important to point out that Shin [5] reports about an increase in the number of babies born with Down syndrome in the United States from a prevalence of 9.0–11.8 per 10000 live births in ten US regions in the period from 1979 to 2003. Meanwhile in European countries, such as Spain, there has been a drop in this number in recent years [6]. These figures prove to be very different for different countries across the globe, for example in Canada the number of births per year stands at 330000, with the prevalence of DS being 14.41 per 10000 births; in the Czech Republic that number stands at 110000, with the prevalence of DS being 21.03; in Finland, 60000 with a prevalence of 29.94; Hungary, 100000 with a prevalence of 17.40; and Sweden, 100000–120000 and a prevalence of 28.91 [7]. In Mexico, the World Health Organisation estimates that on average there is one DS birth for every 1000–1100 live births.
Despite these changing figures, it is nonetheless important to identify those characteristics that prove significant in order to ensure the creation of inclusive technology. Cáliz [8] is of the opinion that research involving usability testing with DS and similar disabilities is on the increase, and will continue to increase in the future.
As a result of their cognitive skills, individuals with Down syndrome struggle with abstract thinking, short term memory problems, and short attention spans [9]. They have stronger social skills, whereas their language and perception skills are typically poorer. Most noticeably, visual memory and spatial memory are much stronger than verbal skills [10]. In terms of technology use, the three main skills that influence computer usage are cognitive skills, motor skills and perceptual skills [11].

2. Related Work

2.1. Use of Technologies by Individuals with Down Syndrome

Few relevant scientific contributions exist relating to technology usability studies that have selected individuals with Down syndrome as users. The recent study by Nacher [12] analyses touch gestures on interactive touch screens performed by children with DS. The study by Feng [13] mentions the main obstacles that parents of children with DS report when their children use computers, these being: frustration when navigating and troubleshooting, lack of patience, and design (flaws), among others [9]. Lazar [13] points out the importance of working specifically with individuals who have Down syndrome, and not just with “people with cognitive impairment” in general. Lazar argues that each group has its own unique cognitive traits and thus its own sets of strengths and weaknesses. Cortés et al. [14] analysed the usability of applications that are used by children, however the general recommendations focus more on software programming than on the hardware itself. Feng [9] studied the use of computers by adolescents and children, observing that although children encountered problems interacting with these primarily as a result of the software not being designed for their specific needs, results gathered from expert users shows that, with practice, teenagers gain the skills needed to work with word processing, data entry, and communication software [15].
Other works performed with users who have DS have attempted to stimulate the cognitive abilities of children [16] using computer software designed to teach mathematics [17]. Miyauchi [18] designed a system for training the tongue that includes exercises to facilitate movement. Nascimento and Salah [19,20] provide recommendations on serious games that are accessible for individuals with DS. González-Ferreras [21] proposes a video game for improving verbal skills, in particular prosody, and focused on the design and evaluation of the educational video game, from point of view how appealing is. Alonso-Virgós et al. [22] propose recommendations for web design based on the experiences of users with DS.
In the case of mobile devices, Mendoza [23] studies how easy it is for first-time users to learn how to use gestures on mobile devices. Del Rio et al. [11] suggest a methodology for evaluating the ease with which touch gestures, body movements and eye movements can be performed by individuals with DS. Bargagna [24] proposes using a robotic kit to promote education and collaborative learning in a play setting. Felix et al. [25] studied how to improve reading and writing skills using a multimedia tool, with significant improvement being found.

2.2. Studies on Fine and Gross Motor Movements

The literature offers several pieces of research on fine and gross motor movements in individuals with DS. Gross motor skills coordinate the movement of the arms, legs, feet, and entire body (when running, crawling, walking, swimming, and any other movements that require the use of the body’s large muscles). Fine motor skills coordinate the movements of the hands, feet, fingers, toes, lips, tongue (e.g., picking up objects using the thumb and index finger, or movements involving any of the body’s small muscles that occur on a daily basis). In this paper, our work centres on those that relate to gestures.
Although children with DS follow the same motor development as children without DS (neurotypical), they take twice as long to acquire the abilities of their neurotypical counterparts [26]. Research exists on hand tracking and gesture recognition [27]. Torres et al. [28] worked on an emotional development assessment tool using the platform Kinect. Alt et al. [29] studied Mid-Air gestures for large interactive displays, and the means by which users receive feedback. Cabreira [30] reviewed the most widely used gestures on three platforms: Kinect, Leap Motion and MYO, and identified a total of fifteen recurrent gestures. Raisbeck and Diekfuss [31] analysed the differences in fine and gross motor skills by measuring performance during tasks execution. Additionally, they proposed the game Beesmart, developed for Kinect, to improve users’ day-to-day motor skills [32]. Capio [33] studied the fundamental movement skills that show delayed development in children with DS. This development was documented by Winders et al. [34]. To increase the autonomy and independence of individuals with DS, Engler [35] proposes the Poseidon Project, that involves the use of a technical assistance. Silva [36] demonstrated that the use of exercises on the Wii could improve physical condition, functional mobility, and motor proficiency in adults with DS. Beerse [37] revised the skill needed by children to hop on one leg on the same spot, and concluded that they would need to be at least seven years old in order to successfully perform the action. Another author, Macías [38] reported the differences that could be observed between neurotypical participants and those with DS. Puspasari [39] found the fine motor skills of children with DS who used Wii consoles to be inferior to neurotypical children, whereas Berg [40] found that the use of Wii consoles could help DS children improve their postural stability, limits of stability, and Bruininks-Oseretsky Test of Motor Proficiency.

2.3. Studies on Virtual Reality and Augmented Reality with Users Who Have Down Syndrome

Hovorka et al. [41] ran a study based on VR and AR technologies in which users with DS had to perform everyday tasks. Abdel Rahman [42] state that motor training with virtual reality during therapy sessions promised encouraging results in this population. Del Ciello [43] provides a revision of existing literature on rehabilitation therapies using AR apps.
Ramli et al. [44] propose the usability factors that need to be taken into account when developing applications for users with DS, especially when using AR. In the field of education, a number of different studies have been performed involving AR and VR. McMahon [45], for example, successfully used AR to teach scientific terminology. Lopes [46] analysed brain activity when children with DS entered virtual reality using VR technology. For their part, Salah [47] performed a study on individuals with DS to study the cognitive difference acquired through the use of educational games using a computer and AR, and Martín-Sabarís [48] also studied the use of AR applied to users with DS using the game Pokémon Go.
The research presented in this paper studies the use of interactive gestures in AR and VR in order to identify the most suitable gestures that should be taken into account when designing apps for all users.

3. Materials and Methods

3.1. 3D Gesture Selection

To determine which gestures should be included in this study, the authors reviewed the literature and existing VR/AR applications to establish which gestures are most commonly used or mentioned, although they may not be identical or referred to by these same names. These gestures are presented in Table 1. From this list, eight were selected for this study (Table 2).

3.2. Experimental Study

The purpose of this work is to analyse a set of 3D gestures and evaluate their usability for individuals with DS when used with Augmented Reality and Virtual Reality technologies.
This study includes individuals with DS from different age groups and different genders who also possess different technological competencies. In order to be able to discern whether the motor development of DS participants affects their ability to interact with AR and VR devices or not, we use a control group containing neurotypical participants (referring to participants who do not suffer any developmental disability), and record any differences that emerge between these two groups.
The activities of both groups were examined to establish whether significance difference exists. Any differences arising as a result of gender and age (as independent variables) were also analysed. The time taken to complete a gesture and the percentage of successful attempts were the independent variables for each of the tasks.
The hypotheses formulated:
H1A:
The gesture has an effect on the success rate percentage of the task.
H1B:
The gesture itself has an effect on the time taken to complete the task.
H1C:
Gender has an effect on the success rate percentage of the task.
H1D:
Gender has an effect on the time taken to complete the task.
The alternative hypotheses are defined as: H0i = –H1i
The first step involved establishing which gestures to measure, and elaborating the vocabulary of gestures shown. As proposed by Abraham et al. [27] a set of 3D gestures was defined: Stop, Pinch, Grab and drop, Pan, Take, Point, Tap and Swipe (drag with two fingers) (Table 2).

3.3. Tasks

Tasks were created in the form of games for the eight aforementioned gestures. One task/game was designed for each gesture. When designing the tasks, the main prerequisite was to ensure that activities should not require any cognitive workload that could distract from the performance of the task, as suggested by Mendoza et al. [23]. To ensure this, simple games were created that used less colours, and avoided the use of abstract images and ambiguity. Images were presented on white backgrounds to show objects clearly and accompanied by as little text as possible, none if possible.
To determine which game would be best suited to a given gesture, we surveyed 100 computer science students and sector professionals, all of who are familiar with User Experience design. In this survey, respondents were asked to name activities that could relate to the gesture in question; for example, the pinch gesture feedback included activities such as plucking the petals from a flower, moving marbles from one spot to another, or adding a pinch of salt to a plate of food. Once the feedback was collated, a tag cloud was generated and the most mentioned games were selected. The activity ascribed to each gesture is detailed in Table 2 below; the table also lists the device on which a gesture would be tested.
Once the games were designed and developed, a pilot was run to check there were no issues with performing the gestures. This pilot was run using two individuals with DS. Based on the observations made during the pilot, a few minor adjustments were made. The two individuals involved in the pilot did not participate in the study used to measure the success of a task.

3.4. Equipment

The Oculus Rift Development Kit Dk2 with a built-in latency tester was used to perform tasks in VR. The Oculus has a low-latency positional tracking, and a low persistence OLED display (to eliminate motion blur and judder). The Leap Motion sensor and Atheer Air Android 4.4 goggle was used for tasks in AR. The Leap Motion is a sensor measuring less than seven centimetres and is capable of capturing both the finger movements and hand movements of users. The advantages of this piece of equipment include: detects hands and fingers rapidly; integrates with other devices, such as the Oculus Rift and Atheer Air; and is compatible with Android 4.4.

3.5. Participants

A total of 24 participants performed the augmented reality tasks using Atheer and Leap Motion, and 23 of them completed the virtual reality tasks using the Oculus Rift. The participants’ ages ranged from 14 to 46 years old, with a mean of M = 24.22 and a standard deviation of SD = 7.69. The gender of participants was evenly balanced with 11 men and 12 women. Participants were recruited from a Down syndrome support centre where individuals with Down syndrome receive support helping them to develop motor skills, cognitive skills and problem-solving skills.
A second group consisting of 25 neurotypical participants was established as a control group. Participants in this control group were students recruited from the same university at which the Down syndrome support centre is hosted. This second group contained 17 men and 8 women. Ages ranged from 18 to 29 years old, with a mean of M = 20.28 and a standard deviation of SD = 2.25. All participants gave their informed consent for inclusion before they participated in the study. The study was conducted in accordance with the Declaration of Helsinki, and the protocol was approved by the Ethics Committee of University of Monterrey.

3.6. Procedure

For each task, participants were accompanied by a moderator, an observer, and a caretaker employed by to the Down syndrome support centre. The moderator was responsible for demonstrating the gesture to the participant, after which the participant would have to repeat the gesture without assistance. Once the participant understood how to execute the gesture s/he would then have to complete the task designed for said gesture. The app provided feedback in the form of an audible sound when the gesture was completed correctly. The system recorded the start time and end time of each interaction. In the event a gesture was not completed within a given time it was logged as not completed. Each participant took around 5 min to put on the equipment, understand and complete tasks on the Oculus with Leap Motion, (average to do the tasks: DS group 41.08 s, and neurotypical group 17.97 s), 5 min to put on the equipment and complete tasks on Leap Motion, (average to do the tasks DS group 61.3 s, and neurotypical group 18.98 s), and between 7–10 min on the Atheer (average to do the tasks: DS group 81.35 s and neurotypical group 47.34 s). As these proceedings took place, an observer logged the session and noted down any problems that arose.
The same procedure was used for the control group containing neurotypical participants. The results from both sample groups were then compared and contrasted.

4. Results

In order to know which are the gestures that can provide a better execution experience for all users, the proposed gestures in Table 2 were analysed according to the success and execution times. Table 3 show the descriptive statistics for each gesture in each experimental group.

4.1. Analysis of Success Rate

The first hypothesis to test is whether the task itself determines the success rate. In other words, we must establish whether there are tasks that are more difficult to perform due to their nature. Firstly a Kolmogorov-Smirnov test was performed to identify the distribution of the sample regarding the success rate. Both samples show normal distribution, so a parametric test is applied to find out the presence of significant differences between groups. To test hypothesis H1A: The gesture has an effect on the success rate percentage of the task, the success rates of both groups were compared calculating successes/attempts and analysed using two sample Student’s T-test. As Table 4 reveals, it is the tasks involving the gestures Pinch, Grab and drop, Pan and Tap that present significant difference between the two groups. Thus, as they prove to be more difficult for one group to perform than the other, these gestures should be avoided when trying to create inclusive technology. When analysing the mean of these activities, it is possible to see that the neurotypical participants had a much higher success rate than individuals with DS. The H1A hypothesis is accepted for the tasks: Stop, Take, Point and Swipe, because there is no significant difference in the success of execution by both experimental groups.

4.2. Analysis of Tasks by Completion Time

To test the second hypothesis, H1B: The gesture itself has an effect on the time taken to complete the task, the successful gestures of both groups were compared to see which times were faster; these were analysed using a two sample Student’s T-test. As can be seen in Table 5, there is significant difference in the following gestures: Stop, Pinch, Pan, Point and Tap. These are more difficult for individuals with DS to perform. However, individuals with DS can perform the gestures Grab, Take and Swipe with equal dexterity, which means these gestures should be given preference. Once again, in trying to be inclusive, we recognise that these tasks require more time to complete. The H1B hypothesis is accepted for the four tasks: Grab and Drop, Take and Swipe, because there is no significant difference in the time of execution by both experimental groups.

4.3. Analysis of Success Rate by Gender

An ANOVA was performed to compare the success rate of gestures in tasks completed by individuals with DS versus neurotypical participants, with the independent variable being gender. The results, presented in Table 6 below, show us that there is no difference based on gender in either of the sample groups. As such, the third hypothesis H1C: Gender has an effect on the success rate of the task in the current study has been tested. The H1C hypothesis is rejected, there is no significant difference in the success performing any of the gestures/tasks by gender in the current study.

4.4. Analysis of Completion Time by Gender and Age

An ANOVA was performed to compare the completion time of gestures in tasks completed by individuals with DS versus neurotypical participants, with the independent variable being gender. Based on the results that are presented in Table 7 below, it is possible to conclude that there is no difference in completion times based on gender in either of the sample groups. Following this, a MANOVA was performed using gender, Down syndrome or neurotypical and the co-variable age as the independent quantitative variable. As seen in Table 8 below, age and gender do not affect the performance of a gesture.
The hypothesis H1D is rejected as gender does not affect the time taken to perform gestures or complete tasks. Alternative hypothesis is accepted (H0D).

4.5. Comparing Tasks

Once gestures were analysed, a post hoc test was applied using the Tukey’s method. Having done so, it was possible to establish how easy or difficult gestures proved to be and tasks could be clustered based on gesture difficulty. As can be seen in Table 9, the gestures that proved easiest to perform for both sample groups were Stop and Take, based on their lower average scores. A scale using the letters A to E has been applied to record the difficulty rating of gestures: (A) Most complicated to perform-(E) Least complicated to perform. The most complicated gestures (A), which also have the largest mean scores, are the gestures Swipe and Pinch.
The comparison of completion times using a test for equal variances results in P = 0.000, as such there are significant differences in the completion times for all gestures performed by individuals with DS. In Figure 1 it is possible to observe that there are noticeable differences for the gestures Stop and Take when compared against the other gestures.

4.6. Geometric Distribution. Probability of Individual with DS Correctly Performing a Gesture

From the data recorded for ‘number of attempts’ and ‘number of successful gestures’, it is possible to establish the mean time for completing a gesture on the first attempt (successes/attempts) by users with DS.
Table 10 shows us that the gesture that is least likely to be successfully completed is Swipe (by joining the index finger and middle finger); whereas the gesture that is most likely to be completed successfully is Stop (by raising the hand).

4.7. Observational Findings. Problems Generated by Physiological Traits

4.7.1. Microcephaly

Participants with microcephaly needed to readjust their glasses on several occasions, and this even led to some having to ask to stop tasks (Figure 2a).

4.7.2. Clinodactyly

A common physical trait encountered in users with Down syndrome is different finger angles. Finger angles and curves proved a limitation for some gestures, as the device could not correctly identify the gesture being made (Figure 2b).

4.7.3. Inability to Join Index and Middle Fingers

The majority of users found it difficult to perform the Pan gesture, which required holding up and joining the index finger and middle finger. Performing the gesture correctly proved impossible for many (Figure 2c).

4.7.4. Dizziness When Using Oculus

Eight users complained of feeling dizzy when using the Oculus Rift. However, only three of these asked to remove the headset (Figure 2d). Three said that although they felt dizzy they were ok. One experienced a headache, and another user’s eyes watered.

4.7.5. Atheer Detected Pan Gesture as a Single Finger

There were users who performed the gesture despite experiencing difficulty in doing so. However, the Atheer glasses detected both fingers as a single finger.

5. Discussion

5.1. Usability Issues and Design Considerations

People with DS usually encounter problems with the fine motor skills required to interact with gestural interfaces. From our experiences and studies exploring 2D gestures on touchscreens that have involved individuals with DS [11] we can confirm that some 3D gestures (Mid-Air) prove easier and straightforward for users with DS to perform than the 2D gestures used on touch screens. This should be taken into account when designing interfaces otherwise user performance will diminish.
The easiest gestures for users with DS to perform in AR y VR environments are Stop and Point out. Overall, the Swipe gesture was the hardest to perform.
Mendoza [23] reported the Tap gesture as being one of the easiest gestures to perform with touch interfaces; however, in this study the Tap gesture proved to be one of the hardest to perform when using Atheer and Leap Motion in an AR.
Having a better understanding of the difficulties encountered by individuals with DS is the first step towards explaining and addressing any problems that arise that arise with these technologies.
In reviewing the results obtained for hypotheses H1A and H1B that are detailed in Table 4 and Table 5, no statistical significance differences have been identified between either of the study groups for the successful execution of the gestures Stop, Take, Point and Swipe, nor the execution time of the gestures Grab and drop, Take and Swipe. Based on these findings, it appears that when it comes to the design of virtual reality or augmented reality applications the aforementioned gestures prove to be those best suited for all users. Our recommendation would be to give priority to gestures that provide the greatest guarantee of success; therefore, we would argue that these findings form part of basic guidelines for developing apps and should be taken into account when establishing the functions required during a task (precision or speed of execution).
When selecting gestures there will be pros and cons involved in selecting one over another. For example: Stop, Take or Point may take longer to complete, but there is a higher guarantee of success for all users; Grab and drop and Take completion times are the same time for all users, but nonetheless these gestures prove more difficult for all users to consistently perform correctly; and the Swipe gesture (move with two fingers) is the gesture par excellence as it can be correctly performed by all users.
In this study, the authors have found that any usability testing for gestures must be conducted for each environment independently.
Below is a list of recommendations that serve as design guidelines for apps that require Mid-Air gestures:
  • Basic recommendations regarding cognitive psychology are very useful, as are those proposed by Mendoza [23].
  • For 3D gestures, try to use gestures that involve the whole hand and avoid gestures that require the use of fingers and precise movements.
  • Regardless of environment, whether it be virtual reality or augmented reality, it is advisable to design apps so that they use gestures similar to those used in real life since the goal is to simulate within an synthetic environment those actions that are performed in the physical setting.
  • Ensure devices have a greater margin of error for gesture recognition.
  • Avoid the use of objectives to touch that are too small.
  • Do not assume that gestures for a touch interface should be similar to those in a 3D environment or headset.
  • When design with 3D gestures, give preference to: stop, point, pan and grab.
  • Gestures that are much harder to perform can be used to confirm critical tasks, such as deleting data. For critical tasks, such as Delete, use an appropriate gesture for the action that is being performed and a second gesture to confirm the action.
  • Avoid gestures using two fingers.
Specifically, it is important to take into consideration the design of specific elements for users with disabilities, thus we would propose the following:
  • Using voice recognition when developing AR and VR apps for individuals with Down syndrome to compensate for users’ low literacy skills.
  • Prioritising the use of gestures that suit the motor skill development of users with disability over other gestures when selecting suitable gestures for an app.
According to Table 10 of this paper. That said, however, the true goal is to ensure that all applications are inclusive, not just those developed specifically for users with Down syndrome.
Given the observations made regarding the morphology of individuals with DS, we would argue that AR and VR devices need to be better adapted to all people. This means they should be better at detecting gestures and capable of fitting difference-sized heads. Individuals with DS have clear physical limitations when it comes to wearing VR and AR headsets, as revealed by this research.

5.2. Threats to Validity

The study relied on the support of individuals with DS involved in a labour inclusion program. All these individuals had basic computer skills and used smartphones. It is important to repeat this study with participants who are not familiar with technology and compare the results to be sure of our findings.

6. Conclusions and Future Research

There is an increasing amount of legislation emerging covering inclusion policy. For some time now, organisations have been founded to design strategies, standards and resources aimed at providing universal access to individuals living with disabilities. One such organisation, the Web Accessibility Initiative (WAI), has developed guidelines that are regarded as the international standards for Web accessibility [60]. Nevertheless, these guidelines focus on physical disabilities (sight, hearing, etc.) and do not include cognitive disabilities such as Autism, Asperger or Down syndrome. We would like to see similar guidelines put in place that include Down syndrome. We propose designing guidelines that will assist hardware manufacturers in selecting more adequate gestures for Down syndrome users when performing tasks related to everyday activities in VR or AR environments. We think that in the future we will probably see the emergence of regulations within this legislation for hardware that includes individuals with motor, cognitive, or physical disabilities and technological handicaps.
In the literature, we came across one recommendation that we did not test ourselves: computer training is beneficial for users with DS. Facial recognition software and voice recognition software are making using devices easier and easier.
One interesting line of work that requires further study within the field of voice recognition research is that of voice recognition assistive technology for individuals with speech or voice impairments [61]. Verbal instructions might be an appropriate manner through which to interact with machines via software, and as such it is something that should be taken into account during UX design, as discussed in this paper.
When designing tasks to evaluate in usability testing, it is important to remember that participants will not always follow the instructions given. As such, one should expect these users to propose new gestures for performing day-to-day tasks, and that these should be included as valid new gestures.
Research has shown that individuals with Down syndrome can effectively use digital devices [23], and this study proves that AR and VR technology can be part of their daily interactions.

Author Contributions

The contributions to this paper are as follows. M.S. conceptualization, designed the methodology, testing organization, writing-original draft preparation; J.M. statistical analysis, writing final paper, project supervisor; R.A. and S.S. development of app and collecting participants’ data.

Funding

This research is funded by the research program of Universidad de Monterrey and the research program of Universidad de La Laguna.

Acknowledgments

We would like to acknowledge the collaboration of the Social and Educational Inclusion Program (Programa de Inclusión Social y Educativa-PISYE) of the University of Monterrey. Our thanks also go to Tomás Vargas who so kindly lent us the AR equipment needed to pursue this research.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Kumin, L.; Lazar, J.; Feng, J.H. Expanding job options: Potential computer-related employment for adults with Down syndrome. ACM SIGACCESS Access. Comput. 2012, 103, 14–23. [Google Scholar] [CrossRef]
  2. Martín-Gutiérrez, J.; Mora, C.E.; Añorbe-Díaz, B.; González-Marrero, A. Virtual Technologies Trends in Education. EURASIA J. Math. Sci. Technol. Educ. 2017, 13, 469–486. [Google Scholar] [CrossRef]
  3. Antonioli, M.; Blake, C.; Sparks, K. Augmented Reality Applications in Education. J. Technol. Stud. 2014, 40, 96–107. [Google Scholar] [CrossRef]
  4. Kumin, L.; Lazar, J.; Feng, J.H.; Wentz, B.; Ekedebe, N. A Usability Evaluation of Workplace-related Tasks on a Multi-Touch Tablet Computer by Adults with Down Syndrome. J. Usabil. Stud. 2012, 7, 118–142. [Google Scholar]
  5. Shin, M.; Besser, L.M.; Kucik, J.E.; Lu, C.; Siffel, C.; Correa, A. Congenital Anomaly Multistate Prevalence and Survival Collaborative Prevalence of Down Syndrome Among Children and Adolescents in 10 Regions of the United States. Pediatrics 2009, 124, 1565–1571. [Google Scholar] [CrossRef] [PubMed]
  6. Huete García, A. Demografía e inclusión social de las personas con síndrome de Down. Rev. Síndrome Down 2016, 33, 38–50. [Google Scholar]
  7. Sierra Romero, M.C.; Navarrete Hernández, E.; Canún Serrano, S.; Reyes Pablo, A.E.; Valdés Hernández, J. Prevalence of Down Syndrome Using Certificates of Live Births and Fetal Deaths in México 2008–2011. Bol. Med. Hosp. Infant. Mex. 2014, 71, 292–297. [Google Scholar]
  8. Cáliz, D.; Martinez, L.; Alaman, X.; Teran, C.; Caliz, R. Usability Testing Process with People with Down Syndrome Intreacting with Mobile Applications: A Literature Review. Int. J. Comput. Sci. Inf. Technol. 2016, 8, 117–131. [Google Scholar]
  9. Feng, J.; Lazar, J.; Kumin, L. Computer Usage by Children with Down Syndrome: Challenges and Future Research. ACM Trans. Access. Comput. TACCESS 2010, 2, 13. [Google Scholar] [CrossRef]
  10. Brandão, A.; Trevisan, D.G.; Brandão, L.; Moreira, B.; Nascimento, G.; Vasconcelos, C.N.; Clua, E.; Mourão, P. Semiotic Inspection of a Game for Children with Down Syndrome. In Proceedings of the 2010 Brazilian Symposium on Games and Digital Entertainment, Florianopolis, Santa Catarina, Brazil, 8–10 November 2010; pp. 199–210. [Google Scholar]
  11. Del Rio Guerra, M.; Martin Gutierrez, J.; Aceves, L. Design of an interactive gesture measurement system for down syndrome people. In Universal Access in Human-Computer Interaction. Methods, Technologies, and Users, UAHCI 2018; Antona, M., Stephanidis, C., Eds.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2018; Volume 10907. [Google Scholar]
  12. Nacher, V.; Cáliz, D.; Jaen, J.; Martínez, L. Examining the Usability of Touch Screen Gestures for Children with Down Syndrome. Interact. Comput. 2018, 30, 258–272. [Google Scholar] [CrossRef]
  13. Lazar, J.; Kumin, L.; Feng, J.H. Understanding the computer skills of adult expert users with down syndrome: An explanatory study. In Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility—ASSETS ’11, Dundee, UK, 24–26 October 2011; pp. 51–58. [Google Scholar]
  14. Cortes, M.Y.; Guerrero, A.; Zapata, J.V.; Villegas, M.L.; Ruiz, A. Study of the usability in applications used by children with Down Syndrome. In Proceedings of the 2013 8th Computing Colombian Conference (8CCC), Armenia, Colombia, 21–23 August 2013; pp. 1–6. [Google Scholar]
  15. Feng, J.; Lazar, J.; Kumin, L.; Ozok, A. Computer usage by young individuals with down syndrome. In Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility, Halifax, HS, Canada, 13–15 October 2008; pp. 35–42. [Google Scholar]
  16. Brandão, A.; Brandão, L.; Nascimento, G.; Moreira, B.; Vasconcelos, C.N.; Clua, E. JECRIPE: Stimulating cognitive abilities of children with Down Syndrome in pre-scholar age using a game approach. In Proceedings of the 7th International Conference on Advances in Computer Entertainment Technology—ACE ’10, Taipei, Taiwan, 17–19 November 2010; pp. 15–18. [Google Scholar]
  17. Chalarca, D.T. Diseño de una Aplicación para Enseñar las Operaciones Básicas de las Matemáticas a Personas con Síndrome de Down. Master’s Thesis, Universitat Oberta de Catalunya, Barcelona, Spain, 2015. [Google Scholar]
  18. Miyauchi, M.; Kimura, T.; Nojima, T. A tongue training system for children with Down syndrome. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology, St. Andrews, UK, 8–11 October 2013; pp. 373–376. [Google Scholar]
  19. Salah, J.; Abdennadher, S.; Sabty, C.; Abdelrahman, Y. Super Alpha: Arabic Alphabet Learning Serious Game for Children with Learning Disabilities. In Serious Games, JCSG 2016; Marsh, T., Ma, M., Oliveira, M., Baalsrud Hauge, J., Göbel, S., Eds.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2016; Volume 9894, pp. 104–115. [Google Scholar]
  20. Nascimento, L.S.; Martins, L.B.; Villarouco, V.; de Carvalho, W.; Junior, R.L. Recommendations for the Development of Accessible Games for People with Down Syndrome. In Proceedings of the 20th Congress of the International Ergonomics Association—IEA 2018, Florence, Italy, 26–30 August 2018; Springer: Cham, Switzerland, 2018; Volume 824, pp. 1712–1723. [Google Scholar]
  21. González-Ferreras, C.; Escudero-Mancebo, D.; Corrales-Astorgano, M.; Aguilar-Cuevas, L.; Flores-Lucas, V. Engaging Adolescents with Down Syndrome in an Educational Video Game. Int. J. Hum. Comput. Interact. 2017, 33, 693–712. [Google Scholar] [CrossRef] [Green Version]
  22. Alonso-Virgós, L.; Rodríguez Baena, L.; Pascual Espada, J.; González Crespo, R. Web Page Design Recommendations for People with Down Syndrome Based on Users’ Experiences. Sensors 2018, 18, 4047. [Google Scholar] [CrossRef] [PubMed]
  23. Alfredo, M.G.; Francisco, J.A.R.; Ricardo, M.G.; Francisco, A.E.; Jaime, M.A. Analyzing Learnability of Common Mobile Gestures Used by Down Syndrome Users. In Proceedings of the XVI International Conference on Human Computer Interaction, Vilanova i la Geltrú, Spain, 7–9 September 2015. [Google Scholar]
  24. Bargagna, S.; Castro, E.; Cecchi, F.; Cioni, G.; Dario, P.; Dell’Omo, M.; Di Lieto, M.C.; Inguaggiato, E.; Martinelli, A.; Pecini, C.; et al. Educational Robotics in Down Syndrome: A Feasibility Study. Technol. Knowl. Learn. 2018, 24, 315–323. [Google Scholar] [CrossRef]
  25. Felix, V.G.; Mena, L.J.; Ostos, R.; Maestre, G.E. A pilot study of the use of emerging computer technologies to improve the effectiveness of reading and writing therapies in children with Down syndrome. Br. J. Educ. Technol. 2017, 48, 611–624. [Google Scholar] [CrossRef]
  26. Kim, H.I.; Kim, S.W.; Kim, J.; Jeon, H.R.; Jung, D.W. Motor and Cognitive Developmental Profiles in Children with Down Syndrome. Ann. Rehabil. Med. 2017, 41, 97–103. [Google Scholar] [CrossRef] [PubMed]
  27. Abraham, L.; Urru, A.; Normani, N.; Wilk, M.; Walsh, M.; O’Flynn, B.; Abraham, L.; Urru, A.; Normani, N.; Wilk, M.P.; et al. Hand Tracking and Gesture Recognition Using Lensless Smart Sensors. Sensors 2018, 18, 2834. [Google Scholar] [CrossRef] [PubMed]
  28. Torres-Carrión, P.; González-González, C.; Carreño, A.M. Methodology of emotional evaluation in education and rehabilitation activities for people with Down Syndrome. In Proceedings of the XV International Conference on Human Computer Interaction, Puerto de la Cruz, Spain, 10–12 September 2014. [Google Scholar]
  29. Alt, F.; Geiger, S.; Höhl, W. ShapelineGuide: Teaching Mid-Air Gestures for Large Interactive Displays. In Proceedings of the 7th ACM International Symposium on Pervasive Displays—PerDis ’18, Munich, Germany, 6–8 June 2018; pp. 1–8. [Google Scholar]
  30. Cabreira, A.T.; Hwang, F. An Analysis of Mid-Air Gestures used cross Three Platforms. In Proceedings of the 2015 British HCI Conference—British HCI ’15, Lincoln, UK, 13–17 July 2015; pp. 257–258. [Google Scholar]
  31. Raisbeck, L.D.; Diekfuss, J.A. Fine and gross motor skills: The effects on skill-focused dual-tasks. Hum. Mov. Sci. 2015, 43, 146–154. [Google Scholar] [CrossRef] [PubMed]
  32. Amado Sanchez, V.L.; Islas Cruz, O.I.; Ahumada Solorza, E.A.; Encinas Monroy, I.A.; Caro, K.; Castro, L.A. BeeSmart: A gesture-based videogame to support literacy and eye-hand coordination of children with down syndrome. In Games and Learning Alliance, GALA 2017; Dias, J., Santos, P., Veltkamp, R., Eds.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2017; Volume 10653, pp. 43–53. [Google Scholar]
  33. Capio, C.M.; Mak, T.C.T.; Tse, M.A.; Masters, R.S.W. Fundamental movement skills and balance of children with Down syndrome. J. Intellect. Disabil. Res. 2018, 62, 225–236. [Google Scholar] [CrossRef] [PubMed]
  34. Winders, P.; Wolter-Warmerdam, K.; Hickey, F. A schedule of gross motor development for children with Down syndrome. J. Intellect. Disabil. Res. 2019, 63, 346–356. [Google Scholar] [CrossRef]
  35. Engler, A.; Zirk, A.; Siebrandt, M.; Schulze, E.; Oesterreich, D. Mobility Competencies of People with Down Syndrome Supported by Technical Assistance. In European Conference on Ambient Intelligence; Braun, A., Wichert, R., Maña, A., Eds.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2017; Volume 10217, pp. 241–246. [Google Scholar]
  36. Silva, V.; Campos, C.; Sá, A.; Cavadas, M.; Pinto, J.; Simões, P.; Machado, S.; Murillo-Rodríguez, E.; Barbosa-Rocha, N. Wii-based exercise program to improve physical fitness, motor proficiency and functional mobility in adults with Down syndrome. J. Intellect. Disabil. Res. 2017, 61, 755–765. [Google Scholar] [CrossRef]
  37. Beerse, M.; Wu, J. Vertical stiffness and balance control of two-legged hopping in-place in children with and without Down syndrome. Gait Posture 2018, 63, 39–45. [Google Scholar] [CrossRef] [PubMed]
  38. Macias, A.; Caro, K.; Castro, L.A.; Sierra, V.; Ahumada, E.A.; Encinas, I.A. Exergames in Individuals with Down Syndrome: A Performance Comparison Between Children and Adolescents. In Smart Objects and Technologies for Social Good, GOODTECHS 2017; Guidi, B., Ricci, L., Calafate, C., Gaggi, O., Marquez-Barja, J., Eds.; Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering; Springer: Cham, Switzerland, 2017; Volume 233, pp. 92–101. [Google Scholar]
  39. Sinaga, P.; Prananta, M.S.; Fadlyana, E. Score of Fine Motor Skill in Children with Down Syndrome using Nintendo Wii. Althea Med. J. 2016, 3, 371–375. [Google Scholar] [CrossRef] [Green Version]
  40. Berg, P.; Becker, T.; Martian, A.; Danielle, P.K.; Wingen, J. Motor Control Outcomes Following Nintendo Wii Use by a Child With Down Syndrome. Pediatr. Phys. Ther. 2012, 24, 78–84. [Google Scholar] [CrossRef] [PubMed]
  41. Hovorka, R.Μ.; Virji-Babul, N. A preliminary investigation into the efficacy of virtual reality as a tool for rehabilitation for children with Down syndrome. Int. J. Disabil. Hum. Dev. 2006, 5, 351–356. [Google Scholar] [CrossRef]
  42. Rahman, S.A.; Rahman, A. Efficacy of Virtual Reality-Based Therapy on Balance in Children with Down Syndrome. World Appl. Sci. J. 2010, 10, 254–261. [Google Scholar]
  43. De Menezes, L.D.C.; Massetti, T.; Oliveira, F.R.; de Abreu, L.C.; Malheiros, S.R.P.; Trevizan, I.L.; Moriyama, C.H.; de Monteiro, C.B.M. Motor Learning and Virtual Reality in Down Syndrome; a Literature Review. Int. Arch. Med. 2015, 8, 1–11. [Google Scholar] [CrossRef]
  44. Ramli, R.; Zaman, H.B. Designing usability evaluation methodology framework of Augmented Reality basic reading courseware (AR BACA SindD) for Down Syndrome learner. In Proceedings of the 2011 International Conference on Electrical Engineering and Informatics, Bandung, Indonesia, 17–19 July 2011; pp. 1–5. [Google Scholar]
  45. McMahon, D.D.; Cihak, D.F.; Wright, R.E.; Bell, S.M. Augmented reality for teaching science vocabulary to postsecondary education students with intellectual disabilities and autism. J. Res. Technol. Educ. 2016, 48, 38–56. [Google Scholar] [CrossRef]
  46. Lopes, J.; Miziara, I.; Kahani, D.; Lazzari, R.; Guerreiro, L.; Moura, R.; Cordeiro, L.; Naves, E.; Conway, B.; Oliveira, C. P 115—Brain activity after transcranial stimulation combined with virtual reality training in children with Down Syndrome: Case Report. Gait Posture 2018, 65, 426–427. [Google Scholar] [CrossRef]
  47. Salah, J.; Abdennadher, S.; Atef, S. Galaxy Shop: Projection-Based Numeracy Game for Teenagers with Down Syndrome. In Serious Games, JCSG 2017; Alcañiz, M., Göbel, S., Ma, M., Fradinho Oliveira, M., Baalsrud Hauge, J., Marsh, T., Eds.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2017; Volume 10622, pp. 109–120. [Google Scholar]
  48. Martín-Sabarís, R.M.; Brossy-Scaringi, G. La realidad aumentada aplicada al aprendizaje en personas con Síndrome de Down: Un estudio exploratorio. Rev. Lat. Comun. Soc. 2017, 72, 737–750. [Google Scholar] [CrossRef]
  49. Jang, S.; Elmqvist, N.; Ramani, K. Gesture Analyzer: Visual analytics for pattern analysis of mid-air hand gestures. In Proceedings of the 2nd ACM Symposium on Spatial User Interaction—SUI ’14, Honolulu, HI, USA, 4–5 October 2014; pp. 30–39. [Google Scholar]
  50. Jego, J.-F. Interaction Basée sur des Gestes Définis par L’utilisateur: Application à la Réalité Virtuelle. Master’s Thesis, l’école doctorale Sciences des métiers de l’ingénieur, Paris, France, 2013. [Google Scholar]
  51. Buchmann, V.; Violich, S.; Billinghurst, M.; Cockburn, A. FingARtips: Gesture based direct manipulation in Augmented Reality. In Proceedings of the 2nd International Conference on Computer Graphics and Interactive Techniques in Austalasia and Southe East Asia—GRAPHITE ’04, Singapore, 15–18 June 2004; pp. 212–221. [Google Scholar]
  52. Albertini, N.; Brogni, A.; Olivito, R.; Taccola, E.; Caramiaux, B.; Gillies, M. Designing natural gesture interaction for archaeological data in immersive environments. Virtual Archaeol. Rev. 2017, 8, 12–21. [Google Scholar] [CrossRef] [Green Version]
  53. Manders, C.; Farbiz, F.; Yin, T.K.; Miaolong, Y.; Chong, B.; Guan, C.G. Interacting with 3D objects in a virtual environment using an intuitive gesture system. In Proceedings of the 7th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry—VRCAI ’08, Singapore, 8–9 December 2008; pp. 1–5. [Google Scholar]
  54. Caro, K.; Tentori, M.; Martinez-Garcia, A.I.; Alvelais, M. Using the FroggyBobby exergame to support eye-body coordination development of children with severe autism. Int. J. Hum. Comput. Stud. 2017, 105, 12–27. [Google Scholar] [CrossRef]
  55. Bernardes, J.L., Jr.; Nakamura, R.; Tori, R. Design and Implementation of a Flexible Hand Gesture Command Interface for Games Based on Computer Vision. In Proceedings of the 2009 VIII Brazilian Symposium on Games and Digital Entertainment, Rio de Janeiro, Brazil, 9–10 October 2009; pp. 64–73. [Google Scholar]
  56. Bai, H.; Lee, G.; Billinghurst, M. Using 3D hand gestures and touch input for wearable AR interaction. In Proceedings of the Extended Abstracts of the 32nd Annual ACM Conference on Human Factors in Computing Systems—CHI EA ’14, Toronto, ON, Canada, 26 April–1 May 2014; pp. 1321–1326. [Google Scholar]
  57. Kim, M.; Lee, J.Y. Touch and hand gesture-based interactions for directly manipulating 3D virtual objects in mobile augmented reality. Multimed. Tools Appl. 2016, 75, 16529–16550. [Google Scholar] [CrossRef]
  58. Alkemade, R.; Verbeek, F.J.; Lukosch, S.G. On the Efficiency of a VR Hand Gesture-Based Interface for 3D Object Manipulations in Conceptual Design. Int. J. Hum. Comput. Interact. 2017, 33, 882–901. [Google Scholar] [CrossRef]
  59. Choi, H.-R.; Kim, T. Combined Dynamic Time Warping with Multiple Sensors for 3D Gesture Recognition. Sensors 2017, 17, 1893. [Google Scholar] [CrossRef] [PubMed]
  60. W3C. Web Accessibility Initiative (WAI). Available online: https://www.w3.org/WAI/ (accessed on 5 May 2019).
  61. Corrales-Astorgano, M.; Martínez-Castilla, P.; Escudero-Mancebo, D.; Aguilar, L.; González-Ferreras, C.; Cardeñoso-Payo, V. Automatic Assessment of Prosodic Quality in Down Syndrome: Analysis of the Impact of Speaker Heterogeneity. Appl. Sci. 2019, 9, 1440. [Google Scholar] [CrossRef]
Figure 1. Test for Equal Variances in time.
Figure 1. Test for Equal Variances in time.
Applsci 09 02641 g001
Figure 2. Problems generated by physiological traits. (a) Image of user whose glasses have slipped. (b) Example of angle of finger joints in users with DS. (c) User attempting Pan gesture. (d) User complaining of dizziness removing Oculus Rift.
Figure 2. Problems generated by physiological traits. (a) Image of user whose glasses have slipped. (b) Example of angle of finger joints in users with DS. (c) User attempting Pan gesture. (d) User complaining of dizziness removing Oculus Rift.
Applsci 09 02641 g002
Table 1. Gestures identified from bibliography review.
Table 1. Gestures identified from bibliography review.
GestureNameJang, Elmqvist, Ramani [49]Jégo, Paljic and Fuchs [50]Buchmann et al. [51]Albertini et al. [52]Manders et al. [53]Caro et al. [54]Bernardes, et al. [55]Bai et al. [56]Kin et al. [57]Alkemade et al. [58]Choi; Kim [59]This Study
Applsci 09 02641 i001StopX XXX X XX
Applsci 09 02641 i002Pinch XX X
Applsci 09 02641 i003Grab and Drop using fingers XX
Applsci 09 02641 i004PointXX XX X XXX
Applsci 09 02641 i005Rotate X X X
Applsci 09 02641 i006Arm/hand rotationXX X XX
Applsci 09 02641 i007Take X
Applsci 09 02641 i008Hold and Swipe X
Applsci 09 02641 i009Grab and Drop using hand X X X X
Applsci 09 02641 i010PanX X X
Applsci 09 02641 i011Swipe (Move with 2 fingers) XX X
Table 2. Vocabulary of 3D gestures for Leap Motion, Atheer and Oculus Rift.
Table 2. Vocabulary of 3D gestures for Leap Motion, Atheer and Oculus Rift.
GestureImage of GestureActivityDescription of ActivityDevice Used
Stop Applsci 09 02641 i001Stop the vehicleA vehicle is driving towards the screen and the user. To stop the vehicle the user must raise their hand and hold it up with the palm facing outwards towards the screen. The test is completed when the vehicle is stopped.Leap motion
Pinch Applsci 09 02641 i002 Applsci 09 02641 i003Place marbles in boxThe user must place marbles inside of a box. This test does not have an associated gesture, instead the purpose is to identify which gesture is chosen by the user. The objective was to observe which gesture is most frequently used. No specific instructions were given to users on how to perform this task.Leap motion
Grab and drop Applsci 09 02641 i009Move coloured blocksThe user must place blocks on circles of the same colour.Oculus Rift with Leap motion
Pan Applsci 09 02641 i010Push ball towards goalThe user pushes three different coloured balls towards a red line.Oculus Rift with Leap motion
Take Applsci 09 02641 i007Pick apple from treeThe user has to pick an apple from a tree.Oculus Rift with Leap motion
Point Applsci 09 02641 i012Search for catThe user has to point to the cat that appears in different locations in the game.Oculus Rift with Leap motion
Tap Applsci 09 02641 i004Press button to move characterThe user has to press a button to make the character advance on the base.Atheer
Swipe Applsci 09 02641 i011Move ball over obstacleThe user has to select the ball using two fingers to move it over an obstacle.Atheer
Table 3. Gestures: Descriptive statistic.
Table 3. Gestures: Descriptive statistic.
GesturesStopPinchGrab and DropTakePanPointTapSwipe
DS GroupMean success (St.Desv)0.754 (0.265)0.041 (0.085)0.073 (0.058)0.022 (0.024)0.391 (0.367)1.957 (1.637)0.03 (0.03)0.037 (0.109)
Mean Time (St.Desv)6.261 (5.328)116.348 (12.194)89.478 (32.946)3.609 (2.888)(41.043) (30.022)30.174 (33.107)57.087 (30.716)106.727 (30.363)
Neurot. GroupMean success (St.Desv)0.693 (0.258)0.240 (0.133)0.141 (0.071)0.062 (0.026)0.820 (0.372)1.760 (0.723)0.072 (0.056)0.080 (0.085)
Mean Time (St.Desv)3.080 (2.943)34.880 (15.865)48.640 (32.140)3.360 (1.497)10.760 (6.180)9.120 (6.948)19.920 (29.508)74.760 (57.523)
Table 4. Comparison of tasks by success rate.
Table 4. Comparison of tasks by success rate.
ActionStop the VehiclePlace Marbles in a BoxMove Coloured BlocksPick the Apple from the TreePush Ball Towards GoalSearch for the CatPress a Button to Move a CharacterMove Ball over Obstacle
GestureStopPinchGrab and dropTakePanPointTapSwipe
Down syndrome vs. NeurotypicalP = 0.485P = 0.000P = 0.001P = 0.322P = 0.000P = 0.600P = 0.002P = 0.143
Table 5. Comparison of tasks by completion time.
Table 5. Comparison of tasks by completion time.
ActionStop the VehiclePlace Marbles in a BoxMove Coloured BlocksPick the Apple from the TreePush Ball Towards GoalSearch for the CatPress a Button to Move a CharacterMove Ball over Obstacle
GestureStopPinchGrab and DropTakePanPointTapSwipe
Down syndrome vs. NeurotypicalP = 0.016P = 0.000P = 0.702P = 0.714P = 0.000P = 0.007P = 0.000P = 0.055
Table 6. Comparison of success rate by gender.
Table 6. Comparison of success rate by gender.
ActionStop the VehiclePlace Marbles in a BoxMove Coloured BlocksPick the Apple from the TreePush Ball Towards GoalSearch for the CatPress a Button to Move a CharacterMove Ball over Obstacle
GestureStopPinchGrab and DropTakePanPointTapSwipe
Gender and Down syndromeP = 1.000P = 0.879P = 0.470P = 0.494P =0.943P = 0.902P = 0.205P = 0.452
Gender and NeurotypicalP = 0.941P = 0.642P = 0.498P = 0.414P = 0.161P = 0.243P = 0.087P = 0.098
Table 7. Comparison of completion time by gender.
Table 7. Comparison of completion time by gender.
ActionStop the VehiclePlace Marbles in a BoxMove Coloured BlocksPick the Apple From the TreePush Ball Towards GoalSearch for the CatPress a Button to Move a CharacterMove Ball on Obstacle
GestureStopPinchGrab and dropTakePanPointTapSwipe
Gender and Down syndromeP = 0.697P = 0.137P = 0.476P = 0.216P = 0.701P = 0.989P = 0.501P = 0.728
Gender and NeurotypicalP = 0.739P = 0.478P = 0.398P = 0.601P = 0.42P = 0.95P = 0.739P = 0.468
Table 8. Comparison of age, gender by success rate/ number of attempts (between groups).
Table 8. Comparison of age, gender by success rate/ number of attempts (between groups).
AgeGender
P = 0.548P = 0.645
Table 9. Groupings of gestures by completion times.
Table 9. Groupings of gestures by completion times.
TaskGestureMeanGroups
Place marbles in boxPinch116.348A
Move ball over obstacleSwipe102.087A
Press button to move characterTap57.087 B
Move coloured blocksGrab and drop52.957 BC
Push ball towards goalPan41.043 BC
Search for catPoint30.174 CD
Stop carStop6.261 DE
Pick apple Take3.609 E
Table 10. Geometric distribution of probability of performing gesture on first attempt for individuals with DS.
Table 10. Geometric distribution of probability of performing gesture on first attempt for individuals with DS.
GestureAverage n° AttemptsSuccess Rate of AttemptsProbability of Performing Gesture
Stop1.57163.69%
Point2.04149.02%
Pan142.7819.86%
Grab and drop25.351.395.48%
Take33.0913.02%
Pinch39.571.042.63%
Tap61.1711.63%
Swipe41.650.30.72%

Share and Cite

MDPI and ACS Style

Del Rio Guerra, M.S.; Martin-Gutierrez, J.; Acevedo, R.; Salinas, S. Hand Gestures in Virtual and Augmented 3D Environments for Down Syndrome Users. Appl. Sci. 2019, 9, 2641. https://doi.org/10.3390/app9132641

AMA Style

Del Rio Guerra MS, Martin-Gutierrez J, Acevedo R, Salinas S. Hand Gestures in Virtual and Augmented 3D Environments for Down Syndrome Users. Applied Sciences. 2019; 9(13):2641. https://doi.org/10.3390/app9132641

Chicago/Turabian Style

Del Rio Guerra, Marta Sylvia, Jorge Martin-Gutierrez, Renata Acevedo, and Sofía Salinas. 2019. "Hand Gestures in Virtual and Augmented 3D Environments for Down Syndrome Users" Applied Sciences 9, no. 13: 2641. https://doi.org/10.3390/app9132641

APA Style

Del Rio Guerra, M. S., Martin-Gutierrez, J., Acevedo, R., & Salinas, S. (2019). Hand Gestures in Virtual and Augmented 3D Environments for Down Syndrome Users. Applied Sciences, 9(13), 2641. https://doi.org/10.3390/app9132641

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop