1. Introduction
Educational robotics is a subdiscipline that allows active learning by favoring the integration of different areas of knowledge. This new way of learning puts students’ skills into practice. Its application in the classroom helps digital literacy, which is particularly relevant today [
1], and allows abstract knowledge to be understood more easily, thus achieving a higher functional level of learning and understanding [
2]. Educational robotics is a very effective technological educational tool frequently integrated into many learning environments that require interaction, where the robot often acts as an intermediary between the educational objective and the child [
3].
The use of robotics as a collaborative and learning element is present in our environment [
4,
5]. The stages of working with robots can be structured into the phases of the design and construction of the environment; the programming of the robot; and testing in a test environment. Today, a multitude of different robots are used that are programmed with Scratch 3.0 software or similar environments [
6]. For all the above reasons, robotics has been proven to be an effective tool in the educational field. Furthermore, in recent times, the confluence of robotics with other cutting-edge transversal technologies, such as artificial intelligence [
7] or machine learning [
8], has demonstrated the usefulness of these synergies in obtaining innovative solutions.
This paper will present several intermediate works that have led to the definition of optimal design criteria applicable to robots with the objective of interacting with children with ASD through educational tasks, as will be detailed later. The structure of this paper is as follows: In
Section 1.1, a bibliographical study of references is presented that allows
Section 1.2 to summarize some initial design guidelines.
Section 2 presents a partially failed first project involving the creation of an initial facilitating robot for children with ASD.
Section 3 presents a second project that seeks to find a more complete list of design criteria for robots for children with ASD, and
Section 4 presents a design example based on them.
Section 5 presents the TEA-2 robot that was designed and built according to these guidelines. The paper ends with the Discussion and Conclusions Sections.
1.1. Trends and Considerations in Robotic-Assisted Autism Spectrum Disorder Therapy
Autism spectrum disorder (ASD) in children is characterized by
persistent deficits in social communication and social interaction across a variety of contexts, coupled with restrictive and repetitive patterns of behavior, interests, or activities. Symptoms must be present in the early developmental period and cause clinically significant impairment in social, occupational, or other important areas of current functioning [
9]. Autism has a broad spectrum, so patients have different levels of communication skills [
10].
The hypothesis of this paper is that conventional therapy in children with ASD does not promote certain social behaviors that are possible in those where robots designed based on a series of criteria related to morphology, aesthetics, functionality, etc., are used. Therefore, it is necessary to define the characteristics that must be included in the design of a robot specifically dedicated to working with children with ASD so that these design guidelines are reproducible.
1.1.1. Robotic-Assisted Autism Research and Treatment
With regard to the use of robots in the treatment of children with ASD, the following characteristics have been identified as common:
Early diagnosis has been shown to prevent functional difficulties in adult life [
11]. Among the elements to be considered, we mainly indicate two:
It is necessary to teach children to respect waiting times when communicating with another person. Robots have been introduced to support this ability [
13].
Problems are found in seeing another person as a social element. Imitation allows for a greater awareness of their surroundings. Children with ASD communicate better with robots. If the therapist participates in the movements performed by the robot, a better therapist–patient relationship is observed [
14].
Another problem is the difficulty of understanding simple expressions in their environment. Too much information is presented that is quite complex [
15]. Robots that represent facial expressions convey information more easily. They improve children’s imitation and interpretation of their environment. When they act out moments of their lives with gestures, children can extrapolate this to other situations [
16].
The ability to focus on a certain object or person is an issue that is critical to learning. This is used as a measure of receptivity to the robot [
17].
Finally, so-called triadic interactions are sought, i.e., the interaction between the patient, another person, and the robot. The robot should be a tool to improve communication and not represent an end in itself. The goal is to improve the relationship with the other person [
18]. Improvements in self-initiative and joint attention can result from these types of interactions [
19].
1.1.2. Use of Humanoid/Non-Humanoid Robots
Within the first division in the application of diverse types of robots to ASD cases, a first distinction is the use of humanoid or non-humanoid robots.
As for humanoid robots [
19], android-like ones are more generalist but less attractive. If the robot is a more mechanical humanoid robot, they are more generalist but sometimes lose the child’s attention. Robots similar to animals/pets [
18] are attractive, representing good interaction, but have a loss of generalization. Non-humanoid robots can fulfill specific tasks, but they do not imitate humans.
1.2. First Design Guidelines Based on Examples
Once it was demonstrated in the above cases that robots help in the treatment of children with ASD, we could begin to derive some basic design guidelines:
No skills are demonstrated outside the clinic. The situations encountered in therapy must have a high similarity with their environment.
The greatest potential is triadic interactions.
The ideal size is small and portable to help children in their relationships with others.
To start treatment, it is better to use simple and abstract robots first. Later, complex, and realistic robots can be used.
2. Design for the Treatment and Diagnosis of Autism
2.1. A First Attempt: A Robot Facilitator of the Communication of People with Autism Spectrum Disorder
Based on these very first ideas defined in
Section 1.2, a design was carried out between 2020 and 2021. The environment of the global pandemic made it difficult to communicate with ASD specialists with whom we collaborated, belonging to the Esfera Foundation, an NGO that works in Madrid with children with TEA.
These difficulties in communication led to a design with partial success but with some flaws that had to be reconsidered in the future. This first design, the Facilitating Robot, TEA-1, was a facilitator for the communication of children with ASD, that is, the robot mediated in the communication between the child and another person using the TEA-1 robot (
Figure 1a). The child, through an app developed for a mobile phone/tablet, expressed text, emoticons about how they felt, etc., which were sent to the robot, and this robot was the one that completed the communication with the other interlocutor by the means of an array of LED diodes (
Figure 1b) or text displayed on an LCD screen.
In this project, as indicated, the goal was to design and build a robot for people with autism spectrum disorder (ASD) to facilitate nonverbal communication that would fulfill the following purposes:
Connect a mobile application to the robot via a Bluetooth module to control it;
Display text or audio sent from the mobile application on the robot’s LCD display;
Display emoticons that symbolize feelings in a matrix of LEDs on the robot;
Control the robot’s movements with the mobile application;
Make the mobile application intuitive and easy to use for the people who will use it (
Figure 2a).
To create this prototype, a motion robot design and an Arduino board were used to connect several components: an LCD display, an LED array, DC motors, a Bluetooth module, and an ultrasonic sensor to control the robot (
Figure 2b). Also, a mobile application was developed that connected to the robot through the Bluetooth module and was responsible for controlling and communicating with the robot.
Analysis of TEA-1 Results
The main shortcomings of the project were the timing of the project, with the COVID pandemic. In this sense, the collaboration with the therapists of the Esfera Foundation could not be close and this led to decisions that were ultimately wrong. It is worth mentioning, for example, that the track system used for movement was very noisy and not very suitable or that the appearance of the prototype, with all the electronics in sight, was quite annoying for children with ASD, details that we discovered when the robot was finally tested at the Foundation. On the other hand, the therapists highlighted the enormous potential of the robot as a facilitator, i.e., a robot that mediates communication between a child with ASD and an interlocutor. A project was defined to redesign this system, in addition to implementing a composite system between several robots, so that they could interact wirelessly, without knowing whether it was another child with ASD, a therapist, etc. Unfortunately, due to the difficulties presented by the TEA-1 system, the Esfera Foundation lost interest in designing a new robot and we focused on collaboration using existing robots, as can be seen in the references [
20,
21].
However, with a view to carry out a future project in this line and without the support of the Foundation, it was seen as necessary to first carry out a systematic study of the design of a robot for children with ASD order to establish the optimal design criteria, which are one of the main contributions of this work.
3. Material and Methods: Guidelines for Designing Robots for Children with ASD
From the first TEA-1 design, it was suggested that any future design should be preceded by a further analysis of the existing literature to study success stories in the application of robots with children with ASD, which would be studied. These cases should be synthesized into guidelines for designing robots for children with ASD.
3.1. Robot Design Recommendations for Interaction with Children with ASD
From the study of a bibliography on the subject, apart from studies based on our own experience in previous work [
4,
6,
20,
21], we defined a series of recommendations that were proposed for the correct design of a robot for children with ASD. Conventional therapy has been shown to have better outcomes when supplemented with robotic therapy [
22,
23], but the characteristics of the robot in each case have not been systematically analyzed.
3.1.1. The Right Size of the Robot Must Be Small
Since it is a tool to improve the quality of play, it must be able to be taken anywhere and used whenever it is needed. This can help the patient to communicate with their environment and vice versa. So, we discarded large robots or those that are difficult to carry [
24]. Optimal designs can be robots such as the Beebot [
20], where their simple design focuses on portability and ease of use.
3.1.2. Robots Focused on Diagnostic Therapies Should Focus on Early Ages
Early diagnosis favors the early treatment of this disorder. Robots provide us with a more precise detection than conventional therapy [
11]. Thanks to the addition of a number of sensors, it is possible to act in stages before speech [
25]. First, eye tracking tells us where the patient’s gaze is directed. Children with ASD do not engage at sources of nonverbal communication but focus on the source of the sound. Therefore, they look at the mouth instead of the eyes. This information is key to knowing if they are following a normal pattern or has some type of communication alteration. Second, including sensors that study movement by monitoring specific parts of the body are also relevant, so tactile and force sensors can be particularly useful [
26]. An example of this case would be the robot Nao [
27]. By designing questionnaires adapted to this robot such as the Q-CHAT-NAO [
28], it has been possible to successfully detect autism in early stages.
3.1.3. Encouraging Proactivity
The integration of buttons or the capacity to specify actions for the robot fosters the initiative of children, enabling them to express needs or desires [
13]. A notable example is the Kibo robot, a programmable device that, due to its accessibility, has been utilized in children with ASD [
29]. Through the utilization of wooden blocks, students can program a sequence for the robot to execute. Each block is assigned a color and letter to denote its function, and Kibo utilizes its barcode scanner to interpret the sequence. This approach fosters proactivity in children by enabling them to plan their actions and understand the consequences of their actions.
3.1.4. Adapting the Topology of the Robot to the Phase of Therapy
A critical aspect that must be given due consideration is the adaptation of children with ASD to the utilization of these devices. In the event that the child is undergoing an adaptation process, the most optimal course of action would be to utilize uncomplicated robotic devices. These robots are characterized by their abstract nature, which distinguishes them from human forms. A notable example of such a robot is the Beebot, which was previously discussed in
Section 3.1.1. It is imperative to ensure that the child is not overwhelmed by an excess of information. In subsequent stages, the introduction of more sophisticated devices can be considered. The primary benefit of these advanced robots is their ability to more closely resemble reality [
15]. A notable example of a robot traversing these phases is Milo [
30].
3.1.5. The Robot Should Be the Means but Not the End
The efficacy of robot therapies is maximized when the objective is to enhance communication between the patient and an individual in their environment, such as the therapist. The utilization of a robot as an instrument to enhance specific communication and social skills is critical in this approach. An example of this approach is Nao (
Section 3.1.2). Research has demonstrated that the utilization of this device by therapists, through structured gaming activities, can facilitate the enhancement of specific social skills [
18].
3.1.6. Imitation as an Improvement in Their Communication Skills
Children diagnosed with ASD are not yet equipped with the cognitive ability to understand that their actions have consequences in the external world. However, their capacity for imitation serves to illuminate this awareness. Moreover, imitating robots is a more straightforward task for them than for other humans and serves as an effective method to enhance the emotional expressiveness of the child. This approach serves as a foundational step in fostering their social development [
31].
3.1.7. The Environment in Therapy Must Be Faithful to Its Reality
As advances outside the therapy environment are significant, it is relevant to simulate situations of reality where the patient makes use of the robot. The more we work on scenarios from their day-to-day life, the greater the results we will obtain are [
31]. Social robots, such as Nao (
Section 3.1.2) and Milo (
Section 3.1.4), are designed to facilitate children’s environmental relations by simulating communicative interactions.
3.1.8. Development in “Joint Attention”
Joint attention is focusing attention on an object, person, or action. The consequence of promoting this behavior favors the development of other types of social skills. To do this, we must work on changing the focus of attention, tracking objects, and maintaining attention. It is easier to detect the performance of this skill and reinforce it by robots [
32].
As an example, the Cozmo robot allows one to keep the child’s attention thanks to its multitude of games. Some stand out as memory games (remembering the sequence of cubes and repeating it) or speed games (touching the cubes before the device) [
33].
3.1.9. Promoting Turn-Taking
Maintaining a fluent conversation is often challenging in the case of children diagnosed with ASD. Using robots, a child can learn to wait for a response from the device after an action is performed [
13] in a way that encourages their turn-based participation. The child must wait for the robot to perform a sequence before they can indicate the next action.
3.1.10. Work on Emotion Recognition
The wealth of information contained within facial expressions poses a significant challenge for children diagnosed with ASD, who often struggle to decipher the intended message conveyed by these expressions [
9]. Robots can help show these emotions with less information, avoiding overload. This allows them to know what their environment intends to convey to them and how they can communicate their emotions to others [
34] so that their own emotional expressiveness is enhanced. The main benefit acquired after its use is emotional intelligence as it shows emotions and social skills [
30].
In summary, it can be concluded that there are a large number of robots focused on improving the diagnosis and quality of life of children with ASD. These robots have been shown to have a number of desirable characteristics with an optimal design. However, a robot that encompasses all these guidelines has not yet been developed. It is necessary to promote the design of new devices that consider all the guidelines proposed here. The subsequent sections of this study propose a robot that integrates these guidelines.
4. Design: Use Case Applying Proposed Guidelines
The proposed design is aimed at treatment, not diagnosis, so the guideline in
Section 3.1.2 is not addressed. The storytelling technique (
Figure 3) is used to stage a use case for an initial proposed robot. The efficacy of the robot in stressful situations for children is demonstrated through a narrative. The robot’s assistance is illustrated in a scenario involving a health center with a high volume of activity. This complies with the guideline in
Section 3.1.7 by using it in everyday places as a work tool. Circumstances like these can lead to a crisis for infants with ASD.
He (Pablo) is able to imitate the gestures of the latter to the rhythm of the music (guideline in
Section 3.1.6), abstracting himself from reality. Once the activity is finished, the patient is prompted to provide a report on their emotional state, which is subsequently displayed on patient’s screen. In this way, emotional intelligence and the expression of emotions are developed.
4.1. Conceptual Design
The result of this first design (
Figure 4), in the shape of an animal, brings familiarity to the child (
Section 3.1.4). In this way, it favors interaction with it. It has a small design (
Section 3.1.1), favoring its portability. The following sections describe its components and the application of the proposed guidelines.
4.2. Design Components
4.2.1. Control Elements
The device is equipped with a set of buttons (
Figure 5) that serve two primary functions: first to express YES/NO and second to indicate a series of emotions in the form of emojis.
Following the conclusion of the dance, our robot inquires with the child about their experience. The infant is able to express their feelings using buttons shaped like emotions that are integrated into the design (
Section 3.1.8). This functionality complies with the proposed guideline in
Section 3.1.10. They learn to recognize their emotions and communicate them to the device. This process fosters emotional intelligence and self-awareness (
Section 3.1.5). Furthermore, they establish a connection between the emotions they feel and the expressions on the buttons (
Figure 5a).
The patient is then asked whether they wish to continue with the session. The patient is able to answer with affirmation and denial buttons. If the answer is yes, then the recommendation in
Section 3.1.9 applies as it waits for the next song to play. If not, it will simply say goodbye to the patient. Also,
Section 3.1.3 is addressed, as it details the communication between the operator and the robot (
Figure 5b).
4.2.2. Output Elements
The design (
Figure 4) consists of two displays, one that shows the robot’s face and can present different gestures and another that presents emotions that the child introduces through the buttons. The design is complemented by integrated speakers for music and voice.
4.3. Defining Robot Behavior
What Is the Child Feeling? How Do They React?
The design of this robot is based on the idea of following a series of dance steps through music. Its coherent and repetitive design makes it well suited for children with ASD. To initiate the activity, users are instructed to power on their device and utilize the on/off button located on the back of the unit. This activates the LED (
Figure 4), located on the robot’s nose, which lights up. It is fed back to the child that their action has been correctly executed.
The interaction flow through the app: The app is started as shown in
Figure 5a. By clicking on the “Melobot” option, the child can enter the app or access the Settings. Once in the app, the music track to be played is displayed. To proceed, the child selects "OK" to play the selected track or navigate to the next one. During playback, the app provides visual and textual guidance, illustrating the dance steps (
Figure 5b).
When the user turns on the device, the user is asked how they feel before starting (
Figure 5a). This feature enables the cultivation of emotional intelligence. The child discovers the needs that they have and learns to communicate them to the environment. A series of buttons are added to the emotions to promote the recognition of each one and increase proactivity.
The robot’s response conveys the selected emotion. This facilitates the child’s identification of the emotion through a simplified facial expression. The robot’s simplified representation, chosen for its ease of use with young children, offers a starting point for developing emotional intelligence. This choice aligns with the therapeutic phase of development.
The main activity the child focuses on is following a few dance steps to the rhythm of the music. Imitation helps them to understand themselves as a social being. Each step is reflected with great simplicity to avoid confusion and adapt it to treat children at an early age. The integration of robot-assisted step tracking ensures that children remain focused on the activity. Music is a great attraction for children who keep their concentration on the activity, in addition to helping them in difficult situations to remain calm. The option to change songs ensures that children can switch to a more engaging track if they lose interest in one.
At the end of the task, the user is asked again how they are feeling. This reinforces the recognition of emotions, and the tutor evaluates if it helps them feel better. If the activity has not been favorable, the songs are supposed to be removed. The user’s preferences are adjusted.
4.4. Conclusions on the Set of Design Guidelines for a Robot for Children with ASD
It is necessary to establish guidelines that meet a series of requirements that promote results outside therapy, considering the different research with these devices. There are still no robots available that bring together this series of recommendations. For this reason, we propose an initial design that does apply them. Its application was explained through a specific use case. All the components necessary for its operation were defined.
In a quite simple way, a robot optimally designed for children with ASD must meet the following general objectives:
As a general rule, the robot is a means, not the end. All notions of design must be conditioned for its use as a therapy tool.
As a general rule, it should have an intuitive, attractive, and user-friendly design.
And, these other specific criteria apply:
- 3.
Portability and adequate size so that it can be easily handled;
- 4.
Adaptation to the child’s early ages as an aid in early diagnosis;
- 5.
Easy adaptation to the therapy phase;
- 6.
Encouraging proactivity and communication between the robot and the child;
- 7.
Obtaining activity results’ data to analyze outside the query;
- 8.
Tasks and functionalities that allow imitation as a tool for communication;
- 9.
The robot and its use must try to replicate a real scenario for the child;
- 10.
Joint attention, with interaction for joint action to be resolved;
- 11.
Encouraging turn-taking activities;
- 12.
The robot’s ability to recognize the child’s emotions.
5. Results: The TEA-2 Robot—Animatronic Head Design for Interaction with Children with ASD
5.1. Description of Proposed Solution
Following the research conducted in the aforementioned projects, the third project focused on the creation of a novel animatronic head that can help improve the symptoms produced by ASD through interaction with the robot. The primary focus of this project is treatment, rather than diagnosis.
A simple definition of the proposed system includes the robot itself and associated software that will be briefly explained below.
5.1.1. Animatronic Head
The functionality of the head briefly includes simulating a humanoid character that proposes a series of games through a software application (e.g., recognizing colors and checking the response by the means of a camera that is integrated into the forehead of the head) for children with ASD in conversational mode that must be solved on a computer. The results of the games are recorded in a database associated with the application to be easily processed by the therapist.
The animatronic head can reproduce texts in multiple languages for each game and react to attempts to solve it. Furthermore, it is capable of representing emotional states through a combination of neck movements and facial expressions (eyes, eyebrows, mouth). The design of the animatronic face is intentionally uncomplicated, allowing for easy modification with accessories such as wigs, false ears, and other accouterments. This versatility enables the animatronic head to represent a humanoid figure, a pet, or any other character, according to what is seen as most suitable for the TEA’s child to interact with.
The structure of the animatronic integrates four subsystems for better adaptation to future changes:
The Base subsystem holds, protects, and stores all the logical and power components of the system, containing two fans, the PCA9685 servo control module, the Raspberry Pi 4, and the power supply (
Figure 6a).
The Neck subsystem performs movements like the human neck, including extension, flexion, left rotation and right rotation, and right lateral flexion and left lateral flexion. It is composed of 20 pieces and three servomotors, whose details can be seen in
Figure 6b.
The Helmet subsystem is responsible for providing support to the speaker, the camera, and the various servomotors that constitute the face. It is composed of 25 mounting parts, in addition to the camera, nine servomotors, and the communication speaker (
Figure 7a).
The Face subsystem serves the purpose of endowing the animatronic with the semblance of a humanoid countenance while concurrently reflecting the expressive gestures of the system’s face. It is composed of a coating that integrates eight additional pieces (
Figure 7b).
5.1.2. Computer Application
The application developed for the animatronic consists of a desktop application capable of recording, deleting, modifying, and visualizing the data and results of the different tests performed by patients who interact with the animatronic.
In addition, the application includes two types of games based on the different colors (
Figure 8a) that make up the chromatic range, with which the patient can play with the robot, interacting with it. The results of these two games can be stored in the database.
The application also includes a voice, movement colors, and vision control rooms (movement control room,
Figure 8b) in which the specialist who is controlling the robot can check the correct functioning of the different parts of the robot The movement control room also serves as an interface, facilitating communication between the specialist and the patient.
6. Discussion: Analysis of Compliance with Proposed Optimal Design Criteria
As seen below, the robot designed is an animatronic head that represents a humanoid head that interacts with the child. Given the specified conditions, meeting all these criteria may not be feasible, as it could result in a robot that does not adequately meet all the child’s needs, so the chosen design uses only a subset of the criteria.
The analysis of the criteria that are discarded is as follows.
- (3)
The appropriate size of the robot must be small: It is impossible to meet this requirement, since if we want the animatronic to make facial expressions, the necessary hardware must be included inside it to perform the movements. Thus, this occupies a large volume inside.
- (4)
Robots focused on diagnostic therapies should focus on early ages: this is ruled out since the animatronic does not focus on diagnosing but on treatment through games.
The analysis of the criteria that are included is as follows:
- (1)
The robot must be the means but not the end: As it is a robot that is completely controlled by the specialist, they can decide at any time what to do in the session with the patient. The robot is an interface between the patient and the specialist.
- (2)
Intuitive, attractive, and user-friendly design: This is the most subjective criterion of the set and will be analyzed later when the development of the head is complete. However, for the child, the use is intuitive, since it is a conversational interaction and uses a computer.
- (5)
Adapting the topology of the robot to the phase of therapy: as it is a treatment in which the idea is that it is used once the patient is used to using other simpler robots or the use of technology, it must be used in more advanced phases.
- (6)
Encouraging proactivity: the animatronic is designed so that the patient always interacts with the robot when carrying out the different activities it includes, thus encouraging the patient to start taking the initiative.
- (7)
The acquisition of data on the results of the activity to analyze outside the consultation: On the one hand, the results of the games and for each patient are recorded in the application. Conversely, a camera positioned on the head can check the answers to proposed games (
Figure 9). A later development of the system will allow the inclusion of a series of sensors that automate the collection of various data, such as if the child looks at the animatronic head, maintains their gaze, turns their head, etc. Therefore, it is not ruled out in this design, but it is also not currently available.
- (8)
Imitation as an improvement in their communication skills: One of the applications of animatronics is that they can make facial expressions like those of a human. Thus, the patient, by repeatedly seeing facial expressions, can imitate or understand the meaning of each one, improving their communication skills.
- (9)
The environment in therapy must be faithful to reality: the animatronic, being able to express itself as a human through its voice or movements and having a robotic–humanoid appearance with a size like that of a person, can simulate as much as possible a daily life situation such as a conversation with a person.
- (10)
Development in joint attention: To interact with the robot, the patient must focus their attention on the animatronic when performing certain activities, such as conversations, requests, or games. Thus, this tries to improve their social skills.
- (11)
Encouraging turn-taking: By being able to talk to the animatronic, the patient is able to learn to wait for the robot’s response. Thus, this improves their social skills such as learning to hold a conversation.
- (12)
Work on emotion recognition: As the animatronic has a face with fewer facial features than a human, it may be easier for the patient to assimilate and understand. Consequently, patients may develop a more profound understanding and ability to interpret their own emotions.
7. Conclusions
This work presents several projects within a line of research focused on creating robots appropriate for interacting with children with ASD. As indicated, a first attempt, the TEA-1 robot, was partly unsuccessful given the characteristics of the pandemic at the time it was created. Despite the insights gained from this development, reflections after this development were incorporated as a complement to the Introduction.
The need for a much more in-depth study of the subject was highlighted, with the second project being a bibliographic study of different experiences of applying robotics to children with ASD. The analysis ended with an initial conceptual design of an optimal solution, Melobot, and from this design, a list of the design guidelines for a robot for children with ASD was determined.
The third work crystallized a real application of these guidelines, explaining how each of them was applied in this second design. TEA-2, an animatronic head for ASD treatment and the final result of this work, is shown in this study. Subsequent work will present the results of this robot through a collaborative effort between the research group and another university as well as a group of children diagnosed with ASD.
Therefore, the experience accumulated in previous cited work in educational robotics, as well as the study carried out in this paper, gives rise to a series of design guides that are directly applicable to any robot that is to be used for children with ASD and also characterizes the suitability of existing robots.
Author Contributions
Conceptualization, L.H. and J.S.M.; methodology, J.S.M.; software, C.T., L.H. and C.B.; validation, C.T. and J.S.M.; formal analysis, J.S.M.; investigation, J.S.M., L.H. and C.T.; resources, C.T., L.H., C.B. and J.S.M.; data curation, C.T. and J.S.M.; writing—original draft preparation, J.S.M.; writing—review and editing, J.S.M.; visualization, J.S.M.; supervision, J.S.M.; project administration, J.S.M.; funding acquisition, J.S.M. All authors have read and agreed to the published version of the manuscript.
Funding
This research received no external funding.
Data Availability Statement
No new data were created.
Acknowledgments
The authors would like to thank the Esfera Foundation of Madrid for its work on children with ASD and Elena Peribáñez for their collaboration in carrying out this work.
Conflicts of Interest
The authors declare no conflict of interest.
References
- González Fernández, M.O.; Flores González, Y.A.; Muñoz López, C. Overview of educational robotics in favor of STEAM learning. Eureka J. Sci. Teach. Dissem. 2021, 18, 101–119. [Google Scholar]
- Nourbakhsh, I.; Crowley, K.; Bhave, A.; Hamner, E.; Hsiu, T.; Perez-Bergquist, A.; Wilkinson, K. The Robotic Autonomy Mobile Robotics Course: Robot Design, Curriculum Design and Educational Assessment. Auton. Robot. 2005, 18, 103–127. [Google Scholar] [CrossRef]
- Sáez López, J.M.; Vivas Fernández, L. Integration of educational robotics in Primary Education. Rev. Latinoam. Tecnol. Educ. 2019, 18, 107–129. [Google Scholar]
- Leoste, J.; Vikk, T.; San Martin, J.; Kangur, M.; Vunder, V.; Mollard, Y.; Oun, T.; Tammo, H.; Paekivi, K. Robots as My Future Colleagues: Changing Attitudes Toward Collaborative Robots by Means of Experience-Based Workshops; Smart Innovation, Systems and Technologies. In Ludic Co-Design and Tools Supporting Smart Learning Ecosystems and Smart Education; Springer: Singapore, 2022; Volume 249. [Google Scholar]
- Di Lieto, M.C.; Castro, E.; Pecini, C.; Inguaggiato, E.; Cecchi, F.; Dario, P.; Cioni, G.; Sgandurra, G. Improving executive functions at school in children with special needs by educational robotics. Front. Psychol. 2020, 10, 2813. [Google Scholar] [CrossRef] [PubMed]
- Leoste, J.; Pastor, L.; San Martín López, J.; Garre, C.; Baltasar, C. Swimming Figures with Robots (Choreography). In Proceedings of the Future Technologies Conference (FTC), Vancouver, BC, Canada, 5–6 November 2020. [Google Scholar]
- Palestra, G.; De Carolis, B.; Esposito, F. Artificial Intelligence for Robot-Assisted Treatment of Autism. Waiah@ ai* ia. 2017. Available online: https://ceur-ws.org/Vol-1982/paper3.pdf (accessed on 7 January 2025).
- Di Nuovo, A.; Conti, D.; Trubia, G.; Buono, S.; Di Nuovo, S. Deep learning systems for estimating visual attention in robot-assisted therapy of children with autism and intellectual disability. Robotics 2018, 7, 25. [Google Scholar] [CrossRef]
- American Psychiatric Association. Diagnostic and Statistical Manual of Mental Disorders, 5th ed.; American Psychiatric Association: Washington, DC, USA, 2013. [Google Scholar]
- Conti-Ramsden, G.; Simkin, Z.; Botting, N. The prevalence of autistic spectrum disorders in adolescents with a history of specific language impairment (SLI). J. Child Psychol. Psychiatry 2006, 47, 621–628. [Google Scholar] [CrossRef] [PubMed]
- Scassellati, B. How Social Robots Will Help Us to Diagnose, Treat, and Understand Autism. In Robotics Research; Springer Tracts in Advanced Robotics; Thrun, S., Brooks, R., Durrant-Whyte, H., Eds.; Springer: Berlin/Heidelberg, Germany, 2007; Volume 28, pp. 552–563. [Google Scholar]
- Pennisi, P.; Tonacci, A.; Tartarisco, G.; Billeci, L.; Ruta, L.; Gangemi, S.; Pioggia, G. Autism and Social Robotics: A Systematic Review. Autism Res. 2016, 9, 165–183. [Google Scholar] [CrossRef] [PubMed]
- Dautenhahn, K.; Werry, I. Towards interactive robots in autism therapy. Pragmat. Cogn. 2004, 12, 1–35. [Google Scholar] [CrossRef]
- Yun, S.S.; Choi, J.; Park, S.K.; Bong, G.Y.; Yoo, H. Social Skills Training for Children with Autism Spectrum Disorder Using a Robotic Behavioral Intervention System. Autism Res. 2017, 10, 1306–1323. [Google Scholar] [CrossRef] [PubMed]
- Amorosa, H.; Kiefl, H.; Martinius, J. Recognition of Face Identity and Emotion in Expressive Specific Language Impairment. Folia Phoniatr. Logop. 2012, 64, 73–79. [Google Scholar]
- Blow, M.; Dautenhahn, K.; Appleby, A.; Nehaniv, C.L.; Lee, D. The art of designing robot faces. In Proceedings of the 1st ACM SIGCHI/SIGART Conference on Human-Robot Interaction—HRI ’06, Salt Lake City, UT, USA, 2–3 March 2006. [Google Scholar]
- Duquette, A.; Michaud, F.; Mercier, H. Exploring the use of a mobile robot as an imitation agent with children with low-functioning autism. Auton. Robot. 2007, 24, 147–157. [Google Scholar] [CrossRef]
- Stanton, C.M.; Kahn, P.H., Jr.; Severson, R.L.; Ruckert, J.H.; Gill, B.T. Robotic animals might aid in the social development of children with autism. In Proceedings of the 3rd international conference on Human robot interaction—HRI ’08, Amsterdam, The Netherlands, 12–15 March 2008. [Google Scholar]
- Robins, B.; Dautenhahn, K. The Role of the Experimenter in HRI Research—A Case Study Evaluation of Children with Autism Interacting with a Robotic Toy. In Proceedings of the ROMAN 2006—The 15th IEEE International Symposium on Robot and Human Interactive Communication, Hatfield, UK, 6–8 September 2006. [Google Scholar]
- Leoste, J.; Tammemäe, T.; Eskla, G.; San Martín López, J.; Pastor, L.; Blasco, E.P. Bee-Bot Educational Robot as a Means of Developing Social Skills Among Children with Autism-Spectrum Disorders. In Robotics in Education. RiE 2021; Advances in Intelligent Systems and Computing; Springer: Cham, Switzerland, 2022; Volume 1359. [Google Scholar]
- Peribañez, E.; Bayona, S.; San Martin, J.; Verde, A.; Garre, C.; Leoste, J.; Pastor, L. An Experimental Methodology for Introducing Educational Robotics and Storytelling in Therapeutical Activities for Children with Neurodevelopmental Disorders. Machines 2023, 11, 629. [Google Scholar] [CrossRef]
- Baraka, K.; Melo, F.; Veloso, M. Interactive Robots with Model-based ‘Autism-like’ Behaviors: Assessing Validity and Potential Benefits. Paladyn J. Behav. Robot. 2019, 10, 103–116. [Google Scholar] [CrossRef]
- Pivetti, M.; Di Battista, S.; Agatolio, F.; Simaku, B.; Moro, M.; Menegatti, E. Educational Robotics for Children with Neurodevelopmental Disorders: A Systematic Review. Heliyon 2020, 6, e05160. [Google Scholar] [CrossRef] [PubMed]
- Colton, M.; Ricks, D.; Goodrich, M.; Dariush, B.; Fujimura, K.; Fukiki, M. Toward therapist-in-the-loop assistive robotics for children with autism and specific language impairment. In Proceedings of the AISB Symposium: New Frontiers in Human-Robot Interaction, Edinburgh, UK, 8–9 April 2009. [Google Scholar]
- Kumazaki, H.; Warren, Z.; Swanson, A.; Yoshikawa, Y.; Matsumoto, Y.; Yoshimura, Y.; Shimaya, J.; Ishiguro, H.; Sarkar, N.; Wade, J.; et al. Brief Report: Evaluating the Utility of Varied Technological Agents to Elicit Social Attention from Children with Autism Spectrum Disorders. J. Autism Dev. Disord. 2018, 49, 1700–1708. [Google Scholar] [CrossRef] [PubMed]
- Campolo, D.; Taffoni, F.; Schiavone, G.; Laschi, C.; Keller, F.; Guglielmelli, E. A novel technological approach towards the early diagnosis of neurodevelopmental disorders. In Proceedings of the 2008 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Vancouver, BC, Canada, 20–25 August 2008; pp. 4875–4878. [Google Scholar]
- Seo, K. KISUNG Seo and ALDEBARAN Robotics. Using Nao: Introduction to Interactive Humanoid Robots, AldeBaran Robotics. 2013. Available online: http://www.kramirez.net/Robotica/Material/Laboratorios/Lab1/Table-of-Content-TEXTBOOK-KI-SUNG-SU-2013-hdef.pdf (accessed on 7 January 2025).
- Romero-García, R.; Martínez-Tomás, R.; Pozo, P.; de la Paz, F.; Sarriá, E. Q-CHAT-NAO: A robotic approach to autism screening in toddlers. J. Biomed. Inform. 2021, 118, 103797. [Google Scholar] [CrossRef] [PubMed]
- Albo-Canals, J.; Martelo, A.B.; Relkin, E.; Hannon, D.; Heerink, M.; Heinemann, M.; Leidl, K.; Bers, M.U. A Pilot Study of the KIBO Robot in Children with Severe ASD. Int. J. Soc. Robot. 2018, 10, 371–383. [Google Scholar] [CrossRef]
- Yousif, M. Humanoid Robot Enhancing Social and Communication Skills of Autistic Children: Review. Artif. Intell. Robot. Dev. J. 2021, 1, 80–92. [Google Scholar] [CrossRef]
- Ricks, D.J.; Colton, M.B. Trends and considerations in robot-assisted autism therapy. In Proceedings of the 2010 IEEE International Conference on Robotics and Automation, Anchorage, AK, USA, 3–7 May 2010; pp. 4354–4359. [Google Scholar]
- Ravindra, P.; De Silva, S.; Tadano, K.; Saito, A.; Lambacher, S.G.; Higashi, M. Therapeutic-assisted robot for children with autism. In Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, St. Louis, MO, USA, 10–15 October 2009; pp. 3561–3567. [Google Scholar]
- Pelikan, H.R.; Broth, M.; Keevallik, L. Are you sad, Cozmo? How humans make sense of a home robot’s emotion displays. In Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, Cambridge, UK, 23–26 March 2020; pp. 461–470. [Google Scholar]
- Pioggia, G.; Igliozzi, R.; Sica, M.L.; Ferro, M.; Muratori, F.; Ahluwalia, A.; De Rossi, D. Exploring emotional and imitational android-based interactions in autistic spectrum disorders. J. Cyber Ther. Rehabil. 2008, 1, 49–61. [Google Scholar]
| Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).