Next Article in Journal
A Hybrid Electromagnetic and Tendon-Driven Actuator for Minimally Invasive Surgery
Next Article in Special Issue
Highly Stretchable Polymer Optical Fiber for Mechanical Sensing in Artificial Tendons: Towards Novel Sensors for Soft Robotics
Previous Article in Journal
Simplification of the Model of Piezoelectric Actuator Control Based on Preliminary Measurements
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Open-Source Social Robot Based on Compliant Soft Robotics for Therapy with Children with ASD

by
Diego Casas-Bocanegra
1,
Daniel Gomez-Vargas
1,
Maria J. Pinto-Bernal
1,
Juan Maldonado
1,
Marcela Munera
1,
Adriana Villa-Moreno
2,
Martin F. Stoelen
3,
Tony Belpaeme
4,5 and
Carlos A. Cifuentes
1,*
1
Department of Biomedical Engineering, Colombian School of Engineering Julio Garavito, Bogotá 111166, Colombia
2
Tejido de Sueños, Medellín 050021, Colombia
3
School of Computing, Electronics, and Mathematics, University of Plymouth, Plymouth PL4 8AA, UK
4
Faculty of Engineering and Architecture, Ghent University, 9000 Ghent, Belgium
5
Centre for Robotics and Neural Systems, University of Plymouth, Plymouth PL4 8AA, UK
*
Author to whom correspondence should be addressed.
Actuators 2020, 9(3), 91; https://doi.org/10.3390/act9030091
Submission received: 11 August 2020 / Revised: 7 September 2020 / Accepted: 11 September 2020 / Published: 20 September 2020
(This article belongs to the Special Issue Actuators Technologies for the Next Generation of Robots and Industry)

Abstract

:
Therapy with robotic tools is a promising way to help improve verbal and nonverbal communication in children. The robotic tools are able to increase aspects such as eye contact and the ability to follow instructions and to empathize with others. This work presents the design methodology, development, and experimental validation of a novel social robot based on CompliAnt SofT Robotics called the CASTOR robot, which intends to be used as an open-source platform for the long-term therapy of children with autism spectrum disorder (CwASD). CASTOR integrates the concepts of soft actuators and compliant mechanisms to create a replicable robotic platform aimed at real therapy scenarios involving physical interaction between the children and the robot. The validation shows promising results in terms of robustness and the safety of the user and robot. Likewise, mechanical tests assess the robot’s response to blocking conditions for two critical modules (i.e., neck and arm) in interaction scenarios. Future works should focus on the validation of the robot’s effectiveness in the therapy of CwASD.

1. Introduction

Social assistive robotics (SAR) is an established research area in robotics where robots support therapeutic and healthcare interventions. Promising results exist in therapeutic interventions for the elderly and children [1]. SAR has received considerable attention as a potential intervention tool for children with autism spectrum disorder (CwASD). Autism spectrum disorder (ASD) is a neurodevelopmental disorder that affects people, often from birth, and commonly manifests in the early years of life. Worldwide statistics estimate that one in 160 children has ASD and that most children are not diagnosed until after the age of four years [2,3]. CwASD have difficulties with attention and concentration, deficits in social communication, social interaction, and recognition of emotions, impairments in verbal and nonverbal social communication, restrictive interest, and atypical behaviour [4,5].
In the context of ASD therapy, SAR has shown great advances and potential benefits in the development and applications of therapies for CwASD [6,7,8,9]. Specifically, SAR has been used to assist the diagnostic process and practice and improve social skills, such as eye contact and joint attention [10,11], emotion recognition [12,13], imitation [14], sharing simple activities, and increasing self-initiated interaction [15] to encourage basic verbal and nonverbal communication [16]. Although the evidence for the efficacy of SAR for ASD therapy is increasing [1,16,17], most robots used with CwASD are off-the-shelf robots (e.g., toy robots and social robots), which are not specifically designed for therapeutic ASD interventions [18,19]. Thus, evidence supports that these robots could negatively influence the therapy’s performance and lead both to an unpleasant experience or dangerous event [1,20]. Therefore, there is still a lack of consensus on how the interactions should be addressed and which robot morphology might be most effective. Consequently, several design techniques have started to be explored, where participatory design (PD) ensures the acceptability and functionality of the robot [21]. Besides, PD methods have been adopted to develop interventions for populations with special needs.
The use of PD methods allows integrating contributions from different populations (e.g., stakeholders community) who will be directly affected by the decisions made. This process intends to achieve products or services reflecting the real needs, desires, and expectations of the users, designers, and stakeholders [22]. In this sense, all project members are valuable contributors who play a crucial role in the political, social, and ethical development considerations. Thus, the target populations and their social environment (i.e., families, society, partner groups, and friends) are no longer seen as a source of information and requirements for producing results, but rather as an experienced partner [23].
Thus, the PD methodology has been used in the design of SAR for ASD [24]. Those SAR systems aim to induce tactile interactions to promote social relationships and mediate interactions between CwASD, their peers, and adults [25,26]. However, the implementation of a PD process with ASD populations can be very challenging and extensive. Therefore, researches and designers need to find ways and techniques, including the stakeholders’ contributions, to overcome those limitations.
In this work, the CASTOR (CompliAnt SofT Robotics) robot’s development is presented, whose design emerges from a study based on the PD methodology (for further information, refer to our previous research [27]). CASTOR’s primary goal is to create a friend for CwASD, acting as an assistive tool during long-term ASD therapy. This way, the robot has a robust and replicable mechanical structure, based on series elastic actuators (SEA) and compliant mechanisms to resist physical interaction. Moreover, CASTOR will also be used as a multidisciplinary research platform to improve and explore the possibilities of robot-assisted therapy in a long-term ASD environment.
This manuscript is organized as follows: Section 2 presents a brief background regarding the different robots used in ASD therapy; Section 3.1 explains the development and the experimental evaluation of the performance of the robot; Section 3.2 illustrates the results of the evaluation; finally, Section 5 and Section 6 give the discussion and conclusions.

2. Related Work

Several SAR platforms have been found in the literature to support therapy or interventions in CwASD. Generally, those systems aim to be an assistive tool in ASD therapy or an agent for active semi-structured behavioural reinforcement. Specifically, to focus on nonverbal communication and encourage physical interaction [28], some studies use pet-like robots with different shapes and morphologies such as Paro (AIST, Japan) [19,29], Huggable Bear (MIT, Cambridge, MA, USA) [30], and Probo (University Brussels, Belgium) [25,31]. However, other studies use humanoid-like robots for this purpose such as NAO (SoftBank Robotics, Tokyo, Japan) [32,33,34,35,36], ONO (Ghent University, Belgium) [37], KASPAR (University of Hertfordshire, U.K.) [26,38,39,40,41], Robonova (Hitec Robotics, South Korea) [42], and Zeno (Hanson Robotics, Hong Kong) [33]. Despite the diverse roles of robots presented in several studies, the main motivation for using technology is improving social skills and developing perspective-taking skills. Table 1 summarizes the principal characteristics (i.e., actuation type, degrees of freedom (DOFs), materials, sensors, and main functionalities) of those platforms, including the CASTOR robot detailed in the next sections.
On the other hand, different studies studied how SAR can promote and improve social skills in CwASD through an agent mediator [41,51,52]. The results agree to define the physical interaction as an essential aspect for accomplishing those purposes [53]. Additionally, considering that more than 60% of communication is nonverbal [54], other studies recommend face-to-face techniques [13,36] to ensure high performance in therapy. Likewise, taking into account the relevance of facial expressions in human interaction [55], the development of robots should focus on modular and adjustable structures that allow proper visual fixation with the child.
Regarding the robot appearance, the review by Scassellati et al. reported that even though robotic platforms change in terms of behaviour and appearance, they evoke pro-social behaviours such as joint attention and behaviour imitation by many children with ASD [56]. However, a more recent review by Cifuentes et al. reported negative impacts on performance and acceptability during the therapy, concerning the different social robots [1]. Other studies also reported those effects related to the robot’s appearance and morphology [11,20]. Therefore, to promote both the SAR’s acceptability and the development of social skills in CwASD, the robot’s appearance and anthropomorphism level are essential issues [56,57]. In this context, all the stakeholders (i.e., patients, healthcare professionals, software developers, and caregivers) should be involved in the robot’s appearance to ensure proper integration of SAR in therapy scenarios [37,58].
According to the above, this work presents the development and implementation of a compliant soft robot called CASTOR (CompliAnt SofT Robotics). Its design was based on inclusive and participatory design techniques. CASTOR robot is intended to be implemented in the next generation of ASD rehabilitation scenarios based on tangible and affordable SAR. In this sense, this work looks at developing countries’ social context, providing a low-cost tool suitable for physical child-robot interaction.

3. Materials and Methods

3.1. The CASTOR Robot

The CASTOR project seeks to develop a robotic tool to improve the relationship between the therapist and the CwASD and to support ASD therapy. The following section presents the design methodology, the mechanical structure, and the electronic system of the robot.

3.1.1. Design Methodology

According to the different motivations and approaches presented in Section 1 and Section 2, this work covers the design, development, and validation of a novel social robot called CASTOR (see Figure 1) with the central premise of promoting physical interaction with CwASD during ASD therapy. CASTOR’s design was carried out in two stages: (i) participatory design to determine the robot requirements and (ii) an ergonomic analysis to define the adequate robot dimensions.
The first stage consisted of a PD that involved the community (i.e., CwASD, parents, and therapists) and researchers in determining the necessities to promote acceptance. This way, the guidelines regarding physical requirements, technical considerations, mechanical and manufacturing features, and the ideal robot intervention were obtained [27]. This phase also aimed to provide all stakeholders the opportunity to actively participate in the decisions that affect and interest them in CASTOR’s design.
The PD methodology consisted of four stages: (i) sensitization; (ii) focus group with stakeholders; (iii) intervention with CwASD; and (iv) validation (for further information, refer to our previous research [27]). Figure 2 illustrates the PD results obtained in each stage. The results provided the considerations and features for the development of the CASTOR robot.
The second stage consisted of selecting the structural dimensions and determining the proper height of the line of sight between the robot and the user employing an ergonomic analysis. This feature is vital because of the influence of eye contact in the child-robot interaction, which is essential in ASD therapy [59,60]
This analysis included two scenarios commonly used in ASD therapy [41,61]. During the first scenario, a social robot was on the table in front of the child (see Figure 3A), whereas, in the second scenario, the robot and the child were on the floor (see Figure 3B). Moreover, the analysis included the anthropometric measurements of 5- and 10-year-old children (U1 and U2, respectively) who are part of the robot’s potential range. In the figure, D refers to the distance between the child and the robot, H m a x and H m i n denote the maximum and minimum heights of the robot where the child can see all of it, and H i d e a l represents the height where the line of sight between both the child and the robot is aligned.
Concerning the impossibility of adjusting the robot dimensions for all scenarios and users (see Figure 3C), the analysis prioritized the line of sight of the older child in the second case (i.e., the robot and the child on the floor). However, to align the line of sight of U2 with the robot in the first scenario, an adjustable chair would be desirable.
Table 2 summarizes the range values of the robot height for both scenarios. In general terms, the optimal size of CASTOR is between 38 and 49 cm. This range integrates the minimum value of U1 in the first scenario (i.e., H 1 m i n ) and the maximum value of U2 in the second scenario (i.e., H 2 m a x ). This way, the robot can keep proper visual contact with children within the ranges included in this analysis, adjusting the chair height for smaller children, as previously mentioned. The ergonomic analysis information is relevant for the mechanical design of the CASTOR robot presented in the next section.

3.1.2. Development of the Robot

The CASTOR robot is developed on the concept of soft actuation for social robots, which enables a safe and comfortable human-robot interaction. In this sense, the shock impact capability [62] is studied in this work. Therefore, this robot is configured as a platform able to interact physically and socially with CwASD during ASD therapy. In this context, robots use novel motion mechanisms, like series elastic actuators (SEA), to improve the interaction and safeguard the structure [62]. SEAs consist of an elastic element between the end-effector and the actuator. This element actuates as a low-pass filter to shock loads, where the amount of elasticity increases or decreases the absorption tolerance [63]. This way, the robot becomes a robust platform with the potential to accomplish the premises of this project.
The robot has 14 degrees of freedom (DOFs) divided into 12 active and two passive ones (see Figure 4). For the active joints, the robot has 1 DOF for each elbow (i.e., flexo-extension movement), 2 DOFs per shoulder (i.e., flexo-extension and add-abduction movements), 1 DOF for the head rotation movement, and 5 DOFs for the facial expressions. Besides, to allow deformation in the huggable structure, the robot incorporates 2 passive DOFs. The robot uses low-cost 3D-printed pieces of polylactic acid (PLA) and thermoplastic polyurethane (TPU) for the soft parts in the structural context. The pieces’ design included static simulations and mechanical tests to determine the best shape that achieves both stiffness and lightness. CASTOR’s mechanical structure files are available in a public repository at https://github.com/CastorProject/CASTOR_Robot/wiki.
Taking into account the guidelines (see Figure 2) and the robot dimensions (see Table 2), this section is divided into five modules developed for the CASTOR robot: (A) head and neck module, (B) arm module, (C) huggable structure module, (D) perception module, and (E) the electronic system.
  • Head and Neck Module
The head plays a fundamental role in communication. This part expresses 44 % of the nonverbal dimension (i.e., the transition of personal information through gesticulation) and, entirely, the verbal communication (i.e., the speech to transmit ideas and thoughts) [64]. Therefore, the CASTOR Robot can emulate facial expression, keep eye contact, and emit sounds and words.
Regarding gesticulation, the robot has 3 DOFs in the mouth, 1 DOF in each eyebrow, and two screens to represent the eyes (see Figure 5). Thus, the CASTOR Robot can reproduce facial expressions (see Figure 5A) such as happiness, sadness, anger, and wonder, among others, which are ideal for different therapy scenarios. Furthermore, the device can open and close the mouth to accompany the speech function (see Figure 5B).
In the interaction context, the neck integrates an SEA system based on compliant mechanisms [65]. The elastic element for the energy dissipation involves four aluminium bars attached to two 3D-printed parts (see Figure 6B), which link the head and the motor (see Figure 6A). The system employs the flexibility of the aluminium and exploits the geometry to allow rotational and axial deformations. Likewise, the actuators do not exhibit overload or mechanical locking because the end-effector is decoupled.
B.
Arm Module
Another important aspect of nonverbal communication consists of expressing ideas using the upper limbs (e.g., waving hello or goodbye, shaking hands) [64]. Therefore, the inclusion of arms in robots increases social interaction [66]. Likewise, systems based on SEA for these limbs encourage a natural and safe interaction during those tasks [66]. For the CASTOR robot, the SEA principle of the arm constitutes a different system compared to the neck module.
CASTOR’s flexible arm stems from the GummiArm project [67], which is a robotic platform with variable stiffness actuators (i.e., the spring value is variable in contrast to the SEA) in a bidirectional antagonist configuration. The arm robot has two motors for each DOF, working in the co-contraction movements [68], as human muscle functions. For the variable stiffness, this device includes bioinspired elastic elements, whose mechanical behaviour is similar to human tendons [69], composed of thermoplastic elastomer (TPE), fibres of polyethylene (HMPE), and filaments of polytetrafluoroethylene (ePTFE).
In CASTOR’s arms, a single actuator creates the DOF in a bidirectional antagonist configuration (see Figure 7), removing the variable stiffness actuation. However, the stress level can be adjusted manually (see Figure 7B), in an initial state, to change the robot performance and the perception of the interaction. The arms consist of a pulley system for transmitting the torque from the actuator to move the joint using a bioinspired elastic element (see Figure 7A).
In general terms, CASTOR has two arms to create a humanoid appearance, as well as emulate physical gestures in real applications. Each one has three DOFs actuated by servomotors with characteristics such as a continuous torque of 0.3 Nm, a turning speed of 59 rpm, and a weight of 53.5 g. This way, the actuators can easily move, control, and assemble CASTOR’s arms. On the other hand, the lengths used in the device are related to both the anthropometric measurements for assuring the robot’s symmetry and the minimum size required for coupling the servomotors. Moreover, the robot integrates the actuators into the structure to increase the stiffness and robustness of the arm, the total weight of which is 600 g.
C.
Huggable Structure Module
The human body has different mechanisms to resist physical interaction (e.g., the shoulder complex joints), which allow body deformation during shocks or hugs [70]. Thus, the CASTOR robot has a system based on those joints of two passive DOFs. For that, the robot includes pneumatic pistons of 60 N to absorb the interaction energy. Figure 8 shows the huggable system, where external forces (F_ext) deform the structure without damaging it. This module protects the structure from falls or collisions and enhances the interaction with the child.
D.
Perception Module
Perception improves the interaction between the robot and the child [41], playing an important role in social development [71]. This way, different robotic platforms such as KASPAR, NAO, or Probo have integrated interactive modules composed of touch sensors, push bottoms, and touch screens to identify such haptic interactions. Several studies showed the relationship between the device’s response to the stimulus of the child and the advancement in his/her spontaneous interaction [41]. Likewise, this capacity evidenced encouragement, motivation, and adherence to posterior sessions [41].
Therefore, CASTOR incorporates a system based on touch sensors made of Velostat (Adafruit, New York, NY, USA) to detect the interaction (see Figure 9). The sensors were placed on zones with a higher probability of physical contact (i.e., the antenna, head, hands, and shoes). Thus, this system identifies when the child has direct contact with the device, and hence, it responds with a programmed behaviour (e.g., movements or sounds). In this context, the robot uses a speaker to communicate verbally with the child, integrating text speech software. Moreover, according to the circumstances and the therapy goal, the voice parameters (i.e., genre, type, or tone) can be modified.
E.
Electronic System
Considering CASTOR’s modules, Figure 10 shows the electronic system and communication protocols implemented on the robot. For the hardware, the robot integrates a network of seven servomotors (AX12, Dynamixel, Seoul, Korea ) to move the arms and the neck. The actuators use a USB driver (U2D2, Dynamixel, Korea) for communication with the processing unit. In the face movements, the ROBOT has five low-cost servomotors (MG995, TowerPro, Taiwan) controlled by an OpsoroHAT board (OPSORO, Kortrijk, Belgium). This board also controls the speaker (Extra-bass, Sony, Japan) and the touch sensors (Velostat, Adafruit, USA) of the perception module. Moreover, an EyesBonet board (Adafruit, USA) connected to the main computer controls the eyes’ aspect and functionality. In the processing context, the robot incorporates two Raspberry Pi 3 (i.e., the first board for the head and perception modules and the second for the arm module) running the Robot Operating System (ROS) under a Unix-based distribution. In terms of consumption, the robot requires a power supply of 12 V to 9.5 A in normal conditions (i.e., without blocking states).
For the software, the device uses IVONA Text To Speech for the perception module’s robot voice. Moreover, CASTOR has a web interface to configure and control the different modalities and applications using any smart device. From the modularity and replicability aimed at in this project, the software (i.e., controllers, sensor acquisition modules, and the functionalities of the device) are ROS packages available in a public repository at https://github.com/CastorProject/CASTOR_Robot/wiki.

3.1.3. Robot Functionalities

As mentioned in the previous section, the actuators of the head (i.e., 2 DOFs for eyebrows and 3 DOFs for the mouth) and the screens can generate facial expressions such as wonder, happiness, sadness, and anger (see Figure 11). The integrated screens have three main characteristics: (1) control the eye movements using the Cartesian axes, (2) change the iris and eyelid colour through a design editor, and (3) modify the pupil size between contracted and dilated. For the face movements, CASTOR uses an OpsoroHAT board to establish the appropriate range of motion (ROM) of each motor to represent a specific emotion. Thus, the CASTOR robot has the necessary gestures for emotion recognition methods in therapy sessions [57].
Likewise, the arm and neck actuators allow movements such as waving/farewell, high-five, pointing to parts of the body, pointing to an object/place, and even dancing (see Figure 12). To this end, the CASTOR robot implements position controllers for each servomotor with characteristics such as initial position, actuation range, and movement speed. Moreover, the inclusion of a speaker gives the possibility to tell stories or play sounds. This way, the CASTOR robot can combine movements and sounds, for instance saying “hello” while performing a greeting or playing a song while the robot dances. Furthermore, the OPSORO board also allows integrating 12 tactile sensors, establishing the activation threshold of the sensor, and executing the facial movements or sounds associated with the interaction. Hence, the inclusion of these functionalities provides the robot with the capacity to respond to the child’s stimulus. Therefore, CASTOR has the potential to be included in different therapy techniques such as imitation, proprioception, physical interaction, or following instructions.

3.2. Experimental Study

The experimental design proposed in this paper intends to show the applicability of the CASTOR robot in close interaction with children, without representing a risk for both the robot and the child. Therefore, from the modules and functionalities presented above, this section shows the preliminary validation of CASTOR in two cases: (1) mechanical testing of critical joints to measure the device response in the blocking condition and (2) a case study with three CwASD to assess the performance of the robot in a real interaction scenario.

3.2.1. Mechanical Test

According to CASTOR’s mechanical design, the mechanical test aims to show the device’s capability to resist physical interaction. Therefore, this paper assesses the most fragile parts (i.e., neck and arms) of the CASTOR robot. Specifically, the neck actuator has a high susceptibility to mechanical blocking, which is related to the child’s reaction and curiosity during the therapy. Likewise, the arm actuators can suffer excessive forces because of both the inertia of the segments and external forces in an interaction scenario.
For the experimental procedure, a mechanical structure blocked the servomotors of the neck and the arm. The actuators received a signal of goal position commands, where the amplitude increased in each repetition. The maximum value of the set-point was the ROM of each joint (i.e., 180 degrees for the head rotation and 105 degrees for the arm flexo-extension). The speed of the actuators, during the trial, was configured as the maximum value (i.e., 55 rpm).
The servomotors attempted to execute the positions in stiff and flexible configurations. For that, in the neck, the 3D-printed piece’s adjustment, coupled to the bar mechanism, allowed modifying the stiffness level. In the arm actuators, the stretching of the elastic element achieved both configurations. The device’s load response was acquired using the rosbag package on an external computer (Pavilion Intel i5, HP, Palo Alto, CA, USA). The data extraction and processing were performed in MATLAB (R2018b. MathWorks, Natick, MA, USA).

3.2.2. Case Study

To evaluate CASTOR’s performance in a child-robot interaction setting, a familiarization phase with diverse activities was proposed. This phase’s goal was to assess the mechanical design and actuator performance during physical interaction with CwASD. This way, three CwASD were enrolled in this study conforming to the validation group (3 males, 6.66 ± 2.49 y.o.) without any visual, auditory, or cognitive impairment impeding the correct understanding of the activities. Additionally, the children did not present any co-morbidities, such as fragile X syndrome or Down syndrome. The legal representative of each participant signed the informed consent before the study.
The experimental trials took place at the Howard Gardner Comprehensive Rehabilitation Clinic in Bogotá, Colombia. The familiarization phase was a standard procedure with proprioception activity carried out in one session. At the beginning of the session, socializing the children with the robot and integrating it into their environment were conducted. The therapist introduced the robot to the participant, and the child was able to freely explore it to feel safe and confident. At this stage, CASTOR asked the children what their name was and other questions of interest to establish a relationship between them. Afterwards, CASTOR invited the child to play with him; the play consisted of imitating and recognizing the body parts (e.g., eyes, nose, mouth, arms). This activity aimed to promote the physical interaction between them and, with that, verify the performance of CASTOR’s compliant motion. To this end, three types of physical interactions were identified throughout the session: (1) blocking, (2) swing, and (3) hard interactions.
The first one refers to a blocking interaction. This interaction aims at verifying the safety and naturalness of physical contact brought by a hybrid compliant system. Thus, in this condition, the participant creates some obstruction to the robot’s movement (e.g., the robot moves its arm, and the participant stops the motion with his/her hand).
The second one implies a swing interaction. This interaction examines whether the robot could perform interactive and playful physical contact with the aid of a hybrid compliant system. For this interaction, the robot moves the articulation, and the child moves this articulation in another direction, e.g., the child shakes the arms when the motors are active and actuated.
The third one indicates a hard interaction. This interaction is to verify whether CASTOR’s instinctive compliance could keep it safe for both children and itself when a strong collision happens. In this condition, the participant impacts the robot or performs fast and heavy manipulation, e.g., the participant pushes hard on the robot’s head in the direction that it is slowly turning.

4. Results

This section describes the results obtained for both experimental validations proposed in this paper, i.e., the mechanical test of the critical joints in extreme conditions and a study case to measure the device’s response in the interaction scenarios. Firstly, in the mechanical validation, Figure 13 shows the actuators’ load response in terms of the percentage of the maximum set-point value. This value was normalized according to the ROM of each joint. The red lines represent the flexible configuration in the two joints in the graph, and the black colour denotes the load response for the stiff condition.
For both joints, the stiff configuration reached 50% of the load capacity in 30% of the ROM value. However, the flexible configuration reached this value up to 70% of the ROM. Moreover, the actuators exhibited a saturation state in all tests. Nevertheless, the stiff trials also evidenced an overload event (i.e., close to the stall torque), which led to the actuator’s automatic switch-off. In contrast, the actuator for the flexible configuration remained moving despite the saturation state.
In structural terms, CASTOR did not evidence any damages in the 3D-printed pieces, elastic elements, bar mechanism, or actuators during the trials. Likewise, CASTOR kept the initial configuration (i.e., stress level) on the joints assessed despite the blocking condition conducted in this experiment.
Secondly, for the case study, Figure 14 illustrates the three physical interactions identified throughout the session: (1) blocking, (2) swing, and (3) hard, respectively. Regarding the first interaction, Figure 14A shows when the subject 2 (S2) causes an obstruction of CASTOR’s arm movement. In contrast, concerning the swing, Figure 14B exhibits when S2 produces a force against the motion of the robot arm. Finally, according to the hard interaction, Figure 14C shows when S3 generates a substantial impact against the neck and head of CASTOR.
From the events identified in the experiment (i.e., blocking, swing, and hard), Table 3 summarizes the time and the number of events registered for each participant. Moreover, it also presents the percentage of physical interaction between the child and the robot concerning the duration of the session and if the child had verbal communication with CASTOR.
The children involved in the case study evidenced at least one episode of physical interaction with the CASTOR robot, wherein the child S1 registered the maximum percentage value in comparison with S2 and S3. The hard event was the episode with the highest recurrence in the experiment, reaching 23 incidents during a total time of 57 s. Moreover, this event occurred mainly in CASTOR’s arms. Regardless of the physical robot-child interactions, CASTOR did not hurt the participants and maintained its initial configuration and all functionalities of its actuators, as the mechanical test showed in the first part of this section.

5. Discussion

The primary goal of this paper is to present the design, development, and validation of a new social robot (CASTOR robot), aimed at developing a tool for the treatment of CwASD. Thus, the experimental section intends to assess the device response in an interaction context. The first part analysed two critical joints (i.e., the robot’s neck and arms in a blocking condition). The load response of the actuators changed concerning the flexibility of the system. Specifically, the stiff actuation led to the early saturation and overload state, potentially damaging the actuator. The results correspond to the typical behaviour of the SEAs, where the elastic element stores energy (e.g., torsion because of blocking) to decrease the load on the motor [63].
In this sense, stiff actuators evidence limitations and risks for interaction scenarios, being the most typically used in social robots (see Table 1). Therefore, robots based on SEA could allow the execution of long-term therapies focused on physical interaction exercises to promote nonverbal communication and support the development of CwASD, without representing a risk for both the child and the robot.
Regarding the case study, the children displayed at least one physical interaction with CASTOR during all sessions, blocking and hard being the predominant events. This way, the children generated some obstruction in the movements and presented some massive manipulation or impact on CASTOR when interacting with it. Moreover, this interaction was focused on the arms and the head, which explains the experimental design proposed for the mechanical test. On the other hand, only S1 and S2 manifested verbal or nonverbal communication with CASTOR, whereas S3 only presented an aggressive response to the robot. The therapist informed us that S3 exhibited high anxiety levels before the session, and the anxiety levels did not decrease with the robot interaction.
These findings suggest that CASTOR is able to resist physical contact in states such as blocking, swing, or hard interaction. Furthermore, despite the different events presented, CASTOR did not exhibit any damage to its actuators, mechanical structure, functionality, or electronic system. Besides, CASTOR was demonstrated to be a safe platform for the users, notwithstanding the different interactions performed by them, because of the mechanical design, materials, and actuators integrated into the device.
According to the observations in the experimental validations, intense interactions predominated during the study, and CASTOR was able to resist them. Therefore, this suggests that CASTOR has the potential to be used in ASD therapy promoting safe physical child-robot interaction. Likewise, the device provided a natural and safe interaction with the participants, thanks to the actuation principles and mechanisms used. To this end, it was essential to take away some importance from the precise movements of conventional social robots, hence focusing on the design of flexible action strategies based on human biomechanics.
Concerning the most common social robots used in ASD therapy reported in Table 1, some robots have SEA systems such as Probo [46,47] and Huggable Bear [30,49]. However, these robotic platforms do not have a wide range of functionalities to emulate nonverbal communication or physical interaction. To be more precise, although Probo has 20 DOFs, these are only focused on the reproduction of facial expressions. Regarding Huggable Bear, despite having an SEA system, it cannot reproduce facial expressions, and its amount of DOFs is reduced. In this context, CASTOR proposed numerous functionalities, which were based on the functional characterization of the most common SARs used in ASD therapy. Even though CASTOR does not have the same amount of DOFs as other robots, this does not mean the generation of the same functions. For instance, CASTOR has digital eyes, and then, it reduces by six the DOFs necessary to evoke facial expressions. Besides, this feature is an advantage because of the facility of changing its eyes (e.g., colour, increase/decrease pupil) and, with that, the way of interacting with the users. Regarding more complex robotic platforms in terms of DOFs, such as KASPAR [38,41] or NAO [9,44,45], they have a significant amount of DOFs, as well as are capable of emulating facial expressions and interacting with CwASD. However, these robots are sophisticated in their design; therefore, they are expensive platforms, and it is complex to reproduce or replicate them.
Considering the points mentioned above, CASTOR was designed and elaborated with easily manufactured parts and readily available components such as screws and bearings. In this regard, CASTOR is an easily replicable platform that provides a wide range of functionality.

6. Conclusions

This paper presents the prototype of the CASTOR robot for therapy for CwASD. Mechanical tests assessed the actuation principles applied in the robot. Likewise, a case study exposed the robot to the interaction scenario and provided encouraging results in terms of the user’s safety and the robot. In general, the CASTOR robot evidenced potential use in applications with physical interaction without representing a risk for the child or the structure. Moreover, the modularity in the mechanics and electronics makes the CASTOR robot an open-source robotic platform that is both easily replicable and modifiable. It is available in a public repository at https://github.com/CastorProject/CASTOR_Robot/wiki. Future works should be mainly focused on the validation of the CASTOR robot in ASD therapy, to assess the robot’s influence on the development of CwASD.

Author Contributions

Conceptualization, M.M., M.F.S., T.B., and C.A.C.; methodology, D.C.-B., M.J.P.-B., M.M., A.V.-M., and C.A.C.; software, D.G.-V. and J.M.; hardware, D.C.-B., M.F.S., validation, D.C.-B., D.G.-V., M.J.P.-B. and J.M.; resources, M.M., M.F.S., T.B., and C.A.C.; data curation, D.C.-B., D.G.-V., and M.J.P.-B.; writing, original draft preparation, D.C.-B., D.G.-V., M.J.P.-B., and J.M.; writing, review and editing, M.M., M.F.S., T.B., and C.A.C.; supervision, M.M., M.F.M,. T.B., and C.A.C.; project administration, M.M. and C.A.C.; funding acquisition, M.M., M.F.S., T.B., and C.A.C. All authors read and agreed to the published version of the manuscript.

Funding

The Royal Academy of Engineering supported this work, CASTOR Project: CompliAnt SofT Robotics (Grant IAPP1-100126).

Acknowledgments

The authors would like to thank the members of the Center for Biomechatronics and Howard Gardner Clinic for supporting this research. Likewise, we are grateful to the children and their parents, without whom this work would not have been possible.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ASDAutism spectrum disorder
CASTORCompliAnt SofT Robotics
CwASDChildren with ASD
ePTFEPolytetrafluoroethylene
DOFDegree of freedom
HMPEPolyethylene
PDParticipatory design
PLAPolylactic acid
ROMRange of motion
ROSRobotic Operating System
SARSocial assistance robotics
SEASeries elastic actuator
TPEThermoplastic elastomer
TPUThermoplastic polyurethane

References

  1. Cifuentes, C.A.; Pinto, M.J.; Céspedes, N.; Múnera, M. Social Robots in Therapy and Care. Curr. Robot. Rep. 2020, 1, 59–74. [Google Scholar] [CrossRef]
  2. Baio, J.; Wiggins, L.; Christensen, D.L.; Maenner, M.J.; Daniels, J.; Warren, Z.; Kurzius-Spencer, M.; Zahorodny, W.; Robinson, C.; Rosenberg, C.R.; et al. Prevalence of Autism Spectrum Disorder Among Children Aged 8 Years—Autism and Developmental Disabilities Monitoring Network, 11 Sites, United States, 2014. MMWR Surveill. Summ. 2018, 67, 1–23. [Google Scholar] [CrossRef] [PubMed]
  3. World Health Organization. Autism spectrum disorders. In Fact Sheets; WHO: Geneva, Switzerland, 2018. [Google Scholar]
  4. Eggebrecht, A.T.; Elison, J.T.; Feczko, E.; Todorov, A.; Wolff, J.J.; Kandala, S.; Adams, C.M.; Snyder, A.Z.; Lewis, J.D.; Estes, A.M.; et al. Joint attention and brain functional connectivity in infants and toddlers. Cereb. Cortex 2017, 27, 1709–1720. [Google Scholar] [CrossRef] [PubMed]
  5. American Psychiatric Association. DSM-5 Diagnostic Classification. In Diagnostic and Statistical Manual of Mental Disorders; American Psychiatric Association: Washington, DC, USA, 2013. [Google Scholar] [CrossRef]
  6. Belpaeme, T.; Baxter, P.E.; Read, R.; Wood, R.; Cuayáhuitl, H.; Kiefer, B.; Racioppa, S.; Kruijff-Korbayová, I.; Athanasopoulos, G.; Enescu, V.; et al. Multimodal Child-Robot Interaction: Building Social Bonds. J. Hum. Robot. Interact. 2012, 1, 33–53. [Google Scholar] [CrossRef] [Green Version]
  7. Cabibihan, J.J.; Javed, H.; Ang, M.; Aljunied, S.M. Why Robots? A Survey on the Roles and Benefits of Social Robots in the Therapy of Children with Autism. Int. J. Soc. Robot. 2013, 5, 593–618. [Google Scholar] [CrossRef]
  8. Pennisi, P.; Tonacci, A.; Tartarisco, G.; Billeci, L.; Ruta, L.; Gangemi, S.; Pioggia, G. Autism and social robotics: A systematic review. Autism Res. 2016, 9, 165–183. [Google Scholar] [CrossRef]
  9. Di Nuovo, A.; Conti, D.; Trubia, G.; Buono, S.; Di Nuovo, S. Deep learning systems for estimating visual attention in robot-assisted therapy of children with autism and intellectual disability. Robotics 2018, 7, 25. [Google Scholar] [CrossRef] [Green Version]
  10. Ramirez-Duque, A.A.; Frizera-Neto, A.; Bastos, T.F. Robot-Assisted Diagnosis for Children with Autism Spectrum Disorder Based on Automated Analysis of Nonverbal Cues. In Proceedings of the 2018 7th IEEE International Conference on Biomedical Robotics and Biomechatronics (Biorob), Enschede, The Netherlands, 26–29 August 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 456–461. [Google Scholar] [CrossRef]
  11. Anzalone, S.M.; Xavier, J.; Boucenna, S.; Billeci, L.; Narzisi, A.; Muratori, F.; Cohen, D.; Chetouani, M. Quantifying patterns of joint attention during human-robot interactions: An application for autism spectrum disorder assessment. Pattern Recognit. Lett. 2019, 118, 42–50. [Google Scholar] [CrossRef]
  12. Del Coco, M.; Leo, M.; Carcagni, P.; Fama, F.; Spadaro, L.; Ruta, L.; Pioggia, G.; Distante, C. Study of Mechanisms of Social Interaction Stimulation in Autism Spectrum Disorder by Assisted Humanoid Robot. IEEE Trans. Cogn. Dev. Syst. 2017, 8920, 1. [Google Scholar] [CrossRef]
  13. Yun, S.S.; Choi, J.; Park, S.K.; Bong, G.Y.; Yoo, H. Social skills training for children with autism spectrum disorder using a robotic behavioural intervention system. Autism Res. 2017, 10, 1306–1323. [Google Scholar] [CrossRef]
  14. Zheng, Z.; Das, S.; Young, E.M.; Swanson, A.; Warren, Z.E.; Sarkar, N. Autonomous robot-mediated imitation learning for children with autism. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 2707–2712. [Google Scholar] [CrossRef]
  15. Costescu, C.A.; Vanderborght, B.; David, D.O. Reversal Learning Task in Children with Autism Spectrum Disorder: A Robot-Based Approach. J. Autism Dev. Disord. 2014, 45, 3715–3725. [Google Scholar] [CrossRef] [PubMed]
  16. Kim, E.S.; Berkovits, L.D.; Bernier, E.P.; Leyzberg, D.; Shic, F.; Paul, R.; Scassellati, B. Social robots as embedded reinforcers of social behaviour in children with autism. J. Autism Dev. Disord. 2013, 43, 1038–1049. [Google Scholar] [CrossRef] [PubMed]
  17. Feil-Seifer, D.; Mataric, M. Automated detection and classification of positive vs. negative robot interactions with children with autism using distance-based features. In Proceedings of the 6th International Conference on Human-Robot Interaction-HRI ’11, Lausanne, Switzerland, 8–11 March 2011; p. 323. [Google Scholar] [CrossRef] [Green Version]
  18. Vallès-Peris, N.; Angulo, C.; Domènech, M. Children’s imaginaries of human-robot interaction in healthcare. Int. J. Environ. Res. Public Health 2018, 15, 970. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  19. Randall, N.; Bennett, C.C.; Šabanović, S.; Nagata, S.; Eldridge, L.; Collins, S.; Piatt, J.A. More than just friends: In-home use and design recommendations for sensing socially assistive robots (SARs) by older adults with depression. Paladyn 2019, 10, 237–255. [Google Scholar] [CrossRef] [Green Version]
  20. Nakadoi, Y. Usefulness of Animal Type Robot Assisted Therapy for Autism Spectrum Disorder in the Child and Adolescent Psychiatric Ward. In New Frontiers in Artificial Intelligence; Otake, M., Kurahashi, S., Ota, Y., Satoh, K., Bekki, D., Eds.; Springer International Publishing: Cham, Switzerland, 2017; pp. 478–482. [Google Scholar]
  21. Bartneck, C.; Belpaeme, T.; Eyssel, F.; Kanda, T.; Keijsers, M.; Šabanović, S. Human-Robot Interaction: An Introduction; Cambridge University Press: Cambridge, UK, 2020. [Google Scholar]
  22. Fletcher-Watson, S.; Adams, J.; Brook, K.; Charman, T.; Crane, L.; Cusack, J.; Leekam, S.; Milton, D.; Parr, J.R.; Pellicano, E. Making the future together: Shaping autism research through meaningful participation. Autism 2018, 1–11. [Google Scholar] [CrossRef]
  23. Merter, S.; Hasırcı, D. A participatory product design process with children with autism spectrum disorder. CoDesign 2016, 14, 170–187. [Google Scholar] [CrossRef]
  24. Huijnen, C.A.G.J.; Lexis, M.A.S.; Jansens, R.; de Witte, L.P. Mapping Robots to Therapy and Educational Objectives for Children with Autism Spectrum Disorder. J. Autism Dev. Disord. 2016, 1–15. [Google Scholar] [CrossRef] [Green Version]
  25. Simut, R.; Vanderfaeillie, J.; Peca, A.; Van de Perre, G.; Vanderborght, B. Children with Autism Spectrum Disorders Make a Fruit Salad with Probo, the Social Robot: An Interaction Study. J. Autism Dev. Disord. 2016, 46, 113–126. [Google Scholar] [CrossRef]
  26. Wood, L.J.; Robins, B.; Lakatos, G.; Syrdal, D.S.; Zaraki, A.; Dautenhahn, K. Developing a protocol and experimental setup for using a humanoid robot to assist children with autism to develop visual perspective taking skills. Paladyn 2019, 10, 167–179. [Google Scholar] [CrossRef]
  27. Ramírez-Duque, A.A.; Aycardi, L.F.; Villa, A.; Munera, M.; Bastos, T.; Belpaeme, T.; Frizera-Neto, A.; Cifuentes, C.A. Collaborative and Inclusive Process with the Autism Community: A Case Study in Colombia About Social Robot Design. Int. J. Soc. Robot. 2020. [Google Scholar] [CrossRef] [Green Version]
  28. Argall, B.D.; Billard, A.G. A survey of Tactile HumanRobot Interactions. Robot. Auton. Syst. 2010, 58, 1159–1176. [Google Scholar] [CrossRef] [Green Version]
  29. Libin, A.; Libin, E. Person-robot interactions from the robopsychologists’ point of view: The robotic psychology and robotherapy approach. Proc. IEEE 2004, 92, 1789–1803. [Google Scholar] [CrossRef]
  30. Stiehl, W.; Lieberman, J.; Breazeal, C.; Basel, L.; Lalla, L.; Wolf, M. Design of a therapeutic robotic companion for relational, affective touch. In Proceedings of the ROMAN 2005 IEEE International Workshop on Robot and Human Interactive Communication, Nashville, TN, USA, 13–15 August 2005; IEEE: Piscataway, NJ, USA, 2005; pp. 408–415. [Google Scholar] [CrossRef]
  31. Iocchi, L.; L, M.T.; Jeanpierre, L. Interaction for Social Robots Assisting Users in Shopping Malls. Int. Conf. Soc. Robot. 2015, 1, 264–274. [Google Scholar] [CrossRef]
  32. Srinivasan, S.M.; Eigsti, I.M.; Neelly, L.; Bhat, A.N. The effects of embodied rhythm and robotic interventions on the spontaneous and responsive social attention patterns of children with autism spectrum disorder (ASD): A pilot randomized controlled trial. Res. Autism Spectr. Disord. 2016, 27, 54–72. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  33. Chevalier, P.; Martin, J.C.; Isableu, B.; Bazile, C.; Iacob, D.O.; Tapus, A. Joint Attention using Human-Robot Interaction: Impact of sensory preferences of children with autism. In Proceedings of the 25th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN 2016, New York, NY, USA, 26–31 August 2016; pp. 849–854. [Google Scholar] [CrossRef]
  34. So, W.C.; Wong, M.K.Y.; Lam, C.K.Y.; Lam, W.Y.; Chui, A.T.F.; Lee, T.L.; Ng, H.M.; Chan, C.H.; Fok, D.C.W. Using a social robot to teach gestural recognition and production in children with autism spectrum disorders. Disabil. Rehabil. Assist. Technol. 2018, 13, 527–539. [Google Scholar] [CrossRef] [PubMed]
  35. David, D.O.; Costescu, C.A.; Matu, S.; Szentagotai, A.; Dobrean, A. Developing Joint Attention for Children with Autism in Robot-Enhanced Therapy. Int. J. Soc. Robot. 2018, 10, 595–605. [Google Scholar] [CrossRef]
  36. Zheng, Z.; Zhao, H.; Swanson, A.R.; Weitlauf, A.S.; Warren, Z.E.; Sarkar, N. Design, Development, and Evaluation of a Noninvasive Autonomous Robot-Mediated Joint Attention Intervention System for Young Children with ASD. IEEE Trans. Hum. Mach. Syst. 2018, 48, 125–135. [Google Scholar] [CrossRef]
  37. Ramírez-Duque, A.A.; Bastos, T.; Munera, M.; Cifuentes, C.A.; Frizera-Neto, A. Robot-Assisted Intervention for children with special needs: A comparative assessment for autism screening. Robot. Auton. Syst. 2020, 127, 103484. [Google Scholar] [CrossRef]
  38. Dautenhahn, K.; Nehaniv, C.L.; Walters, M.L.; Robins, B.; Kose-Bagci, H.; Mirza, N.A.; Blow, M. KASPAR—A Minimally Expressive Humanoid Robot for Human–Robot Interaction Research. Appl. Bionics Biomech. 2009, 6, 369–397. [Google Scholar] [CrossRef] [Green Version]
  39. Robins, B.; Dautenhahn, K. Developing play scenarios for tactile interaction with a humanoid robot: A case study exploration with children with autism. In International Conference on Social Robotics; Springer: Berlin/Heidelberg, Germany, 2010; pp. 243–252. [Google Scholar]
  40. Wainer, J.; Dautenhahn, K.; Robins, B.; Amirabdollahian, F. Collaborating with Kaspar: Using an autonomous humanoid robot to foster cooperative dyadic play among children with autism. In Proceedings of the 2010 10th IEEE-RAS International Conference on Humanoid Robots, Nashville, TN, USA, 6–8 December 2010; IEEE: Piscataway, NJ, USA, 2010; pp. 631–638. [Google Scholar] [CrossRef] [Green Version]
  41. Costa, S.; Lehmann, H.; Dautenhahn, K.; Robins, B.; Soares, F. Using a humanoid robot to elicit body awareness and appropriate physical interaction in children with autism. Int. J. Soc. Robot. 2015, 7, 265–278. [Google Scholar] [CrossRef] [Green Version]
  42. Peca, A.; Simut, R.; Pintea, S.; Vanderborght, B. Are Children with ASD more Prone to Test the Intentions of the Robonova Robot Compared to a Human? Int. J. Soc. Robot. 2015, 7, 629–639. [Google Scholar] [CrossRef]
  43. Hanson, D.; Mazzei, D.; Garver, C.; Ahluwalia, A.; De Rossi, D.; Stevenson, M.; Reynolds, K. Realistic Humanlike Robots for Treatment of Autism. In Proceedings of the 5th International Conference on Pervasive Technologies Related to Assistive Environments PETRA, Corfu, Greece, 26–29 June 2012; pp. 1–7. [Google Scholar]
  44. Vandevelde, C.; Saldien, J.; Ciocci, M.C.; Vanderborght, B. The use of social robot ono in robot assisted therapy. In Proceedings of the International Conference on Social Robotics, Bristol, UK, 27–29 October 2013. [Google Scholar]
  45. Vandevelde, C.; Saldien, J.; Ciocci, C.; Vanderborght, B. Ono, a DIY open source platform for social robotics. In Proceedings of the 8th International Conference on Tangible, Embedded and Embodied Interaction, Munich, Germany, 16–19 February 2014. [Google Scholar]
  46. Goris, K.; Saldien, J.; Vanderborght, B.; Lefeber, D. Mechanical design of the huggable robot Probo. Int. J. Humanoid Robot. 2011, 8, 481–511. [Google Scholar] [CrossRef] [Green Version]
  47. Saldien, J.; Goris, K.; Yilmazyildiz, S.; Verhelst, W.; Lefeber, D. On the Design of the Huggable Robot Probo. J. Phys. Agents 2008, 2, 3–11. [Google Scholar] [CrossRef]
  48. Shibata, T.; Mitsui, T.; Wada, K.; Touda, A.; Kumasaka, T.; Tagami, K.; Tanie, K. Mental commit robot and its application to therapy of children. IEEE/ASME Int. Conf. Adv. Intell. Mechatron. AIM 2001, 2, 1053–1058. [Google Scholar] [CrossRef]
  49. Stiehl, W.D.; Lieberman, J.; Breazeal, C.; Basel, L.; Cooper, R.; Knight, H.; Lalla, L.; Maymin, A.; Purchase, S. The Huggable: A therapeutic robotic companion for relational, affective touch. In Proceedings of the 2006 3rd IEEE Consumer Communications and Networking Conference, CCNC 2006, Las Vegas, NV, USA, 8–10 January 2006; Volume 2, pp. 1290–1291. [Google Scholar] [CrossRef]
  50. Grunberg, D.; Ellenberg, R.; Kim, Y.E.; Oh, P.Y. From robonova to hubo: Platforms for robot dance. In FIRA RoboWorld Congress; Springer: Berlin/Heidelberg, Germany, 2009; pp. 19–24. [Google Scholar]
  51. Shamsuddin, S.; Yussof, H.; Ismail, L.I.; Mohamed, S.; Hanapiah, F.A.; Zahari, N.I. Initial Response in HRI-a Case Study on Evaluation of Child with Autism Spectrum Disorders Interacting with a Humanoid Robot NAO. Procedia Eng. 2012, 41, 1448–1455. [Google Scholar] [CrossRef]
  52. Tapus, A.; Peca, A.; Aly, A.; Pop, C.; Jisa, L.; Pintea, S.; Rusu, A.S.; David, D.O. Children with autism social engagement in interaction with Nao, an imitative robot—A series of single case experiments. Interact. Stud. 2012, 13, 315–347. [Google Scholar] [CrossRef]
  53. Robins, B.; Dautenhahn, K. Kaspar, the social robot and ways it may help children with autism—An overview. Enfance 2018, 2018, 91–102. [Google Scholar] [CrossRef]
  54. Mehrabian, A. Communication without words. Commun. Theory 2008, 6, 193–200. [Google Scholar]
  55. Samadiani, N.; Huang, G.; Cai, B.; Luo, W.; Chi, C.-H.; Xiang, Y.; He, J. A Review on Automatic Facial Expression Recognition Systems Assisted by Multimodal Sensor Data. Sensors 2019, 19, 1863. [Google Scholar] [CrossRef] [Green Version]
  56. Scassellati, B.; Admoni, H.; Matarić, M. Robots for Use in Autism Research. Annu. Rev. Biomed. Eng. 2012, 14, 275–294. [Google Scholar] [CrossRef] [Green Version]
  57. Ricks, D.J.; Colton, M.B. Trends and considerations in robot-assisted autism therapy. In Proceedings of the 2010 IEEE International Conference on Robotics and Automation, Anchorage, AK, USA, 3–7 May 2010; IEEE: Piscataway, NJ, USA, 2010; pp. 4354–4359. [Google Scholar]
  58. Belpaeme, T.; Kennedy, J.; Ramachandran, A.; Scassellati, B.; Tanaka, F. Social robots for education: A review. Sci. Robot. 2018, 3, eaat5954. [Google Scholar] [CrossRef] [Green Version]
  59. Admoni, H.; Scassellati, B. Social eye gaze in human-robot interaction: A review. J. Hum.-Robot. Interact. 2017, 6, 25–63. [Google Scholar] [CrossRef] [Green Version]
  60. Welch, K.C.; Lahiri, U.; Warren, Z.; Sarkar, N. An approach to the design of socially acceptable robots for children with autism spectrum disorders. Int. J. Soc. Robot. 2010, 2, 391–403. [Google Scholar] [CrossRef]
  61. Marino, F.; Chilà, P.; Sfrazzetto, S.T.; Carrozza, C.; Crimi, I.; Failla, C.; Busà, M.; Bernava, G.; Tartarisco, G.; Vagni, D.; et al. Outcomes of a robot-assisted social-emotional understanding intervention for young children with autism spectrum disorders. J. Autism Dev. Disord. 2020, 50, 1973–1987. [Google Scholar] [CrossRef] [PubMed]
  62. Lee, C.; Kim, M.; Kim, Y.J.; Hong, N.; Ryu, S.; Kim, H.J.; Kim, S. Soft robot review. Int. J. Control. Autom. Syst. 2017, 15, 3–15. [Google Scholar] [CrossRef]
  63. Pratt, G.A.; Williamson, M.M. Series elastic actuators. IEEE Int. Conf. Intell. Robot. Syst. 1995, 1, 399–406. [Google Scholar] [CrossRef] [Green Version]
  64. Gomez, R.; Szapiro, D.; Galindo, K.; Nakamura, K. Haru: Hardware design of an experimental tabletop robot assistant. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, Chicago, IL, USA, 5–8 March 2018; pp. 233–240. [Google Scholar]
  65. Hopkins, J.B.; Culpepper, M.L. Synthesis of multi-degree of freedom, parallel flexure system concepts via Freedom and Constraint Topology (FACT)—Part I: Principles. Precis. Eng. 2010, 34, 259–270. [Google Scholar] [CrossRef]
  66. Ma, X.; Quek, F. Development of a child-oriented social robot for safe and interactive physical interaction. In Proceedings of the 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, Taiwan, 18–22 October 2010; IEEE: Piscataway, NJ, USA, 2010; pp. 2163–2168. [Google Scholar]
  67. Stoelen, M.F.; Bonsignorio, F.; Cangelosi, A. Co-exploring actuator antagonism and bio-inspired control in a printable robot arm. In International Conference on Simulation of Adaptive Behavior; Springer: Berlin/Heidelberg, Germany, 2016; pp. 244–255. [Google Scholar]
  68. Petit, F.; Friedl, W.; Höppner, H.; Grebenstein, M. Analysis and synthesis of the bidirectional antagonistic variable stiffness mechanism. IEEE/ASME Trans. Mechatron. 2014, 20, 684–695. [Google Scholar] [CrossRef]
  69. Casas, J.; Leal-Junior, A.; Díaz, C.R.; Frizera, A.; Múnera, M.; Cifuentes, C.A. Large-range polymer optical-fibre strain-gauge sensor for elastic tendons in wearable assistive robots. Materials 2019, 12, 1443. [Google Scholar] [CrossRef] [Green Version]
  70. Clayton, H.M.; Lanovaz, J.; Schamhardt, H.; Willemen, M.; Colborne, G. Net joint moments and powers in the equine forelimb during the stance phase of the trot. Equine Vet. J. 1998, 30, 384–389. [Google Scholar] [CrossRef]
  71. Tronick, E.Z.; Morelli, G.A.; Ivey, P.K. The Efe forager infant and toddler’s pattern of social relationships: Multiple and simultaneous. Dev. Psychol. 1992, 28, 568. [Google Scholar] [CrossRef]
Sample Availability: Samples of the mechanical trials and the case study are available from the authors.
Figure 1. Real prototype and characteristics of the CASTOR robot. Within the robot, the left side shows the outfit, and the right part illustrates the mechanical structure. The measurements on CASTOR refer to the actual sizes.
Figure 1. Real prototype and characteristics of the CASTOR robot. Within the robot, the left side shows the outfit, and the right part illustrates the mechanical structure. The measurements on CASTOR refer to the actual sizes.
Actuators 09 00091 g001
Figure 2. Guidelines for the robot design from the study based on the participatory design methodology.
Figure 2. Guidelines for the robot design from the study based on the participatory design methodology.
Actuators 09 00091 g002
Figure 3. Experimental setup for the ergonomic analysis. (A) illustrates the sitting condition in a chair. (B) shows the sitting condition on the floor. (C) is comprised of the children’s range of vision for both scenarios, the line of sight of the robot, and the optimal height range of the robot. The variable U refers to the user involved in this analysis (U1: 10-year-old and U2: 5-year-old); H denotes the distance between the sacrum and the user’ line of sight; H c represents the chair height; H t symbolizes the table height; H 1 2 ; m a x m i n indicates the maximum values in the visual range; H n ; i d e a l constitutes the child’s line of sight; and D illustrates the distance between the child and the robot. The numerical values of these variables are given in Table 2.
Figure 3. Experimental setup for the ergonomic analysis. (A) illustrates the sitting condition in a chair. (B) shows the sitting condition on the floor. (C) is comprised of the children’s range of vision for both scenarios, the line of sight of the robot, and the optimal height range of the robot. The variable U refers to the user involved in this analysis (U1: 10-year-old and U2: 5-year-old); H denotes the distance between the sacrum and the user’ line of sight; H c represents the chair height; H t symbolizes the table height; H 1 2 ; m a x m i n indicates the maximum values in the visual range; H n ; i d e a l constitutes the child’s line of sight; and D illustrates the distance between the child and the robot. The numerical values of these variables are given in Table 2.
Actuators 09 00091 g003
Figure 4. Mechanical structure and modules implemented in CASTOR. The right lower box summarizes the elements integrated into the mechanical structure. The right upper box presents the modules of the robot.
Figure 4. Mechanical structure and modules implemented in CASTOR. The right lower box summarizes the elements integrated into the mechanical structure. The right upper box presents the modules of the robot.
Actuators 09 00091 g004
Figure 5. Mechanisms and actuators for the CASTOR robot’s face. (A) shows the elements (i.e., actuators, screens, and 3D-printed pieces) involved in the facial expressions. (B) illustrates in two views (i.e., front and side) the system for emulating the speech. The numbers represent the DOFs in this module.
Figure 5. Mechanisms and actuators for the CASTOR robot’s face. (A) shows the elements (i.e., actuators, screens, and 3D-printed pieces) involved in the facial expressions. (B) illustrates in two views (i.e., front and side) the system for emulating the speech. The numbers represent the DOFs in this module.
Actuators 09 00091 g005
Figure 6. Neck mechanism based on series elastic actuators. (A) illustrates the system attached from the motor in the robot base (black box) to the head. (B) shows the mechanism in two views and exhibits the elements involved (i.e., aluminium bars and 3D-printed pieces).
Figure 6. Neck mechanism based on series elastic actuators. (A) illustrates the system attached from the motor in the robot base (black box) to the head. (B) shows the mechanism in two views and exhibits the elements involved (i.e., aluminium bars and 3D-printed pieces).
Actuators 09 00091 g006
Figure 7. Mechanical design of CASTOR’s arm. (A) illustrates the joint movement concerning the actuator. (B) shows the adjustment process to change the system stiffness in the joint. The red lines represent the bioinspired elastic element.
Figure 7. Mechanical design of CASTOR’s arm. (A) illustrates the joint movement concerning the actuator. (B) shows the adjustment process to change the system stiffness in the joint. The red lines represent the bioinspired elastic element.
Actuators 09 00091 g007
Figure 8. Mechanical design of the huggable structure. F_ext denotes external forces applied to the robot. The shaded drawing in the middle represents the movement performed by the system. The lower circle illustrates the pneumatic pistons for the huggable function.
Figure 8. Mechanical design of the huggable structure. F_ext denotes external forces applied to the robot. The shaded drawing in the middle represents the movement performed by the system. The lower circle illustrates the pneumatic pistons for the huggable function.
Actuators 09 00091 g008
Figure 9. Sensors and the actuator involved in the perception module. The highlighted points represent the haptic sensors, based on Velostat, located on the antenna, head, hands, and shoes. The other elements refer to the board and speaker placed on the lower part of the robot.
Figure 9. Sensors and the actuator involved in the perception module. The highlighted points represent the haptic sensors, based on Velostat, located on the antenna, head, hands, and shoes. The other elements refer to the board and speaker placed on the lower part of the robot.
Actuators 09 00091 g009
Figure 10. Electronic system, connections, and communication protocols of CASTOR. The right boxes summarize the elements (top) and connection types (bottom) of the robot.
Figure 10. Electronic system, connections, and communication protocols of CASTOR. The right boxes summarize the elements (top) and connection types (bottom) of the robot.
Actuators 09 00091 g010
Figure 11. Emotions performed by CASTOR’s face: (A) wonder, (B) happiness, (C) sadness, and (D) anger.
Figure 11. Emotions performed by CASTOR’s face: (A) wonder, (B) happiness, (C) sadness, and (D) anger.
Actuators 09 00091 g011
Figure 12. CASTOR robot functionalities for potential use in therapy scenarios. (A) illustrates the greeting. (B) shows the robot pointing to the head. In (C), the robot points to the eyes, and (D) shows the robot dancing.
Figure 12. CASTOR robot functionalities for potential use in therapy scenarios. (A) illustrates the greeting. (B) shows the robot pointing to the head. In (C), the robot points to the eyes, and (D) shows the robot dancing.
Actuators 09 00091 g012
Figure 13. Motor load capacity for the neck and the arms in the blocking state. The X-axis represents the percentage of the total ROM of the element (i.e., 180 degrees for the neck and 105 degrees for the arm). The black lines represent the stiff condition for the neck (white point and segmented line) and arm (black point and continuous line). The red lines refer to the flexible condition for both parts in the same convention.
Figure 13. Motor load capacity for the neck and the arms in the blocking state. The X-axis represents the percentage of the total ROM of the element (i.e., 180 degrees for the neck and 105 degrees for the arm). The black lines represent the stiff condition for the neck (white point and segmented line) and arm (black point and continuous line). The red lines refer to the flexible condition for both parts in the same convention.
Actuators 09 00091 g013
Figure 14. Physical interactions exhibited by the participants during the session: (A) blocking, (B) swing, and (C) hard.
Figure 14. Physical interactions exhibited by the participants during the session: (A) blocking, (B) swing, and (C) hard.
Actuators 09 00091 g014
Table 1. Characteristics of robotic platforms aimed at therapy for CwASD.
Table 1. Characteristics of robotic platforms aimed at therapy for CwASD.
RobotDOFsTypeActuationSensorsMaterialsFacial
Expressions
Functionalities
CASTORTotal: 14
mouth: 3
eyebrows: 2
neck: 1
shoulders: 2 × 2
elbows: 1 × 2
passive: 2
HumanoidSEA-Tactile sensorsLow-cost 3D-
printed and PLA
Yes-Display facial expressions including
happy, sad, angry, and surprise
-Answer with a sentence, sound,
movement, or a combination of them
when the sensor is activated
-Arm and head active movements
KASPAR [38,41]Total: 17
mouth: 2
eyes: 2
eyelids: 1
neck: 3
arms: 4 × 2
torso: 1
HumanoidStiff-Cameras in eyes
-Force sensor resistor
Fibreglass body,
aluminium frame
and head parts,
silicone rubber face
Yes-Detection of faces and objects
-Minimally expressive face
and arms able to produce gestures
-Answer with a sentence and classifies
the force when the force sensitive sensor
(FSR) is activated
-Arm and head active movements
NAO [9,34]Total: 25
head: 2
arms: 5 × 2
hand: 1 × 2
pelvis: 1
legs: 5 × 2
HumanoidStiff-OmniVision cameras
-Inertial sensor
-Sonar range finder
-Infrared sensors
-Tactile sensors
-Pressure sensors
-Microphones
Polycarbonate-ABS
plastic, polyamide,
and carbon fibre-
reinforced
thermoplastic
Yes-Detection of faces and objects
-Walking capabilities and
grasping/gestural hands and arms
-Perform different gestures and
play audio clips while gesturing
-Arm, leg, and head active
movements
Zeno [43]Total: 36
face: 8
neck: 3
arms: 12
waist: 1
legs: 12
HumanoidStiff-Cameras in eyes
-Gyroscope
-Accelerometer
-Compass
-Load sensor
-Encoder
-Cliff sensors
-Ground contact sensors
-Infrared sensors
-Bump sensors (feet)
-Grip-load sensors
-Microphones
Frubber, plastic,
and aluminium
Yes-Detection of faces and objects
-Walking capabilities and
grasping/gestural hands and arms
-Display naturalistic expressions
including happy, sad, angry,
disgust, fear, surprise, and neutral
-Arm, leg, and head active
movements
ONO [44,45]Total: 13
face: 13
HumanoidStiff-Tactile sensorsPolyurethane foamYes-Display facial expressions including
happy, sad, angry, and surprise
-Answer with a sentence, a sound,
or a combination of them when
the sensor is activated
Probo [46,47]Total: 20
head: 3
mouth: 3
eyes: 3
eyelids: 2
eyebrows: 4
ears: 2
torso: 3
Pet-likeSEA-Camera
-Tactile sensors
-Microphones
Body covered with
FlexFoam-iT! X
and cotton fabric
Yes-Detection of faces and objects
-Detection of the direction and
intensity of sounds
-Locates the place where it is
touched and the strength of the touch
-Arm and head active movements
Paro [48]Total: 7
eyelids: 1 × 2
neck: 2
flipper: 1 × 2
tail: 1
Pet-likeStiff-Light sensors
-Temperature
sensors
-Tactile sensors
-Microphones
Plastic skeleton
and body covered
with soft white fur
No-Light and dark recognition
-Recognize voice and
sound direction
-Respond to touch and
classifies it as caress or blow
Huggable
Bear [30,49]
Total: 8
eyebrows: 2
ears: 1
neck: 3
arms: 2
Pet-likeStiff and
compliant
mechanisms
-Cameras in eyes
-Temperature
sensors
-Force sensors
-Electric field
sensors
Body covered with
soft fur
No-Detection of faces
-The combination of multiple sensors
determines the position and if it is being
touched by a person or by an object
Robonova [42,50]Total: 16
arms: 3 × 2
legs: 5 × 2
HumanoidStiff-Infrared sensorsAnodised aluminium
plates
No-Arm and leg active movements
Table 2. Variables used and parameters estimated in the ergonomic analysis.
Table 2. Variables used and parameters estimated in the ergonomic analysis.
VariableDescriptionRange (cm)
HHeight from the base to eyes43–67
H c Chair height30–40
H t Table height63–76
H 1 m a x Maximum robot height Scenario 129–67
H 1 m i n Minimal height Scenario 120–38
H 2 m a x Maximum height Scenario 249–95
H 2 m i n Minimal height Scenario 238–40
DDistance front face object35–62
Table 3. Physical interaction during the familiarization phase. These data were generated by computing the times and the instances in which the participants made an interaction with the robot throughout the session. Quantity (Qty) denotes the number of times the event occurred.
Table 3. Physical interaction during the familiarization phase. These data were generated by computing the times and the instances in which the participants made an interaction with the robot throughout the session. Quantity (Qty) denotes the number of times the event occurred.
BlockingSwingHard
Time (s)QtyTime (s)QtyTime (s)QtyTotal
Interaction (%)Communication
Subject 1 (S1)218287411436.4yes
Subject 2 (S2)1558312619.6yes
Subject 3 (S3)0000431.7no

Share and Cite

MDPI and ACS Style

Casas-Bocanegra, D.; Gomez-Vargas, D.; Pinto-Bernal, M.J.; Maldonado, J.; Munera, M.; Villa-Moreno, A.; Stoelen, M.F.; Belpaeme, T.; Cifuentes, C.A. An Open-Source Social Robot Based on Compliant Soft Robotics for Therapy with Children with ASD. Actuators 2020, 9, 91. https://doi.org/10.3390/act9030091

AMA Style

Casas-Bocanegra D, Gomez-Vargas D, Pinto-Bernal MJ, Maldonado J, Munera M, Villa-Moreno A, Stoelen MF, Belpaeme T, Cifuentes CA. An Open-Source Social Robot Based on Compliant Soft Robotics for Therapy with Children with ASD. Actuators. 2020; 9(3):91. https://doi.org/10.3390/act9030091

Chicago/Turabian Style

Casas-Bocanegra, Diego, Daniel Gomez-Vargas, Maria J. Pinto-Bernal, Juan Maldonado, Marcela Munera, Adriana Villa-Moreno, Martin F. Stoelen, Tony Belpaeme, and Carlos A. Cifuentes. 2020. "An Open-Source Social Robot Based on Compliant Soft Robotics for Therapy with Children with ASD" Actuators 9, no. 3: 91. https://doi.org/10.3390/act9030091

APA Style

Casas-Bocanegra, D., Gomez-Vargas, D., Pinto-Bernal, M. J., Maldonado, J., Munera, M., Villa-Moreno, A., Stoelen, M. F., Belpaeme, T., & Cifuentes, C. A. (2020). An Open-Source Social Robot Based on Compliant Soft Robotics for Therapy with Children with ASD. Actuators, 9(3), 91. https://doi.org/10.3390/act9030091

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop