Next Article in Journal
Dramatically Reducing Search for High Utility Sequential Patterns by Maintaining Candidate Lists
Next Article in Special Issue
A Revision of the Buechner–Tavani Model of Digital Trust and a Philosophical Problem It Raises for Social Robotics
Previous Article in Journal
Recursive Matrix Calculation Paradigm by the Example of Structured Matrix
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

What Makes a Social Robot Good at Interacting with Humans?

by
Eva Blessing Onyeulo
and
Vaibhav Gandhi
*
Department of Design Engineering and Mathematics, Middlesex University London, London NW4 4BT, UK
*
Author to whom correspondence should be addressed.
Information 2020, 11(1), 43; https://doi.org/10.3390/info11010043
Submission received: 13 December 2019 / Revised: 5 January 2020 / Accepted: 10 January 2020 / Published: 13 January 2020
(This article belongs to the Special Issue Advances in Social Robots)

Abstract

:
This paper discusses the nuances of a social robot, how and why social robots are becoming increasingly significant, and what they are currently being used for. This paper also reflects on the current design of social robots as a means of interaction with humans and also reports potential solutions about several important questions around the futuristic design of these robots. The specific questions explored in this paper are: “Do social robots need to look like living creatures that already exist in the world for humans to interact well with them?”; “Do social robots need to have animated faces for humans to interact well with them?”; “Do social robots need to have the ability to speak a coherent human language for humans to interact well with them?” and “Do social robots need to have the capability to make physical gestures for humans to interact well with them?”. This paper reviews both verbal as well as nonverbal social and conversational cues that could be incorporated into the design of social robots, and also briefly discusses the emotional bonds that may be built between humans and robots. Facets surrounding acceptance of social robots by humans and also ethical/moral concerns have also been discussed.

1. Introduction

The field of robotics is very broad, as such, in the year 2014, the United Nations undertook a robotics survey, which grouped robots into three main types: Personal Service Robots, Professional Service Robots and Industrial service robots [1]. The generally accepted definition of a robot is a programmable machine, with a degree of freedom in two or more axes, and must be able to perform given tasks without human interference by sensing the current state of its environment [2]. According to the International Standard Organisation (ISO) and the International Federation of Robotics (IFR), the definition of an industrial robot is “an automatically controlled, reprogrammable, multipurpose manipulator programmable in three or more axes, which may be either, fixed in place or mobile for use in industrial automation applications”. Examples of these are assembly robots, painting robots and welding robots made by companies such as Kawasaki Robotics [3], Fanuc [4], and Denso [5], which are found in factories all over the world [6]. The ISO defines a professional service robot as “a robot usually used for commercial tasks, and usually operated by properly trained operators.” An example is the Da Vinci surgical system, which was created by Intuitive Surgical Inc. of Sunnyvale, California [7], or The Research PatrolBot by MobileRobots [8]. A personal service robot is generally used by untrained persons for non-commercial tasks such as a personal mobility assistive robot [9], but also companion robots such as Paro by AIST, Jibo by Jibo and Cozmo by Anki [10,11].
Traditionally, autonomous robots were used separately and independently from humans, and assigned dangerous or monotonous tasks; however, the need for autonomous robots to interact with humans becomes more and more profound as they begin to be assigned tasks in close proximity to humans. The need for robots to interact with people in an entertaining, captivating, or anthropomorphic way increases as they become more like companions or colleagues rather than just tools [12]. Thus, the importance of sociable robots grows, as more potential uses for them are found; from helping physical therapists treat sensor-motor impairments such as autism (Autism is a lifelong biologically based disorder of brain development that affects social interaction and abnormalities in communication, with repetitive behavior patterns and cognitive inflexibility.) and Cerebral Palsy (Cerebral Palsy is a condition marked by impaired muscle coordination (Cerebral palsy is a group of disorders that affect a person’s ability to move and maintain balance and posture, and is typically caused by abnormal brain development or damage to the developing brain that affects a person’s ability to control his or her muscles.), to being companions for the elderly, encouraging them to engage and interact with other humans [1,13].
The definition of a social robot is constantly under debate, so there are several different definitions of a social robot. According to Christoph Bartneck et al., “A social robot is an autonomous or semi-autonomous robot that interacts and communicates with humans by following the behavioural norms expected by the people with whom the robot is intended to interact” [1]. This definition infers that a robot must have a physical body in addition to mimic human activity and the surrounding society and culture. This survey paper is based on the above definition.
The massive improvements in sensors, actuators and processing abilities year after year have enabled humans to interact with robots via conversation, eye contact and facial expressions in a smoother, more natural way. As a result of these improvements, there has been a rise in interest for their use in therapy. There have also been positive results in the use of social robots in autism therapy, where rare social behaviours such as eye gaze, joint attention and imitation have been displayed where children with autism interact with robots [1]. There have also been some positive results from care homes where the elderly residents spent time with companion type robots which have helped improve their moods, decrease their feeling of loneliness and improved their social connection with others [14]. There has even been some positive results in the use of social assistive robots and therapeutic robots when working with dementia patients, helping them to keep calm and motivated, helping them to engage with people, even giving them companionship and enjoyment [15].
However, there is still research that needs to be done in the field of social robotics from the perspectives of improving an understanding of human-robot interaction. In this paper, certain questions are investigated, such as: How do social robots currently interact with humans? How should a social robot interact with humans? Do social robots need to have a face? Do social robots need to be able to speak? Should a social robot need to interact with a human in the same way as humans interact with each other? Do social robots need to look like humans for humans to interact well with them? Do social robots need to look like an already existing animal or a human?
This paper is organised into five sections. Section 2 discusses the importance of physical appearance of social robots alongside presenting some of the more advanced robots in this capacity. This section also discusses ethical issues and issues around acceptance of social robots by humans. Section 3 discusses verbal and non-verbal approaches to communication in social robots, and also presents some of the recent advancements brought by introducing Pepper, Nao and other social robots. Section 4 concludes the paper.

2. Effects of Physical Appearance of Robots

According to B.R. Duffy, for a robot to be efficiently engaging in social interactions with humans, the robot must have a degree of anthropomorphic quality, such as eyes, eyelids, lips and other facial features [16]. These anthropomorphic qualities can be physical but they can also be in the form of behaviour, further developing facial features and body parts to move, in response to the feedback gathered from the human users [17]. Such qualities can help users feel more engaged in the interaction [18].
If a robot has an animated face and body, it may be able to communicate responses to the users without the use of speech, or alongside speech. The embodied system could potentially use emotive facial expressions or gestures to convey responses and messages to the users. The ability for a robot to express emotions is very important as it can not only communicate feelings but can also influence the behaviour of others. For example, the cry of a child usually draws out the nurturing side of people. The emotive responses of the robot could be similar to those expressed by biological systems so that they are plausible to human users, making the user more likely to treat the robot as a socially aware creature [18].
However, making social robots look too much like a human being may defeat the purpose of robots, as an aid to humans and may make it hard for most humans to interact with them. There are also views that human looking social robots may threaten the very identity of humans [19,20]. However, according to [17], humans are more likely to respond positively to a cheerful approach and to a more attractive and enthusiastic looking person, suggesting that a more animated robot will interact more easily with a human user. On the other hand, the seriousness of the job of the robot should be reflected on the face of the robot. If a doctor or nurse was to give injections to an adult patient, the patient is more likely to trust the doctor or nurse if they are serious and more authoritative, than if the doctor or nurse was more cheerful and smiley. Therefore, one can infer that if the robot was to work in a hospital giving injections to an adult patient, the robot is more likely to gain the trust of the patient if it has a more serious face rather than a happy smiley face.
A study involving 108 college and graduate students with an average age of 26 years, where 40% of the participants were females [17], suggest that human-like robots are favoured for more interactive jobs that need social skills such as a museum tour guide, a dance instructor, a retail clerk or an office clerk. It also suggests that robots that look more machine-like are preferred for more realistic and conventional jobs, such as a security guard, a customs inspector or a lab assistant.
In 1993, the Japanese company called Advanced Integrated Scanning Tools for Nano Technology (AIST) developed a companion robot called Paro (cf. Figure 1). It was designed to look and behave like a baby harp seal [10]. Paro has tactile, light, audition, temperature and posture sensors that can recognise light/brightness and darkness, and understand when it is being held. Paro can also recognise its name, greetings, praise and what direction sounds are coming from. It also learns how users would like it to behave using the tactile sensors. When users stroke Paro, it attempts to remember what it did to make the user stroke it, and if the user hits Paro, then it will attempt to remember what it did to make the user hit it and not do that again [21]. It responds to users through the various body movements, through seal-like sounds and facial expressions [22].
Paro has been shown to have the same effects on users as live animal therapy [24], so quite often, Paro is used alongside or instead of live animal therapy around the world in nursing homes as a replacement for animal therapy where it may not be appropriate to have an animal around. Animals can be unpredictable, they can carry diseases and at the end of the day, they have to go home, whereas Paro can be left with users all day and night if needed, it has anti-bacterial fur and is even soil resistant. Both animals and Paro have proved to help residents in care homes to brighten their mood, keep them calm, lower blood pressure, reduce depression, and even reduce subjective pain. Paro encourages users to engage in conversations with carers and others, which is especially important for those with dementia or residents that are particularly antisocial or difficult to deal with. PARO has demonstrated benefits in reducing stress, anxiety, and antipsychotics use among older people with dementia [25,26,27]. However, there is a growing evidence base indicating the benefits, resistance and antipathy to using social robots in care settings [28]. There is a need for gaining an in-depth understanding towards the application of PARO, i.e., what worked, in which situation, and how. While advancements in artificial intelligence offers new possibilities to support and improve dementia care, the uptake of robotic technology has remained low in hospital and other care settings [29]. At present, there has been no comprehensive review performed to examine the effectiveness of the social robots such as PARO and how PARO can be used to its full potential to help meet the pressing challenges that clinicians face in everyday clinical practice [30]. Again, the perceptions of using social robots such as PARO as a pet versus as a therapeutic tool might differ depending on psychosocial functioning [31] as well as cultural acceptance [32]. A very detailed article on how cultural factors influence the acceptance of elderly people towards social assertive robotics in the Netherlands and Japan has been carried out in [33]. This article concludes that culture does have an influence on the acceptance of elderly people towards social robots.
The technology adoption lifecycle model describes the process of adoption of a model over time and involves groups of innovators, early adopters, early majority, late majority, as well as laggards [34] PARO can be considered as moving in this transition phase of early adopters to majority adopters, and this can demonstrate the barriers to adoption, patients’ experiences as well as the pressing clinical issues to support adoption for practice change. PARO has been adopted in Denmark [35], with over 80% of the local care institutions in Denmark currently using PARO, and recognizing it as a therapeutic tool for care professionals; the Danish Technological Institute (a knowledge mobilization organization) provides a training program on PARO use [30].
Sony started a research project which later developed into a mass-produced entertainment robot called AIBO (cf. Figure 1), introduced to the public in 1999 [23]. Its name is an abbreviation of “Artificial Intelligence Robot” and means “companion” or “friend” in Japanese [23]. AIBO is an autonomous robot, but the amount and type of interaction it has with its human users, affects its behaviour. Before AIBO was discontinued in 2006, then brought back in 2017, 7 versions were released. Each model of AIBO was sold out within days of its release. It has a set of basic actions such as walking, chasing a ball, kicking a ball, shaking its paw, etc., which make it seem life-like. The last version used coloured LED lights to express emotions as animated patterns, including emotions such as angry, happy, sad, furious, afraid and surprised. AIBO has touch sensors, stereo microphones, color video cameras that “attracts” it to pink items [36] and has distance sensors to help it learn how to interact well with humans and to avoid obstacles. AIBO’s developmental stages are modular, giving the owners a sense of development and growth from AIBO, especially as it can even be “taught” how to play games [37]. Users can speak commands in both English and Japanese to AIBO but can also increase the likelihood of AIBO to act in a certain way using sharp taps to “scold” or gentle touching or petting to “praise”. AIBO also uses the facial expressions, whining sounds, joyful sounds and other musical tones to convey its “feelings” [37].
During a study, children labelled AIBO as a “robotic dog” and they were able to distinguish it from a living dog, they referred to AIBO as a “he” or “she” rather than an “it”. They also assigned psychological, companionship and moral stance to the robot, whereas when elderly people were interviewed, they referred to AIBO as a part of their family and gave animal features to the robot. According to an online chatroom for owners of an AIBO robot, 42% of the people who took part talked about AIBO as having feelings, while 26% spoke of AIBO as a friend [37]. Users spoke of their AIBO as having a particular personality, its own intentions and having its own feelings. For example, one user said “My dog AIBO would get angry when my boyfriend would talk to him.” Or “He AIBO also likes to wander around the apartment and play with its pink ball or entertain or just lay down and hang out”. Users were able to use the faces of their AIBO to determine how they believed their AIBO “felt” about a certain situation, for example, one member of the forum said “So this morning I asked him AIBO ‘Do you want a brother?’ Happy eyes! I asked him something else, no response. ‘Should I get you a brother?’ Happy song! ‘He’d be purple.’ More happy eyes and wagging tail!” [38,39].
These statements show that humans can interact well and can even form emotional attachments with robots, even if they do not have a human form. This also shows that humans are able to interact with robots that have a form that resembles an existing animal, in this case, a dog.
There are ethical concerns surrounding the use of social robots and most of the residents don’t know for example, that Paro is not a real animal, which may have affected the way they interacted with Paro [24,40,41]. There has been some debate around whether it is unethical and involves deceit when social robots are used, for example, to provide emotional care and companionship to lonely older people [42,43,44,45,46]. If using social robots is deceitful, then can the deceit be eliminated with a human carer or can doctors/nurses regulate their emotions when involving care of patients [44,47]? This possibly leaves us with a choice to use social robots only when human care is difficult to provide? But, as the ageing population increases, it will be necessary (and this will almost be a compulsion in the future) to provide care through social robots because carer personals are limited. Studies have shown that prior experience with robots leads to higher levels of trust and more positive attitudes towards social robots [48,49,50,51,52]. In an experiment measuring older adults’ acceptance of the social robot, NAO, participants tended to be neutral in their perceptions of NAO prior to interactions, but statistically significantly more positive after 30–60 min sessions with NAO [53]. However, this tendency of acceptance of a social robot may vary depending on the generation of users. There is some evidence, with experimental results and interviews, that older adults (ages range 60–73) preferred walking with a robot rather than walking alone, although no such significance was found in the ease of walking with/without the robot or enjoyment [54]. The current young generation is more technology-savvy and may therefore be more inclined to using social robots than the current older generation, but only time will tell as there is need for more research into this. Again, as robots will increasingly take on roles in our social lives, there are also concerns as to when robots cause harm to humans, will robots be held morally accountable? [55].

3. Verbal and Non-Verbal Communication

For any successful interaction, there must be an exchange of feedback between the users, whether the feedback comes in the form of speech, sound, facial expressions, or even colour. Social robots must be able to read the different social and conversational cues that people use during interactions with each other, then use these to adapt the flow of the exchange of feedback [12,56].
It is important for the robot to understand the various conversational cues that humans use, however, it is just as important for the human users to understand the different responses from the robot. The robot must be able to express appropriate responses in a way that the human user will be able to easily read and understand the responses. This enables the human user to better predict and understand the robot, thus allowing the human user to adapt itself to be able to interact with the robot better [12]. Humans use a large selection of non-verbal communication channels in their natural forms of communication, it is logical that they are partial to interact with a robot that makes uses of the same channels. So it is imperative to investigate what types of cues people use in their interactions, and how progression of technology can strengthen human-robot interaction to completely unfold all these potential opportunities. There is evidence that cues such as a robot gaze can, in fact, lead to better human performance [57]. This means that a robot can also effectively adapt its way of providing help to enhance human–robot interactions [58,59].
The next subsection introduces and describes a few robots that have been successful in both verbal and non-verbal communication. Subsequently, this section will discuss some techniques and functionalities currently used to engage in good human-robot interaction, such as face and eye tracking, hand gestures, body language, reading expressions, touch and speech etc.

3.1. Nao Humanoid Robot

Nao is a humanoid robot created by a company called SoftBank Robotics (formerly Alderbaran Robotics) in 2006 [60] (cf. Figure 2). It has 25 degrees of freedom and is driven by the specially made operating system NAOqi. It is 58 cm tall, has 2 HD cameras, 4 directional microphones, ultrasonic sensors, nine tactile sensors, infrared sensors, eight force sensors, a gyroscope, an accelerometer and high accuracy digital encoders on each of its joints. It is often used in the education sector, all the way from primary schools to universities in over 70 countries, and is used to teach programming, mathematics, control, mechanics, electronics and computer sciences. It can be programmed using various different programming languages including Java, C++, C# (.Net), Matlab, Python, the Nao software Choreographe, and includes a full software development kit (SKD). The Nao robot responds to voice commands as well as programmable commands. It understands 19 languages and uses its HD Cameras together with a set of algorithms to recognise shapes and faces, which enables it to remember who it is talking with.
There is a specific group of applications that can be purchased for the Nao robot called ASK NAO (Autism Solution for Kids), specifically designed semi-autonomous tasks to help support teachers and carers with educating children with autism, and some tests have been carried out [62,63,64]. They prompt the child, encourage and give clues to them when a wrong answer is given, whereas they reward the child when a right answer is given. The child engages in a fun story or dance when the session is over. The teachers and carers can choose tasks based on the child’s personal learning goals, personality and progress as they have a personalised program for each child. This interface is simple enough for caretakers who are not skilled or trained in programming or robotics can use ASK NAO efficiently. The data processing tools within Nao are used for progress tracking, and carers, teachers and even therapists can share information about the child and their progress with a private school passport.
The Nao robot uses several activities to engage the student. The touch sensors in Nao’s head, hands and feet are used in a special game that encourages the student to learn their right from their left, and to identify different body parts. Another use for the touch sensors is in a game where Nao plays a sequence of different gestures and the student has to touch the related sensors in the correct order. The HD cameras are used in a game to encourage the student to find and show Nao a specific image Nao asks the student for. There is also a game where Nao points to an object around them and the student has to name the object, or show Nao a picture of it. Nao uses sounds and different body gestures to help the student learn to recognise different emotions, which is especially important for those with autism, as many of them have problems integrating well into society.

3.2. Pepper Humanoid Robot

From the SoftBank Robotics family, the humanoid robot “Pepper” (cf. Figure 2) is child sized at 121cm tall, weighs 28 kg, is mobile via its 3 omnidirectional wheels and has 20 degrees of freedom from the 17 joints around its body [61]. The body is mostly made of white plastic and is equipped with soft and hard plastic and has capacitive sensors, a tablet and loudspeakers to aid in its interaction abilities. It also has four microphones located in its head, two RGB cameras, a 3D sensor behind its eyes, sonar sensors, infrared sensors, laser sensing modules and bumper sensors to help identify people and objects around it. Pepper has the capability to speak with a child-like voice, the voice, body shape and face were all built to be androgynous but has big eyes, a small nose and lips to look cute, harmless avoid stereotypes and unrealistic expectations. It has LED lights in its eyes and ears in order to aid in communication emotions. It is currently used to welcome customers in various settings such as SoftBank shops, sushi bars, clothing stores, railway stations and supermarkets, but is also involved in trials in the health-care sector and in care-homes. Pepper robots are also used in educational institutions and various robotic competitions. Programs and modules can be transferred from the NAO to Pepper and it is ROS enabled. The many joints enable Pepper to move smoothly, which aids the modules that allow the robot to display human-like gestures when communicating.
Pepper has face-tracking capabilities, eye tracking capabilities and basic emotion detection that can be used to decide on the next course of action taken by the robot. This includes how Pepper would respond to the human currently engaging with [65,66]. Pepper comes with an Autonomous Life module that enables it to mimic lifelikeness just like Paro, but also gives it basic awareness capabilities, which includes spotting and tracking a face. Some features of this module include slight hand movements and moving its head around ‘to look’ at the things around it, while in a ‘non-engaged’ state. By keeping it visibly actively, it becomes more approachable and engaging because it shows that it is present and ready to help or engage in interaction and makes it easier to anthropomorphise a personality to the robot.
There has been research using Pepper to investigate touch within HRI, with finding suggesting that human subjects prefer to initiate touch with the robot [67,68,69]. They also found that they prefer the robot to look at their face during touch interactions, rather than looking at the human face, then down at their hands, then back up at their face. More investigations still need to be made into why this may be the case, and the experiment tried with more subjects as the experiment was within-subject with only 20 participants [70].
This also brings about other questions about gaze within interaction, such as, if the robot should look at the human 100% of interaction time? If the answer is no, then where should the robot look, and for how long?

3.3. Kismet Humanoid Robot

Not all social robots have the ability to speak coherently, yet they are still able to interact well with human. Kismet (cf. Figure 3) is a robot created and developed by researchers, Ph.D students, and undergraduate students at Massachusetts Institute of Technology (MIT) in the late 1990s [71,72]. The creators were inspired by the way infants develop social skills and learn how to communicate with adults, as such Kismet does not have the ability to speak any coherent language, but can express itself through vocal babbles. It uses two wide field of view colour cameras to decide what Kismet should pay attention to, and uses its other two colour cameras located within the pupils of its eyes to do more specific tasks such as eye detection. It has three degrees of freedom in its eyes, three degrees of freedom in its neck to control its neck and has a wireless microphone that can be mounted on to the users. Its face has an additional fifteen degrees of freedom altogether, i.e., two in its ears, two in its eyebrows, one in each of its eyelids, four in its lips and one in its jaw. These are all used to exhibit at least nine facial expressions including fatigue, fear, surprise, content, disgust, anger, accepting and unhappiness. In addition, Kismet can display different degrees of these expressions as its facial expression moves from one to another. It also has an articulatory synthesiser that allows Kismet not to speak, but to make babbling sounds sound like an infant, but also gives Kismet the ability to portray personality and emotion to its vocal babbles.
If a robot has the ability to speak, it is important to be able to speak well; this can include being able to carry on a conversation. It is important for the robot to be able to flexibly exchange the tempo and rhythm of the words during face-to-face interactions, depending on the conversation and the response of the human user. Some of the conversational cues such as facial gestures, the likes of raising the eyebrows and shifting of the gaze, are used to adjust and balance the exchange of speaking turns. Generally, there is a 0.25 s pause after speaking, before the next speaker when humans speak to each other. However, due to certain physical limitations, some robots such as Kismet which has a 0.5 s pause before responding to a human user, may have a slower response time when conversing with a human. However, because humans generally have to adapt when conversing with other humans of different ages (i.e., an adult human conversing with a toddler), humans have the ability to adapt to the slightly slower response time of current robots [12].
Kismet uses the direction of its gaze and its body posture to let the human user know whether it has finished what it has to say. When Kismet has finished speaking and wants the user to respond, it will look into the eyes of the user and lean forward slightly or raise its eyebrows, showing an interest in what the person has to say. If Kismet wants to respond, it will lean back into a neutral position, use a neutral facial expression and may move its gaze away from the person. According to user evaluations, people found that they felt more engaged with Kismet when it looked into their eyes, rather than it just looking at their faces [12].

3.4. iCub Humanoid Robot

The iCub (cf. Figure 3) is an open access robot that aims to be a research platform for many, from roboticists to phycologists [74,75]. The iCub is a small humanoid robot, with 52 degrees of freedom and a height of 94cm. The legs and feet are designed to support crawling, sitting up and walking, with binocular vision system, touch, audition and inertial sensors.
There has research using iCub to investigate adaptive interactions with a user, depending on the users’ level of engagement, which in turn effected the robots internal “emotions” which led the robot to decide the next appropriate level of engagement with the human. This research is aimed to aid in creating Social robots that are able to give user-personalised interactions. The robot looked out for certain actions and the OpenFace Library was used to recognise facial expressions. When iCub wanted to engage, it would straighten up and look for the person, then when interaction was initiated, it would point to objects and engage in gaze-cueing. When iCub no longer wanted to interact, it would pull back from the person and look down to its toys, ignoring further attempts to engage in interaction.

4. Conclusions

From the above discussion, it seems that for humans to be able to interact well with a robot, the robot does not necessarily need to look like a human. However, the robot would benefit from some sort of “face”, a focal area that humans would have a direct conversation at, especially if the facial features can move around to better express emotion in a life-like way. This will help human-robot interaction as humans are better able to relate and respond to a robot that is able to give out visual feedback. This should be similar to feedback given and received when humans interact with other humans, as they constantly read and give out non-verbal cues using their body gestures, body posture and facial expressions [76].
Also gathered from the evidence, it seems that humans do not need social robots to specifically have the capability to speak in a coherent human language, however the robot must be able to read and portray non-verbal conversational cues well, especially if its speech is limited. If the robot is to speak, it must be able to hold conversations well, understanding and portraying the different verbal and non-verbal cues given by humans, with a natural pause between each speaker. It is important to note, although the robot does not need to interact with the human in the same ways humans interact with each other, the robot must be able to behave and respond to cues in a predictable way that makes sense to the human.
With the advancement of technology, and also the increase in open source technology and platforms such as Robot Operating System (ROS) there’s a further growth in cross-device programming and the supporting communities, and many interesting projects that have incorporated different tactics in order to increase the proficiency of human-robot interaction. There are still many ways in which the current crop of social robots can be improved, however there are still many questions about how to improve the ability for robots to become better at interacting with humans that need to be asked and investigated. These include but are not limited to: “Do robots need to sound like humans when they speak?”, “Do different cultures prefer different styles of faces and vocal sounds?”, “Do people prefer interacting with a robot that looks like an animal they know, or a robot that does not look like anything they have seen before?” and “If social robots are designed to look like humans, how realistic do the features have to be?”.

Author Contributions

E.B.O. conceived the original idea. V.G. supervised and revised the work critically for important intellectual content. Both authors contributed to the final version of the manuscript. All authors have read and agree to the published version of the manuscript.

Funding

This research was part-funded by the Faculty of Science and Technology, Middlesex University London.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Bartneck, C.; Nomura, T.; Kanda, T.; Suzuki, T.; Kennsuke, K. A cross-cultural study on attitudes towards robots. In Proceedings of the HCI International, Las Vegas, NV, USA, 22–27 July 2005. [Google Scholar]
  2. International Federation of Robotics. Available online: http://www.ifr.org/service-robots (accessed on 12 January 2020).
  3. A Kawasaki Robot for Every Application. Available online: https://robotics.kawasaki.com/en1/ (accessed on 12 January 2020).
  4. FANUC Industrial Robot. Available online: https://www.fanucamerica.com/industrial-solutions (accessed on 12 January 2020).
  5. Denso Robotics EU Brochure. Available online: https://www.densorobotics-europe.com/fileadmin/Products/DENSO_Robotics_Product_Overview_EN.pdf (accessed on 12 January 2020).
  6. Vysocky, A.; Novak, P. Human-Robot collaboration in industry. MM Sci. J. 2016, 9, 903–906. [Google Scholar] [CrossRef]
  7. Freebody, M. The Rise of the Service Robot. Photonics Spectra 2011, 45, 40–42. [Google Scholar]
  8. Bekey, G.; Ambrose, R.; Kumar, V.; Sanderson, A.; Wilcox, B.; Zheng, Y. International Assessment of Research and Development in Robotics. Available online: http://www.wtec.org/robotics/report/screen-robotics-final-report-highres.pdf (accessed on 15 January 2020).
  9. Partner Robot FAMILY. Available online: https://www.toyota-global.com/innovation/partner_robot/robot/ (accessed on 12 January 2020).
  10. Paro Robot Seal Healing Pet. Available online: https://www.japantrendshop.com/paro-robot-seal-healing-pet-p-144.html (accessed on 12 January 2020).
  11. PARO Therapeutic Robot. Available online: http://www.parorobots.com/index.asp (accessed on 12 January 2020).
  12. Breazeal, C. Toward sociable robots. Robot. Auton. Syst. 2003, 42, 167–175. [Google Scholar] [CrossRef]
  13. Buitrago, J.A.; Bolaños, A.M.; Bravo, E.C. A motor learning therapeutic intervention for a child with cerebral palsy through a social assistive robot. Disabil. Rehabil. Assist. Technol. 2019, 2019, 1–6. [Google Scholar] [CrossRef]
  14. Broekens, J.; Heerink, M.; Rosendal, H. Assistive social robots in elderly care: A review. Gerontechnology 2009, 8, 94–103. [Google Scholar] [CrossRef] [Green Version]
  15. Mordoch, E.; Osterreicher, A.; Guse, L.; Roger, K.; Thompson, G. Use of social commitment robots in the care of elderly people with dementia: A literature review. Maturitas 2013, 74, 14–20. [Google Scholar] [CrossRef] [PubMed]
  16. Duffy, B.R. Anthropomorphism and the social robot. Robot. Auton. Syst. 2003, 42, 177–190. [Google Scholar] [CrossRef]
  17. Goetz, J.; Kiesler, S.; Powers, A. Matching robot appearance and behavior to tasks to improve human-robot cooperation. In Proceedings of the 12th IEEE International Workshop on Robot and Human Interactive Communication, Millbrae, CA, USA, 2 November 2003. [Google Scholar]
  18. Breazeal, C. Emotion and sociable humanoid robots. Int. J. Hum. Comput. Stud. 2003, 59, 119–155. [Google Scholar] [CrossRef]
  19. Ackerman, E. Study: Nobody wants social robots that look like humans because they threaten our identity. IEEE Spectr. 2016, 2016, 1–5. [Google Scholar]
  20. Ferrari, F.; Paladino, M.P.; Jetten, J. Blurring human–machine distinctions: Anthropomorphic appearance in social robots as a threat to human distinctiveness. Int. J. Soc. Robot. 2016, 8, 287–302. [Google Scholar] [CrossRef]
  21. Melson, G.F.; Peter, J.K.H.; Beck, A.; Friedman, B. Robotic pets in human lives: Implications for the human–animal bond and for human relationships with personified technologies. J. Soc. Issues 2009, 65, 545–567. [Google Scholar] [CrossRef]
  22. NAO Robot: Intelligent and Friendly Companion. Available online: https://www.softbankrobotics.com/emea/en/nao (accessed on 12 January 2020).
  23. Moon, Y. Sony AIBO: The World’s First Entertainment Robot. Available online: https://store.hbr.org/product/sony-aibo-the-world-s-first-entertainment-robot/502010?sku=502010-PDF-ENG (accessed on 15 January 2020).
  24. Yu, R.; Hui, E.; Lee, J.; Poon, D.; Ng, A.; Sit, K.; Ip, K.; Yeung, F.; Wong, M.; Shibata, T. Use of a therapeutic, socially assistive pet robot (PARO) in improving mood and stimulating social interaction and communication for people with dementia: Study protocol for a randomized controlled trial. JMIR Res. Protoc. 2015, 4, e45. [Google Scholar] [CrossRef] [PubMed]
  25. Bemelmans, R.; Gelderblom, G.J.; Jonker, P.; de Witte, L. The potential of socially assistive robotics in care for elderly, a systematic review. In Human-Robot Personal Relationships. HRPR 2010. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering; Lamers, M.H., Verbeek, F.J., Eds.; Springer: Berlin, Germany, 2010. [Google Scholar]
  26. Moyle, W.; Jones, C.J.; Murfield, J.E.; Thalib, L.; Beattie, E.R.; Shum, D.K.; O’Dwyer, S.T.; Mervin, M.C.; Draper, B.M. Use of a robotic seal as a therapeutic tool to improve dementia symptoms: A cluster-randomized controlled trial. J. Am. Med. Dir. Assoc. 2017, 18, 766–773. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  27. Petersen, S.; Houston, S.; Qin, H.; Tague, C.; Studley, J. The utilization of robotic pets in dementia care. J. Alzheimer’s Dis. 2017, 55, 569–574. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  28. Dodds, P.; Martyn, K.; Brown, M. Infection prevention and control challenges of using a therapeutic robot. Nurs. Older People 2018, 30, 34–40. [Google Scholar] [CrossRef] [PubMed]
  29. Ienca, M.; Jotterand, F.; Vică, C.; Elger, B. Social and assistive robotics in dementia care: Ethical recommendations for research and practice. Int. J. Soc. Robot. 2016, 8, 565–573. [Google Scholar] [CrossRef]
  30. Hung, L.; Liu, C.; Woldum, E.; Au-Yeung, A.; Berndt, A.; Wallsworth, C.; Horne, N.; Gregorio, M.; Mann, J.; Chaudhury, H. The benefits of and barriers to using a social robot PARO in care settings: A scoping review. BMC Geriatr. 2019, 19, 232. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  31. Baisch, S.; Kolling, T.; Schall, A.; Rühl, S.; Selic, S.; Kim, Z.; Rossberg, H.; Klein, B.; Pantel, J.; Oswald, F. Acceptance of social robots by elder people: Does psychosocial functioning matter? Int. J. Soc. Robot. 2017, 9, 293–307. [Google Scholar] [CrossRef]
  32. Shibata, T. Therapeutic seal robot as biofeedback medical device: Qualitative and quantitative evaluations of robot therapy in dementia care. Proc. IEEE 2012, 100, 2527–2538. [Google Scholar] [CrossRef]
  33. Ouwehand, A.N. The Role of Culture in the Acceptance of Elderly towards Social Assertive Robots: How do Cultural Factors Influence the Acceptance of Elderly People towards Social Assertive Robotics in the Netherlands and Japan? University of Twente Thesis: Enschede, Netherland, 2017. [Google Scholar]
  34. Taherdoost, H. A review of technology acceptance and adoption models and theories. Procedia Manuf. 2018, 22, 960–967. [Google Scholar] [CrossRef]
  35. Klein, B.; Gaedt, L.; Cook, G. Emotional Robots. Available online: https://econtent.hogrefe.com/doi/10.1024/1662-9647/a000085 (accessed on 15 January 2020).
  36. Friedman, B.; Jr, P.H.K.; Hagman, J. Hardware companions? What online AIBO discussion forums reveal about the human-robotic relationship. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Ft. Lauderdale, FL, USA, 5–10 April 2003. [Google Scholar]
  37. Kerepesi, A.; Kubinyi, E.; Jonsson, G.K.; Magnusson, M.S.; Miklosi, A. Behavioural comparison of human–animal (dog) and human–robot (AIBO) interactions. Behav. Process. 2006, 73, 92–99. [Google Scholar] [CrossRef] [PubMed]
  38. Kubinyi, E.; Miklósi, Á.; Kaplan, F.; Gácsi, M.; Topál, J.; Csányi, V. Social behaviour of dogs encountering AIBO, an animal-like robot in a neutral and in a feeding situation. Behav. Process. 2004, 65, 231–239. [Google Scholar] [CrossRef] [PubMed]
  39. Kahn, P.H.; Freier, N.G.; Friedman, B.; Severson, R.L.; Feldman, E.N. Social and moral relationships with robotic others? In Proceedings of the 13th IEEE International Workshop on Robot and Human Interactive Communication (IEEE Catalog no. 04TH8759), Okayama, Japan, 22 September 2004. [Google Scholar]
  40. Johnston, A. Robotic Seals Comfort Dementia Patients but Raise Ethical Concerns; KALW: San Francisco, CA, USA, 2015. [Google Scholar]
  41. Calo, C.J.; Hunt-Bull, N.; Lewis, L.; Metzler, T. Ethical implications of using the paro robot, with a focus on dementia patient care. In Proceedings of the Workshops at the Twenty-Fifth AAAI Conference on Artificial Intelligence, San Francisco, CA, USA, 8 August 2011. [Google Scholar]
  42. Fiske, A.; Henningsen, P.; Buyx, A. Your Robot Therapist Will See You Now: Ethical Implications of Embodied Artificial Intelligence in Psychiatry, Psychology, and Psychotherapy. J. Med. Internet Res. 2019, 21, e13216. [Google Scholar] [CrossRef] [PubMed]
  43. Share, P.; Pender, J. Preparing for a robot future? Social professions, social robotics and the challenges ahead. Irish J. Appl. Soc. Stud. 2018, 18, 4. [Google Scholar]
  44. Wachsmuth, I. Robots like me: Challenges and ethical issues in aged care. Front. Psychol. 2018, 9, 432. [Google Scholar] [CrossRef] [Green Version]
  45. Sharkey, A.; Sharkey, N. Children, the elderly, and interactive robots. IEEE Robot. Autom. Mag. 2011, 18, 32–38. [Google Scholar] [CrossRef]
  46. Sparrow, R.; Sparrow, L. In the hands of machines? The future of aged care. Minds Mach. 2006, 16, 141–161. [Google Scholar] [CrossRef]
  47. Cecil, P.; Glass, N. An exploration of emotional protection and regulation in nurse–patient interactions: The role of the professional face and the emotional mirror. Collegian 2015, 22, 377–385. [Google Scholar] [CrossRef]
  48. Winkle, K.; Caleb-Solly, P.; Turton, A.; Bremner, P. Social robots for engagement in rehabilitative therapies: Design implications from a study with therapists. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, Chicago, IL, USA, 5–8 March 2018. [Google Scholar]
  49. Carrillo, F.M.; Butchart, J.; Kruse, N.; Scheinberg, A.; Wise, L.; McCarthy, C. Physiotherapists’ acceptance of a socially assistive robot in ongoing clinical deployment. In Proceedings of the 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Nanjing, China, 27–31 August 2018. [Google Scholar]
  50. Correia, F.; Alves-Oliveira, P.; Maia, N.; Ribeiro, T.; Petisca, S.; Melo, F.S.; Paiva, A. Just follow the suit! Trust in human-robot interactions during card game playing. In Proceedings of the 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), New York, NY, USA, 26–31 August 2016. [Google Scholar]
  51. Sanders, T.L.; MacArthur, K.; Volante, W.; Hancock, G.; MacGillivray, T.; Shugars, W.; Hancock, P.A. Trust and prior experience in human-robot interaction. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Los Angeles, CA, USA, 20 October 2017. [Google Scholar]
  52. Langer, A.; Feingold-Polak, R.; Mueller, O.; Kellmeyer, P.; Levy-Tzedek, S. Trust in Socially Assistive Robots: Considerations for Use in Rehabilitation. Available online: https://www.ncbi.nlm.nih.gov/pubmed/31348963 (accessed on 15 January 2020).
  53. Beuscher, L.M.; Fan, J.; Sarkar, N.; Dietrich, M.S.; Newhouse, P.A.; Miller, K.F.; Mion, L.C. Socially Assistive Robots: Measuring Older Adults’ Perceptions. J. Gerontol. Nurs. 2017, 43, 35–43. [Google Scholar] [CrossRef]
  54. Karunarathne, D.; Morales, Y.; Nomura, T.; Kanda, T.; Ishiguro, H. Will Older Adults Accept a Humanoid Robot as a Walking Partner? Int. J. Soc. Robot. 2019, 11, 343–358. [Google Scholar] [CrossRef]
  55. Kahn, P.H., Jr.; Kanda, T.; Ishiguro, H.; Gill, B.T.; Ruckert, J.H.; Shen, S.; Gary, H.E.; Reichert, A.L.; Freier, N.G.; Severson, R.L. Do people hold a humanoid robot morally accountable for the harm it causes? In Proceedings of the Seventh Annual ACM/IEEE International Conference on Human-Robot Interaction, Boston, MA, USA, 5–8 March 2012. [Google Scholar]
  56. Mavridis, N. A review of verbal and non-verbal human–robot interactive communication. Robot. Auton. Syst. 2015, 63, 22–35. [Google Scholar] [CrossRef] [Green Version]
  57. Mutlu, B.; Yamaoka, F.; Kanda, T.; Ishiguro, H.; Hagita, N. Nonverbal leakage in robots: Communication of intentions through seemingly unintentional behavior. In Proceedings of the 4th ACM/IEEE International Conference on Human Robot Interaction, La Jolla, CA, USA, 11–13 March 2009. [Google Scholar]
  58. Andriella, A.; Torras, C.; Alenyà, G. Short-Term Human–Robot Interaction Adaptability in Real-World Environments. Int. J. Soc. Robot. 1–9. 2019. Available online: https://link.springer.com/article/10.1007/s12369-019-00606-y (accessed on 12 January 2020).
  59. Andriella, A.; Torras, C.; Alenya, G. Learning Robot Policies Using a High-Level Abstraction Persona-Behaviour Simulator. Available online: https://www.iri.upc.edu/files/scidoc/2224-Learning-Robot-Policies-Using-a-High-Level-Abstraction-Persona-Behaviour-Simulator.pdf (accessed on 15 January 2019).
  60. Find Out More about NAO. Available online: https://www.softbankrobotics.com/us/nao (accessed on 12 January 2020).
  61. Pepper: Softbank Robotics. Available online: https://www.softbankrobotics.com/us/pepper (accessed on 12 January 2020).
  62. Shamsuddin, S.; Yussof, H.; Ismail, L.I.; Mohamed, S.; Hanapiah, F.A.; Zahari, N.I. Initial response in HRI-a case study on evaluation of child with autism spectrum disorders interacting with a humanoid robot Nao. Procedia Eng. 2012, 41, 1448–1455. [Google Scholar] [CrossRef]
  63. Tapus, A.; Peca, A.; Aly, A.; Pop, C.; Jisa, L.; Pintea, S.; Rusu, A.S.; David, D.O. Children with autism social engagement in interaction with Nao, an imitative robot: A series of single case experiments. Interact. Stud. 2012, 13, 315–347. [Google Scholar] [CrossRef]
  64. Shamsuddin, S.; Yussof, H.; Ismail, L.; Hanapiah, F.A.; Mohamed, S.; Piah, H.A.; Zahari, N.I. Initial response of autistic children in human-robot interaction therapy with humanoid robot NAO. In Proceedings of the IEEE 8th International Colloquium on Signal Processing and its Applications, Melaka, Malaysia, 23–25 March 2012. [Google Scholar]
  65. Pandey, A.K.; Gelin, R. A mass-produced sociable humanoid robot: Pepper: The first machine of its kind. IEEE Robot. Autom. Mag. 2018, 25, 40–48. [Google Scholar] [CrossRef]
  66. Aaltonen, I.; Arvola, A.; Heikkilä, P.; Lammi, H. Hello pepper, may i tickle you? Children’s and adults’ responses to an entertainment robot at a shopping mall. In Proceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, Vienna, Austria, 6 March 2017. [Google Scholar]
  67. Gardecki, A.; Podpora, M. Experience from the operation of the pepper humanoid robots. In Proceedings of the Applied Electrical Engineering (PAEE), Koscielisko, Poland, 25–30 June 2017. [Google Scholar]
  68. Foster, M.E.; Alami, R.; Gestranius, O; Lemon, O.; Niemela, M.; Odobez, J.-M.; Pandey, A.K. The MuMMER Project: Engaging Human-Robot Interaction in Real-World Public Spaces. Available online: https://eprints.gla.ac.uk/123307/ (accessed on 15 January 2020).
  69. Tanaka, F.; Isshiki, K.; Takahashi, F.; Uekusa, M.; Sei, R.; Hayashi, K. Pepper learns together with children: Development of an educational application. In Proceedings of the 2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids), Seoul, South Korea, 3–5 November 2015. [Google Scholar]
  70. Hirano, T.; Shiomi, M.; Iio, T.; Kimoto, M.; Nagashio, T.; Tanev, I.; Shimohara, K.; Hagita, N. Communication Cues In a Human-Robot Touch Interaction. Available online: https://www.researchgate.net/publication/310820319_Communication_Cues_in_a_Human-Robot_Touch_Interaction (accessed on 15 January 2020).
  71. Breazeal, C.; Velásquez, J. Toward teaching a robot ‘infant’using emotive communication acts. In Proceedings of the 1998 Simulated Adaptive Behavior Workshop on Socially Situated Intelligence, Zurich, Switzerland, 17–21 August 1998. [Google Scholar]
  72. Breazeal, C. Early Experiments Using Motivations to Regulate Human-Robot Interaction. 1998. Available online: http://robotic.media.mit.edu/wp-content/uploads/sites/7/2015/01/Breazeal-AAAI-98-early.pdf (accessed on 12 January 2020).
  73. Kismet, The Robot. Available online: http://www.ai.mit.edu/projects/sociable/baby-bits.html (accessed on 12 January 2020).
  74. Metta, G.; Natale, L.; Nori, F.; Sandini, G.; Vernon, D.; Fadiga, L.; von Hofsten, C.; Rosander, K.; Lopes, M.; Santos-Victor, J. The iCub humanoid robot: An open-systems platform for research in cognitive development. Neural Netw. 2010, 23, 1125–1134. [Google Scholar] [CrossRef] [PubMed]
  75. Sandini, G.; Metta, G.; Vernon, D. The iCub Cognitive Humanoid Robot: An Open-System Research Platform for Enactive Cognition. In 50 Years of Artificial Intelligence, Lecture Notes in Computer Science; Lungarella, M., Iida, F., Bongard, J., Pfeifer, R., Eds.; Springer: Berlin/Heidelberg, Germany, 2007; Volume 4850. [Google Scholar]
  76. Frith, C. Role of facial expressions in social interactions. Philos. Trans. R. Soc. B Biol. Sci. 2009, 364, 3453–3458. [Google Scholar] [CrossRef] [Green Version]
Figure 1. (a) AIBO robot [23] (b) Paro robot [10].
Figure 1. (a) AIBO robot [23] (b) Paro robot [10].
Information 11 00043 g001
Figure 2. (a) Nao robot (Aldebaran Robotics) [60] (b) Pepper (Softbank Robotics) [61].
Figure 2. (a) Nao robot (Aldebaran Robotics) [60] (b) Pepper (Softbank Robotics) [61].
Information 11 00043 g002
Figure 3. (a) Kismet robot (Source [73]; (b) iCub [74].
Figure 3. (a) Kismet robot (Source [73]; (b) iCub [74].
Information 11 00043 g003

Share and Cite

MDPI and ACS Style

Onyeulo, E.B.; Gandhi, V. What Makes a Social Robot Good at Interacting with Humans? Information 2020, 11, 43. https://doi.org/10.3390/info11010043

AMA Style

Onyeulo EB, Gandhi V. What Makes a Social Robot Good at Interacting with Humans? Information. 2020; 11(1):43. https://doi.org/10.3390/info11010043

Chicago/Turabian Style

Onyeulo, Eva Blessing, and Vaibhav Gandhi. 2020. "What Makes a Social Robot Good at Interacting with Humans?" Information 11, no. 1: 43. https://doi.org/10.3390/info11010043

APA Style

Onyeulo, E. B., & Gandhi, V. (2020). What Makes a Social Robot Good at Interacting with Humans? Information, 11(1), 43. https://doi.org/10.3390/info11010043

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop