A Survey of Behavioral Models for Social Robots
Abstract
:1. Introduction
- Cognitive architectures—This term refers to research works where both abstract models of cognition and software instantiations of such models, employed in the field of artificial intelligence, are described [26]. Cognitive architectures have the fundamental role to enable artificial intelligence in robotic agents, in order to exhibit intelligent behaviors.
- Behavioral adaptation—Behavioral adaptation is defined as “learning new tasks and to adapt to changes in environmental conditions, or to failures in sensors and/or actuators” [27]. Thus, the papers included in this group describe robot’s social abilities enhanced by the robot’s capability of adapting its behavior to the user’s need and habits [28].
- Empathy—Empathy is defined as “The act of perceiving, understanding, experiencing, and responding to the emotional state and ideas of another person” [29]. In human-human relationships, this term explains the capacity to take the role of the other to adopt alternative perspectives [28]. Works clustered in this category present a particular emphasis on the attempts to reproduce this ability in robotic agents to establish an empathetic connection with the user, which improves Human-Robot Interaction (HRI). Empathy is a sub-category of the behavioral adaptation. However, we decide to make separate categories to be aligned with some recent papers [17,30,31].
2. Materials and Methods
Study Selection Procedures
3. Results
3.1. Application Overview
3.2. Data Abstraction
3.3. Theoretical Works on the Development of Robotics Behavioral Models
3.3.1. Concepts for the Cognitive Application Area
3.3.2. Concepts for the Empathy Area
3.3.3. Concepts for Behavioral Adaptation Area
3.4. Experimental Works on the Development and Implementation of the Behavioral Model
3.4.1. Experimental Works for Cognitive Architectures
3.4.2. Experimental Works on Empathy
3.4.3. Experimental Works on Behavioral Adaptation
4. Discussion
4.1. Sensors Technology
4.2. Perception and Learning from the User
4.3. Architecture Design
4.4. Experimental Phase
4.5. Ethical, Legal, and Social Aspects
5. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Goodrich, M.A.; Schultz, A.C. Human-Robot Interaction: A Survey. Found. Trends®. Hum. Comput. Interact. 2008, 1, 203–275. [Google Scholar] [CrossRef]
- Conti, D.; Trubia, G.; Buono, S.; Di Nuovo, S.; Di Nuovo, A. Evaluation of a robot-assisted therapy for children with autism and intellectual disability. In Proceedings of the Annual Conference towards Autonomous Robotic Systems, Bristol, UK, 21 July 2018. [Google Scholar]
- Cavallo, F.; Aquilano, M.; Bonaccorsi, M.; Limosani, R.; Manzi, A.; Carrozza, M.C.; Dario, P. Improving Domiciliary Robotic Services by Integrating the ASTRO Robot in an AmI Infrastructure; Springer International Publishing: Cham, Switzerland, 2014; pp. 267–282. [Google Scholar]
- Sancarlo, D.; D’Onofrio, G.; Oscar, J.; Ricciardi, F.; Casey, D.; Murphy, K.; Giuliani, F.; Greco, A. MARIO Project: A Multicenter Survey About Companion Robot Acceptability in Caregivers of Patients with Dementia; Springer International Publishing: Cham, Switzerland, 2017; pp. 311–336. [Google Scholar]
- Loi, S.M.; Bennett, A.; Pearce, M.; Nguyen, K.; Lautenschlager, N.T.; Khosla, R.; Velakoulis, D. A pilot study exploring staff acceptability of a socially assistive robot in a residential care facility that accommodates people under 65 years old. Int. Psychogeriatr. 2018, 30, 1075–1080. [Google Scholar] [CrossRef] [PubMed]
- Limosani, R.; Manzi, A.; Fiorini, L.; Cavallo, F.; Dario, P. Enabling Global Robot Navigation Based on a Cloud Robotics Approach. Int. J. Soc. Robot. 2016, 8, 371–380. [Google Scholar] [CrossRef] [Green Version]
- Gerłowska, J.; Skrobas, U.; Grabowska-Aleksandrowicz, K.; Korchut, A.; Szklener, S.; Szczȩśniak-Stańczyk, D.; Tzovaras, D.; Rejdak, K. Assessment of perceived attractiveness, usability, and societal impact of a multimodal Robotic Assistant for aging patients with memory impairments. Front. Neurol. 2018, 9, 392. [Google Scholar] [CrossRef] [PubMed]
- Reppou, S.; Karagiannis, G. Progress in Automation, Robotics and Measuring Techniques; Springer International Publishing: Cham, Switzerland, 2015; Volume 352, pp. 233–234. [Google Scholar]
- Cesta, A.; Cortellessa, G.; Orlandini, A.; Tiberio, L. Long-Term Evaluation of a Telepresence Robot for the Elderly: Methodology and Ecological Case Study. Int. J. Soc. Robot. 2016, 8, 421–441. [Google Scholar] [CrossRef] [Green Version]
- Fasola, J.; Mataric, M. A Socially Assistive Robot Exercise Coach for the Elderly. J. Hum. Robot. Interact. 2013, 2, 3–32. [Google Scholar] [CrossRef] [Green Version]
- Fiorini, L.; Esposito, R.; Bonaccorsi, M.; Petrazzuolo, C.; Saponara, F.; Giannantonio, R.; Petris, G.D.; Dario, P.; Cavallo, F. Enabling personalised medical support for chronic disease management through a hybrid robot-cloud approach. Auton. Robot. 2017, 41, 1263–1276. [Google Scholar] [CrossRef]
- Burke, N.; Dautenhahn, K.; Saunders, J.; Koay, K.L.; Syrdal, D.S. “Teach Me–Show Me”—End-User Personalization of a Smart Home and Companion Robot. IEEE Trans. Hum. Mach. Syst. 2015, 46, 27–40. [Google Scholar]
- Chance, G.; Jevtic, A.; Caleb-Solly, P.; Alenya, G.; Torras, C.; Dogramadzi, S. “Elbows Out” - Predictive Tracking of Partially Occluded Pose for Robot-Assisted Dressing. IEEE –. Autom. Lett. 2018, 3, 3598–3605. [Google Scholar] [CrossRef]
- Woiceshyn, L.; Wang, Y.; Nejat, G. A Socially Assistive Robot to Help with Getting Dressed 2 Clothing Recommendation System. IEEE Robot. Autom. Lett. 2018, 4, 13–14. [Google Scholar]
- Turchetti, G.; Micera, S.; Cavallo, F.; Odetti, L.; Dario, P. Technology and innovative services. IEEE Pulse 2011, 2, 27–35. [Google Scholar] [CrossRef] [PubMed]
- García-Soler, Á.; Facal, D.; Díaz-Orueta, U.; Pigini, L.; Blasi, L.; Qiu, R. Inclusion of service robots in the daily lives of frail older users: A step-by-step definition procedure on users′ requirements. Arch. Gerontol. Geriatr. 2018, 74, 191–196. [Google Scholar] [CrossRef] [PubMed]
- Asada, M. Towards Artificial Empathy: How Can Artificial Empathy Follow the Developmental Pathway of Natural Empathy? Int. J. Soc. Robot. 2015, 7, 19–33. [Google Scholar] [CrossRef]
- Cross, E.S.; Hortensius, R.; Wykowska, A. From social brains to social robots: Applying neurocognitive insights to human-robot interaction. Philos. Trans. R. Soc. B Biol. Sci. 2019, 374, 5–8. [Google Scholar] [CrossRef] [PubMed]
- Chidambaram, V.; Chiang, Y.-H.; Mutlu, B. Designing Persuasive Robots: How Robots Might Persuade People Using Vocal and Nonverbal Cues. In Proceedings of the 7th Annual. ACM/IEEE International Conference Human-Robot Interaction. (HRI ’12), New York, NY, USA, 5–8 March 2012; pp. 293–300. [Google Scholar]
- Cavallo, F.; Semeraro, F.; Fiorini, L.; Magyar, G.; Sinčák, P.; Dario, P. Emotion Modelling for Social Robotics Applications: A Review. J. Bionic Eng. 2018, 15, 185–203. [Google Scholar] [CrossRef]
- Hall, E.T. The Hidden Dimension; Doubleday: Garden City, NY, USA, 1966. [Google Scholar]
- Wiltshire, T.J.; Smith, D.C.; Keebler, J.R. Cybernetic Teams: Towards the Implementation of Team Heuristics in HRI; Springer International Publishing: Berlin, Gemany, 2013; pp. 321–330. [Google Scholar]
- Wiltshire, T.J.; Warta, S.F.; Barber, D.; Fiore, S.M. Enabling robotic social intelligence by engineering human social-cognitive mechanisms. Cogn. Syst. Res. 2017, 43, 190–207. [Google Scholar] [CrossRef]
- Kato, Y.; Kanda, T.; Ishiguro, H. May I help you? In Proceedings of the 10th Annual ACM/IEEE International Conference Human-Robot Interaction (HRI 2015), New York, NY, USA, 3–5 March 2015; pp. 35–42. [Google Scholar]
- Vircikova, M.; Magyar, G.; Sincak, P. The affective loop: A tool for autonomous and adaptive emotional human-robot interaction. Adv. Intell. Syst. Comput. 2015, 345, 247–254. [Google Scholar]
- Lieto, A.; Bhatt, M.; Oltramari, A.; Vernon, D. The role of cognitive architectures in general artificial intelligence. Cogn. Syst. Res. 2018, 48, 1–3. [Google Scholar] [CrossRef] [Green Version]
- Silva, F.; Correia, L.; Christensen, A.L. Evolutionary online behaviour learning and adaptation in real robots. R. Soc. Open Sci. 2017, 4, 160938. [Google Scholar] [CrossRef]
- Tapus, A.; Aly, A. User adaptable robot behavior. In Proceedings of the 2011 International Conference on Collaboration Technologies and Systems (CTS 2011), Philadelphia, PA, USA, 23–27 May 2011; pp. 165–167. [Google Scholar]
- Cuff, B.M.P.; Brown, S.J.; Taylor, L.; Howat, D.J. Empathy: A review of the concept. Emot. Rev. 2014, 8, 144–153. [Google Scholar] [CrossRef]
- Lewandowska-Tomaszczyk, B.; Wilson, P.A. Compassion, empathy and sympathy expression features in affective robotics. In Proceedings of the 2016 7th IEEE International Conference on Cognitive Infocommunications (CogInfoCom), Wroclaw, Poland, 16–18 October 2017; pp. 65–70. [Google Scholar]
- Damiano, L.; Dumouchel, P.; Lehmann, H. Artificial Empathy: An Interdisciplinary Investigation. Int. J. Soc. Robot. 2015, 7, 3–5. [Google Scholar] [CrossRef]
- Sonntag, D. Persuasive AI Technologies for Healthcare Systems. In Proceedings of the 2016 AAAI Fall Symposium Series, The Westin Arlington Gateway, Arlington, Virginia, 17–19 November 2016; pp. 165–168. [Google Scholar]
- Raymundo, C.R.; Johnson, C.G.; Vargas, P.A. An architecture for emotional and context-aware associative learning for robot companions. In Proceedings of the 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Kobe, Japan, 31 August–4 September 2015; pp. 31–36. [Google Scholar]
- Franchi, A.M.; Mutti, F.; Gini, G. From learning to new goal generation in a bioinspired robotic setup. Adv. Robot. 2016, 1864, 795–805. [Google Scholar] [CrossRef]
- Pieters, R.; Racca, M.; Veronese, A.; Kyrki, V. Human-aware interaction: A memory-inspired artificial cognitive architecture. Cogn. Robte Archit. 2017, 1855, 38–39. [Google Scholar]
- Cutsuridis, V.; Taylor, J.G.; Asprino, L.; Nuzzolese, A.G.; Russo, A.; Gangemi, A.; Presutti, V.; Nolfi, S. A Cognitive Control Architecture for the Perception-Act1. Cognit. Comput. 2013, 5, 383–395. [Google Scholar] [CrossRef]
- Haazebroek, P.; Van Dantzig, S.; Hommel, B. A computational model of perception and action for cognitive robotics. Cogn. Process. 2011, 12, 355–365. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Farahmand, A.M.; Ahmadabadi, M.N.; Lucas, C.; Araabi, B.N. Interaction of culture-based learning and cooperative co-evolution and its application to automatic behavior-based system design. IEEE Trans. Evol. Comput. 2010, 14, 23–57. [Google Scholar] [CrossRef]
- Vitale, J.; Williams, M.A.; Johnston, B.; Boccignone, G. Affective facial expression processing via simulation: A probabilistic model. Biol. Inspired Cogn. Archit. 2014, 10, 30–41. [Google Scholar] [CrossRef] [Green Version]
- Chen, L.F.; Liu, Z.T.; Wu, M.; Dong, F.Y.; Yamazaki, Y.; Hirota, K. Multi-robot behavior adaptation to local and global communication atmosphere in humans-robots interaction. J. Multimodal User Interfaces 2014, 8, 289–303. [Google Scholar] [CrossRef]
- Asprino, L.; Nuzzolese, A.G.; Russo, A.; Gangemi, A.; Presutti, V.; Nolfi, S. An ontology design pattern for supporting behaviour arbitration in cognitive agents. Adv. Ontol. Des. Patterns 2017, 32, 85–95. [Google Scholar]
- Wan, J.; Tang, S.; Hua, Q.; Li, D.; Liu, C.; Lloret, J. Context-aware cloud robotics for material handling in cognitive industrial Internet of Things. IEEE Internet Things J. 2018, 5, 2272–2281. [Google Scholar] [CrossRef]
- Looije, R.; Neerincx, M.A.; Cnossen, F. Persuasive robotic assistant for health self-management of older adults: Design and evaluation of social behaviors. Int. J. Hum. Comput. Stud. 2010, 68, 386–397. [Google Scholar] [CrossRef]
- Ficocelli, M.; Terao, J.; Nejat, G. Promoting interactions between humans and robots using robotic emotional behavior. IEEE Trans. Cybern. 2015, 46, 2911–2923. [Google Scholar] [CrossRef] [PubMed]
- Cao, H.L.; Van de Perre, G.; Kennedy, J.; Senft, E.; Esteban, P.G.; De Beir, A.; Simut, R.; Belpaeme, T.; Lefeber, D.; Vanderborght, B. A personalized and platform-independent behavior control system for social robots in therapy: Development and applications. IEEE Trans. Cogn. Dev. Syst. 2018, 8920, 1–13. [Google Scholar] [CrossRef]
- Karami, A.B.; Sehaba, K.; Encelle, B. Adaptive artificial companions learning from users’ feedback. Adapt. Behav. 2016, 24, 69–86. [Google Scholar] [CrossRef]
- De Greeff, J.; Belpaeme, T.; Bongard, J. Why robots should be social: Enhancing machine learning through social human-robot interaction. PLoS ONE 2015, 10, e0138061. [Google Scholar] [CrossRef] [PubMed]
- Garzotto, F.; Gelsomini, M.; Kinoe, Y. Puffy: A Mobile Inflatable Interactive Companion for Children with Neurodevelopmental Disorder. In Proceedings of the IFIP Conference on Human-Computer Interaction, Bombay, India, 25–29 September 2017; pp. 467–492. [Google Scholar]
- Chan, J.; Nejat, G. Social intelligence for a robot engaging people in cognitive training activities. Int. J. Adv. Robot. Syst. 2012, 9, 113. [Google Scholar] [CrossRef]
- Liu, X.; Xie, L.; Wang, Z. Empathizing with emotional robot based on cognition reappraisal. China Commun. 2017, 14, 100–113. [Google Scholar] [CrossRef]
- Sirithunge, H.P.C.; Viraj, M.A.; Muthugala, J.; Buddhika, A.G.; Jayasekara, P.; Chandima, D.P. Interpretation of interaction demanding of a user based on nonverbal behavior in a domestic environment. In Proceedings of the 2017 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), Naples, Italy, 9–12 July 2017. [Google Scholar]
- Kim, S.; Yu, Z.; Lee, M. Understanding human intention by connecting perception and action learning in artificial agents. Neural Netw. 2017, 92, 29–38. [Google Scholar] [CrossRef]
- Dağlarlı, E.; Dağlarlı, S.F.; Günel, G.Ö.; Köse, H. Improving human-robot interaction based on joint attention. Appl. Intell. 2017, 47, 62–82. [Google Scholar] [CrossRef]
- Boucenna, S.; Gaussier, P.; Andry, P.; Hafemeister, L. A Robot Learns the Facial Expressions Recognition and Face/Non-face Discrimination Through an Imitation Game. Int. J. Soc. Robot. 2014, 6, 633–652. [Google Scholar] [CrossRef]
- Boucenna, S.; Gaussier, P.; Hafemeister, L. Development of first social referencing skills: Emotional interaction as a way to regulate robot behavior. IEEE Trans. Auton. Ment. Dev. 2014, 6, 42–55. [Google Scholar] [CrossRef]
- Granata, C.; Bidaud, P. A framework for the design of person following behaviors for social mobile robots. In Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura, Portugal, 7–12 October 2012; pp. 4652–4659. [Google Scholar]
- Feil-Seifer, D.; Matarić, M. People-aware navigation for goal-oriented behavior involving a human partner. In Proceedings of the 2011 IEEE International Conference on Development and Learning (ICDL), Frankfurt am Main, Germany, 24–27 August 2011. [Google Scholar]
- Rodić, A.D.; Jovanović, M.D. How to Make Robots Feel and Social as Humans Building attributes of artificial emotional intelligence with robots of human-like behavior. In Proceedings of the 6th IARIA International Conference on Advanced Cognitive Technologies and Applications, Venice, Italy, 25–29 May 2014; pp. 133–139. [Google Scholar]
- Bethel, C.L.; Henkel, Z.; Eakin, D.K.; May, D.C.; Pilkinton, M. Moving toward an intelligent interactive social engagement framework for information gathering. In Proceedings of the 2017 IEEE 15th International Symposium on Applied Machine Intelligence and Informatics (SAMI), Herl’any, Slovakia, 26–28 January 2017; pp. 21–26. [Google Scholar]
- Infantino, I.; Augello, A.; Maniscalto, U.; Pilato, G.; Vella, F.; Prestazioni, A.; Nazionale, C.; Malfa, V.U. La A Cognitive Architecture for Social Robots. In Proceedings of the 2018 IEEE 4th Internation Forum Research Technology Society India, Palermo, Italy, 10–13 September 2018. [Google Scholar]
- Rodić, A.; Urukalo, D.; Vujović, M.; Spasojević, S.; Tomić, M.; Berns, K.; Al-Darraji, S.; Zafar, Z. Embodiment of human personality with EI-Robots by mapping behaviour traits from live-model. Adv. Intell. Syst. Comput. 2017, 540, 438–448. [Google Scholar]
- Sarathy, V.; Scheutz, M. A Logic-Based Computational Framework for Inferring Cognitive Affordances. IEEE Trans. Cogn. Dev. Syst. 2018, 10, 26–43. [Google Scholar] [CrossRef]
- Awaad, I.; Kraetzschmar, G.K.; Hertzberg, J. The Role of Functional Affordances in Socializing Robots. Int. J. Soc. Robot. 2015, 7, 421–438. [Google Scholar] [CrossRef]
- Chumkamon, S.; Hayashi, E.; Koike, M. Intelligent emotion and behavior based on topological consciousness and adaptive resonance theory in a companion robot. Biol. Inspired Cogn. Archit. 2016, 18, 51–67. [Google Scholar] [CrossRef]
- Mart, F. Practical aspects of deploying Robotherapy systems. In Proceedings of the ROBOT 2017: Third Iberian Robotics Conference, Sevilla, Spain, 22–24 November 2018. [Google Scholar]
- Ugur, E.; Yukie, N.; Erol, S.; Erhan, O. Staged development of robot skills: Behavior formation, affordance learning and imitation with motionese. IEEE Trans. Auton. Ment. Dev. 2015, 7, 119–139. [Google Scholar] [CrossRef]
- Masuyama, N.; Loo, C.K.; Seera, M. Personality affected robotic emotional model with associative memory for human-robot interaction. Neurocomputing 2018, 272, 213–225. [Google Scholar] [CrossRef]
- Barros, P.; Jirak, D.; Weber, C.; Wermter, S. Multimodal emotional state recognition using sequence-dependent deep hierarchical features. Neural Netw. 2015, 72, 140–151. [Google Scholar] [CrossRef] [Green Version]
- Lemaignan, S.; Warnier, M.; Sisbot, E.A.; Clodic, A.; Alami, R. Artificial cognition for social human–robot interaction: An implementation. Artif. Intell. 2017, 247, 45–69. [Google Scholar] [CrossRef]
- Romay, A.; Kohlbrecher, S.; Stumpf, A.; von Stryk, O.; Maniatopoulos, S.; Kress-Gazit, H.; Schillinger, P.; Conner, D.C. Collaborative Autonomy between High-level Behaviors and Human Operators for Remote Manipulation Tasks using Different Humanoid Robots. J. Field Robot. 2017, 34, 333–358. [Google Scholar] [CrossRef]
- Maeda, G.J.; Neumann, G.; Ewerton, M.; Lioutikov, R.; Kroemer, O.; Peters, J. Probabilistic movement primitives for coordination of multiple human–robot collaborative tasks. Auton. Robot. 2017, 41, 593–612. [Google Scholar] [CrossRef]
- Modares, H.; Ranatunga, I.; Lewis, F.L.; Popa, D.O. Optimized Assistive Human-Robot Interaction Using Reinforcement Learning. IEEE Trans. Cybern. 2016, 46, 655–667. [Google Scholar] [CrossRef] [PubMed]
- Alemi, M.; Meghdari, A.; Ghazisaedy, M. The Impact of Social Robotics on L2 Learners’ Anxiety and Attitude in English Vocabulary Acquisition. Int. J. Soc. Robot. 2015, 7, 523–535. [Google Scholar] [CrossRef]
- Di Nuovo, A.; Broz, F.; Wang, N.; Belpaeme, T.; Cangelosi, A.; Jones, R.; Esposito, R.; Cavallo, F.; Dario, P. The multi-modal interface of Robot-Era multi-robot services tailored for the elderly. Intell. Serv. Robot. 2018, 11, 109–126. [Google Scholar] [CrossRef]
- Ghorbandaei Pour, A.; Taheri, A.; Alemi, M.; Meghdari, A. Human–Robot Facial Expression Reciprocal Interaction Platform: Case Studies on Children with Autism. Int. J. Soc. Robot. 2018, 10, 179–198. [Google Scholar] [CrossRef]
- Jones, A.; Castellano, G. Adaptive Robotic Tutors that Support Self-Regulated Learning: A Longer-Term Investigation with Primary School Children. Int. J. Soc. Robot. 2018, 10, 357–370. [Google Scholar] [CrossRef] [Green Version]
- Moulin-Frier, C.; Fischer, T.; Petit, M.; Pointeau, G.; Puigbo, J.Y.; Pattacini, U.; Low, S.C.; Camilleri, D.; Nguyen, P.; Hoffmann, M.; et al. DAC-h3: A Proactive Robot Cognitive Architecture to Acquire and Express Knowledge about the World and the Self. IEEE Trans. Cogn. Dev. Syst. 2018, 10, 1005–1022. [Google Scholar] [CrossRef]
- Costa, S.; Lehmann, H.; Dautenhahn, K.; Robins, B.; Soares, F. Using a Humanoid Robot to Elicit Body Awareness and Appropriate Physical Interaction in Children with Autism. Int. J. Soc. Robot. 2015, 7, 265–278. [Google Scholar] [CrossRef]
- Conti, D.; Di Nuovo, S.; Buono, S.; Di Nuovo, A. Robots in Education and Care of Children with Developmental Disabilities: A Study on Acceptance by Experienced and Future Professionals. Int. J. Soc. Robot. 2017, 9, 51–62. [Google Scholar] [CrossRef]
- Horii, T.; Nagai, Y.; Asada, M.; Access, O. Imitation of human expressions based on emotion estimation by mental simulation. J. Behav. Robot. 2016, 7, 40–54. [Google Scholar] [CrossRef]
- Bechade, L.; Dubuisson-Duplessis, G.; Pittaro, G.; Garcia, M.; Devillers, L. Towards metrics of evaluation of pepper robot as a social companion for the elderly. Lect. Notes Electr. Eng. 2019, 510, 89–101. [Google Scholar]
- Liu, K.; Picard, R. Embedded empathy in continuous, interactive health assessment. In Proceedings of the CHI Workshop on HCI Challenges in Health Assessment, Portland, Oregon, 2–7 April 2005. [Google Scholar]
- Chen, L.; Wu, M.; Zhou, M.; She, J.; Dong, F.; Hirota, K. Information-Driven Multirobot Behavior Adaptation to Emotional Intention in Human-Robot Interaction. IEEE Trans. Cogn. Dev. Syst. 2018, 10, 647–658. [Google Scholar] [CrossRef]
- Admoni, H.; Series, B.S. Nonverbal Behavior Modeling for Socially Assistive Robots. In Proceedings of the 2014 AAAI Fall Symposium Series, Arlington, VA, USA, 24 September 2014. [Google Scholar]
- Ham, J.; Cuijpers, R.H.; Cabibihan, J.J. Combining Robotic Persuasive Strategies: The Persuasive Power of a Storytelling Robot that Uses Gazing and Gestures. Int. J. Soc. Robot. 2015, 7, 479–487. [Google Scholar] [CrossRef] [Green Version]
- Martins, G.S.; Santos, L.; Dias, J. BUM: Bayesian User Model for Distributed Learning of User Characteristics from Heterogeneous Information. IEEE Trans. Cogn. Dev. Syst. 2018. [Google Scholar] [CrossRef]
- Billard, A.; Siegwart, R. Robot learning from demonstration. Robot. Auton. Syst. 2004, 47, 65–67. [Google Scholar] [CrossRef]
- Di Nuovo, A.; Jay, T. Development of numerical cognition in children and artificial systems: A review of the current knowledge and proposals for multi-disciplinary research. Cogn. Comput. Syst. 2019, 1, 2–11. [Google Scholar] [CrossRef]
- Martins, G.S.; Santos, L.; Dias, J. User-Adaptive Interaction in Social Robots: A Survey Focusing on Non-physical Interaction. Int. J. Soc. Robot. 2019, 11, 185–205. [Google Scholar] [CrossRef]
- Clarke, R. The regulation of civilian drones’ impacts on behavioural privacy. Comput. Law Secur. Rev. 2014, 30, 286–305. [Google Scholar] [CrossRef]
Research Keywords |
---|
(cult*) AND (adapt*) AND (behavio*) AND (model* OR system*) AND (robot*) |
(cult*) AND (adapt*) AND (cognitive) AND (model* OR architecture*) AND (robot*) |
affordance* AND (behavio*) AND (adapt*) AND (robot*) |
affordance* AND (cognitive) AND (model* OR architecture*) AND (robot*) |
fac* AND expression AND cognitive AND (model* OR architecture*) AND (robot* ) |
fac* AND expression AND (behavio*) AND (model* OR system*) AND (robot*) |
cognitive AND robot* AND architecture* |
learning AND assistive AND robot* |
affective AND robot* AND behavio* |
empathy AND social AND robot* |
Ref. | Title and Year | Aim | Robot | Social Cues | Models | Area |
---|---|---|---|---|---|---|
[32] | Persuasive AI Technologies for Healthcare Systems (2016) | Development of an IoT toolbox toward AI-based persuasive technologies for healthcare systems | NAO | Gaze behavior, facial behavior, speech cues | A system that contains:
| Cognitive architectures |
[23] | Enabling robotic social intelligence by engineering human social-cognitive mechanisms (2017) | Suggestions and overviews on social cognitive mechanisms | - | - | A model that takes into consideration social-cognitive mechanisms to facilitate the design of robots | Cognitive architectures |
[33] | An Architecture for Emotional and Context-Aware Associative Learning for Robot Companions (2015) | Theoretical architectural model based on the brain’s fear learning system | - | Environmental cues | Theoretical architectural model that uses artificial neural networks (ANNs) representing sensory thalamus, sensory cortex, amygdala, and orbitofrontal cortex. | Cognitive architectures |
[34] | From learning to new goal generation in a bio-inspired robotic setup (2016) | Imitation of the neural plasticity, the property of the cerebral cortex supporting learning | NAO | Eye behavior | A model called Intentional Distributed Robotic Architecture (IDRA) that takes inspiration from the amygdala-thalamo-cortical circuit in the brain at its functional level | Cognitive architectures |
[35] | Human-aware interaction: A memory-inspired artificial cognitive architecture (2017) | Human-aware cognitive architecture to support HRI | Care-O-Bot 4 | Eyes and vocal behaviors | Partially observable Markov decision process (POMDP) model | Cognitive architectures |
[31] | Artificial Empathy: An Interdisciplinary Investigation (2015) | Overview the research field aimed at building emotional and empathic robots focusing on its main characteristics and ongoing transformations | - | Gestures and posture cues | - | Empathy |
[17] | How can artificial empathy follow the developmental pathway of natural empathy? (2015) | A conceptual model of artificial empathy is proposed and discussed with respect to several existing studies | Emotional communication robot, WAMOEBA - WE humanoid robot | - | Conceptual model of artificial empathy | Empathy |
[36] | A Cognitive Control Architecture for the Perception–Action Cycle in Robots and Agents (2013) | Visual perception, recognition, attention, cognitive control, value attribution, decision-making, affordances, and action can be melded together in a coherent manner in a cognitive control architecture of the perception–action cycle for visually guided reaching and grasping of objects by a robot or an agent. | - | - | Model composed of four modules: object localization and recognition, cognitive control, decision-making, value attribution affordances, motion, and planning | Cognitive architectures |
[37] | A computational model of perception and action for cognitive robotics (2011) | A novel computational cognitive model that allows for direct interaction between perception and action as well as for cognitive control, demonstrated by task-related attentional influences. | - | - | HiTEC architecture composed of a task level, a feature level, and a sensory motor level | Cognitive architectures |
[38] | Interaction of culture-based learning and cooperative co-evolution and its application to automatic behavior-based system design (2010) | A bio-inspired hybridization of reinforcement learning, cooperative co-evolution, and a cultural-inspired memetic algorithm for the automatic development of behavior-based agents | - | Reinforcement learning for structure learning that finds the organization of behavior modules during the agent’s lifetime | Behavioral adaptation | |
[39] | Affective Facial Expression Processing via Simulation: A Probabilistic Model (2014) | Simulation Theory and neuroscience findings on Mirror-Neuron is used as the basis for a novel computational model, as a way to handle affective facial expressions. | - | postural and vocal cues | Simulation Theory and neuroscience findings on Mirror-Neuron Systems as the basis for a novel computational model, | Cognitive architectures |
[40] | Multi-robot behavior adaptation to local and global communication atmosphere in humans-robots interaction (2014) | A multi-robot behavior adaptation mechanism based on cooperative–neutral–competitive fuzzy-Q learning is developed in a robot | - | Eyes cues | Cooperative–neutral–competitive fuzzy-Q learning | Behavioral adaptation |
[41] | An Ontology Design Pattern for supporting behavior arbitration in cognitive agents (2017) | An Ontology Design Pattern for the definition of situation-driven behavior selection and arbitration models for cognitive agents. | MARIO robot- | affordances cues | Affordance Ontology Design Pattern (ODP) | Cognitive architectures |
[42] | Context-Aware Cloud Robotics for Material Handling in Cognitive Industrial Internet of Things (2018) | In this paper, a cognitive industrial entity called context-aware cloud robotics (CACR) for advanced material handling is introduced and analyzed. | - | - | Energy-efficient and cost-saving algorithms for material handling | Cognitive architectures |
Ref. | Title and Year | Aim | Robot | Social Cues | Participants | Models | Area |
---|---|---|---|---|---|---|---|
[30] | Compassion, empathy, and sympathy expression features in affective robotics (2017) | Comparison between expression features of compassion, sympathy, and empathy in British English and Polish that need to be tuned in social robots to enable them to operate successfully | Generic assistive robots | Linguistic cues, facial cues, movement cues, and physiological features | British English-speaking participants (mean age 23.2 years, 21 females) and 29 Polish-speaking subjects (mean age 25.6 years, 26 females) | Creating culture-specific emotion (compassion, sympathy, and empathy) models | Empathy |
[43] | Persuasive robotic assistant for health self-management of older adults: Design and evaluation of social behaviors (2010) | Comparison of a text-based interface with a character persuading user | I-Cat | Gaze and posture cues | 24 middle age adults age 45 to 65 years old | The model is based on persuasion and uses Big five questionnaire on personality | Empathy |
[16] | Inclusion of service robots in the daily lives of frail older users: A step-by-step definition procedure on users′ requirements (2018) | Definition of metrics to build an empathic robot that helps elderly people | Pepper | - | 42 participants (elderly people) | The model is based on an emotion recognition algorithm | Empathy |
[44] | Promoting interactions between humans and robots using robotic emotional behavior (2015) | Emotion-based assistive behavior for a socially assistive robot | Brian | Gestures and emotional state cues | 34 subjects aged 17 to 68 years old | Markov model that uses a human affective state classifier and a non-verbal interaction and states analysis (NISA) | Empathy |
[45] | A personalized and platform-independent behavior control system for social robots in therapy: development and applications (2018) | A behavior control system for social robots in therapies is presented with a focus on personalization and platform-independence | NAO Pepper | - | Children and elderly people | Model based on a Body Action Coding System (BACS) | Behavioral adaptation |
[46] | Adaptive artificial companions learning from users’ feedback (2017) | Capacity of an intelligent system to learn and adapt its behavior/actions | EMOX | Current user(s) attributes and current environmental attribute cues | Children, teenagers, and adults | Markov Decision Processes (MDPs) model | Behavioral adaptation |
[47] | Why robots should be social: Enhancing machine learning through a social human-robot interaction (2015) | How additional social cues can improve learning performance | Robot head mounted on an articulated robot arm | Gaze behavior | 41 healthy and young participants with an average age of 24 years old. | The interaction model of the robot is based on language games | Behavioral adaptation |
[25] | The affective loop: A tool for autonomous and adaptive emotional human-robot interaction (2015) | Affective model for social robotics | NAO | - | Children age 5 to 7 years old. | Plutchik emotional model; uses Fuzzy rules and learning by demonstration | Empathy |
[48] | Puffy: A Mobile Inflatable Interactive Companion for Children with Neurodevelopmental Disorder (2017) | Robot’s ability in gestures’ interpretation | Puffy | Visual, auditory, and Tactile cues | 19 children with an average age of 6 years old | Interactional Spatial Model which considers: interpersonal distance - relative (child-robot) bodily orientations - child’s and robot’s movements in space - child’s emotional state - child’s eye contact - robot’s emotional state | Behavioral adaptation |
[49] | Social intelligence for a robot engaging people in cognitive training activities (2012) | Definition of user state and task performance, adjusting robot’s behavior | Brian 2.0 | - | 10 healthy and young participants from 20 to 35 years old. | A hierarchical reinforcement learning approach is used to create a model that allow the robot to learn appropriate assistive behaviors based on the structure of the activity | Behavioral adaptation |
[50] | Empathizing with emotional robot based on cognition reappraisal (2017) | Continuous cognitive emotional regulation model for robot | - | - | 10 subjects | Hidden Markov Model; uses cognitive reappraisal strategy | Behavioral adaptation |
[51] | Interpretation of interaction demanding of a user based on nonverbal behavior in a domestic environment (2017) | Model for decision making on a user’s non-verbal interaction demand | MIRob | Eyes behavior | Eight subjects from 25 to 58 years old. | A model to decide when to interact with the user. It observes movements and behavior of the patient put them through a module called Interaction Demanding Pose Identifier. The data obtained are fed into the Fuzzy Interaction Decision Making Module in order to interpret the degree of interaction demanding of the user. | Behavioral adaptation |
[52] | Understanding human intention by connecting perception and action learning in artificial agents (2017) | System inspired by humans’ psychological and neurological phenomena | - | Eyes behavior | - | Generic model; uses supervised multiple timescale recurrent neural networks | Cognitive architectures |
[53] | Improving human-robot interaction based on joint attention (2017) | A novel cognitive architecture for a computational model of the limbic system is proposed, inspired by human brain activity, which improves interactions between a humanoid robot and preschool children | Robotis Bioloid | Eyes, auditory, and sensory cues | 16 pre-school children from 4 to 6 years old. | Dynamic neural fields (DNFs) model; used with reinforcement and unsupervised learning-based adaptation processes | Cognitive architecture |
[54] | A Robot Learns the Facial Expressions Recognition and Face/Non-face Discrimination Through an Imitation Game (2014) | A robotic system that can learn online is shown to recognize facial expressions without having a teaching signal associating a facial expression | - | Gestures, gaze direction, vocalization cues | 20 persons | Theoretical model for online learning of facial expression recognition | Cognitive architecture |
[55] | Development of first social referencing skills: Emotional interaction as a way to regulate robot behavior (2014) | Studying how emotional interactions with a social partner can bootstrap increasingly complex behaviors such as social referencing | - | Facial expression cues | 20 persons | Model that uses the child’s affective states and adapts its affective and social behavior in response to the affective states of the child | Behavioral adaptation |
[24] | May I help you? (2015) | Model of adaptive behaviors to pedestrians’ intentions | Robovie | Body and facial expression cues | The participants were visitors of the shopping mall where the robot was placed. | State transition model; used with an Intention-Estimation Algorithm | Behavioral adaptation |
[56] | A framework for the design of person following behaviors for social mobile robots (2012) | Framework for people detection, state estimation, and trajectories generation for an interactive social behavior | Kompai | Body and facial expression cues | - | The model combines perception, decision, and action and uses fuzzy logic and SLAM algorithms | Behavioral adaptation |
[57] | People-aware navigation for goal-oriented behavior involving a human partner (2011) | Person-aware navigation system | - | Body and facial expression cues | 8 healthy and young subjects (7 male and 1 female) with an average age of 20.8 years. | The model weights trajectories of a robot’s navigation system for autonomous movement using Algorithms in navigation-stack of ROS | Behavioral adaptation |
[58] | Promoting interactions between humans and robots using robotic emotional behavior (2015) | Map of human psychological traits to make robots emotive and sociable | NAO | Body and facial expression cues | 10 subjects | Emotional Intelligence (EI) model; uses Meyer Briggs theory (MBT) and fuzzy logic | Behavioral adaptation |
[59] | Moving toward an intelligent interactive social engagement framework for information gathering (2017) | To develop an integrated robotic framework including a novel architecture and an interactive user interface to gather information | NAO, Milo, and Alice | Body and facial expression cues | 186 children from 8 to 12 years old | Interactive Social Engagement Architecture (ISEA) is designed to integrate behavior-based robotics, human behavior models, cognitive architectures, and expert user input to increase social engagement between a human and system | Cognitive architectures |
[60] | Moving toward an intelligent interactive social engagement framework for information gathering (2018) | Software architecture allowing a robot to socially interact with human beings, sharing with them some basilar cognitive mechanisms | NAO | non- verbal cues such as social signals | Children | Hidden Markov Model (HMM) used as a reasoning approach The model encoded social interaction and uses natural and intuitive communication channels, both to interpret the human behavioral and to transfer knowledge to the human | Cognitive architectures |
[61] | Persuasive robotic assistant for health self-management of older adults: Design and evaluation of social behaviors (2010) | To implement models obtained from biological systems to humanoid robots | Robothespian | Body and facial expression cues | 237 subjects of different age, gender, and education | Model that uses a facial action coding system | Behavioral adaptation |
[62] | A Logic-Based Computational Framework for Inferring Cognitive Affordances (2018) | A framework that reasons about affordances in a more general manner than described in the existing literature | - | Eye, gaze body orientation; verbal request | The robot is tested in different scenarios with the following domains as, for example, at an elder care facility | A DS theory framework often interpreted as a generalization of the Bayesian framework | Cognitive architectures |
[63] | The Role of Functional Affordances in Socializing Robots (2015) | A paper where affordances of objects were used as a starting point for the socialization of robots | Jenny Robot | gesture, gaze, head movements, vocal features, posture, proxemics, and touch and affordances cues | Jenny is implemented for the tea-making task under various scenarios | OWL-DL model | Cognitive architectures |
[64] | Intelligent emotion and behavior based on topological consciousness and adaptive resonance theory in a companion robot (2016) | An artificial topological consciousness that uses a synthetic neurotransmitter and motivation, including a logically inspired emotion system is proposed | CONBE robot | Gaze cues | Experiments done for the object recognition and face recognition | Behavioral model for recognition of objects and faces | Cognitive architectures |
[65] | Practical aspects of deploying Robotherapy systems (2018) | Our robotic system helps therapists in sessions of cognitive stimulation. Without | NAO | Orientation of the gaze | Robot tested with different kind of cultures | A new module capable of communicating implemented with a behavioral architecture BICA | Behavioral adaptation |
[66] | Staged Development of Robot Skills: Behavior Formation, Affordance Learning and Imitation with Motions (2015) | Realization of an integrated developmental system where the structures emerging from the sensorimotor experience of an inter- acting real robot are used as the sole building blocks of the subsequent stages that generate increasingly more complex cognitive capabilities | 7 DOF Motoman robot arm | Motionese cues robot | The robot performed 64 swipe action executions towards a graspable object that is placed in a reachable random position | Model that includes three stages: discovering behavior primitives, learning to detect affordances, learning to predict effects | Behavioral adaptation |
[67] | Personality affected robotic emotional model with associative memory for human-robot interaction (2016) | This paper discusses human psychological phenomena during communication from the point of view of internal and external factors, such as perception, memory, and emotional | Iphonod robot | Facial, gesture, voice cues | The experimental part is divided into two parts; first, the processing of multi-modal information into emotional information in the emotion model is simulated. Next, based on multi-modal information and emotional information, which came from the first one, the association process will be performed to determine the robot behaviors. | Model for object, facial and gesture, voice, and biometric recognition | Behavioral adaptation |
[68] | Multimodal emotional state recognition using sequence-dependent deep hierarchical features (2015) | This model uses a hierarchical feature representation to deal with spontaneous emotions, and learns how to integrate multiple modalities for non-verbal emotion recognition, which makes it suitable for use in an HRI scenario | - | Facial expressions | Three experiments are executed and evaluated. The first one uses information of the face expression to determine emotional estates. The second one extracts information from the body motion, composed by arms, torso, and head movements, and the third one uses both types of information. | Multichannel Convolutional Neural Network (MCCNN) to extract hierarchical features | Behavioural adaptation |
[69] | Artificial cognition for social human–robot interaction: An implementation (2015) | This article is an attempt to characterize these challenges and to exhibit a set of key decisional issues that need to be addressed for a cognitive robot to successfully share space and tasks with a human | Manipulator | Verbal communication, gestures, and social gaze | A scenario involving multi-modal, interactive grounding: the humans can refer to invisible or ambiguous objects that the robots anchor to physical objects through multi-modal interactions with the user. A second task that the robot has to achieve is cleaning a table | Model that has collaborative cognitive skills: geometric reasoning and situation assessment based on perspective-taking and affordance analysis; acquisition and representation of knowledge models for multiple agents (humans and robots, with their specificities); natural and multi-modal dialogue; human- aware task planning; human–robot joint task achievement. | Cognitive architectures |
[70] | Collaborative Autonomy between High-level Behaviours and Human Operators for Remote Manipulation Tasks using Different Humanoid Robots (2017) | This article discusses the technical challenges that two teams face and overcame during a DARPA completion to allow the human operators to interact with a robotic system with a higher level of abstraction and share control authority with it | THORMANG “Johnny” and Atlas “Florian” | Affordances Visual cues | The two robots have to complete the two tasks of opening a door and a valve | Model for a humanoid robot that is composed by a remote manipulation control approach, a high-level behavior control approach, an overarching principle, and a collaborative autonomy, which brings together the remote manipulation and high-level control approaches | Behavioral adaptation |
[71] | Probabilistic Movement Primitives for Coordination of Multiple Human-Robot Collaborative Tasks (2017) | This paper proposes an interaction learning method for collaborative and assistive robots based on movement primitives | Dual arm manipulator | Social-cognitive cues | A robot co-worker must recognize the intention of the human to decide some actions: if it should hand over a screwdriver or hold the box or coordinate the location of the handover of a bottle with respect to the location of the hand of the human. | Imitation learning to construct a mixture model of human-robot interaction primitives. This probabilistic model allows the assistive trajectory of the robot to be inferred from human observations | Cognitive architectures |
[2] | Robots in Education and Care of Children with Developmental Disabilities: A Study on Acceptance by Experienced and Future Professionals (2017) | A study on the acceptance of robots by experienced practitioners and university students in psychology and education sciences is presented | NAO | Verbal cues | Demonstration of the capabilities of the robot in front of participants that had to filled a questionnaire at the end according to robot’s behavior | The aim is to examine the factors, through the Unified Theory of Acceptance and Use of Technology (UTAUT) model | Cognitive architectures |
[72] | Optimized Assistive Human–Robot Interaction Using Reinforcement Learning (2016) | The proposed HRI system assists the human operator to perform a given task with minimum workload demands and optimizes the overall human–robot system performance. | PR2 | - | The robot has to draw a prescribed trajectory | Reinforcement Learning LQR Method | Behavioral adaptation |
[73] | The Impact of Social Robotics on L2 Learners’ Anxiety and Attitude in English Vocabulary Acquisition (2015) | This study aimed to examine the effect of robot assisted language learning (RALL) on the anxiety level and attitude in English vocabulary acquisition amongst Iranian EFL junior high school students | NAO | Iranian EFL junior high school students. Forty-six female students, who were beginners at the age of 12, participated in this study. | FLCAS questionnaire that evaluates anxiety | Cognitive Architectures | |
[74] | The multi-modal interface of Robot-Era multi-robot services tailored for the elderly (2018) | ROBOTERA project has the objective of implementing easy and acceptable service robotic system for the elderly. | Three Robot: Coro, Doro, Oro | Gaze, vocal, facial cues | Elderly people | Architectures based on emotion recognitions and object’s recognition | Cognitive Architectures |
[75] | Human–Robot Facial Expression Reciprocal Interaction Platform: Case Studies on Children with Autism (2018) | In this research, a robotic platform has been developed for reciprocal interaction consisting of two main phases, namely as Non-structured and Structured interaction modes | Mina Robot | Facial expressions | Children | The model is composed of two modules: Non-structured and Structured interaction modes. In the Non-structured interaction mode, a vision system recognizes the facial expressions of the user through a fuzzy clustering method. In the Structured interaction mode, a set of imitation scenarios with eight different posed facial behaviors were designed for the robot | Cognitive Architectures |
[76] | Adaptive Robotic Tutors that Support Self-Regulated Learning: A Longer-Term Investigation with Primary School Children (2018) | This paper explores how personalized tutoring by a robot, achieved using an open learner model (OLM), promotes self-regulated learning (SRL) processes and how this can impact learning and SRL skills compared to personalized domain support alone | NAO | Gaze | Children | Using an open learner model (OLM) to learn SRL processes | Behavioral adaptation |
[77] | DAC-h3: A Proactive Robot Cognitive Architecture to Acquire and Express Knowledge About the World and the Self (2018) | This paper introduces a cognitive architecture for a humanoid robot to engage in a proactive, mixed-initiative explo- ration and manipulation of its environment, where the initiative can originate from both humans and robots. | i-Cub | Gaze | Picking objects | The framework, based on a biologically grounded theory of the brain and mind, integrates a reactive interaction engine, a number of state-of-the-art perceptual and motor learning algorithms, as well as planning abilities and an autobiographical memory | Cognitive Architectures |
[78] | Using a Humanoid Robot to Elicit Body Awareness and Appropriate Physical Interaction in Children with Autism (2015) | A human–robot interac- tion study, focusing on tactile aspects of interaction, in which children with autism interacted with the child-like humanoid robot KASPAR | KASPAR | Tactile, gaze cues | Children | Model for the behavior analysis | Cognitive Architectures |
Keywords | Barriers/Limitations | Challenges and Opportunities | Research Topics |
---|---|---|---|
Sensors Technology | Multimodal sensors [32] | A multisensory system should be implemented in the model of a robot to create an improved architecture |
|
Reliable and usable sensor technology [32] | Sensors should be designed to be reliable and acceptable in a real-life situation to reduce the time-to-market |
| |
Perception | Real-time learning [28,84] | Real-time learning should be developed to adapt the behavior of the robot, according to the changing needs of the user |
|
Emotional state transitions [44] | Research on the emotional state module should be done more deeply |
| |
Improving object detections [52] | Different approaches in the area of object detection should be investigated to obtain a strong model of the robot |
| |
Learning from the user [46,71] | The robot should be able to learn from the user in order to accomplish complex tasks |
| |
Affordances [41,62,63] | Affordances are important elements to be analyzed in the process of the implementation of a behavioral model for a cognitive robot |
| |
Experimental | Experimental session [35,61,85] | The model should be implemented on a real robot and tested to evaluate the proposed artificial cognitive architecture in dynamical environments |
|
Architecture Design | Brain-inspired architecture [34] | Research in robot’s behavioral model should be conceived with a multidisciplinary approach to be able to adapt to the user’s needs |
|
Modular and flexible architecture [19] | Robot should adapt to different context and different preferences, which could change over time. Therefore, the architecture of a robot should be modular and flexible. |
| |
Behavioral consistency, predictability, and repeatability [48] | These requirements should be investigated to obtain a complex model |
| |
Standardization | Having a high level of interoperability |
| |
Ethical, legal, and social | Ethical and social aspects [19] | Ethical implications also should be investigated when creating a new model for a robot |
|
Legal aspect | No rules can be found in the legal field of social robotics |
| |
Cultural adaption [24,40,65] | The robot should be able to adjust parameters for different cultures |
|
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Nocentini, O.; Fiorini, L.; Acerbi, G.; Sorrentino, A.; Mancioppi, G.; Cavallo, F. A Survey of Behavioral Models for Social Robots. Robotics 2019, 8, 54. https://doi.org/10.3390/robotics8030054
Nocentini O, Fiorini L, Acerbi G, Sorrentino A, Mancioppi G, Cavallo F. A Survey of Behavioral Models for Social Robots. Robotics. 2019; 8(3):54. https://doi.org/10.3390/robotics8030054
Chicago/Turabian StyleNocentini, Olivia, Laura Fiorini, Giorgia Acerbi, Alessandra Sorrentino, Gianmaria Mancioppi, and Filippo Cavallo. 2019. "A Survey of Behavioral Models for Social Robots" Robotics 8, no. 3: 54. https://doi.org/10.3390/robotics8030054
APA StyleNocentini, O., Fiorini, L., Acerbi, G., Sorrentino, A., Mancioppi, G., & Cavallo, F. (2019). A Survey of Behavioral Models for Social Robots. Robotics, 8(3), 54. https://doi.org/10.3390/robotics8030054