A Robot Has a Mind of Its Own Because We Intuitively Share It
Abstract
:1. Introduction
2. Materials and Methods
2.1. Participants
2.2. Measurement
2.3. Facial Expressions of a Virtual Agent
2.4. Statistical Analysis
3. Results
4. Discussion
Author Contributions
Funding
Conflicts of Interest
References
- Brian, S. Theory of mind for a humanoid robot. Auton. Robots 2002, 12, 13–24. [Google Scholar]
- Gray, H.M.; Gray, K.; Wegner, D.M. Dimensions of mind perception. Science 2007, 315, 619. [Google Scholar] [CrossRef] [Green Version]
- Plachaud, C. Modeling multimodal expression of emotion in a virtual agent. Phil. Trans. Soc. B 2009, 364, 3539–3548. [Google Scholar] [CrossRef] [Green Version]
- Prkachin, K.M. The consistency of facial expressions of pain: A comparison across modalities. Pain 1992, 51, 297–306. [Google Scholar] [CrossRef]
- Hale, C.; Hadjistavropoulos, T. Emotional components of pain. Pain Res. Manag. 1997, 2, 217–225. [Google Scholar] [CrossRef] [Green Version]
- LeResche, L. Facial expression in pain: A study of candid photographs. J. Nonverbal. Behav. 1982, 7, 46–56. [Google Scholar] [CrossRef]
- LeResche, L.; Dworkin, S.F. Facial expressions of pain and emotions in chronic TMD patients. Pain 1988, 35, 71–78. [Google Scholar] [CrossRef]
- Jackson, P.L.; Decety, J. Motor cognition: A new paradigm to study self-other interactions. Curr. Opin. Neurobiol. 2004, 14, 259–263. [Google Scholar] [CrossRef]
- Carr, L.; Iacoboni, M.; Dubeau, M.C.; Mazziotta, J.C.; Lenzi, G.L. Neural mechanisms of empathy in humans: A relay from neural systems for imitation to limbic areas. Proc. Natl. Acad. Sci. USA 2003, 100, 5497–5502. [Google Scholar] [CrossRef] [Green Version]
- Goubert, L.; Craig, K.D.; Vervoort, T.; Morley, S.; Sullivan, M.J.; de C. Williams, A.C.; Cano, A.; Crombez, G. Facing others pain: The effects of empathy. Pain 2005, 118, 285–288. [Google Scholar] [CrossRef]
- Jackson, P.L.; Meltzoff, A.N.; Decety, J. How do we perceive the pain of others? A window into the neural processes involved in empathy. Neuroimage 2005, 24, 771–779. [Google Scholar] [CrossRef] [Green Version]
- Kampe, K.K.; Frith, C.D.; Frith, U. “Hey John”: Signals conveying communicative intention toward the self activate brain regions associated with “mentalizing,” regardless of modality. J. Neurosci. 2003, 23, 5258–5263. [Google Scholar] [CrossRef]
Subject | Age | Gender | Pain Intensity Rating # | Cold Pain Detection Threshold (°C) * | |||||
---|---|---|---|---|---|---|---|---|---|
Neutral | Painful | Unhappy | Control | Neutral | Painful | Unhappy | |||
1 | 37 | male | 1 | 5 | 8 | 17.5 ± 0.75 | 11.6 ± 0.90 | 13.4 ± 3.18 | 13.3 ± 2.35 |
2 | 24 | female | 1 | 8 | 5 | 10.2 ± 1.15 | 11.1 ± 0.72 | 12.4 ± 0.95 | 9.50 ± 0.55 |
3 | 26 | female | 7 | 8 | 5 | 17.9 ± 1.99 | 20.7 ± 1.32 | 20.3 ± 0.35 | 16.5 ± 0.31 |
4 | 35 | male | 8 | 8 | 4 | 28.5 ± 0.25 | 29.8 ± 0.12 | 29.0 ± 1.24 | 29.4 ± 0.68 |
5 | 28 | female | 6 | 7 | 7 | 23.0 ± 2.78 | 23.0 ± 0.97 | 22.7 ± 3.10 | 23.4 ± 1.99 |
6 | 34 | male | 3 | 8 | 1 | 28.4 ± 0.32 | 28.7 ± 0.36 | 29.6 ± 0.58 | 28.3 ± 0.15 |
7 | 31 | female | 1 | 5 | 3 | 16.7 ± 4.60 | 18.4 ± 1.10 | 23.7 ± 1.03 | 20.2 ± 1.22 |
8 | 26 | male | 1 | 3 | 5 | 24.4 ± 1.45 | 25.2 ± 0.38 | 24.3 ± 1.12 | 26.6 ± 0.81 |
9 | 34 | male | 1 | 8 | 3 | 16.3 ± 1.40 | 16.2 ± 1.50 | 20.5 ± 0.81 | 13.3 ± 3.86 |
10 | 33 | male | 7 | 1 | 8 | 19.5 ± 0.44 | 23.2 ± 0.93 | 16.4 ± 0.32 | 23.4 ± 1.10 |
11 | 25 | female | 2 | 6 | 5 | 5.90 ± 4.40 | 10.1 ± 2.40 | 16.5 ± 2.06 | 13.9 ± 1.64 |
12 | 44 | male | 1 | 9 | 6 | 1.20 ± 1.22 | 1.1 ± 1.15 | 11.0 ± 0.87 | 5.90 ± 3.97 |
13 | 25 | female | 1 | 7 | 4 | 12.6 ± 3.25 | 16.8 ± 1.87 | 15.3 ± 1.56 | 19.6 ± 1.81 |
14 | 35 | male | 1 | 9 | 4 | 4.50 ± 2.57 | 6.5 ± 1.21 | 22.7 ± 1.46 | 10.8 ± 0.30 |
15 | 27 | female | 6 | 10 | 8 | 21.8 ± 0.10 | 22.1 ± 2.13 | 20.6 ± 2.51 | 19.3 ± 2.25 |
16 | 27 | female | 1 | 6 | 3 | 28.1 ± 0.20 | 28 ± 0.91 | 29.4 ± 0.25 | 28.5 ± 0.71 |
17 | 27 | female | 2 | 5 | 5 | 27.2 ± 1.59 | 21.9 ± 3.47 | 26.0 ± 1.73 | 22.6 ± 1.82 |
18 | 36 | female | 1 | 4 | 8 | 20.3 ± 0.50 | 20.2 ± 0.47 | 22.5 ± 0.80 | 24.3 ± 0.50 |
19 | 42 | male | 1 | 9 | 7 | 18.3 ± 2.20 | 17.6 ± 0.90 | 23.9 ± 1.60 | 20.3 ± 1.96 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Sumitani, M.; Osumi, M.; Abe, H.; Azuma, K.; Tsuchida, R.; Sumitani, M. A Robot Has a Mind of Its Own Because We Intuitively Share It. Appl. Sci. 2020, 10, 6531. https://doi.org/10.3390/app10186531
Sumitani M, Osumi M, Abe H, Azuma K, Tsuchida R, Sumitani M. A Robot Has a Mind of Its Own Because We Intuitively Share It. Applied Sciences. 2020; 10(18):6531. https://doi.org/10.3390/app10186531
Chicago/Turabian StyleSumitani, Mizuho, Michihiro Osumi, Hiroaki Abe, Kenji Azuma, Rikuhei Tsuchida, and Masahiko Sumitani. 2020. "A Robot Has a Mind of Its Own Because We Intuitively Share It" Applied Sciences 10, no. 18: 6531. https://doi.org/10.3390/app10186531
APA StyleSumitani, M., Osumi, M., Abe, H., Azuma, K., Tsuchida, R., & Sumitani, M. (2020). A Robot Has a Mind of Its Own Because We Intuitively Share It. Applied Sciences, 10(18), 6531. https://doi.org/10.3390/app10186531