Age-Related Differences in Fixation Pattern on a Companion Robot
Abstract
:1. Introduction
2. Literature Review
2.1. Design of a Companion Robot
2.2. Eye Tracking for Evaluating the Design of a Product
2.3. Eye Tracking for Investigating the Design of a Robot
3. Method
3.1. Participants
3.2. Companion Robot Mockup
3.3. Experimental Setting
3.4. Procedure
- Pre-questionnaire: Prior to responding to the survey, all participants were instructed to read the definition and the role of a companion robot described in the questionnaire [56]. The questionnaire included items about demographic information such as gender, age, and living arrangement. Screening questions were included to check whether the participants had a history of ocular disease.
- Eye tracking: After the participants completed the questionnaire, they adjusted the chair to a position marked by scotch tape on the floor. We asked them to sit comfortably leaning on the back of the chair. Instructors explained the eye-tracking procedure and how to wear the eye-tracking glasses (Tobii Pro Glasses 2, 50 Hz sampling, resolution: 1920 × 1080 pixels). The instructors asked the participants not to shake their heads as much as possible to facilitate more accurate data collection and analysis [47,57,58]. After confirming that the calibration was successfully completed, one of the instructors removed the blanket from the mockup, and the data collection began. The eye-tracking data were collected for 15 s [8].
- Post-questionnaire: The participants evaluated the physical attractiveness and the social likeability of the bear on a five-point Likert scale. The questionnaire used was the BEHAVE measurement tool [13], which includes five items of physical attractiveness and five items of social likeability [59]. However, one item—the robot is very sexy looking—was excluded from the physical attractiveness questionnaire because it is beyond the scope of this study. In addition to completing the BEHAVE measurement, they rated the bear’s design on a five-point Likert scale.
- Individual interview: We asked the participants about the parts of the robot they focused on. They were also interviewed about their opinions about the bear’s design. Next, they were asked to explain the reasons behind their evaluation of the robot’s design. Finally, we asked their opinions about the size of the robot. The interviews lasted about 10 min on average.
3.5. Data Analysis
3.5.1. Determining Areas of Interest
3.5.2. Eye-Tracking Data Analysis
3.5.3. Statistical Analysis
- H1: The eye-tracking statistics—total fixation duration, average fixation duration, and number of fixations—of each AOI of the robot will differ by age groups.
- H2: The rating of the robot’s design will differ by age groups.
- H3: The physical attractiveness and the social likeability of the robot will differ by age groups.
4. Eye-Tracking and Survey Results
5. Interview Results
5.1. Overall Impressions
- “The first thing I noticed was the bear’s face. Just like when you talk to people, you see them face to face.”—P25, older.
- “When you see another person, you first look at their eyes. I think it’s just a habit to have eye contact.”—P13, older.
- “Because people see each other’s face and eyes when they first meet, the face naturally comes in the sight first. You can’t just look at their legs first, because that’s not what they’re used to.”—P11, older.
- “I recognized the robot as a bear at a first sight. Overall, I think the quality of it is high. It almost looks like a finished product. The balance between the top and bottom parts is good and I personally can’t seem to find a flaw.”—P5, older.
- “At the first glance, its shape, a bit like a teddy bear, caught my eye. I know why it caught my eye. Because we are familiar with teddy bears in general, and its overall impression was just like a teddy bear. Eyes, then ears, and then this round face. From these features, I had an instant feeling that this was a teddy bear.”—P4, older.
- “I didn’t think it was a bear, maybe because it’s face was too wide? And its body is nothing like what I know of a teddy bear. I think that’s because its arms are thin, and legs are bulky.”—P14, younger.
- “On the whole, it is felt a little different from the other bears I’ve been familiar with. The face is too long in its width and the arms and legs also look different. And for legs, you see, one of the main features of a bear is its paws. But there are no such parts with this robot, and I have a kind of wonder whether this is really a bear.”—P25, younger.
5.2. Head
- “When I first saw it, I thought it was cute because its eyes, nose, and mouth reminded me of a teddy bear. But then, later, I thought it was a little weird. (laugh) The face is too big and its facial features are cute yet a little weird.”—P6, younger.
- “The face is too.. wide. Haha. I… um, I don’t think it’s good.”—P9, younger.
5.3. Body
- “It is round with no sharp edges, which makes it quite friendly and cute. In general, there are no sharp edges, and its belly, ears, eyes, and face are all round, which makes the bear look friendly and soft. I think the belly poking out is so cute.”—P23, older.
- “The belly pokes out, so it looks cute. You know, it reminds me of those mischievous plush toys kids play with. Their bellies are sticking out like this. Or those other toys that look at you like this, showing their belly button. I think it is designed well.”—P24, older.
- “It’s so cute. It looks like it’s going to talk to me like a little child. Oh, I like that feeling. I think everything about this is cute. I can’t think of anything wrong with it. I feel I can have easy conversations with it like I’m talking to a friend.”—P7, older.
- “Overall, it looks cute and funny. I feel it’s going to do something fun with me and talk to me. I feel it’s going to give me a lot of joy.”—P8, older.
- “It seems to be bright and affable. I feel it is very friendly just like a puppy, like a pet dog.I feel I should protect this robot and it will be my friend. I think it will run errands for me.”—P34, older.
- “Pokey bellies are cute. Winnie-the-Pooh or other plush toys usually have big bellies. I think that’s because people like it. I think that’s why I first looked at its belly. This one feels more friendly, even though it’s a machine, because it has a pokey belly.”—P5, younger.
- “Its belly is poking out, so it didn’t occur to me that it would be offensive. In contrast, it felt so friendly, the pokey belly. Like Winnie-the-Pooh. This robot has a similar body shape to littletoddlers.”—P31, younger.
5.4. Arms and Legs
- “When I first saw this bear, I thought to myself, I could just hold both of its hands! I felt that I would just hold its hands, if given some kind of assurance that I may hold them.”—P2, older.
- “The arms were sticking out, just like how other teddy bear plush toys look and beg for your hug. It was cute, so I think that’s why I kept looking at its arms.”—P33, younger.
- “The legs looked quite puffy, which makes it unbalanced throughout its arms, belly, and legs. The legs are as large as human. So, I thought it was a little odd, and asked myself whether teddy bears usually look like this. I think the shape is quite.. different from what people generally think of teddy bears. On the positive side, its ugliness makes it cute.”—P29, younger.
- “The arms.. do not look like bear’s. They’re too thin.”—P22, younger.
- “Right now, it is more like a robot so I think it will be better if it looked a little more natural. This may be difficult because of some parts, but I think it would have looked better with bigger arms.”—P10, younger.
5.5. Size
- “I think people who live alone would like this bear better. People who live alone usually don’t live in a big house. Since they live in small places, I think it’ll be irrelevant if it’s even smaller than this bear.”—P27, older.
- “I think it’ll have to be a little bigger than the current size. I think this is too small. You need to have an even eye level to be able to have conversations. With this size, people will have to bend their back, which makes it uncomfortable. Even if people talk to this bear from a distance, they still need a taller robot.”—P9, older.
- “Perhaps a little larger than this one. The robot would look out of place if it were too big. If it is in the house, it needs to be seen easily. In that sense, it should be a little bigger than now.”—P15, older.
- “When I stand up, this robot is on the floor, so I can’t see the screen well. Things like the texts on the screen get farther away in those cases, so I think it might be better if its height were a little taller.”—P18, older.
6. Discussion
7. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Moyle, W.; Jones, C.; Sung, B.; Bramble, M.; O’Dwyer, S.; Blumenstein, M.; Estivill-Castro, V. What Effect Does an Animal Robot Called CuDDler Have on the Engagement and Emotional Response of Older People with Dementia? A Pilot Feasibility Study. Int. J. Soc. Robot. 2016, 8, 145–156. [Google Scholar] [CrossRef]
- Banks, M.R.; Willoughby, L.M.; Banks, W.A. Animal-Assisted Therapy and Loneliness in Nursing Homes: Use of Robotic Versus Living Dogs. J. Am. Med. Dir. Assoc. 2008, 9, 173–177. [Google Scholar] [CrossRef] [PubMed]
- Wada, K.; Shibata, T. Social and Physiological Influences of Robot Therapy in a Care House. Interact. Stud. 2008, 9, 258–276. [Google Scholar]
- Weiss, A.; Bartneck, C. Meta analysis of the usage of the Godspeed Questionnaire Series. In Proceedings of the 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Kobe, Japan, 31 August–4 September 2015. [Google Scholar]
- Rosenthal-von der Pütten, A.M.; Krämer, N.C. How Design Characteristics of Robots Determine Evaluation and Uncanny Valley Related Responses. Comput. Human Behav. 2014, 36, 422–439. [Google Scholar] [CrossRef]
- Wu, Y.-H.; Fassert, C.; Rigaud, A.-S. Designing Robots for the Elderly: Appearance Issue and Beyond. Arch. Gerontol. Geriatr. 2012, 54, 121–126. [Google Scholar] [CrossRef] [PubMed]
- Shibata, T.; Kawaguchi, Y.; Wada, K. Investigation on People Living with Seal Robot at Home. Int. J. Soc. Robot. 2012, 4, 53–63. [Google Scholar] [CrossRef]
- Bergmann, K.; Eyssel, F.; Kopp, S. A second chance to make a first impression? How appearance and nonverbal behavior affect perceived warmth and competence of virtual agents over time. In Proceedings of the International Conference on Intelligent Virtual Agents, Santa Cruz, CA, USA, 12–14 September 2012. [Google Scholar]
- Goetz, J.; Kiesler, S.; Powers, A. Matching robot appearance and behavior to tasks to improve human-robot cooperation. In Proceedings of the 12th IEEE International Workshop on Robot and Human Interactive Communication, Millbrae, CA, USA, 2 November 2013. [Google Scholar]
- Lazar, A.; Thompson, H.J.; Piper, A.M.; Demiris, G. Rethinking the design of robotic pets for older adults. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems, Brisbane, Australia, 4–8 June 2016. [Google Scholar]
- Frennert, S.; Östlund, B. Review: Seven Matters of Concern of Social Robots and Older People. Int. J. Soc. Robot. 2014, 6, 299–310. [Google Scholar] [CrossRef]
- Dziergwa, M.; Frontkiewicz, M.; Kaczmarek, P.; Kędzierski, J.; Zagdańska, M. Study of a social robot’s Appearance using interviews and a mobile eye-tracking device. In Proceedings of the 5th International Conference on Social Robotics, Bristol, UK, 27–29 October 2013. [Google Scholar]
- Joosse, M.; Sardar, A.; Evers, V. BEHAVE: A set of measures to assess users’ attitudinal and non-verbal behavioral responses to a robot’s social behaviors. In Proceedings of the International Conference on Social Robotics, Amsterdam, The Netherlands, 24–25 November 2011. [Google Scholar]
- Okita, S.Y.; Schwartz, D.L. Young children’s understanding of animacy and entertainment robots. Int. J. Humanoid Robot. 2006, 03, 393–412. [Google Scholar] [CrossRef]
- Oh, S.; Oh, Y.H.; Ju, D.Y. Understanding the preference of the elderly for companion robot design. In Proceedings of the International Conference on Applied Human Factors and Ergonomics, Washington, DC, USA, 24–28 July 2019. [Google Scholar]
- Oh, Y.H.; Kim, J.; Ju, D.Y. Investigating the Preferences of Older Adults Concerning the Design Elements of a Companion Robot. Interact. Stud. 2019, 20, 426–454. [Google Scholar] [CrossRef]
- Wada, K.; Shibata, T. Living With Seal Robots—Its Sociopsychological and Physiological Influences on the Elderly at a Care House. IEEE Trans. Robot. 2007, 23, 972–980. [Google Scholar] [CrossRef]
- Li, D.; Rau, P.L.P.; Li, Y. A Cross-cultural Study: Effect of Robot Appearance and Task. Int. J. Soc. Robot. 2010, 2, 175–186. [Google Scholar] [CrossRef]
- Syrdal, D.S.; Dautenhahn, K.; Woods, S.N.; Walters, M.L.; Koay, K.L. Looking good? Appearance preferences and robot personality inferences at zero acquaintance. In Proceedings of the AAAI Spring Symposium: Multidisciplinary Collaboration for Socially Assistive Robotics, Stanford, CA, USA, 26–28 March 2007. [Google Scholar]
- Van Wynsberghe, A. Healthcare Robots: Ethics, Design and Implementation, 1st ed.; Ashgate Publisher Location: Surrey, England, UK, 2015; ISBN 978-1-4724-4433-2. [Google Scholar]
- Heerink, M.; Albo-Canals, J.; Valenti-Soler, M.; Martinez-Martin, P.; Zondag, J.; Smits, C.; Anisuzzaman, S. Exploring requirements and alternative pet robots for robot assisted therapy with older adults with dementia. In Proceedings of the International Conference on Social Robotics, Bristol, UK, 27–29 October 2013. [Google Scholar]
- Kim, S.; Lee, I. Components of Geriatric Nursing Robot for Korean Elderly: Based on the Focus Group Interview. J. Korea Acad. Coop. Soc. 2016, 17, 527–536. [Google Scholar]
- Moyle, W.; Bramble, M.; Jones, C.; Murfield, J. Care Staff Perceptions of a Social Robot Called Paro and a Look-Alike Plush Toy: A Descriptive Qualitative Approach. Aging Ment. Health 2018, 22, 330–335. [Google Scholar] [CrossRef] [PubMed]
- Batra, R.; Seifert, C.; Brei, D. The Psychology of Design: Creating Consumer Appeal; Batra, R., Seifert, C., Brei, D., Eds.; Routledge: New York, NY, USA, 2015; ISBN 1317502108. [Google Scholar]
- Schifferstein, H.N.J.; Desmet, P.M.A. The Effects of Sensory Impairments on Product Experience and Personal Well-Being. Ergonomics 2007, 50, 2026–2048. [Google Scholar] [CrossRef] [PubMed]
- Ho, C.-H.; Lu, Y.-N. Can Pupil Size be Measured to Assess Design Products? Int. J. Ind. Ergon. 2014, 44, 436–441. [Google Scholar] [CrossRef]
- Liu, Y.; Li, F.; Tang, L.H.; Lan, Z.; Cui, J.; Sourina, O.; Chen, C. Detection of humanoid robot design preferences using EEG and eye tracker. In Proceedings of the 2019 International Conference on Cyberworlds (CW), Toronto, ON, Canada, 15–17 October 2019. [Google Scholar]
- Kanda, T.; Miyashita, T.; Osada, T.; Haikawa, Y.; Ishiguro, H. Analysis of Humanoid Appearances in Human–Robot Interaction. IEEE Trans. Robot. 2008, 24, 725–735. [Google Scholar] [CrossRef]
- Riek, L.D. Healthcare robotics can provide health and wellness support to billions of people. Commun. ACM 2017, 60, 68–78. [Google Scholar] [CrossRef]
- Husić-Mehmedović, M.; Omeragić, I.; Batagelj, Z.; Kolar, T. Seeing is not Necessarily Liking: Advancing Research on Package Design with Eye-Tracking. J. Bus. Res. 2017, 80, 145–154. [Google Scholar] [CrossRef]
- Yu, J.A.; Kim, H.S. A Study of Makgeolli Container and Label Design Elements Utilized Eye Tracker for Promotion of 2030 Generation Consumer. J. Korean Soc. Des. Cult. 2016, 22, 471–482. [Google Scholar]
- Choi, S.; Oh, Y.H.; Ju, D.Y. Analysis of companion robot design AOI with eye tracker. In Proceedings of the 2019 Korean Society for Emotion and Sensibility Annual Spring Conference, Yeosu, Korea, 31 May–1 June 2019. [Google Scholar]
- Park, E.; Kim, K.J.; del Pobil, A.P. Facial Recognition Patterns of Children and Adults Looking at Robotic Faces. Int. J. Adv. Robot. Syst. 2012, 9, 28. [Google Scholar] [CrossRef] [Green Version]
- Lindgaard, G.; Fernandes, G.; Dudek, C.; Brown, J. Attention Web Designers: You have 50 milliseconds to Make a Good FirstImpression! Behav. Inf. Technol. 2006, 25, 115–126. [Google Scholar] [CrossRef]
- Pinto, Y.; van der Leij, A.R.; Sligte, I.G.; Lamme, V.A.F.; Scholte, H.S. Bottom-Up and Top-Down Attention are Independent. J. Vis. 2013, 13, 1–14. [Google Scholar] [CrossRef] [Green Version]
- Qu, Q.-X.; Zhang, L.; Chao, W.-Y.; Duffy, V. User experience Design based on eye-tracking technology: A case study on smartphone APPs. In Proceedings of the AHFE 2016 International Conference on Digital Human Modeling and Simulation, Walt Disney World®, FL, USA, 27–31 July 2017. [Google Scholar]
- Khalighy, S.; Green, G.; Scheepers, C.; Whittet, C. Quantifying the Qualities of Aesthetics in Product Design Using Eye-Tracking Technology. Int. J. Ind. Ergon. 2015, 49, 31–43. [Google Scholar] [CrossRef]
- Kumar, M.; Garg, N. Aesthetic Principles and Cognitive Emotion Appraisals: How Much of the Beauty Lies in the Eye of the Beholder? J. Consum. Psychol. 2010, 20, 485–494. [Google Scholar] [CrossRef]
- Stoll, M.; Baecke, S.; Kenning, P. What They See is What They Get? An fMRI-Study on Neural Correlates of Attractive Packaging. J. Consum. Behav. 2008, 7, 342–359. [Google Scholar] [CrossRef]
- Bergstrom, J.R.; Schall, A. (Eds.) Eye Tracking in User Experience Design, 1st ed.; Elsevier: Amsterdam, The Netherlands, 2014; ISBN 978-0-12-408138-3. [Google Scholar]
- Tullis, T.S. Older adults and the web: Lessons learned from eye-tracking. In Proceedings of the 4th International Conference on Universal Access in Human-Computer Interaction, Beijing, China, 22–27 July 2007. [Google Scholar]
- Poulsen, A.; Burmeister, O.K.; Kreps, D. The ethics of inherent trust in care robots for the elderly. In Proceedings of the 13th IFIP TC 9 International Conference on Human Choice and Computers, Poznan, Poland, 19–21 September 2018. [Google Scholar]
- Zafrani, O.; Nimrod, G. Towards a Holistic Approach to Studying Human–Robot Interaction in Later Life. Gerontologist 2018, 59, e26–e36. [Google Scholar] [CrossRef]
- de Graaf, M.M.A.; Allouch, S.B.; Klamer, T. Sharing a life with Harvey: Exploring the Acceptance of and Relationship-bBuilding with a Social Robot. Comput. Human Behav. 2015, 43, 1–14. [Google Scholar] [CrossRef]
- Kaspersen, T. Objective Measurement of the Experience of Agency During Myoelectric Pattern Recognition Based Prosthetic Limb Control Using Eye-Tracking. Master’s Thesis, Chalmers University of Technology, Gothenburg, Sweden, 2019. [Google Scholar]
- Niehorster, D.C.; Santini, T.; Hessels, R.S.; Hooge, I.T.C.; Kasneci, E.; Nyström, M. The Impact of Slippage on the Data Quality of Head-Worn Eye Trackers. Behav. Res. Methods 2020, 52, 1140–1160. [Google Scholar] [CrossRef] [Green Version]
- Pernice, K.; Nielsen, J. How to Conduct Eyetracking Studies; Nielsen Norman Group: Fremont, CA, USA, 2009. [Google Scholar]
- Kim, S.Y.; Oh, Y.H.; Ju, D.Y. A Study on the Design of Companion Robots Preferred by the Elderly. In Proceedings of the International Conference on Applied Human Factors and Ergonomics, Washington, DC, USA, 24–28 July 2019; Springer International Publishing: Cham, Switzerland, 2019; pp. 104–115. [Google Scholar]
- Becker-Asano, C.; Ishiguro, H. Evaluating facial displays of emotion for the android robot Geminoid F. In Proceedings of the 2011 IEEE Workshop on Affective Computational Intelligence (WACI), Paris, France, 11–15 April 2011. [Google Scholar]
- Breazeal, C. Designing Sociable Robots; MIT Press: London, UK, 2002; ISBN 9780262025102. [Google Scholar]
- Goldinger, S.D.; He, Y.; Papesh, M.H. Deficits in Cross-Race Face Learning: Insights From Eye Movements and Pupillometry. J. Exp. Psychol. Learn. Mem. Cogn. 2009, 35, 1105–1122. [Google Scholar] [CrossRef] [Green Version]
- Hessels, R.S.; Kemner, C.; van den Boomen, C.; Hooge, I.T.C. The Area-of-Interest Problem in Eyetracking Research: A Noise-Robust Solution for Face and Sparse Stimuli. Behav. Res. Methods 2016, 48, 1694–1712. [Google Scholar] [CrossRef] [Green Version]
- Chen, C.-Y. Using an eye tracker to investigate the effect of sticker on LINE APP for older adults. In Proceedings of the HCII 2019: Human-Computer Interaction. Recognition and Interaction Technologies, Orlando, FL, USA, 26–31 July 2019. [Google Scholar]
- Besdine, R.W. Changes in the Body with Aging. Available online: https://www.msdmanuals.com/home/older-people’s-health-issues/the-aging-body/changes-in-the-body-with-aging (accessed on 25 May 2020).
- Pfleging, B.; Fekety, D.K.; Schmidt, A.; Kun, A.L. A model relating pupil diameter to mental workload and lighting conditions. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, San Jose, CA, USA, 7–12 May 2016; Association for Computing Machinery (ACM): New York, NY, USA. [Google Scholar]
- Heerink, M.; Kröse, B.; Evers, V.; Wielinga, B. Assessing Acceptance of Assistive Social Agent Technology by Older Adults: The Almere model. Int. J. Soc. Robot. 2010, 2, 361–375. [Google Scholar] [CrossRef] [Green Version]
- Broz, F.; Lehmann, H.; Nehaniv, C.L.; Dautenhahn, K. Mutual gaze, personality, and familiarity: Dual eye-tracking during conversation. In Proceedings of the 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication, Paris, France, 7–12 September 2012. [Google Scholar]
- Macinnes, J.J.; Iqbal, S.; Pearson, J.; Johnson, E.N. Wearable Eye-tracking for Research: Automated dynamic gaze mapping and accuracy/precision comparisons across devices. bioRxiv 2018, 299925. [Google Scholar]
- McCroskey, J.C.; McCain, T.A. The measurement of interpersonal attraction. Speech Monogr. 1974, 41, 261–266. [Google Scholar] [CrossRef]
- Orquin, J.; Scholderer, J. Attention to Health Cues on Product Packages. J. Eye Tracking, Vis. Cogn. Emot. 2011, 1, 59–63. [Google Scholar]
- Al-Samarraie, H.; Sarsam, S.M.; Guesgen, H. Predicting User Preferences of Environment Design: A Perceptual Mechanism of User Interface cCustomisation. Behav. Inf. Technol. 2016, 35, 644–653. [Google Scholar] [CrossRef]
- Olsen, A. The Tobii I-VT Fixation Filter; Tobii: Stockholm, Sweden, 2012. [Google Scholar]
- Takahashi, R.; Suzuki, H.; Chew, J.Y.; Ohtake, Y.; Nagai, Y.; Ohtomi, K. A System for Three-Dimensional Gaze Fixation Analysis Using Eye Tracking Glasses. J. Comput. Des. Eng. 2018, 5, 449–457. [Google Scholar] [CrossRef]
- Manual and Assisted Mapping of Gaze Data on to Snapshots and Screenshots. Available online: https://www.tobiipro.com/learn-and-support/learn/steps-in-an-eye-tracking-study/data/manual-and-assisted-mapping/ (accessed on 12 May 2020).
- Bonensteffen, F. Does He Mean What He Says?: Using Eye Tracking to Understand Victim-Offender Mediation. Master’s Thesis, University of Twente, Enschede, The Netherlands, 2018. [Google Scholar]
- Niehorster, D.C.; Hessels, R.S.; Benjamins, J.S. GlassesViewer: Open-source software for viewing and analyzing data from the Tobii Pro Glasses 2 eye tracker. Behav. Res. Methods 2020, 52, 1244–1253. [Google Scholar] [CrossRef] [PubMed]
- Chen, H.-E.; Sonntag, C.C.; Pepley, D.F.; Prabhu, R.S.; Han, D.C.; Moore, J.Z.; Miller, S.R. Looks Can be Deceiving: Gaze Pattern Differences Between Novices and Experts During Placement of Central Lines. Am. J. Surg. 2019, 217, 362–367. [Google Scholar] [CrossRef] [PubMed]
- Rogers, S.L.; Speelman, C.P.; Guidetti, O.; Longmuir, M. Using Dual Eye Tracking to Uncover Personal Gaze Patterns During Social Interaction. Sci. Rep. 2018, 8, 4271. [Google Scholar] [CrossRef] [Green Version]
- Judd, T.; Ehinger, K.; Durand, F.; Torralba, A. Learning to predict where humans look. In Proceedings of the 2009 IEEE 12th International Conference on Computer Vision, Kyoto, Japan, 29 September–2 October 2009. [Google Scholar]
- Prakash, A.; Kemp, C.C.; Rogers, W.A. Older adults’ reactions to a robot’s Appearance in the context of home use. In Proceedings of the 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Bielefeld, Germany, 3–6 March 2014. [Google Scholar]
- Anzalone, S.M.; Boucenna, S.; Ivaldi, S.; Chetouani, M. Evaluating the Engagement with Social Robots. Int. J. Soc. Robot. 2015, 7, 465–478. [Google Scholar] [CrossRef]
- Guo, F.; Ding, Y.; Liu, W.; Liu, C.; Zhang, X. Can Eye-Tracking Data be Measured to Assess Product Design?: Visual Attention Mechanism should be Considered. Int. J. Ind. Ergon. 2016, 53, 229–235. [Google Scholar] [CrossRef]
- Henderson, J.M.; Williams, C.C.; Falk, R.J. Eye Movements are Functional During Face Learning. Mem. Cognit. 2005, 33, 98–106. [Google Scholar] [CrossRef] [Green Version]
- Zebrowitz, L.A.; Montepare, J.M. Social Psychological Face Perception: Why Appearance Matters. Soc. Personal. Psychol. Compass 2008, 2, 1497–1517. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Norman, D.A. Emotional Design: Why We Love (or Hate) Everyday Things, 1st ed.; Basic Civitas Books: New York, NY, USA, 2004. [Google Scholar]
- Lorenz, K. Die angeborenen formen möglicher erfahrung. Z. Tierpsychol. 1943, 5, 235–409. [Google Scholar] [CrossRef]
- Hinde, R.A.; Barden, L.A. The evolution of the teddy bear. Anim. Behav. 1985, 33, 1371–1373. [Google Scholar] [CrossRef]
- Breazeal, C.; Foerst, A. Schmoozing with robots: Exploring the boundary of the original wireless network. In Proceedings of the 1999 Conference on Cognitive Technology (CT99), San Francisco, CA, USA, 11–14 August 1999. [Google Scholar]
- Deutsch, I.; Erel, H.; Paz, M.; Hoffman, G.; Zuckerman, O. Home Robotic Devices for Older Adults: Opportunities and Concerns. Comput. Human Behav. 2019, 98, 122–133. [Google Scholar] [CrossRef]
- Cho, M.; Lee, S.; Lee, K.-P. Once a kind friend is now a thing: Understanding how conversational agents at home are forgotten. In Proceedings of the 2019 on Designing Interactive Systems Conference, New York, NY, USA, 23–28 June 2019. [Google Scholar]
- de Graaf, M.M.A.; Ben Allouch, S.; van Dijk, J.A.G.M. Long-Term Evaluation of a Social robot in Real Homes. Interact. Stud. 2016, 17, 462–491. [Google Scholar] [CrossRef] [Green Version]
- Fukuda, R.; Bubb, H. Eye Tracking Study on Web-Use: Comparison Between Younger and Elderly Users in Case of Search Task with Electronic Timetable Service. PsychNology J. 2003, 1, 202–228. [Google Scholar]
- Al-Showarah, S.; AL-Jawad, N.; Sellahewa, H. Effects of user age on smartphone and tablet use, measured with an eye-tracker via fixation duration, scan-path duration, and saccades proportion. In Proceedings of the 8th International Conference on Universal Access in Human-Computer Interaction, Crete, Greece, 22–27 June 2014. [Google Scholar]
- Cohen, J. Statistical Power Analysis for the Behavioral Sciences, 2nd ed.; Lawrence Earlbam Associates: Hillsdale, NL, USA, 1988; ISBN 0-8058-0283-5. [Google Scholar]
- Sawilowsky, S.S. New Effect Size Rules of Thumb. J. Mod. Appl. Stat. Methods 2009, 8, 597–599. [Google Scholar] [CrossRef]
- Hill, R.L.; Dickinson, A.; Arnott, J.L.; Gregor, P.; McIver, L. Older web users’ eye movements: Experience counts. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada, 7–12 May 2011. [Google Scholar]
Demographic Information | Age Group | |
---|---|---|
Older | Younger | |
number of participants | 31 | 31 |
number of male participants | 7 | 16 |
number of female participants | 24 | 15 |
mean age in years (range) | 62.3 (55–76) | 23.3 (18–29) |
AOI | Dependent Variable | Group | Mean | U | Sig. | Effect Size (d) |
---|---|---|---|---|---|---|
Face | Total fixation duration (s) | younger | 4.271 | 163.00 | 0.000 1 | 1.357 |
older | 6.883 | |||||
Average fixation duration (s) | younger | 0.302 | 411.00 | 0.327 | 0.354 | |
older | 0.344 | |||||
Number of fixations | younger | 14.194 | 180.00 | 0.000 1 | 1.257 | |
older | 21.065 | |||||
Left ear | Total fixation duration (s) | younger | 0.088 | 341.00 | 0.003 1 | 0.667 |
older | 0.008 | |||||
Average fixation duration (s) | younger | 0.084 | 342.00 | 0.003 1 | 0.642 | |
older | 0.008 | |||||
Number of fixations | younger | 0.355 | 340.50 | 0.003 1 | 0.787 | |
older | 0.032 | |||||
Right ear | Total fixation duration (s) | younger | 0.139 | 373.00 | 0.039 1 | 0.619 |
older | 0.023 | |||||
Average fixation duration (s) | younger | 0.100 | 376.00 | 0.044 1 | 0.604 | |
older | 0.023 | |||||
Number of fixations | younger | 0.452 | 381.50 | 0.056 | 0.543 | |
older | 0.129 | |||||
Body | Total fixation duration (s) | younger | 1.144 | 234.50 | 0.000 1 | 0.747 |
older | 0.383 | |||||
Average fixation duration (s) | younger | 0.254 | 205.00 | 0.000 1 | 1.085 | |
older | 0.076 | |||||
Number of fixations | younger | 3.613 | 267.00 | 0.002 1 | 0.600 | |
older | 1.774 | |||||
Left arm | Total fixation duration (s) | younger | 0.083 | 402.50 | 0.046 1 | 0.451 |
older | 0.011 | |||||
Average fixation duration (s) | younger | 0.075 | 402.00 | 0.044 1 | 0.472 | |
older | 0.006 | |||||
Number of fixations | younger | 0.226 | 405.50 | 0.054 | 0.371 | |
older | 0.065 | |||||
Right arm | Total fixation duration (s) | younger | 0.090 | 372.00 | 0.005 1 | Infinite 2 |
older | 0.000 | |||||
Average fixation duration (s) | younger | 0.065 | 372.00 | 0.005 1 | Infinite 2 | |
older | 0.000 | |||||
Number of fixations | younger | 0.323 | 372.00 | 0.005 1 | Infinite 2 | |
older | 0.000 | |||||
Left leg | Total fixation duration (s) | younger | 0.190 | 232.50 | 0.000 1 | Infinite 2 |
older | 0.000 | |||||
Average fixation duration (s) | younger | 0.124 | 232.50 | 0.000 1 | Infinite 2 | |
older | 0.000 | |||||
Number of fixations | younger | 0.774 | 232.50 | 0.000 1 | Infinite 2 | |
older | 0.000 | |||||
Right leg | Total fixation duration (s) | younger | 0.144 | 288.00 | 0.000 1 | 0.980 |
older | 0.004 | |||||
Average fixation duration (s) | younger | 0.090 | 288.00 | 0.000 1 | 1.027 | |
older | 0.004 | |||||
Number of fixations | younger | 0.710 | 291.50 | 0.000 1 | 0.937 | |
older | 0.032 |
Dependent Variable | Group | Mean | U | Sig. | Effect size (d) |
---|---|---|---|---|---|
Design Preference | younger | 3.665 | 155.0 | 0.000 1 | 1.515 |
older | 4.516 | ||||
BEHAVE physical attractiveness | younger | 13.839 | 198.0 | 0.000 1 | 1.216 |
older | 16.968 | ||||
BEHAVE social likeability | younger | 18.419 | 221.0 | 0.000 1 | 0.971 |
older | 21.290 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Oh, Y.H.; Ju, D.Y. Age-Related Differences in Fixation Pattern on a Companion Robot. Sensors 2020, 20, 3807. https://doi.org/10.3390/s20133807
Oh YH, Ju DY. Age-Related Differences in Fixation Pattern on a Companion Robot. Sensors. 2020; 20(13):3807. https://doi.org/10.3390/s20133807
Chicago/Turabian StyleOh, Young Hoon, and Da Young Ju. 2020. "Age-Related Differences in Fixation Pattern on a Companion Robot" Sensors 20, no. 13: 3807. https://doi.org/10.3390/s20133807
APA StyleOh, Y. H., & Ju, D. Y. (2020). Age-Related Differences in Fixation Pattern on a Companion Robot. Sensors, 20(13), 3807. https://doi.org/10.3390/s20133807