The Town Crier: A Use-Case Design and Implementation for a Socially Assistive Robot in Retirement Homes †
Abstract
:1. Introduction
2. Current Challenges for Socially Assistive Robots
3. Definition of the Use-Case
3.1. Capture of Retirement Home Needs and Viability Study
3.2. Design of the First Use-Case: Announcer Task
4. Robotic Platform and Interfaces
- The teleoperator interface, which allows a remote operator to control the robot via a computer, tablet or mobile phone through a web browser. Its first version (Figure 4a) provided the following options to the operator: (i) to announce a message that can be written by the teleoperator or selected from a list; (ii) to announce a randomly generated message to greet the residents; (iii) to play some specific sounds (horn); (iv) to set the robot speaking volume; (v) to go to a mapped location. Additionally, this interface displays a live video streaming from an IP camera installed in the robot’s head, the battery levels for both the robot and the teleoperation joystick, the selected navigation mode (joystick/autonomous) and a text console with information about the actions executed by the robot. Later, the interface was updated with some additional features, as explained in Section 5.6.
- The interface of the robot with the residents. This interface accomplishes the next requirements, gathered iteratively via the recommendations of the stakeholders in the participatory design, and also based on the user’s abilities and disabilities and interaction needs according to a user-centred approach: (i) be multimodal, supporting voice, images and text displayed on a screen to inform about the events; (ii) display specific events associated with birthdays or anniversaries; (iii) warn about the information announcement with a repetitive sound of a horn in order to attract attention; (iv) say goodbye and return to the home position after the announcement. The touch-screen placed on the robot’s torso (see Figure 2) has been used as the screen interface to update the visual information targeted at the residents. In a first version, the screen showed a list with a brief version of the daily agenda, as well as a text with a transcript of the robot speech at any given moment, or a loudspeaker icon when the horn sound was being played (Figure 5a). This interface was deeply modified after the pilot tests, as explained in Section 5.6. Its design and evaluation process is detailed in Section 6.
5. Pilot Experiment
5.1. Aims
- From a Technological Perspective, the objectives were (i) to detect possible issues in the SAR performance during the interaction in the retirement home (in a real environment, in contrast to a controlled one) and; (ii) to technically evaluate the interfaces to detect accessibility and usability barriers.
- From a Social Perspective, the objectives were (i) to introduce the robot to the residents, to explain to them the main aims of the project and to make them aware that it will be present in their daily lives for a few weeks for the first pilot experiment; (ii) to evaluate the first impressions of the residents about the SAR as a phase of a participatory design, by involving them in the process of deciding certain aspects of the design of the user case, robot functionality and recommendations about the interface design.
5.2. Participants
5.3. Material
5.4. Environment
5.5. Procedure
5.6. Results
5.6.1. Technical Issues and Recommendations
5.6.2. User’s Acceptance of the Robotic System
6. Robot Information Interface: Co-Design Process
- First evaluation. Evaluation in a controlled environment: the process of improving the interfaces by taking into account the opinion of real users was carried out in a controlled environment, where people who were not as vulnerable to COVID-19 as the older adults could participate in the improvement of the interfaces and give their feedback and collaborating in participatory sessions on the design of the interfaces. This evaluation was carried out in one of the research laboratories of the authors of this paper. This evaluation is described in detail in Section 6.1 and led to the improvement of the interfaces prior to their use with older people in the retirement home.
- Second evaluation. Evaluation in the retirement home: This evaluation implements the second phase of the pilot experiment, in which the improved interfaces were used for five months in the retirement home. Older people and the staff at the retirement home could interact with the robot, and objective data about this interaction is detailed in Section 6.2. After the evaluation, stakeholders participated in a co-design session to provide feedback again on the interface and propose new interfaces and protocols, including the colour and position of buttons, navigation, the robot’s voice, sequence of activities performed by the robot, etc.
6.1. Evaluation in a Controlled Environment
6.1.1. Participants
6.1.2. Environment
6.1.3. Materials
- Questionnaires. Pre-test and post-test questionnaires were used to collect data from the participants. In the Pre-test questionnaire, eighteen questions were used to define participants’ socio-demographic information and their experience with electronic devices and robots. These data allowed the researchers to perform a deeper analysis of the results, correlated with their previous experience and socio-demographic information. The post-test questionnaire, detailed in Appendix A, included 26 questions to investigate participants’ responses on usability, accessibility and acceptance aspects of both the hardware and software interfaces of the use-case. These questions were answered following a 5-point Likert Scale, where 1 corresponds to “strongly disagree” and 5 corresponds to “totally agree”.
- Informal interview. After completing the questionnaires, users provided valuable informal feedback on their interactions with the robotic platform. Users provided feedback on the information interface design (proposing changes) and reported issues encountered during interactions.
6.1.4. Procedure
- Introduction and pre-test information. Once the participants were selected, the goal of the evaluation process was explained to them. Then, the participants signed a consent form. Finally, the ten participants answered the pre-test questionnaire.
- Evaluation session. In order to test the robot interface in depth, the participants were provided with a list of tasks, to be accomplished during their interaction with the robot. Seventeen short and direct tasks were designed to ensure all possible robot interaction modes were used during the experiment. These tasks involved obtaining information about the scheduled activities in their daily agenda, checking the date and time, information about the weather and information about the birthdays of the residents and staff. Moreover, the users could adjust the robot’s voice settings.
- Post-test feedback. After each evaluation session, the researchers had an informal interview with the participants, in which they also filled in the post-test questionnaire using a computer.
6.1.5. Evaluation Results
- Usability. Table A3 shows the six questions that evaluated usability: five had a 5-point Likert scale answer system and one was an open-ended question. The global usability was rated as 4.3 on average, with a standard deviation of 0.76 (Figure 6). The questions focused on four factors, which were intuitive interaction, effectiveness, interfaces’ appearance and robustness. As a summary of the results, the majority of participants (7 of 10) agreed that the interfaces were intuitive and easy to understand. The understanding capability was limited (neutral answer) due to the foreign mother language of one user and the moderate visual disability of another user. All participants agreed that the robot was effective in assisting them to complete the required task, except for one user who reported little experience using electronic devices and had both hearing and visual disabilities (neutral answer). The interface appearance was well appreciated by all the users, except from those with moderate and severe visual disabilities. The interface was robust in general and only 4 out of 10 users detected some issues: (i) the voice of the robot did not match the subtitle displayed on the screen (essentially they were not synchronized because the subtitle lagged one–two seconds behind the robot’s speech) and (ii) the buttons were small and the user was confused between the functionality of the buttons to control the voice parameters, such as volume, velocity or tone.
- Accessibility. Fourteen 5-point Likert scale questions were devoted to assessing the accessibility of the robot interfaces. More precisely, these questions evaluated to what extent the robot’s interfaces were perceivable, understandable and operable. The results for each subfactor are detailed in Table A4, Table A5 and Table A6, with average values of 4.2, 4.7 and 4.6, with standard deviations of 0.91, 0.45 and 0.74, respectively (see Figure 6). The users indicated that the robot’s voice was not clear enough and it should be corrected before its integration at the retirement home; the information provided through the display (including the captioning of the robot’s voice) was clearly perceived by the majority of the users, all being able to read everything one meter away from the screen. Related to the understanding factor, the participants were able to understand the robot’s speech, the purpose of each interface of the events announcer application and the flow of interaction with these interfaces. Moreover, they all found the application windows logically ordered. Finally, related to the interface operable factor, the participants were able to operate the application’s software and hardware interaction components in general, such as the touch screen and voice volume buttons, and they were able to navigate through the town crier application windows. Eight out of ten users attempted to increase the volume of the robot platform. Only one user encountered difficulties adjusting the volume due to the small size of the buttons.
- User acceptance. Table A7 contains questions designed to ascertain whether participants were satisfied with how the robot performed its functions, and to determine their willingness to use the robot in the future. An average value of 4.5 was obtained for this factor, with a standard deviation of 0.76 (see Figure 6), and in the interviews the participants indicated in general that they were satisfied with the robot’s interface and that they would like to use it in the future for the same purpose and for other tasks.
6.1.6. Evaluation Recommendations
6.1.7. Advantages and Limits
6.2. Evaluation in the Retirement Home
7. Discussion
7.1. Methodology
7.2. Main Outcomes
- People detection. Use an effective detection system for crowded environments.
- Robot voice and volume. The speakers should be powerful enough to allow interaction with people with moderate hearing loss.
- Use attention-grabbing mechanisms such as using a non-monotonous voice, moving while speaking, using non-distracting noises, etc.
- Avoid false expectations about the robot’s specific tasks.
- Teleoperator interface. If a teleoperator is responsible for interacting with users, the interface should be as simple and fast as clicking on predefined phrases to avoid wasting time writing new phrases or selecting phrases from a large list.
- Information interface. Since different people with different abilities/disabilities/use contexts will interact with the robot, it is absolutely necessary to implement accessible interfaces using multiple channels of communication (a loud robot voice, captioning, audio description, touch screen and voice accessible menus, allow connection of assistive tools, use icons and colours with responsibility, helpful for people with cognitive disabilities, etc.). A proposal of accessibility guidelines has been published by some of the authors in [25]. On the other hand, it is important to avoid delays between the robot’s voice and the subtitles. Navigation through the interface should be usable, including the position, size and icons used for buttons, information, etc. We recommend allowing for different configurations of all these factors, according to the abilities and context of use of the users interacting with the robot at any given moment.
7.3. Ongoing Work and Research Directions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
AAL | Ambient Assisted Living |
DSR | Deep State Representation |
HCI | Human–Computer Interaction |
HRI | Human–Robot Interaction |
ICT | Information and Communication Technologies |
ROS | Robot Operating System |
SAR | Socially Assistive Robot |
Appendix A. HRI User’s Evaluation: Participant’s Characteristics
Characteristics | Categories | Percentages |
---|---|---|
Age | 20–30 | 60% |
40–50 | 10% | |
50–60 | 20% | |
60–70 | 10% | |
Gender | Male | 60% |
Female | 40% | |
Nationality | Spanish | 90% |
Iranian | 10% | |
Type of disabilities (if any) | Hearing disabilities | (High) 10% (Moderate) 10% (Low) 30% (None) 50% |
Hearing aids | (None) 100% | |
Visual disabilities | (High) 20% (Moderate) 40% (Low) 20% (None) 20% | |
Visual aids | (Glasses) 70% | |
Motor disabilities | (Moderate) 10% (Low) 20% (None) 70% | |
Motor aids | (None) 100% | |
Reading agility | (Moderate) 20% (Low) 30% (None) 50% |
Characteristics | Categories | Percentages |
---|---|---|
Experience in using electronic devices | Using mobile phones | (Frequently) 30% (Moderate use) 40% (Low use) 20% (None) 10% |
Other uses of the phone beside calls | (Messaging, calls, social networks and surfing the internet) 80% (games and shopping) 20% | |
Using computers or tablets | (Frequently/High use) 30% (Moderate use) 50% (Low use) 10% (None) 10% | |
Using computer or tablet for | (For work and study) 60% (For work, study and entertainment) 40% | |
Experience in interaction with robots | I have interacted with a robot | (None) 30% (Rarely) 30% (Low use) 20% (Moderate use) 10% (Frequently/high use) 10% |
Type of used robot and what for | (Intelligent virtual assistant) 30% (Study and research) 30% (Drones) 10% | |
Opinions about interacting with robots and the preferred use cases of robots | (I don’t like to interact with robots) 20% (Welfare) 10% (Work and home) 40% (Personal assistive) 10% (Entertainment) 10% (In all aspects of life without replacing individuals) 10% |
Appendix B. HRI User’s Evaluation: Questionnaire Results
# | Factors | Questions | Mean | Standard Deviation (SD) |
---|---|---|---|---|
1 | Intuitive interaction | I find the robotic application intuitive. | 4.2 | 0.63 |
2 | Intuitive interaction | The displayed formation on the screen is easy to understand. | 4.2 | 0.91 |
3 | Effectiveness | I have been able to perform tasks easily. | 4.3 | 0.67 |
4 | Effectiveness | I have not needed any help during the completion of the tasks. | 4.4 | 0.52 |
5 | Interfaces appearance | Interfaces appearance helped me to clearly distinguish the different available functions. | 4.3 | 1.06 |
6 | Robustness | I have found errors in the interfaces, and they are… | * | * |
# | Factors | Questions | Mean | Standard Deviation (SD) |
---|---|---|---|---|
7 | Perception | The robot voice was clear to me. | 3.3 | 0.82 |
8 | I was able to read the displayed subtitles on the robot screen at all times. | 4.6 | 0.70 | |
9 | It was easy to perceive the displayed messages and subtitles at the same time and along with the robot voice. | 4.2 | 0.79 | |
10 | The colors chosen for the interfaces made it easy to read the information. | 4.8 | 0.42 | |
11 | The used font size was appropriate for reading at a one-meter distance. | 4.2 | 1.03 |
# | Factors | Questions | Mean | Standard Deviation (SD) |
---|---|---|---|---|
12 | Understanding | I was able to understand what the robot was saying at all times. | 4.7 | 0.48 |
13 | I have clearly understood that the purpose of the main screen was to click on each button to access the calendar, birthdays, weather and activity interfaces, respectively. | 4.9 | 0.32 | |
14 | I have clearly understood that the purpose of the calendar interfaces was to know the current day and time, and to click on any day to check the scheduled activities for that day. | 4.6 | 0.52 | |
15 | I have clearly understood that the purpose of the birthday interface was to show all of today’s birthdays and those for the next two days. | 4.7 | 0.48 | |
16 | I have clearly understood that the purpose of the weather interface was to know the weather forecast for today and the next few days. | 5 | 0.0 | |
17 | I have clearly understood that the purpose of the activities interface was to know the schedule of activities for today. | 4.8 | 0.42 | |
18 | I have found the order of the application windows logical. | 4.4 | 0.52 |
# | Factors | Questions | Mean | Standard Deviation (SD) |
---|---|---|---|---|
19 | Operating | At all times, I have been able to tap the robot screen to navigate through the application windows. | 4.9 | 0.33 |
20 | The application has enabled me to control the volume of the robot voice, so I can hear it perfectly. | 4.3 | 0.95 | |
21 | At all times, I have known what the running function of the application is, and how to return to the main interface. | 4.5 | 0.71 |
# | Factors | Questions | Mean | Standard Deviation (SD) |
---|---|---|---|---|
22 | Satisfaction | I liked how the robot told me what day it was, and what activities were on the calendar for that day. I’d also like it to do so in the future. | 4.6 | 0.70 |
23 | I liked how the robot told me the birthdays for today and for the next few days. I’d also like it to do so in the future. | 4.4 | 0.84 | |
24 | I liked how the robot told me the forecast for today and for the next few days. I’d also like it to do so in the future. | 4.6 | 0.70 | |
25 | I liked how the robot told me the scheduled activities of the day. I’d like it to do so in the future. | 4.5 | 0.71 | |
26 | In the future, I’d like the robot to be more complete and to include new tasks. | 4.2 | 0.92 |
References
- Servicio de Difusión y Publicaciones del Instituto de Estadística y Cartografía de Andalucía. Proyección de Población de Andalucía por Ámbitos Subregionales 2009–2035; Junta de Andalucía: Seville, Spain, 2012. [Google Scholar]
- Feil-Seifer, D.; Mataric, M. Defining Socially Assistive Robotics. In Proceedings of the 2005 IEEE C9th International Conference on Rehabilitation Robotics, Chicago, IL, USA, 28 June–1 July 2005; pp. 465–468. [Google Scholar]
- Li, Y.; Liang, N.; Effati, M.; Nejat, G. Dances with Social Robots: A Pilot Study at Long-Term Care. Robotics 2022, 11, 96. [Google Scholar] [CrossRef]
- Abdi, J.; Al-Hindawi, A.; Ng, T.; Vizcaychipi, M.P. Scoping review on the use of socially assistive robot technology in elderly care. BMJ Open 2018, 8, e018815. [Google Scholar] [CrossRef] [PubMed]
- Anghel, I.; Cioara, T.; Moldovan, D.; Antal, M.; Pop, C.D.; Salomie, I.; Pop, C.B.; Chifu, V.R. Smart Environments and Social Robots for Age-Friendly Integrated Care Services. Int. J. Environ. Res. Public Health 2020, 17, 3801. [Google Scholar] [CrossRef] [PubMed]
- Hall, A.; Brown, C.; Stanmore, E.; Todd, C. Implementing monitoring technologies in care homes for people with dementia: A qualitative exploration using Normalization Process Theory. Int. J. Nurs. Stud. 2017, 7, 60–70. [Google Scholar] [CrossRef] [PubMed]
- Seibt, J.; Damholdt, M.F.; Vestergaard, C. Integrative social robotics, value-driven design, and transdisciplinarity. Interact. Stud. 2020, 21, 111–144. [Google Scholar] [CrossRef]
- Brown, B.; Reeves, S.; Sherwood, S. Into the wild: Challenges and opportunities for field trial methods. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’11), Vancouver, BC, Canada, 7–12 May 2011; pp. 1657–1666. [Google Scholar]
- Weiss, A.; Bernhaupt, R.; Lankes, M.; Tscheligi, M. The USUS evaluation framework for human-robot interaction. In Proceedings of the AISB2009: Proceedings of the Symposium on New Frontiers in Human-Robot Interaction, Edinburgh, UK, 8–9 April 2009; Volume 4, pp. 11–26. [Google Scholar]
- Iglesias, A.; Viciana, R.; Pérez-Lorenzo, J.; Lan Hing Ting, K.; Tudela, A.; Marfil, R.; Dueñas, A.; Bandera, J. Towards long term acceptance of Socially Assistive Robots in retirement houses: Use-case definition. In Proceedings of the 2020 IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC), Ponta Delgada, Portugal, 15–17 April 2020; pp. 134–139. [Google Scholar]
- Booth, K.E.; Mohamed, S.C.; Rajaratnam, S.; Nejat, G.; Beck, J.C. Robots in retirement homes: Person search and task planning for a group of residents by a team of assistive robots. IEEE Intell. Syst. 2017, 32, 14–21. [Google Scholar] [CrossRef]
- Kriegel, J.; Grabner, V.; Tuttle-Weidinger, L.; Ehrenmüller, I. Socially Assistive Robots (SAR) in In-Patient Care for the Elderly. Stud. Health Technol. Inform. 2019, 260, 178–185. [Google Scholar] [PubMed]
- Kachouie, R.; Sedighadeli, S.; Khosla, R.; Chu, M.T. Socially Assistive Robots in Elderly Care: A Mixed-Method Systematic Literature Review. Int. J.-Hum.-Comput. Interact. 2014, 30, 369–393. [Google Scholar] [CrossRef]
- Fan, J.; Bian, D.; Zheng, Z.; Beuscher, L.; Newhouse, P.; Mion, L.; Sarkar, N. A Robotic Coach Architecture for Elder Care (ROCARE) Based on Multi-user Engagement Models. IEEE Trans. Neural Syst. Rehabil. Eng. 2016, 25, 1153–1163. [Google Scholar] [CrossRef] [PubMed]
- Breazeal, C. Affective Interaction between Humans and Robots. In Advances in Artificial Life; Kelemen, J., Sosík, P., Eds.; Springer: Berlin/Heidelberg, Germany, 2001; pp. 582–591. [Google Scholar]
- Obrenovic, Z.; Abascal, J.; Starcevic, D. Universal accessibility as a multimodal design issue. Commun. ACM 2007, 50, 83–88. [Google Scholar] [CrossRef]
- Courbet, L.; Morin, A.; Bauchet, J.; Rialle, V. Preliminary Evaluation of a Digital Diary for Elder People in Nursing Homes. In Smart Technologies in Healthcare; CRC Press: Boca Raton, FL, USA, 2017; pp. 178–194. [Google Scholar] [CrossRef]
- Olde Keizer, R.A.; van Velsen, L.; Moncharmont, M.; Riche, B.; Ammour, N.; Del Signore, S.; Zia, G.; Hermens, H.; N’Dja, A. Using socially assistive robots for monitoring and preventing frailty among older adults: A study on usability and user experience challenges. Health Technol. 2019, 9, 595–605. [Google Scholar] [CrossRef]
- Voilmy, D.; Suárez, C.; Romero-Garcés, A.; Reuther, C.; Pulido, J.C.; Marfil, R.; Manso, L.J.; Ting, K.L.H.; Iglesias, A.; González, J.C.; et al. CLARC: A Cognitive Robot for Helping Geriatric Doctors in Real Scenarios. ROBOT (1). In Proceedings of the Advances in Intelligent Systems and Computing, Madrid, Spain, 26–28 July 2017; Springer: Berlin/Heidelberg, Germany, 2017; Volume 693, pp. 403–414. [Google Scholar]
- Astorga, M.; Cruz-Sandoval, D.; Favela, J. A Social Robot to Assist in Addressing Disruptive Eating Behaviors by People with Dementia. Robotics 2023, 12, 29. [Google Scholar] [CrossRef]
- Winkle, K.; Caleb-Solly, P.; Turton, A.; Bremner, P. Mutual shaping in the design of socially assistive robots: A case study on social robots for therapy. Int. J. Soc. Robot. 2019, 12, 847–866. [Google Scholar] [CrossRef]
- World Wide Web Consortium. Web Content Accessibility Guidelines (WCAG) 2.0; World Wide Web Consortium: San Francisco, CA, USA, 2008. [Google Scholar]
- Nu, F. Mobile Navigation Guideline. 2014. Available online: https://www.funka.com/contentassets/d005946001ef460eb4df58a4fc967b83/mobile-navigation-guidelines-funka-2014.pdf (accessed on 3 April 2024).
- BBC. Accessibility Standards and Guidelines. 2014. Available online: https://www.w3.org/WAI/GL/mobile-a11y-tf/wiki/BBC_Mobile_Accessibility_Standards_and_Guidelines (accessed on 3 April 2024).
- Qbilat, M.; Iglesias, A. Accessibility Guidelines for Tactile Displays in Human-Robot Interaction. A Comparative Study and Proposal. In Proceedings of the International Conference on Computers Helping People with Special Needs, Linz, Austria, 11–13 July 2018; Springer: Berlin/Heidelberg, Germany, 2018; pp. 217–220. [Google Scholar]
- Abras, C.; Maloney-Krichmar, D.; Preece, J. User-centered design. In Encyclopedia of Human-Computer Interaction; Bainbridge, W., Ed.; Sage Publications: Thousand Oaks, CA, USA, 2004; Volume 37, pp. 445–456. [Google Scholar]
- Bannon, L. Reimagining HCI: Toward a More Human-Centered Perspective. Interactions 2011, 18, 50–57. [Google Scholar] [CrossRef]
- Bannon, L.J.; Ehn, P. Design: Design matters in Participatory Design. In Routledge International Handbook of Participatory Design; Routledge: London, UK, 2012; pp. 37–63. [Google Scholar]
- Vargas, C.; Whelan, J.; Brimblecombe, J.; Allender, S. Co-creation, co-design, co-production for public health: A perspective on definition and distinctions. Public Health Res. Pract. 2022, 32, e3222211. [Google Scholar] [CrossRef] [PubMed]
- Suchman, L. Human-Machine Reconfigurations: Plans and Situated Actions, 2nd ed.; Learning in Doing: Social, Cognitive and Computational Perspectives; Cambridge University Press: Cambridge, UK, 2006. [Google Scholar] [CrossRef]
- Heinzmann, J.; Zelinsky, A. Building Human-Friendly Robot Systems. In Robotics Research; Springer: London, UK, 2000. [Google Scholar]
- Zinn, M.; Roth, B.; Khatib, O.; Salisbury, J.K. A New Actuation Approach for Human Friendly Robot Design. Int. J. Robot. Res. 2004, 23, 379–398. [Google Scholar] [CrossRef]
- Bustos, P.; Manso, L.J.; Bandera, A.J.; Bandera, J.P.; Garcia-Varea, I.; Martinez-Gomez, J. The CORTEX cognitive robotics architecture: Use cases. Cogn. Syst. Res. 2019, 55, 107–123. [Google Scholar] [CrossRef]
- Stanford Artificial Intelligence Laboratory. Robotic Operating System. 2007. Available online: https://historyofinformation.com/detail.php?id=3661 (accessed on 3 April 2024).
- Henning, M. A new approach to object-oriented middleware. IEEE Internet Comput. 2004, 8, 66–75. [Google Scholar] [CrossRef]
- Tombaugh, T.N.; McIntyre, N.J. The mini-mental state examination: A comprehensive review. J. Am. Geriatr. Soc. 1992, 40, 922–935. [Google Scholar] [CrossRef] [PubMed]
- Nielsen, J. Usability Engineering; Morgan Kaufmann Publishers, Inc.: San Francisco, CA, USA, 1993. [Google Scholar]
- Knoblauch, H.; Tuma, R. Videography: An interpretive approach to video-recorded mi-cro-social interaction. In The Sage Handbook of Visual Methods; Sage Publications: Thousand Oaks, CA, USA, 2011; pp. 414–430. [Google Scholar]
- Sacks, H.; Schegloff, E.; Jefferson, G. A Simplest Systematics for the Organization of Turn Taking for Conversation. In Studies in the Organization of Conversational Interaction; Schenkein, J., Ed.; Academic Press: New York, NY, USA, 1978; pp. 7–55. [Google Scholar] [CrossRef]
- Heritage, J. Conversation analysis as social theory. In The New Blackwell Companion to Social Theory; Turner, B.S., Ed.; Blackwell: Oxford, UK, 2008; pp. 300–320. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Iglesias, A.; Viciana, R.; Pérez-Lorenzo, J.M.; Ting, K.L.H.; Tudela, A.; Marfil, R.; Qbilat, M.; Hurtado, A.; Jerez, A.; Bandera, J.P. The Town Crier: A Use-Case Design and Implementation for a Socially Assistive Robot in Retirement Homes. Robotics 2024, 13, 61. https://doi.org/10.3390/robotics13040061
Iglesias A, Viciana R, Pérez-Lorenzo JM, Ting KLH, Tudela A, Marfil R, Qbilat M, Hurtado A, Jerez A, Bandera JP. The Town Crier: A Use-Case Design and Implementation for a Socially Assistive Robot in Retirement Homes. Robotics. 2024; 13(4):61. https://doi.org/10.3390/robotics13040061
Chicago/Turabian StyleIglesias, Ana, Raquel Viciana, José Manuel Pérez-Lorenzo, Karine Lan Hing Ting, Alberto Tudela, Rebeca Marfil, Malak Qbilat, Antonio Hurtado, Antonio Jerez, and Juan Pedro Bandera. 2024. "The Town Crier: A Use-Case Design and Implementation for a Socially Assistive Robot in Retirement Homes" Robotics 13, no. 4: 61. https://doi.org/10.3390/robotics13040061
APA StyleIglesias, A., Viciana, R., Pérez-Lorenzo, J. M., Ting, K. L. H., Tudela, A., Marfil, R., Qbilat, M., Hurtado, A., Jerez, A., & Bandera, J. P. (2024). The Town Crier: A Use-Case Design and Implementation for a Socially Assistive Robot in Retirement Homes. Robotics, 13(4), 61. https://doi.org/10.3390/robotics13040061