Next Article in Journal
Technology-Supported Group Activity to Promote Communication in Dementia: A Protocol for a Within-Participants Study
Next Article in Special Issue
Emergency Response Cyber-Physical Framework for Landslide Avoidance with Sustainable Electronics
Previous Article in Journal / Special Issue
Dance Pose Identification from Motion Capture Data: A Comparison of Classifiers
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Socially Assistive Robotics: Robot Exercise Trainer for Older Adults †

School of Science and Technology, Nottingham Trent University, Clifton Lane, Nottingham NG11 8NS, UK
*
Author to whom correspondence should be addressed.
This paper is an extended version of our paper published in Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments (PETRA ‘17), Island of Rhodes, Greece, 21–23 June 2017.
Technologies 2018, 6(1), 32; https://doi.org/10.3390/technologies6010032
Submission received: 1 November 2017 / Revised: 6 March 2018 / Accepted: 7 March 2018 / Published: 10 March 2018

Abstract

:
Physical activities have tremendous benefit to older adults. A report from the World Health Organization has mentioned that lack of physical activity contributed to around 3.2 million premature deaths annually worldwide. Research also shows that regular exercise helps the older adults by improving their physical fitness, immune system, sleep and stress levels, not to mention the countless health problems it reduces such as diabetes, cardiovascular disease, dementia, obesity, joint pains, etc. The research reported in this paper is introducing a Socially Assistive Robot (SAR) that will engage, coach, assess and motivate the older adults in physical exercises that are recommended by the National Health Services (NHS) in the UK. With the rise in the population of older adults, which is expected to triple by 2050, this SAR will aim to improve the quality of life for a significant proportion of the population. To assess the proposed robot exercise trainer, user’s observational evaluation with 17 participants is conducted. Participants are generally happy with the proposed platform as a mean of encouraging them to do regular exercise correctly.

Graphical Abstract

1. Introduction

The benefits of physical activity cannot be over-emphasised, especially for older adults. Health issues such as diabetes, dementia, heart disease, high blood pressure, obesity, etc. are improved with physical exercise. Moreover, as people become older, their body tissue take longer to repair and exercise is a key factor in counteracting this [1]. According to a World Health Organisation (WHO) report, premature deaths attributed to lack of physical activity have risen to 3.2 million annually worldwide [2]. Meanwhile, the older adult population is rapidly increasing, and according to a survey the population will triple by 2050 [3]. Therefore, to improve the quality of life for older adults, regular physical activity should be part of their daily routines. Some additional benefits of physical activity include improving the body immune system, good blood circulation, removal of skin toxins etc. Research shows that many older adults fail to meet the level of exercise that their body needs. In America, over 60% of people above 50 years of age fail to meet the expected level. In England, only 18% of people between 65 and 74 meet the required level. For people above 75 years, only 7–8% of them are able to meet their required exercise level [2].
There are needs for exercise training instructors. Having a human instructor for each elderly person is not feasible due to the increase in the population of older adults. A possible solution for this limitation is to have a Socially Assistive Robot (SAR) that engages older adults in the needed physical activity and also coaches and motivates them while providing performance assessment. A formal definition is provided in [4].
The aim of the research reported in this paper is to investigate the most effective and safest physical activity (with affordable cost) for elderly people, and implement an assistive robot that will engage older adults in the proposed exercises in both sitting and standing positions. This will be achieved by asking participants to mimic the robot, by assessing their performance and providing both visual and audio feedback with facial expressions, motivational words and praise depending on the user’s performance. Effectiveness of the new system is evaluated by running a test with users and collecting feedback.
This paper is structured as follows. In Section 2, reviews of the existing research in this field in a wider context are presented. In Section 3, our methodology to develop the proposed system is explained. Implementation, evaluation and the conclusions are presented in Section 4, Section 5 and Section 6, respectively.

2. Related Work

A growing challenge of the 21 century is the rise in the population of older adults. In the US alone, 12.9% of its population (i.e., 39.6 million) were aged 65 and above in 2009; this is estimated to rise to 72 million by 2030, amounting to 19% of its population. Similarly, 26% of Japan’s population is predicted to be above 65 in 2015 [5]. The United Nation (UN) projected that Europe will have a high percentage of older adults with over 30% elderly people in 2060 [6].
The challenge is that, as the population of older adults is growing, the number of people to care for them is decreasing and cannot accommodate the needs. To reduce this problem, scientific solutions are needed to assist older adults in their homes and public places such as hospitals and malls.
Research shows that the fear of falling is ranked first among older adults compared to other health related issues such as choking and seizure [7]. While assistive technologies have been developed to handle more challenging and seemingly impossible tasks in the field of manufacturing and agriculture [8], attempts in the eighties to develop home robots were not successful because of their inability to perform practical tasks. In general, they are more geared towards entertainment and education. However, after the success of robotic vacuum cleaners in the 2000s, studies of household robots began to witness a significant rise [7].

2.1. Robots in Homes

Household robots can best be described in terms of the functions they perform. Their purpose is to assist in daily tasks such as cleaning, surveillance, entertainment, storage, etc. These robots do not interact with humans, but rather perform the task they are meant for. They lack social capability and emotion but they offer much-needed physical assistance.
On the other hand, social robots offer more. They interact with humans and even offer social assistance. As they are described by Piesca et al. [6] they tend to behave and interact like partners, peers, and assistants. The term “Socially Assistive Robots (SAR)” is widely used in literature to represent this group of robots [6]. The expression SAR was defined initially in [4]. A comprehensive research presented in [9] has explored how a robot’s physical presence affects human judgments of the robot as a social partner.
Several assistive robots such as the German Care-O-Bot have been well publicised. Figure 1a shows the Caro-O-Bot in a kitchen. These robots are able to learn from their environment and interact with the user. While considering the cost of a personal assistive robot such as Care-O-Bot, potential users and their caregivers will have to consider the alternative costs of homecare workers or institutionalisation for eldercare. SARs could well be a cost effective alternative.

2.2. Robots for Older Adult Care

Socially assistive robots, if properly utilised, would assist older people in their daily routines and increase their quality of life by performing some much-needed functions such as reminders to take meals and medication, offer suggestions for activities and encourage social interactions—in other words, providing the support that would normally be done by a human care worker.
For example, the Pearl robot can remind people about their daily routines such as taking medication, eating, bathing and also guide them through their environment [10]. The Paro robot shown in Figure 1b, on the other hand, is used in therapy for people with dementia [11]. Robinson et al. [11] also stated that their research has proven that the Paro robot has been very effective in reducing depression and the effects of loneliness for older adults. Pollack et al. [10] assert that in large environments such as malls and hospitals where older adults require the help of nurses or other porters to navigate the environment, robots can be used to offer these services, thus allowing humans to concentrate on more crucial tasks.
Other robotic technologies used to assist older adults as well as the disabled are exoskeletons, electrical wheelchairs and other similar devices. However, in most of these technologies, the intelligence incorporated into them is very limited and therefore they cannot interact with humans effectively [12].
A comprehensive review of smart home technology and socially-assistive robots to allow the extension of independent living for elderly people is presented in [13]. In [14] a software platform called RApps is presented to deliver smart robotic applications for empowering elderly users.

2.3. Robots for Exercise Training

In the very early stages of deploying robots for exercise training, in 2005, a virtual exercise advisor robot was developed by Bickmore and Picard [15]. An immobile assistive robot that helps with advice to avoid dangerous movements for wheelchair users was developed in 2008 [16]. Johnson et al. [17] developed an exercise robot called ADLER for therapy to post-stroke patients. This is merely an arm exercise robot that helps with the movement of the arm, grasping an object and transporting it.
The Taizo humanoid robot developed in 2009 is also an exercise trainer; results from its use have shown that it has been very effective [18]. In 2015, researchers in Singapore developed an exercise robot called Xuan to carry out training sessions for older adults [19]. Fasola and Mataric [3] developed a robot that engages older adults in a seated arm exercise, motivating the users by giving praise and feedback from evaluating their performance. A further study by Fasola and Mataric [16] showed that people prefer to be trained by a physical object (e.g., robot) than virtual training software. Socially assistive robots that engage older adults in physical activities have not been given much-needed attention, despite the importance of exercise to older adults. The three robots that can be seen to perform this task are the Japanese Taizo, Singapore Xuan and the seated arm exercise robot developed by Fasola and Mataric.
Taizo is very small (approximately the size of a table lamp) and lacks a display screen for visual feedback, while Fasola and Mataric’s robot can only administer seated arm exercise and therefore lacks full body locomotion and does not have a display screen for visual feedback. The Xuan robot however bridges this gap by being big (almost equal to the size of a human), equipped with an arm and above all a display screen for visual feedback. However, it also administers only seated exercises.
Figure 2a,b shows Robo Coach Xuan and Taizo robot, respectively, during training exercises.
Considering the existing research and development, there is a compelling need to have a more interactive robot that engages older adults in not just seated exercise, but also in standing positions as well as being equipped with a display screen for visual feedback.

3. Methodology

Our aim, therefore, is to develop a social assistive robot that will coach older adults in an appropriate set of exercises, monitor and evaluate their performance, and then provide suitable feedback to encourage further exercise. For this, we need suitable exercises, a means for assessing performance, and a platform that will integrate all the system requirements: to show the exercises, gather the assessment data and provide feedback.

3.1. Choice of Exercises

Exercise can be defined as any physical activity that is repetitive and structured to maintain the fitness or condition of the body. These can be further classified into into aerobic, anaerobic and flexibility exercises, as suggested by the European Food Information Council [20].
Aerobic exercise, also known as cardiovascular exercise, has to do with any physical activity that involves rapid intake of oxygen by the body, mostly performed for a long period of time at low or medium intensity, such as walking, running, cycling, swimming, etc. [20].
Anaerobic exercise, also termed as weight training, on the other hand, has to do with exercise which has an effect on muscular power, structure, and size. They are mostly performed for a short period of time with high intensity.
Flexibility exercise, as the name implies, has to do with stretching of body muscles and joints. This helps improves the overall movement of the body.
Regular exercise is beneficial to the mental and physical well-being of both young and older adults. It has been proven medically that regular exercise improves health and makes a person more resistant to disease.
According to the UK Medical Officers’ guideline, older people stand to gain much from regular exercise [21]. Some of the benefits of exercise to older adults are listed below:
  • Improves sleep
  • Manages and reduces stress
  • Improves the quality of life
  • Helps in maintaining healthy weight and overall increase in health
In the same report, it was found that regular exercises reduce the chances of:
  • Type II Diabetes by 40%
  • Cardiovascular disease by 35%
  • Joint and back pain by 25%
  • Falls, dementia, and depression by over 30%
  • Colon and breast cancer by 20%
The required level of exercises for older adults is obviously different from that of younger people. According to LaVona Traywick [22], the recommended level of exercise for older adults is 150 min weekly at a moderate level. This recommendation is in line with the National Health Services (NHS) in the UK guideline for older adults exercise [21,23]. The 150 min can, however, be supplemented with a 75-min vigorous weekly exercise [21]. Traywick [22] believe that starting at a slower pace such as 10 min a day in most cases is better, so that the body can adapt to the physical activity.
The NHS in the UK has published an exercise handbook containing the recommended exercises for older adults. This includes written notes and screen shots of the various exercise poses. Exercises are divided into Sitting, Strength, Flexibility and Balance exercises, but all fall within the three general categories mentioned earlier [23]. Figure 3 shows a sample of sitting, flexibility, strength and balance exercises recommended by the NHS in the UK for elderly people.

3.2. Assessment and Monitoring

To give relevant and appropriate feedback to the person doing exercise, the SAR has to have some information about the positions and activities of their limbs. Although wearable sensors are a possibility, accurate information about the position of each limb would require multiple sensors to be worn, thus making the exercise more onerous for the elderly person to start doing. A better, less obtrusive way of assessing the exercise activity is by image analysis. This could be via either standard video images, or some other sensor system such as Kinect to capture 3D information.
The difference between the image produced by a depth sensor and that of an ordinary camera is that each pixel of the image includes a distance between the object and the sensor instead of colour intensity produced by conventional cameras. The depth sensor image presents several advantages due to its ability to function in a low light environment, easy background removal and silhouette creation [24].
The depth data produced by 3D devices allow for inferences of the human skeleton. While motion capture devices use markers to track the various human joints, structured light based sensors make use of infrared light emitted in a certain pattern to determine the depth, which is used in skeleton tracking. Similarly, Time of Flight cameras also function like the structured light system, although without needing an infrared sensor [25].
The depth sensor system creates a depth image map by emitting an infrared light (structured light), and then infers the human skeleton using a randomised decision forest. Figure 4 shows the various transformation stages that the depth image undergoes. This includes (Figure 4a) colour stream, (Figure 4b) depth stream, (Figure 4c) skeleton, and (Figure 4d) tracked skeleton and joints.

3.3. System Platform Design

The targeted SAR should be able to engage older adults in the aforementioned recommended exercises. Personalisation and responsiveness of the SAR are also factors that need to be considered to ensure that the SAR accurately assesses the performance of older adults to provide timely, relevant and appropriate feedback to encourage the user.
The system diagram of the proposed exercise system is illustrated in Figure 5. The proposed system has several modules which will be described below:
  • The Vision Module is responsible for tracking the user’s activity. Due to the skeleton tracking required, the Kinect Sensor (Version 2) is used.
  • The Display Module’s function is to present a visual feedback (in the form of a facial expression) and display the exercise poses to the participant. This is an iPad screen which is part of the Double robot platform. The robot manufactured by Double Robotics (http://www.doublerobotics.com/) is a telescopic robotic platform with a removable iPad tablet.
  • The Communication/Sound Module provides speech recognition and audio feedback. The inbuilt speaker and microphone of the iPad/Double robot serve the purpose of receiving voice commands from the user, and providing prompts, praise and feedback.
  • The Processing Module is the core unit of the system where the data obtained from the vision module is processed. Activity recognition is performed, and the desired data for comparison are retrieved from the database. Output from this comparison process is then routed to the display and sound modules. This also handles user commands received via the sound unit.
  • The Database Module stores the exercise details, participant’s information (e.g., name, gender, and height) and his/her performance record.

3.4. Activity Recognition and Pose Matching

The Kinect Sensor is used to determine the user’s skeleton and joint positions. The skeleton data need to be used to ascertain if the user is performing the exercises correctly and therefore involves extracting the joint coordinates and calculating their angles.
Since the Kinect sensor provides 3D coordinates of all the tracked joints, the angles between the joints can be calculated using an Angular Kinematics Model. Once the joint angles and coordinates are obtained, the resulting values are compared to the required outcomes up to a tolerance level. Figure 6a shows the tracked skeleton while Figure 6b shows how the joints are mapped to the required exercise pose. For example, the wrist of the left hand must be at the same height level as that of the right hand, etc. Figure 6c shows how the joint angles will be compared. For example, the right and left shoulder angles must be approximately 120 degrees while both ankle angles must be approximately 180 degrees.
These criteria will be applied for all exercise types to verify if the pose of the participant is in line with the required pose.

3.5. Participants

The proposed system has been tested with a group of volunteers and as part of our ongoing research, ethical approval had been obtained for the research reported in this paper. All subjects gave their informed consent for inclusion before they participated in the study. The protocol was approved by the Non-Invasive Human Ethics Committee at School of Science and Technology, Nottingham Trent University.
The evaluation of the process was conducted in two parts. Initially some students and staff in the university campus were contacted to take part in the testing of the system. This group of volunteers were not in the right age group and the main purpose of the experiment was to test and evaluate the hardware and software developed for this research. After completion of this preliminary evaluation, in the second phase of the evaluation, 17 volunteers participated in the evaluation and they had the opportunity to interact with the robot platform and follow the recommended exercise. Female and male participants from a relatively wide age group were involved in this study. Figure 7 depicts the distribution of participants based on their age group, gender and their prior daily exercise routine.

4. Implementation

To evaluate the proposed system, a fully functional system, as shown in Figure 5, was developed. The hardware elements of this Socially Assistive Robot are:
The software libraries and facilities used were as follows:
  • Kinect SDK/Runtime
  • Vitruvius Kinect library
  • Double Robot SDK
  • Nuance Dragon Speech Recognition SDK
  • Bonjour SDK for Windows
  • Visual Studio and XCODE
As mentioned earlier, one important factor in implementing this solution is its affordability factor. Therefore, the choice of hardware and total cost are important factors when the exploitation plan of this project were considered.
The system was developed as two applications which are the Processing Module Application and the Display Module Application.

4.1. Processing Module Application

The application runs on the Processing Module, i.e., the PC. It interfaces with the Kinect Sensor to detect the user’s skeleton. It also performs all the necessary computations such as determining the joint positions, joint angles, converting 3D coordinates to 2D space, etc., thereby ascertaining whether the user is performing the exercise correctly and sends feedback to the Display Module.
This application was developed using programming languages C# and WPF, using a third party library called Vitruvius which provided useful functionality for analysis of body pose, joints, etc. Software was written to detect the position of the user—sitting, standing, raising hands, opening arms, etc. For example, to detect if the user is sitting, the hips and knees should be at a similar height. To detect if the user’s arms are wide open, the shoulders and elbow joints should form an angle of approximately 180 degrees at the same time having the wrist, elbow, and shoulder be at an equal height.
Figure 8 shows the main content area of the Processing module interface. It presents a live feed from the Kinect Sensor showing the tracked skeleton and the various computed joint angles highlighted in different colours.

4.2. Display Module Application

Developed in Objective-C, the application runs on a tablet (iPad) connected to the Double Robot. The exercise scenario is presented to the user as an MP4 video clip with audio explanation. The user will be given feedback in the form of voice notes and a facial expression, and the application receives voice commands from the participant as well as controlling the Double Robot. Nuance Dragon Speech Recognition SDK is used with a few keywords defined that navigate the robot as well as performing some tasks such as starting/stopping an exercise. Three different facial expressions are used to provide visual feedback to the user: (a) idle; (b) happy/approving; and (c) angry/disapproving. A sample of the visual feedback is shown in Figure 9c.
The process is straightforward for the user. The user would be able to use the voice command to move the robot to a correct position before the exercise commenced. During the exercise, audio and visual feedback are given to improve and complete the correct exercise.
Figure 9 shows the Display Module which can display the main window, the exercise information, and feedback. From the main page, the user can navigate to the score page for the user’s performance chart or to the settings page which allows the user to input his details such as name, age, height, etc.

5. Evaluation

Figure 10a,b shows a subject while conducting an exercise. The developed system allows the user to be instructed in exercises, and then assessed as they performed them, being given feedback on their level of achievement. The assessment and feedback were done in real-time, so that the user got an immediate indication of whether they were doing the exercise correctly. Some issues were found with the evaluation of exercises that related to the sides of the body. However, this could be addressed in a future version of the SAR equipped with an on-board depth sensor, which could autonomously move around the person doing the exercise in order to assess different aspects of the movement.
After an initial trial with a group of students and staff from the university campus, the performance of SAR was evaluated by external participants to determine the effectiveness of the proposed system in terms of administering, motivating and assessing capabilities. Seventeen participants in three age groups of 50s, 60s and 70s volunteered to take part in interacting with the SAR system and conduct the recommended exercise. The participants used the system for a ten minute exercise session, and then answered a questionnaire. They found that the system engaged them, that the feedback was appropriate and timely, and that they would recommend it to others. Most female participants were more engaged with the exercise than male participants. Both groups found that the interactive process is encouraging them to be engaged with the exercise. Comparing with the instruction provided to them in a leaflet, they all participants agree that going through the exercise with the SAR system is more fun and it is more likely that they will take part in daily exercises.
The only criticism received from some participants is regarding the configuration and set-up of the Double Robot platform and the tripod used to mount the depth sensor. Their suggestion is to integrate robot and the depth sensor to make it less cumbersome to start the exercise.
A couple of participants who had had the experience of following exercise instruction from TV found the SAR system to be a better way of interacting with a digital screen.
Usefully, two participants from the earlier trial suggested having the robot record a video of the exercise sessions. They believed that this would be useful for therapy so that physical therapists could review where the participant might be having problems such as being unable to raise his/her legs, unable to perform standing exercises, etc. This would be a relatively simple enhancement that could also act as further motivation and feedback for the users.

Achievements

As highlighted in the introduction, the robot developed by Fasola and Mataric, the Singapore Xuan and the Japanese Taizo robot have several shortcomings in which this SAR was able to overcome:
  • This SAR is the first of its kind to administer solely the NHS approved exercises for older adults.
  • The named robots above only administer seated exercises. However, this SAR administers both seated and standing exercises
  • The Taizo robot and that of Fasola and Mataric lack display screens to provide visual feedback. This SAR tackles this limitation by presenting facial expressions as feedback along with voice (audio) encouragements or suggestions.
  • Communication between the participant and the robot is limited to button presses in the case of Fasola and Mataric’s robot. However, this SAR communicates via speech and responds to voice commands. Additionally, this SAR is mobile and can be made to navigate with voice commands.

6. Conclusions

This research investigates effective and approved exercises for older adults and implements a Socially Assistive Robot (SAR) that coaches older adults in those exercises. The robot also motivates and assesses the participants’ performance while presenting feedback in the form of facial expressions and voice.
User evaluation has been conducted to ascertain the effectiveness of the implemented SAR. The results of the evaluation show some success, with the participants indicating high satisfaction and willingness to recommend the SAR to others. However, for commercialisation of this system, further modification and trial is needed. A long term evaluation of this SAR with an elderly focus group will lead to more findings with regard to the perception and general preference of the participant towards the SAR.
This SAR currently lacks the ability to administer exercises that use the sides of the body. This deprives the participants of performing up to 8 different recommended exercises that are beneficial to their well-being. A version with on-board Kinect and autonomously moving to positions to assess the side body movements would be a potential solution.
Finally, the interaction between the participant and this SAR was limited to the context of the exercise and some defined keywords. Enlarging the scope of the interaction could be of great benefit to elderly users, providing them with a socially assistive robot that acted more like a mentor, coach and companion.

Author Contributions

This project was conducted by Salisu Wada Yahaya as part of his MSc project at NTU. The project was supervised by Ahmad Lotfi. The work report here is part of an ongoing research conducted by Ahmad Lotfi and Caroline Langensiepen. Salisu Yahaya has conducted the experiments and developed the software interface. All Authors have contributed to the preparation of this paper.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
SARSocially Assistive Robot
WHOWorld Health Organization
NHSNational Health Services

References

  1. Sollitto, M. Exercise for the Elderly. Available online: https://goo.gl/OijaNF (accessed on 5 March 2017).
  2. Taylor, D. Physical activity is medicine for older adults. Postgrad. Med. J. 2013, 90, 26–32. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Fasola, J.; Mataric, M.J. Robot exercise instructor: A socially assistive robot system to monitor and encourage physical exercise for the elderly. In Proceedings of the IEEE 19th International Symposium in Robot and Human Interactive Communication, Viareggio, Italy, 13–15 September 2010; pp. 416–421. [Google Scholar]
  4. Feil-Seifer, D.; Mataric, M.J. Defining socially assistive robotics. In Proceedings of the 9th International Conference on Rehabilitation Robotics, Chicago, IL, USA, 28 June–1 July 2005; pp. 465–468. [Google Scholar]
  5. Bogue, R. Robots to aid the disabled and the elderly. Ind. Robot 2013, 40, 519–524. [Google Scholar] [CrossRef]
  6. Pieskä, S.; Luimula, M.; Jauhiainen, J.; Spiz, V. Social Service Robots in Public and Private Environments. In Proceedings of the 11th WSEAS International Conference on Instrumentation, Measurement, Circuits and Systems, Rovaniemi, Finland, 18–20 April 2012; World Scientific and Engineering Academy and Society (WSEAS): Stevens Point, WI, USA, 2012; pp. 190–195. [Google Scholar]
  7. Kantorovitch, J.; Väre, J.; Pehkonen, V.; Laikari, A.; Seppälä, H. An assistive household robot—Doing more than just cleaning. J. Assist. Technol. 2014, 8, 64–76. [Google Scholar] [CrossRef]
  8. Hagele, M. Robots Conquer the World [Turning Point]. IEEE Robot. Autom. Mag. 2016, 23. [Google Scholar] [CrossRef]
  9. Bainbridge, W.A.; Hart, J.W.; Kim, E.S.; Scassellati, B. The Benefits of Interactions with Physically Present Robots over Video-Displayed Agents. Int. J. Soc. Robot. 2011, 3, 41–52. [Google Scholar] [CrossRef]
  10. Pollack, M.E.; Brown, L.; Colbry, D.; Orosz, C.; Peintner, B.; Ramakrishnan, S.; Engberg, S.; Matthews, J.T.; Dunbar-Jacob, J.; McCarthy, C.E. Pearl: A Mobile Robotic Assistant for the Elderly. In Proceedings of the Workshop on Automation as Caregiver: The Role of Intelligent Technology in Elder Care (AAAI), Edmonton, AB, Canada, 28 July–1 August 2002. [Google Scholar]
  11. Robinson, H.; MacDonald, B.; Kerse, N.; Broadbent, E. The Psychosocial Effects of a Companion Robot: A Randomized Controlled Trial. J. Am. Med. Dir. Assoc. 2013, 14, 661–667. [Google Scholar] [CrossRef] [PubMed]
  12. Pearce, A.J.; Adair, B.; Miller, K.; Ozanne, E.; Said, C.; Santamaria, N.; Morris, M.E. Robotics to Enable Older Adults to Remain Living at Home. J. Aging Res. 2012, 2012, 538169. [Google Scholar] [CrossRef] [PubMed]
  13. Johnson, D.O.; Cuijpers, R.H.; Juola, J.F.; Torta, E.; Simonov, M.; Frisiello, A.; Bazzani, M.; Yan, W.; Weber, C.; Wermter, S.; et al. Socially Assistive Robots: A Comprehensive Approach to Extending Independent Living. Int. J. Soc. Robot. 2014, 6, 195–211. [Google Scholar] [CrossRef]
  14. Reppou, S.E.; Tsardoulias, E.G.; Kintsakis, A.M.; Symeonidis, A.L.; Mitkas, P.A.; Psomopoulos, F.E.; Karagiannis, G.T.; Zielinski, C.; Prunet, V.; Merlet, J.P.; et al. RAPP: A Robotic-Oriented Ecosystem for Delivering Smart User Empowering Applications for Older People. Int. J. Soc. Robot. 2016, 8, 539–552. [Google Scholar] [CrossRef]
  15. Bickmore, T.; Picard, R.W. Future of caring machines. Stuid. Health Technol. Inform. 2005, 45, 118–132. [Google Scholar]
  16. Fasola, J.; Mataric, M.J. Using Socially Assistive Human—Robot Interaction to Motivate Physical Exercise for Older Adults. Proc. IEEE 2012, 100, 2512–2526. [Google Scholar] [CrossRef]
  17. Johnson, M.J.; Wisneski, K.J.; Anderson, J.; Nathan, D.; Smith, R.O. Development of ADLER: The Activities of Daily Living Exercise Robot. In Proceedings of the BioRob 2006, The First IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, Pisa, Italy, 20–22 February 2006; pp. 881–886. [Google Scholar]
  18. Matsusaka, Y.; Fujii, H.; Okano, T.; Hara, I. Health exercise demonstration robot TAIZO and effects of using voice command in robot-human collaborative demonstration. In Proceedings of the RO-MAN 2009—The 18th IEEE International Symposium on Robot and Human Interactive Communication, Toyama, Japan, 27 September–2 October 2009; pp. 472–477. [Google Scholar]
  19. Wong, T. Singapore’s Robot Exercise Coach for the Elderly. Available online: https://goo.gl/rJgZ8V (accessed on 5 March 2017).
  20. The European Food Information Council (EUFIC). Types of Exercise. Available online: https://goo.gl/Xr19cj (accessed on 5 March 2017).
  21. GOV.UK. Physical Activity Benefits for Adults and Older Adults. Available online: https://goo.gl/Q1Hwy6 (accessed on 5 March 2017).
  22. Traywick, L.S. Exercise Recommendations for Older Adults. Available online: http://goo.gl/ZfIFJe (accessed on 5 March 2017).
  23. GOV.UK. Exercises for Older People. Available online: http://www.nhs.uk/Tools/Pages/Exercises-for-older-people.aspx (accessed on 15 August 2016).
  24. Shotton, J.; Sharp, T.; Kipman, A.; Fitzgibbon, A.; Finocchio, M.; Blake, A.; Cook, M.; Moore, R. Real-time Human Pose Recognition in Parts from Single Depth Images. Commun. ACM 2013, 56, 116–124. [Google Scholar] [CrossRef]
  25. Han, F.; Reily, B.; Hoff, W.; Zhang, H. Space-time representation of people based on 3D skeletal data: A review. Comput. Vis. Image Underst. 2017, 158, 85–105. [Google Scholar] [CrossRef]
Figure 1. Examples of socially assistive robots: (a) Care-O-Bot robot; and (b) Paro robot.
Figure 1. Examples of socially assistive robots: (a) Care-O-Bot robot; and (b) Paro robot.
Technologies 06 00032 g001
Figure 2. Social robots in action: (a) Robo Coach Xuan; and (b) Taizo robot.
Figure 2. Social robots in action: (a) Robo Coach Xuan; and (b) Taizo robot.
Technologies 06 00032 g002
Figure 3. A sample of exercises recommended by National Health Services in the UK for older adults: (a) sitting exercises; (b) flexibility exercises; (c) strength exercises; and (d) balance exercises.
Figure 3. A sample of exercises recommended by National Health Services in the UK for older adults: (a) sitting exercises; (b) flexibility exercises; (c) strength exercises; and (d) balance exercises.
Technologies 06 00032 g003
Figure 4. Illustration of depth image transformation: (a) colour stream; (b) depth stream; (c) skeleton (joint are shown in green dots); and (d) tracked skeleton and joints (similar joints are presented with the same colour).
Figure 4. Illustration of depth image transformation: (a) colour stream; (b) depth stream; (c) skeleton (joint are shown in green dots); and (d) tracked skeleton and joints (similar joints are presented with the same colour).
Technologies 06 00032 g004
Figure 5. The system diagram of the proposed system.
Figure 5. The system diagram of the proposed system.
Technologies 06 00032 g005
Figure 6. Exercise pose matching: (a) tracked skeleton; (b) comparing joint positions; and (c) comparing joint angles.
Figure 6. Exercise pose matching: (a) tracked skeleton; (b) comparing joint positions; and (c) comparing joint angles.
Technologies 06 00032 g006
Figure 7. Distribution of participants based on their age, gender and daily exercise.
Figure 7. Distribution of participants based on their age, gender and daily exercise.
Technologies 06 00032 g007
Figure 8. Skeleton and joint assessment in SAR. Similar joints are presented with the same colour.
Figure 8. Skeleton and joint assessment in SAR. Similar joints are presented with the same colour.
Technologies 06 00032 g008
Figure 9. Display module application: (a) main menu; (b) exercise page; (c) feedback page; and (d) settings.
Figure 9. Display module application: (a) main menu; (b) exercise page; (c) feedback page; and (d) settings.
Technologies 06 00032 g009
Figure 10. Participant evaluation: (a) Participant 1; and (b) Participant 2.
Figure 10. Participant evaluation: (a) Participant 1; and (b) Participant 2.
Technologies 06 00032 g010

Share and Cite

MDPI and ACS Style

Lotfi, A.; Langensiepen, C.; Yahaya, S.W. Socially Assistive Robotics: Robot Exercise Trainer for Older Adults. Technologies 2018, 6, 32. https://doi.org/10.3390/technologies6010032

AMA Style

Lotfi A, Langensiepen C, Yahaya SW. Socially Assistive Robotics: Robot Exercise Trainer for Older Adults. Technologies. 2018; 6(1):32. https://doi.org/10.3390/technologies6010032

Chicago/Turabian Style

Lotfi, Ahmad, Caroline Langensiepen, and Salisu Wada Yahaya. 2018. "Socially Assistive Robotics: Robot Exercise Trainer for Older Adults" Technologies 6, no. 1: 32. https://doi.org/10.3390/technologies6010032

APA Style

Lotfi, A., Langensiepen, C., & Yahaya, S. W. (2018). Socially Assistive Robotics: Robot Exercise Trainer for Older Adults. Technologies, 6(1), 32. https://doi.org/10.3390/technologies6010032

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop