A Low-Cost Assistive Robot for Children with Neurodevelopmental Disorders to Aid in Daily Living Activities
Abstract
:1. Introduction
- We introduce a novel, low-cost assistive robot to attend to kids and youth with NDDs. Note that all of the mechanical design of the platform is new; therefore, we do not need to use any expensive commercial solution. Moreover, this aspect allows us to perform any (mechanical) adaptation that the final user might need. This is a fundamental feature for an ARD, because the final design must ensure the user’s satisfaction and usefulness, to avoid difficulties and disappointments during its use [9].
- Our robotic platform integrates an online action detection approach for monitoring the development of ADLs of the users.
- We propose two applications specially designed to assist children with NDDs. The first one helps the kids to develop a correct sequencing of ADLs. The second application focuses on aiding the users to practice and learn a specific action. Both solutions assist the users to improve their independence in ADLs.
- We offer a detailed experimental validation of the online action detection module, using a well-known action recognition dataset; we are able to report an accuracy of 72.4% for 101 different action categories.
- We consider this work as a proof of concept, allowing for a potential evaluation of the impact that the developed technology will have on a group of children and youth with NDDs.
2. Related Work
- Fixed home adaptations.
- Wheelchair solutions.
- Mobile platforms.
3. The Low-Cost, Assistive AI Robotic Platform
3.1. Description of the Hardware
3.2. Software Architecture
3.3. Online Action Detection
4. Monitoring ADLs
4.1. Teaching an OAD Module with Daily Living Activities: Experimental Validation
4.2. Assistance in ADLs with the Platform: Applications, Qualitative Results and Discussion
- How much time has the user spent on certain tasks or actions?
- What actions are carried out in certain time slots?
- What actions are the most frequent?
- How does the user sequence the different tasks or actions?
- Application 1—Help for correct sequencing. We implemented an application that monitors the sequencing of the activities. In other words, our platform will recognize in real time the actions that are being carried out, and will inform the user of what task should be the next one. E.g., if you are now shaving your beard, next step should be to inform you to start brushing your teeth. Further on, this help in sequencing actions could address more complex activities, such as “getting ready for school” or “going to the park with friends”. In this context, our robotic platform could indicate what to do first and how to do it, and then accompany the child in the transfers between one task to another, in order to work as a robot companion to facilitate independent mobility.
- Application 2—Aid in action learning. This application allows one to reinforce the learning of the correct performance of daily living activities. The robot simply monitors the action that is being performed, recognizing it. Once the action has been identified, the robot can show the users videos with people performing the same type of action, with the intention of reinforcing in them how to develop the action in an adequate way.
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
ADLs | Activities of Daily Living |
ICF | International Classification of Functioning, Disability and Health |
AT | Assistive Technology |
NDD | Neurodevelopmental Disorder |
ARD | Assistive Robotic Device |
CNN | Convolutional Neural Network |
OAD | Online Action Detection |
ROS | Robot Operating System |
LD | Linear dichroism |
References
- Ministerio de Sanidad, Política Social e Igualdad. Estrategia Española Sobre Discapacidad. 2012–2020; Real Patronato Sobre Discapacidad: Madrid, Spain, 2011.
- Ministerio de Sanidad, Política Social e Igualdad. Plan de Acción de la Estrategia Española sobre Discapacidad 2014–2020; Ministerio de Sanidad: Madrid, Spain, 2014.
- World-Health-Organization. International Classification of Functioning, Disability and Health (ICF); World Health Organization: Geneva, Switzerland, 2001. [Google Scholar]
- Groba, B.; Pousada, T.; Nieto, L. Assistive Technologies, Tools and Resources for the Access and Use of Information and Communication Technologies by People with Disabilities. In Handbook of Research on Personal Autonomy Technologies and Disability Informatics; IGI Global: Hershey, PA, USA, 2010; Volume 1, pp. 1–15. [Google Scholar] [CrossRef]
- World-Health-Organization. International Classification of Functioning, Disability and Health-Child and Youth Version (ICF-CY); World Health Organization: Geneva, Switzerland, 2007. [Google Scholar]
- Lersilp, S.; Putthinoi, S.; Lersilp, T. Facilitators and Barriers of Assistive Technology and Learning Environment for Children with Special Needs. Occup. Ther. Int. 2018. [Google Scholar] [CrossRef]
- Lin, S.C.; Gold, R.S. Assistive technology needs, functional difficulties, and services utilization and coordination of children with developmental disabilities in the United States. Assist. Technol. 2018, 30, 100–106. [Google Scholar] [CrossRef] [PubMed]
- Zhang, H.B.; Zhang, Y.X.; Zhong, B.; Lei, Q.; Yang, L.; Du, J.X.; Chen, D.S. A Comprehensive Survey of Vision-Based Human Action Recognition Methods. Sensors 2019, 19, 1005. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Van Niekerk, K.; Dada, S.; Tönsing, K.; Boshoff, K. Factors perceived by rehabilitation professionals to influence the provision of assistive technology to children: A systematic review. Phys. Occup. Ther. Pediatr. 2018, 38, 168–189. [Google Scholar] [CrossRef] [PubMed]
- Pivetti, M.; Di Battista, S.; Agatolio, F.; Simaku, B.; Moro, M.; Menegatti, E. Educational Robotics for children with neurodevelopmental disorders: A systematic review. Heliyon 2020, 6, e05160. [Google Scholar] [CrossRef]
- Hersh, M. Overcoming barriers and increasing independence–service robots for elderly and disabled people. Int. J. Adv. Robot. Syst. 2015, 12, 114. [Google Scholar] [CrossRef] [Green Version]
- Dawe, J.; Sutherland, C.; Barco, A.; Broadbent, E. Can social robots help children in healthcare contexts? A scoping review. BMJ Paediatr. Open 2019, 3, e000371. [Google Scholar] [CrossRef] [Green Version]
- Gelsomini, M.; Degiorgi, M.; Garzotto, F.; Leonardi, G.; Penati, S.; Ramuzat, N.; Silvestri, J.; Clasadonte, F. Designing a robot companion for children with neuro-developmental disorders. In Proceedings of the 2017 Conference on Interaction Design and Children, Stanford, CA, USA, 27–30 June 2017; pp. 451–457. [Google Scholar]
- Linner, T.; Guttler, J.; Bock, T.; Georgoulas, C. Assistive robotic micro-rooms for independent living. Autom. Constr. 2015, 51, 8–22. [Google Scholar] [CrossRef] [Green Version]
- Hu, R.; Linner, T.; Guttler, J.; Kabouteh, A.; Langosch, K.; Bock, T. Developing a Smart Home Solution Based on Personalized Intelligent Interior Units to Promote Activity and Customized Healthcare for Aging Society. J. Popul. Ageing 2020, 13, 257–280. [Google Scholar] [CrossRef] [Green Version]
- Manoel, F.; Nunes, P.; de Jesus, V.S.; Pantoja, C.; Viterbo, J. Managing natural resources in a smart bathroom using a ubiquitous multi-agent system. In Proceedings of the 11th Workshop-School on Agents, Environments and Applications, Sao Paulo, Brazil, 4 May 2017; pp. 101–112. [Google Scholar]
- Blasco, R.; Marco, A.; Casas, R.; Cirujano, D.; Picking, R. A Smart Kitchen for Ambient Assisted Living. Sensors 2014, 14, 1629–1653. [Google Scholar] [CrossRef]
- Shishehgar, M.; Kerr, D.; Blake, J. A systematic review of research into how robotic technology can help older people. Smart Health 2018, 7–8, 1–18. [Google Scholar] [CrossRef]
- Bien, Z.; Chung, M.J.; Chang, P.H.; Kwon, D.S.; Kim, D.J.; Han, J.S.; Kim, J.H.; Kim, D.H.; Park, H.S.; Kang, S.H.; et al. Integration of a Rehabilitation Robotic System (KARES II) with Human-Friendly Man-Machine Interaction Units. Auton. Robot. 2004, 16, 165–191. [Google Scholar] [CrossRef] [Green Version]
- Bilyea, A.; Seth, N.; Nesathurai, S.; Abdullah, H. Robotic assistants in personal care: A scoping review. Med. Eng. Phys. 2017, 49, 1–6. [Google Scholar] [CrossRef]
- Hu, B.; Chen, H.; Yu, H. Design and Simulation of a Wheelchair Mounted Lightweight Compliant Manipulator. In Proceedings of the i-CREATe 2017: 11th International Convention on Rehabilitation Engineering and Assistive Technology, Kobe, Japan, 22–24 August 2017. [Google Scholar]
- Huete, A.J.; Victores, J.G.; Martinez, S.; Gimenez, A.; Balaguer, C. Personal Autonomy Rehabilitation in Home Environments by a Portable Assistive Robot. IEEE Trans. Syst. Man Cybern. Part C Appl. Rev. 2012, 42, 561–570. [Google Scholar] [CrossRef] [Green Version]
- Chi, M.; Yao, Y.; Liu, Y.; Zhong, M. Recent Advances on Human-Robot Interface of Wheelchair-Mounted Robotic Arm. Recent Patents Mech. Eng. 2019, 12, 45–54. [Google Scholar] [CrossRef]
- Campeau-Lecours, A.; Lamontagne, H.; Latour, S.; Fauteux, P.; Maheu, V.; Boucher, F.; Deguire, C.; Lecuyer, L.J.C. Kinova Modular Robot Arms for Service Robotics Applications. In Rapid Automation: Concepts, Methodologies, Tools, and Applications; IGI Global: Hershey, PA, USA, 2019. [Google Scholar]
- Koceski, S.; Koceska, N. Evaluation of an Assistive Telepresence Robot for Elderly Healthcare. J. Med. Syst. 2016, 40, 121–128. [Google Scholar] [CrossRef] [PubMed]
- Koceska, N.; Koceski, S.; Beomonte Zobel, P.; Trajkovik, V.; Garcia, N. A Telemedicine Robot System for Assisted and Independent Living. Sensors 2019, 19, 834. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Cosar, S.; Fernandez-Carmona, M.; Agrigoroaie, R.; Pages, J.; Ferland, F.; Zhao, F.; Yue, S.; Bellotto, N.; Tapus, A. ENRICHME: Perception and Interaction of an Assistive Robot for the Elderly at Home. Int. J. Soc. Robot. 2020, 12, 779–805. [Google Scholar] [CrossRef] [Green Version]
- Hossain, M.Y.; Zarif, S.; Rahman, M.M.; Ahmed, A.; Zishan, M.S.R. Design and Implementation of Assistive Robot for The Elderly and Impaired Person. In Proceedings of the 2021 2nd International Conference on Robotics, Electrical and Signal Processing Techniques (ICREST), American International University-Bangladesh, Dhaka, Bangladesh, 5–7 January 2021; pp. 535–538. [Google Scholar] [CrossRef]
- Lamas, C.M.; Bellas, F.; Guijarro-Berdiñas, B. SARDAM: Service Assistant Robot for Daily Activity Monitoring. Proceedings 2020, 54, 3. [Google Scholar] [CrossRef]
- Gambi, E.; Temperini, G.; Galassi, R.; Senigagliesi, L.; De Santis, A. ADL Recognition Through Machine Learning Algorithms on IoT Air Quality Sensor Dataset. IEEE Sens. J. 2020, 20, 13562–13570. [Google Scholar] [CrossRef]
- Ferrari, A.; Micucci, D.; Mobilio, M.; Napoletano, P. On the Personalization of Classification Models for Human Activity Recognition. IEEE Access 2020, 8, 32066–32079. [Google Scholar] [CrossRef]
- Nakagawa, E.; Moriya, K.; Suwa, H.; Fujimoto, M.; Arakawa, Y.; Yasumoto, K. Toward real-time in-home activity recognition using indoor positioning sensor and power meters. In Proceedings of the 2017 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops), Kona, HI, USA, 13–17 March 2017; pp. 539–544. [Google Scholar] [CrossRef]
- Chernbumroong, S.; Cang, S.; Yu, H. A practical multi-sensor activity recognition system for home-based care. Decis. Support Syst. 2014, 66, 61–70. [Google Scholar] [CrossRef] [Green Version]
- Wang, Y.; Cang, S.; Yu, H. A survey on wearable sensor modality centred human activity recognition in health care. Expert Syst. Appl. 2019, 137, 167–190. [Google Scholar] [CrossRef]
- Martinez-Martin, E.; Costa, A.; Cazorla, M. PHAROS 2.0—A PHysical Assistant RObot System Improved. Sensors 2019, 19, 4531. [Google Scholar] [CrossRef] [Green Version]
- Zlatintsi, A.; Dometios, A.; Kardaris, N.; Rodomagoulakis, I.; Koutras, P.; Papageorgiou, X.; Maragos, P.; Tzafestas, C.; Vartholomeos, P.; Hauer, K.; et al. I-Support: A robotic platform of an assistive bathing robot for the elderly population. Robot. Auton. Syst. 2020, 126, 103451. [Google Scholar] [CrossRef]
- Kumar, T.; Kyrarini, M.; Gräser, A. Application of Reinforcement Learning to a Robotic Drinking Assistant. Robotics 2020, 9, 1. [Google Scholar] [CrossRef] [Green Version]
- Rudigkeit, N.; Gebhard, M. AMiCUS 2.0—System Presentation and Demonstration of Adaptability to Personal Needs by the Example of an Individual with Progressed Multiple Sclerosis. Sensors 2020, 20, 1194. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Lee, J.; Ahn, B. Real-Time Human Action Recognition with a Low-Cost RGB Camera and Mobile Robot Platform. Sensors 2020, 20, 2886. [Google Scholar] [CrossRef] [PubMed]
- Kyrarini, M.; Lygerakis, F.; Rajavenkatanarayanan, A.; Sevastopoulos, C.; Nambiappan, H.R.; Chaitanya, K.K.; Babu, A.R.; Mathew, J.; Makedon, F. A Survey of Robots in Healthcare. Technologies 2021, 9, 8. [Google Scholar] [CrossRef]
- Quigley, M.; Gerkey, B.; Conley, K.; Faust, J.; Foote, T.; Leibs, J.; Berger, E.; Wheeler, R.; Ng, A. ROS: An open-source Robot Operating System. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) Workshop on Open Source Robotics, Kobe, Japan, 12–17 May 2009. [Google Scholar]
- Shou, Z.; Chan, J.; Zareian, A.; Miyazawa, K.; Chang, S.F. CDC: Convolutional-De-Convolutional Networks for Precise Temporal Action Localization in Untrimmed Videos. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017. [Google Scholar]
- Gao, J.; Yang, Z.; Nevatia, R. Cascaded Boundary Regression for Temporal Action Detection. In Proceedings of the British Machine Vision Conference 2017, London, UK, 4–7 September 2017. [Google Scholar]
- Yeung, S.; Russakovsky, O.; Mori, G.; Fei-Fei, L. End-to-end Learning of Action Detection from Frame Glimpses in Videos. arXiv 2015, arXiv:1511.06984. [Google Scholar]
- Buch, S.; Escorcia, V.; Ghanem, B.; Fei-Fei, L.; Niebles, J.C. End-to-End, Single-Stream Temporal Action Detection in Untrimmed Videos. In Proceedings of the BMVC 2017, London, UK, 4–7 September 2017. [Google Scholar]
- Dai, X.; Singh, B.; Zhang, G.; Davis, L.S.; Chen, Y.Q. Temporal Context Network for Activity Localization in Videos. In Proceedings of the ICCV 2017, Venice, Italy, 22–29 October 2017. [Google Scholar]
- Chao, Y.W.; Vijayanarasimhan, S.; Seybold, B.; Ross, D.A.; Deng, J.; Sukthankar, R. Rethinking the Faster R-CNN Architecture for Temporal Action Localization. In Proceedings of the CVPR 2018, Salt Lake City, UT, USA, 18–22 June 2018. [Google Scholar]
- Wu, Y.; Yin, J.; Wang, L.; Liu, H.; Dang, Q.; Li, Z.; Yin, Y. Temporal Action Detection Based on Action Temporal Semantic Continuity. IEEE Access 2018, 6, 31677–31684. [Google Scholar] [CrossRef]
- Yang, X.; Liu, D.; Liu, J.; Yan, F.; Chen, P.; Niu, Q. Follower: A Novel Self-Deployable Action Recognition Framework. Sensors 2021, 21, 950. [Google Scholar] [CrossRef] [PubMed]
- Patel, C.I.; Labana, D.; Pandya, S.; Modi, K.; Ghayvat, H.; Awais, M. Histogram of Oriented Gradient-Based Fusion of Features for Human Action Recognition in Action Video Sequences. Sensors 2020, 20, 7299. [Google Scholar] [CrossRef] [PubMed]
- Zheng, Y.; Liu, Z.; Lu, T.; Wang, L. Dynamic Sampling Networks for Efficient Action Recognition in Videos. IEEE Trans. Image Process. 2020, 29, 7970–7983. [Google Scholar] [CrossRef]
- De Geest, R.; Gavves, E.; Ghodrati, A.; Li, Z.; Snoek, C.; Tuytelaars, T. Online Action Detection. In Proceedings of the ECCV 2016, Amsterdam, The Netherlands, 11–14 October 2016. [Google Scholar]
- Gao, J.; Yang, Z.; Nevatia, R. RED: Reinforced Encoder-Decoder Networks for Action Anticipation. In Proceedings of the BMVC 2017, London, UK, 4–7 September 2017. [Google Scholar]
- De Geest, R.; Tuytelaars, T. Modeling Temporal Structure with LSTM for Online Action Detection. In Proceedings of the 2018 IEEE Winter Conference on Applications of Computer Vision, WACV 2018, Lake Tahoe, NV, USA, 12–15 March 2018; pp. 1549–1557. [Google Scholar]
- Baptista-Ríos, M.; López-Sastre, R.J.; Caba Heilbron, F.; Van Gemert, J.C.; Acevedo-Rodríguez, F.J.; Maldonado-Bascón, S. Rethinking Online Action Detection in Untrimmed Videos: A Novel Online Evaluation Protocol. IEEE Access 2019, 8, 5139–5146. [Google Scholar] [CrossRef]
- Tran, D.; Bourdev, L.; Fergus, R.; Torresani, L.; Paluri, M. Learning Spatiotemporal Features with 3D Convolutional Networks. In Proceedings of the ICCV 2015, Santiago, Chile, 7–13 December 2015; pp. 4489–4497. [Google Scholar]
- Soomro, K.; Zamir, A.R.; Shah, M. UCF101: A Dataset of 101 Human Actions Classes From Videos in The Wild. arXiv 2012, arXiv:1212.0402. [Google Scholar]
- Paszke, A.; Gross, S.; Massa, F.; Lerer, A.; Bradbury, J.; Chanan, G.; Killeen, T.; Lin, Z.; Gimelshein, N.; Antiga, L.; et al. PyTorch: An Imperative Style, High-Performance Deep Learning Library. In Advances in Neural Information Processing Systems 32; Curran Associates, Inc.: Vancouver, BC, Canada, 2019; pp. 8024–8035. [Google Scholar]
- Caba-Heilbron, F.; Escorcia, V.; Ghanem, B.; Niebles, J.C. ActivityNet: A Large-Scale Video Benchmark for Human Activity Understanding. In Proceedings of the CVPR 2015, Boston, MA, USA, 7–12 June 2015; pp. 961–970. [Google Scholar]
- Jiang, Y.G.; Liu, J.; Roshan Zamir, A.; Toderici, G.; Laptev, I.; Shah, M.; Sukthankar, R. THUMOS Challenge: Action Recognition with a Large Number of Classes. 2014. Available online: http://crcv.ucf.edu/THUMOS14/ (accessed on 9 April 2021).
- Smaira, L.; Carreira, J.; Noland, E.; Clancy, E.; Wu, A.; Zisserman, A. A Short Note on the Kinetics-700-2020 Human Action Dataset. arXiv 2020, arXiv:2010.10864. [Google Scholar]
- Karpathy, A.; Toderici, G.; Shetty, S.; Leung, T.; Sukthankar, R.; Fei-Fei, L. Large-Scale Video Classification with Convolutional Neural Networks. In Proceedings of the CVPR 2014, Columbus, OH, USA, 24–27 June 2014. [Google Scholar]
- Hadidi, R.; Cao, J.; Xie, Y.; Asgari, B.; Krishna, T.; Kim, H. Characterizing the Deployment of Deep Neural Networks on Commercial Edge Devices. In Proceedings of the IEEE International Symposium on Workload Characterization, Orlando, FL, USA, 3–5 November 2019. [Google Scholar]
Item | Estimated Price |
---|---|
Motor and encoders | 132 € |
Arduino MEGA | 12 € |
Battery | 24 € |
Wheels | 66 € |
Structure & Components | 100 € |
Screen | 40 € |
Jetson TX2 | 320 € |
Camera | 20 € |
LIDAR | 90 € |
Total | 804 € |
Category | UCF-101 Class Identifier | Number of Videos |
---|---|---|
Apply Eye Makeup | 1 | >130 |
Apply Lipstick | 2 | >100 |
Blow Dry Hair | 13 | >120 |
Brushing Teeth | 20 | >120 |
Cutting In Kitchen | 25 | >100 |
Haircut | 28 | >120 |
Mixing Batter | 35 | >130 |
Mopping Floor | 55 | >100 |
Shaving Beard | 56 | >150 |
Typing | 95 | >120 |
Walking With Dog | 96 | >120 |
Writing On Board | 100 | >150 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
López-Sastre, R.J.; Baptista-Ríos, M.; Acevedo-Rodríguez, F.J.; Pacheco-da-Costa, S.; Maldonado-Bascón, S.; Lafuente-Arroyo, S. A Low-Cost Assistive Robot for Children with Neurodevelopmental Disorders to Aid in Daily Living Activities. Int. J. Environ. Res. Public Health 2021, 18, 3974. https://doi.org/10.3390/ijerph18083974
López-Sastre RJ, Baptista-Ríos M, Acevedo-Rodríguez FJ, Pacheco-da-Costa S, Maldonado-Bascón S, Lafuente-Arroyo S. A Low-Cost Assistive Robot for Children with Neurodevelopmental Disorders to Aid in Daily Living Activities. International Journal of Environmental Research and Public Health. 2021; 18(8):3974. https://doi.org/10.3390/ijerph18083974
Chicago/Turabian StyleLópez-Sastre, Roberto J., Marcos Baptista-Ríos, Francisco Javier Acevedo-Rodríguez, Soraya Pacheco-da-Costa, Saturnino Maldonado-Bascón, and Sergio Lafuente-Arroyo. 2021. "A Low-Cost Assistive Robot for Children with Neurodevelopmental Disorders to Aid in Daily Living Activities" International Journal of Environmental Research and Public Health 18, no. 8: 3974. https://doi.org/10.3390/ijerph18083974
APA StyleLópez-Sastre, R. J., Baptista-Ríos, M., Acevedo-Rodríguez, F. J., Pacheco-da-Costa, S., Maldonado-Bascón, S., & Lafuente-Arroyo, S. (2021). A Low-Cost Assistive Robot for Children with Neurodevelopmental Disorders to Aid in Daily Living Activities. International Journal of Environmental Research and Public Health, 18(8), 3974. https://doi.org/10.3390/ijerph18083974