sensors-logo

Journal Browser

Journal Browser

Robotics in Healthcare: Automation, Sensing and Application

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Sensors and Robotics".

Deadline for manuscript submissions: closed (30 June 2022) | Viewed by 72798
Please contact the Guest Editor or the Section Managing Editor at ([email protected]) for any queries.

Special Issue Editors


E-Mail Website
Guest Editor
Department of Electronics, Automation and Computer Science, Universidad Politécnica de Madrid, Madrid, Spain
Interests: bioinspired robotics; rehabilitation robots; dynamical control of robots; underwater robots
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
School of Mechatronical Engineering, Beijing Institute of Technology, Beijing 100081, China
Interests: surgical robotics and navigation; human-robot interaction and intelligent control; mechanical design and system integration
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear colleagues,

Global healthcare systems must deal with challenges related to population aging, the rise of the prevalence of chronic diseases, and the logical desire and need of patients for more personalized medicine and closer care. This situation puts pressure on the system and compromises sustainability due to budget constraints, and available resources are not enough to cover increasing demand.

The introduction of innovative technology, new service models, and digitalization is needed to ensure the adequate, sustainable, and efficient adaptation of our healthcare systems for them to be able to respond to actual and future health needs from a more demanding population.

Robotics combined with sensors, smart communication, artificial intelligence, and easy-to-use medical interfaces are promising solutions to overcome problems in many healthcare areas, such as care planning, procedure performance, diagnostics, infection control, medication management, and the improvement of patient experience or monitoring.

These technologies have the potential to dramatically transform the current concept of a healthcare system. Homes, schools, jobs, gyms, and any other environment can be part of the health system for those patients that could be remotely followed, outside of hospitals or clinics.

The deployment and integration of these technologies into digital healthcare systems, their logistics, and management procedures address a wide range of healthcare applications, for example, in patient care, prevention, diagnosis, or treatment surveillance. Regulations related to medical-device, ethical, data-protection, and cybersecurity issues are also key in the success of this novel concept.

In this Special Issue, possible contributions may include realistic clinical/medical applications of robots combined with sensors and other technologies (hardware and software) to transform current healthcare models and enhance personalized medicine.

Furthermore, contributions should demonstrate how the combination of digital and physical services or systems arises as a care solution for hospitals, clinics, primary care centers, rehabilitation centers, care homes, etc.

Novel contributions around the automation of any medical procedure including robots or sensor networks, data processing and analysis, and staff/patient–machine interfaces are welcome.

Prof. Dr. Cecilia Garcia
Prof. Dr. Changsheng Li
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • medical robots
  • autonomous medical sensors
  • intelligence systems
  • aging and chronic diseases

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (13 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

19 pages, 3158 KiB  
Article
InjectMeAI—Software Module of an Autonomous Injection Humanoid
by Kwame Owusu Ampadu, Florian Rokohl, Safdar Mahmood, Marc Reichenbach and Michael Huebner
Sensors 2022, 22(14), 5315; https://doi.org/10.3390/s22145315 - 15 Jul 2022
Viewed by 2362
Abstract
The recent pandemic outbreak proved social distancing effective in helping curb the spread of SARS-CoV-2 variants along with the wearing of masks and hand gloves in hospitals and assisted living environments. Health delivery personnel having undergone training regarding the handling of patients suffering [...] Read more.
The recent pandemic outbreak proved social distancing effective in helping curb the spread of SARS-CoV-2 variants along with the wearing of masks and hand gloves in hospitals and assisted living environments. Health delivery personnel having undergone training regarding the handling of patients suffering from Corona infection have been stretched. Administering injections involves unavoidable person to person contact. In this circumstance, the spread of bodily fluids and consequently the Coronavirus become eminent, leading to an upsurge of infection rates among nurses and doctors. This makes enforced home office practices and telepresence through humanoid robots a viable alternative. In providing assistance to further reduce contact with patients during vaccinations, a software module has been designed, developed, and implemented on a Pepper robot that estimates the pose of a patient, identifies an injection spot, and raises an arm to deliver the vaccine dose on a bare shoulder. Implementation was done using the QiSDK in an android integrated development environment with a custom Python wrapper. Tests carried out yielded positive results in under 60 s with an 80% success rate, and exposed some ambient lighting discrepancies. These discrepancies can be solved in the near future, paving a new way for humans to get vaccinated. Full article
(This article belongs to the Special Issue Robotics in Healthcare: Automation, Sensing and Application)
Show Figures

Figure 1

13 pages, 1140 KiB  
Article
Using Robot-Based Variables during Upper Limb Robot-Assisted Training in Subacute Stroke Patients to Quantify Treatment Dose
by Pascal Jamin, Christophe Duret, Emilie Hutin, Nicolas Bayle, Typhaine Koeppel, Jean-Michel Gracies and Ophélie Pila
Sensors 2022, 22(8), 2989; https://doi.org/10.3390/s22082989 - 13 Apr 2022
Cited by 9 | Viewed by 2168
Abstract
In post-stroke motor rehabilitation, treatment dose description is estimated approximately. The aim of this retrospective study was to quantify the treatment dose using robot-measured variables during robot-assisted training in patients with subacute stroke. Thirty-six patients performed fifteen 60 min sessions (Session 1–Session 15) [...] Read more.
In post-stroke motor rehabilitation, treatment dose description is estimated approximately. The aim of this retrospective study was to quantify the treatment dose using robot-measured variables during robot-assisted training in patients with subacute stroke. Thirty-six patients performed fifteen 60 min sessions (Session 1–Session 15) of planar, target-directed movements in addition to occupational therapy over 4 (SD 2) weeks. Fugl–Meyer Assessment (FMA) was carried out pre- and post-treatment. The actual time practiced (percentage of a 60 min session), the number of repeated movements, and the total distance traveled were analyzed across sessions for each training modality: assist as needed, unassisted, and against resistance. The FMA score improved post-treatment by 11 (10) points (Session 1 vs. Session 15, p < 0.001). In Session 6, all modalities pooled, the number of repeated movements increased by 129 (252) (vs. Session 1, p = 0.043), the total distance traveled increased by 1743 (3345) cm (vs. Session 1, p = 0.045), and the actual time practiced remained unchanged. In Session 15, the actual time practiced showed changes only in the assist-as-needed modality: −13 (23) % (vs. Session 1, p = 0.013). This description of changes in quantitative-practice-related variables when using different robotic training modalities provides comprehensive information related to the treatment dose in rehabilitation. The treatment dose intensity may be enhanced by increasing both the number of movements and the motor difficulty of performing each movement. Full article
(This article belongs to the Special Issue Robotics in Healthcare: Automation, Sensing and Application)
Show Figures

Figure 1

18 pages, 10934 KiB  
Article
Technology-Based Social Innovation: Smart City Inclusive System for Hearing Impairment and Visual Disability Citizens
by Ignacio Chang, Juan Castillo and Hector Montes
Sensors 2022, 22(3), 848; https://doi.org/10.3390/s22030848 - 22 Jan 2022
Cited by 7 | Viewed by 4979
Abstract
The multilayer technology integration of hardware and software will reduce the social inclusion gap and increase the support in case of an emergency for people with special needs at hearing and visual levels. This research shows a development based on Internet of Things [...] Read more.
The multilayer technology integration of hardware and software will reduce the social inclusion gap and increase the support in case of an emergency for people with special needs at hearing and visual levels. This research shows a development based on Internet of Things to support people with visual disabilities (PwVD) for indoor and outdoor activities. The decision-making process is made at the operational, tactical, and strategic level, providing a safe place so people with visual and hearing special needs can make decisions, their families can make decisions, and the government authorities can make decisions in case of an emergency or even on a day-by-day basis. In the case of the authorities, the smart visualization of the data according to the information provided facilitates Comprehensive Disaster Risk Management (CDRM) and Disaster Risk Reduction (DRR). The main findings are based on the need to develop mobile applications, dashboard and web applications that are responsive to people with visual or hearing disabilities, and the need to develop an infrastructure of communication systems assisted by batteries and clean energy, and independent of the current telecommunications system, to allow greater reliability. Full article
(This article belongs to the Special Issue Robotics in Healthcare: Automation, Sensing and Application)
Show Figures

Figure 1

12 pages, 2205 KiB  
Article
Estimation of 1-Repetition Maximum Using a Hydraulic Bench Press Machine Based on User’s Lifting Speed and Load Weight
by Jinyeol Yoo, Jihun Kim, Byunggon Hwang, Gyuseok Shim and Jaehyo Kim
Sensors 2022, 22(2), 698; https://doi.org/10.3390/s22020698 - 17 Jan 2022
Cited by 2 | Viewed by 3410
Abstract
1-repetition maximum (1RM), a representative index for an individual’s weightlifting capacity, provides an organized workout guide, but to measure 1RM needs several repetitive exercises up to one’s limit and has a risk of injury, thus, not adequate for beginners, elders, or disabled people. [...] Read more.
1-repetition maximum (1RM), a representative index for an individual’s weightlifting capacity, provides an organized workout guide, but to measure 1RM needs several repetitive exercises up to one’s limit and has a risk of injury, thus, not adequate for beginners, elders, or disabled people. This study suggests a simpler and safer 1RM measurement method using a hydraulic fitness machine. We asked twenty-five female subjects with less than a month of experience in weight training to repeat chest exercises using a conventional plate-loaded bench press machine and a hydraulic bench press machine and measured 1RMs. Repeated-measures ANOVA and paired t-test reported the difference between the plate and hydraulic 1RMs insignificant (p-value = 0.082) and confirmed the generality of 1RM across the different types of fitness machines. We then derived several 1RM equations in terms of load weight W and lifting speed v during non-1RM exercise and reduced it to a first-order polynomial expression 1RM=0.3908+0.8251W+0.1054v with adjusted R-square of 0.8849. Goodness-of-fit test and comparison with 1RM equations from reference studies (v=1.46×W1RM+1.7035, W1RM×100=7.5786v275.865v+113.02) verified our formula valid. We finally simplified the 1RM measurement process up to a maximum of three repetitions. Full article
(This article belongs to the Special Issue Robotics in Healthcare: Automation, Sensing and Application)
Show Figures

Figure 1

28 pages, 10869 KiB  
Article
Automatic Robot-Driven 3D Reconstruction System for Chronic Wounds
by Damir Filko, Domagoj Marijanović and Emmanuel Karlo Nyarko
Sensors 2021, 21(24), 8308; https://doi.org/10.3390/s21248308 - 12 Dec 2021
Cited by 7 | Viewed by 4127
Abstract
Chronic wounds, or wounds that are not healing properly, are a worldwide health problem that affect the global economy and population. Alongside with aging of the population, increasing obesity and diabetes patients, we can assume that costs of chronic wound healing will be [...] Read more.
Chronic wounds, or wounds that are not healing properly, are a worldwide health problem that affect the global economy and population. Alongside with aging of the population, increasing obesity and diabetes patients, we can assume that costs of chronic wound healing will be even higher. Wound assessment should be fast and accurate in order to reduce the possible complications, and therefore shorten the wound healing process. Contact methods often used by medical experts have drawbacks that are easily overcome by non-contact methods like image analysis, where wound analysis is fully or partially automated. This paper describes an automatic wound recording system build upon 7 DoF robot arm with attached RGB-D camera and high precision 3D scanner. The developed system presents a novel NBV algorithm that utilizes surface-based approach based on surface point density and discontinuity detection. The system was evaluated on multiple wounds located on medical models as well as on real patents recorded in clinical medical center. Full article
(This article belongs to the Special Issue Robotics in Healthcare: Automation, Sensing and Application)
Show Figures

Figure 1

15 pages, 1495 KiB  
Article
Robot-Assisted Gait Self-Training: Assessing the Level Achieved
by Andrea Scheidig, Benjamin Schütz, Thanh Quang Trinh, Alexander Vorndran, Anke Mayfarth, Christian Sternitzke, Eric Röhner and Horst-Michael Gross
Sensors 2021, 21(18), 6213; https://doi.org/10.3390/s21186213 - 16 Sep 2021
Cited by 5 | Viewed by 2911
Abstract
This paper presents the technological status of robot-assisted gait self-training under real clinical environment conditions. A successful rehabilitation after surgery in hip endoprosthetics comprises self-training of the lessons taught by physiotherapists. While doing this, immediate feedback to the patient about deviations from the [...] Read more.
This paper presents the technological status of robot-assisted gait self-training under real clinical environment conditions. A successful rehabilitation after surgery in hip endoprosthetics comprises self-training of the lessons taught by physiotherapists. While doing this, immediate feedback to the patient about deviations from the expected physiological gait pattern during training is important. Hence, the Socially Assistive Robot (SAR) developed for this type of training employs task-specific, user-centered navigation and autonomous, real-time gait feature classification techniques to enrich the self-training through companionship and timely corrective feedback. The evaluation of the system took place during user tests in a hospital from the point of view of technical benchmarking, considering the therapists’ and patients’ point of view with regard to training motivation and from the point of view of initial findings on medical efficacy as a prerequisite from an economic perspective. In this paper, the following research questions were primarily considered: Does the level of technology achieved enable autonomous use in everyday clinical practice? Has the gait pattern of patients who used additional robot-assisted gait self-training for several days been changed or improved compared to patients without this training? How does the use of a SAR-based self-training robot affect the motivation of the patients? Full article
(This article belongs to the Special Issue Robotics in Healthcare: Automation, Sensing and Application)
Show Figures

Figure 1

16 pages, 8638 KiB  
Article
Assist-As-Needed Exoskeleton for Hand Joint Rehabilitation Based on Muscle Effort Detection
by Jenny Carolina Castiblanco, Ivan Fernando Mondragon, Catalina Alvarado-Rojas and Julian D. Colorado
Sensors 2021, 21(13), 4372; https://doi.org/10.3390/s21134372 - 26 Jun 2021
Cited by 23 | Viewed by 4845
Abstract
Robotic-assisted systems have gained significant traction in post-stroke therapies to support rehabilitation, since these systems can provide high-intensity and high-frequency treatment while allowing accurate motion-control over the patient’s progress. In this paper, we tackle how to provide active support through a robotic-assisted exoskeleton [...] Read more.
Robotic-assisted systems have gained significant traction in post-stroke therapies to support rehabilitation, since these systems can provide high-intensity and high-frequency treatment while allowing accurate motion-control over the patient’s progress. In this paper, we tackle how to provide active support through a robotic-assisted exoskeleton by developing a novel closed-loop architecture that continually measures electromyographic signals (EMG), in order to adjust the assistance given by the exoskeleton. We used EMG signals acquired from four patients with post-stroke hand impairments for training machine learning models used to characterize muscle effort by classifying three muscular condition levels based on contraction strength, co-activation, and muscular activation measurements. The proposed closed-loop system takes into account the EMG muscle effort to modulate the exoskeleton velocity during the rehabilitation therapy. Experimental results indicate the maximum variation on velocity was 0.7 mm/s, while the proposed control system effectively modulated the movements of the exoskeleton based on the EMG readings, keeping a reference tracking error <5%. Full article
(This article belongs to the Special Issue Robotics in Healthcare: Automation, Sensing and Application)
Show Figures

Figure 1

18 pages, 1492 KiB  
Article
The Barriers of the Assistive Robotics Market—What Inhibits Health Innovation?
by Gabriel Aguiar Noury, Andreas Walmsley, Ray B. Jones and Swen E. Gaudl
Sensors 2021, 21(9), 3111; https://doi.org/10.3390/s21093111 - 29 Apr 2021
Cited by 11 | Viewed by 7112
Abstract
Demographic changes are putting the healthcare industry under pressure. However, while other industries have been able to automate their operation through robotic and autonomous systems, the healthcare sector is still reluctant to change. What makes robotic innovation in healthcare so difficult? Despite offering [...] Read more.
Demographic changes are putting the healthcare industry under pressure. However, while other industries have been able to automate their operation through robotic and autonomous systems, the healthcare sector is still reluctant to change. What makes robotic innovation in healthcare so difficult? Despite offering more efficient, and consumer-friendly care, the assistive robotics market has lacked penetration. To answer this question, we have broken down the development process, taking a market transformation perspective. By interviewing assistive robotics companies at different business stages from France and the UK, this paper identifies new insight into the main barriers of the assistive robotics market that are inhibiting the sector. Their impact is analysed during the different stages of the development, exploring how these barriers affect the planning, conceptualisation and adoption of these solutions. This research presents a foundation for understanding innovation barriers that high-tech ventures face in the healthcare industry, and the need for public policy measures to support these technology-based firms. Full article
(This article belongs to the Special Issue Robotics in Healthcare: Automation, Sensing and Application)
Show Figures

Figure 1

25 pages, 7234 KiB  
Article
Smart Assistive Architecture for the Integration of IoT Devices, Robotic Systems, and Multimodal Interfaces in Healthcare Environments
by Alberto Brunete, Ernesto Gambao, Miguel Hernando and Raquel Cedazo
Sensors 2021, 21(6), 2212; https://doi.org/10.3390/s21062212 - 22 Mar 2021
Cited by 30 | Viewed by 6251
Abstract
This paper presents a new architecture that integrates Internet of Things (IoT) devices, service robots, and users in a smart assistive environment. A new intuitive and multimodal interaction system supporting people with disabilities and bedbound patients is presented. This interaction system allows the [...] Read more.
This paper presents a new architecture that integrates Internet of Things (IoT) devices, service robots, and users in a smart assistive environment. A new intuitive and multimodal interaction system supporting people with disabilities and bedbound patients is presented. This interaction system allows the user to control service robots and devices inside the room in five different ways: touch control, eye control, gesture control, voice control, and augmented reality control. The interaction system is comprised of an assistive robotic arm holding a tablet PC. The robotic arm can place the tablet PC in front of the user. A demonstration of the developed technology, a prototype of a smart room equipped with home automation devices, and the robotic assistive arm are presented. The results obtained from the use of the various interfaces and technologies are presented in the article. The results include user preference with regard to eye-base control (performing clicks, and using winks or gaze) and the use of mobile phones over augmented reality glasses, among others. Full article
(This article belongs to the Special Issue Robotics in Healthcare: Automation, Sensing and Application)
Show Figures

Figure 1

Review

Jump to: Research

21 pages, 1936 KiB  
Review
A Review of Exoskeletons Considering Nurses
by Esther Rayssiguie and Mustafa Suphi Erden
Sensors 2022, 22(18), 7035; https://doi.org/10.3390/s22187035 - 17 Sep 2022
Cited by 10 | Viewed by 5405
Abstract
Daily tasks of nurses include manual handling to assist patients. Repetitive manual handling leads to high risk of injuries due to the loads on nurses’ bodies. Nurses, in hospitals and care homes, can benefit from the advances in exoskeleton technology assisting their manual [...] Read more.
Daily tasks of nurses include manual handling to assist patients. Repetitive manual handling leads to high risk of injuries due to the loads on nurses’ bodies. Nurses, in hospitals and care homes, can benefit from the advances in exoskeleton technology assisting their manual handling tasks. There are already exoskeletons both in the market and in the research area made to assist physical workers to handle heavy loads. However, those exoskeletons are mostly designed for men, as most physical workers are men, whereas most nurses are women. In the case of nurses, they handle patients, a more delicate task than handling objects, and any such device used by nurses should easily be disinfected. In this study, the needs of nurses are examined, and a review of the state-of-the-art exoskeletons is conducted from the perspective of to what extent the existing technologies address the needs of nurses. Possible solutions and technologies and particularly the needs that have not been addressed by the existing technologies are discussed. Full article
(This article belongs to the Special Issue Robotics in Healthcare: Automation, Sensing and Application)
Show Figures

Figure 1

21 pages, 3549 KiB  
Review
The Advances in Computer Vision That Are Enabling More Autonomous Actions in Surgery: A Systematic Review of the Literature
by Andrew A. Gumbs, Vincent Grasso, Nicolas Bourdel, Roland Croner, Gaya Spolverato, Isabella Frigerio, Alfredo Illanes, Mohammad Abu Hilal, Adrian Park and Eyad Elyan
Sensors 2022, 22(13), 4918; https://doi.org/10.3390/s22134918 - 29 Jun 2022
Cited by 34 | Viewed by 4074
Abstract
This is a review focused on advances and current limitations of computer vision (CV) and how CV can help us obtain to more autonomous actions in surgery. It is a follow-up article to one that we previously published in Sensors entitled, “Artificial Intelligence [...] Read more.
This is a review focused on advances and current limitations of computer vision (CV) and how CV can help us obtain to more autonomous actions in surgery. It is a follow-up article to one that we previously published in Sensors entitled, “Artificial Intelligence Surgery: How Do We Get to Autonomous Actions in Surgery?” As opposed to that article that also discussed issues of machine learning, deep learning and natural language processing, this review will delve deeper into the field of CV. Additionally, non-visual forms of data that can aid computerized robots in the performance of more autonomous actions, such as instrument priors and audio haptics, will also be highlighted. Furthermore, the current existential crisis for surgeons, endoscopists and interventional radiologists regarding more autonomy during procedures will be discussed. In summary, this paper will discuss how to harness the power of CV to keep doctors who do interventions in the loop. Full article
(This article belongs to the Special Issue Robotics in Healthcare: Automation, Sensing and Application)
Show Figures

Figure 1

18 pages, 1100 KiB  
Review
Artificial Intelligence Surgery: How Do We Get to Autonomous Actions in Surgery?
by Andrew A. Gumbs, Isabella Frigerio, Gaya Spolverato, Roland Croner, Alfredo Illanes, Elie Chouillard and Eyad Elyan
Sensors 2021, 21(16), 5526; https://doi.org/10.3390/s21165526 - 17 Aug 2021
Cited by 65 | Viewed by 12615
Abstract
Most surgeons are skeptical as to the feasibility of autonomous actions in surgery. Interestingly, many examples of autonomous actions already exist and have been around for years. Since the beginning of this millennium, the field of artificial intelligence (AI) has grown exponentially with [...] Read more.
Most surgeons are skeptical as to the feasibility of autonomous actions in surgery. Interestingly, many examples of autonomous actions already exist and have been around for years. Since the beginning of this millennium, the field of artificial intelligence (AI) has grown exponentially with the development of machine learning (ML), deep learning (DL), computer vision (CV) and natural language processing (NLP). All of these facets of AI will be fundamental to the development of more autonomous actions in surgery, unfortunately, only a limited number of surgeons have or seek expertise in this rapidly evolving field. As opposed to AI in medicine, AI surgery (AIS) involves autonomous movements. Fortuitously, as the field of robotics in surgery has improved, more surgeons are becoming interested in technology and the potential of autonomous actions in procedures such as interventional radiology, endoscopy and surgery. The lack of haptics, or the sensation of touch, has hindered the wider adoption of robotics by many surgeons; however, now that the true potential of robotics can be comprehended, the embracing of AI by the surgical community is more important than ever before. Although current complete surgical systems are mainly only examples of tele-manipulation, for surgeons to get to more autonomously functioning robots, haptics is perhaps not the most important aspect. If the goal is for robots to ultimately become more and more independent, perhaps research should not focus on the concept of haptics as it is perceived by humans, and the focus should be on haptics as it is perceived by robots/computers. This article will discuss aspects of ML, DL, CV and NLP as they pertain to the modern practice of surgery, with a focus on current AI issues and advances that will enable us to get to more autonomous actions in surgery. Ultimately, there may be a paradigm shift that needs to occur in the surgical community as more surgeons with expertise in AI may be needed to fully unlock the potential of AIS in a safe, efficacious and timely manner. Full article
(This article belongs to the Special Issue Robotics in Healthcare: Automation, Sensing and Application)
Show Figures

Figure 1

17 pages, 5478 KiB  
Review
Robot-Assisted Autism Therapy (RAAT). Criteria and Types of Experiments Using Anthropomorphic and Zoomorphic Robots. Review of the Research
by Barbara Szymona, Marcin Maciejewski, Robert Karpiński, Kamil Jonak, Elżbieta Radzikowska-Büchner, Konrad Niderla and Anna Prokopiak
Sensors 2021, 21(11), 3720; https://doi.org/10.3390/s21113720 - 27 May 2021
Cited by 23 | Viewed by 7462
Abstract
Supporting the development of a child with autism is a multi-profile therapeutic work on disturbed areas, especially understanding and linguistic expression used in social communication and development of social contacts. Previous studies show that it is possible to perform some therapy using a [...] Read more.
Supporting the development of a child with autism is a multi-profile therapeutic work on disturbed areas, especially understanding and linguistic expression used in social communication and development of social contacts. Previous studies show that it is possible to perform some therapy using a robot. This article is a synthesis review of the literature on research with the use of robots in the therapy of children with the diagnosis of early childhood autism. The review includes scientific journals from 2005–2021. Using descriptors: ASD (Autism Spectrum Disorders), Social robots, and Robot-based interventions, an analysis of available research in PubMed, Scopus and Web of Science was done. The results showed that a robot seems to be a great tool that encourages contact and involvement in joint activities. The review of the literature indicates the potential value of the use of robots in the therapy of people with autism as a facilitator in social contacts. Robot-Assisted Autism Therapy (RAAT) can encourage child to talk or do exercises. In the second aspect (prompting during a conversation), a robot encourages eye contact and suggests possible answers, e.g., during free conversation with a peer. In the third aspect (teaching, entertainment), the robot could play with autistic children in games supporting the development of joint attention. These types of games stimulate the development of motor skills and orientation in the body schema. In future work, a validation test would be desirable to check whether children with ASD are able to do the same with a real person by learning distrust and cheating the robot. Full article
(This article belongs to the Special Issue Robotics in Healthcare: Automation, Sensing and Application)
Show Figures

Figure 1

Back to TopTop