sensors-logo

Journal Browser

Journal Browser

Sensing and Imaging in Biomedical Robotics and Image-Guided Surgical Systems

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Sensing and Imaging".

Deadline for manuscript submissions: closed (31 October 2024) | Viewed by 8550

Special Issue Editors


E-Mail Website
Guest Editor
School of Engineering, the University of Tokyo, 7 Chome-3-1 Hongo, Bunkyo City, Tokyo 113-8654, Japan
Interests: medical robotics; surgical navigation systems; biomedical instrumentation

E-Mail Website
Guest Editor
1. ZJU-UIUC Institute, Zhejiang University, 718 East Haizhou Road, Haining 314400, China
2. MechSE, University of Illinois at Urbana-Champaign, Champaign, IL, USA
Interests: robotics; computer vision; vision-guided micromanipulation

E-Mail Website
Guest Editor
Control and Mechatronics Lab, National University of Singapore, 21 Lower Kent Ridge Rd, Singapore 119077, Singapore
Interests: robotics; medical devices; deep learning; control engineering; mechatronics

E-Mail Website
Guest Editor
School of Mathematics and Statistics, Victoria University of Wellington, Wellington 6012, New Zealand
Interests: machine learning; data science; deep learning; biomedical image analysis; health informatics; bioinformatics; drug discovery
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Sensing and imaging technology are fueling the rapid advancement of biomedical robotics and image-guided interventional procedures. For instance, medical imaging and robotics are combined in modern surgery to enhance the effectiveness of surgical interventions. These interventional procedures, referred to as image-guided robotic surgeries, involve sensing and imaging technology, which form the basis for the acquisition and interpretation of information about the patient and operating environment. The purpose is to provide intuitive visual representation and intelligent machine perception during operations. Similarly to many other robot applications, navigation is an important aspect for surgical robots and addresses the problem of localization and mapping. Surgical navigation is guided by a combination of instrument sensing, preoperatively image-constructed models and intraoperative imaging. Combined with computational intelligence, sensing and imaging technology empower modern biomedical robots with visual or/and haptic perception to perform tasks alongside humans with varying degrees of autonomy or in a collaborative fashion.

This Special Issue seeks to showcase state-of-the-art research by inviting high-quality scientific papers related to sensing and imaging in the domain of image-guided robotic surgery system. We are soliciting original papers of unpublished and completed research that is not currently under review by any other conference/magazine/journal. Topics of interest include but are not limited to the following:

  • Biomedical Imaging
  • Surgical Robotics and Instrumentation
  • Robot-Assisted Minimally Invasive Procedure
  • Surgical Navigation
  • Computer-Aided Surgery
  • Digital Operating Room
  • Cyberphysical Systems in Biomedicine
  • Collaborative Robot
  • Force Sensing for Tool–Tissue Interaction

Prof. Dr. Etsuko Kobayashi
Dr. Liangjing Yang
Dr. Chng Chin Boon
Dr. Binh P. Nguyen
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • biomedical imaging
  • medical robotics
  • cyberphysical system      
  • human–robot collaboration
  • human–machine interface
  • haptics and force sensing
  • surgical navigation
  • computer-aided surgery

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

18 pages, 5344 KiB  
Article
Real-Time Tool Localization for Laparoscopic Surgery Using Convolutional Neural Network
by Diego Benavides, Ana Cisnal, Carlos Fontúrbel, Eusebio de la Fuente and Juan Carlos Fraile
Sensors 2024, 24(13), 4191; https://doi.org/10.3390/s24134191 - 27 Jun 2024
Viewed by 1034
Abstract
Partially automated robotic systems, such as camera holders, represent a pivotal step towards enhancing efficiency and precision in surgical procedures. Therefore, this paper introduces an approach for real-time tool localization in laparoscopy surgery using convolutional neural networks. The proposed model, based on two [...] Read more.
Partially automated robotic systems, such as camera holders, represent a pivotal step towards enhancing efficiency and precision in surgical procedures. Therefore, this paper introduces an approach for real-time tool localization in laparoscopy surgery using convolutional neural networks. The proposed model, based on two Hourglass modules in series, can localize up to two surgical tools simultaneously. This study utilized three datasets: the ITAP dataset, alongside two publicly available datasets, namely Atlas Dione and EndoVis Challenge. Three variations of the Hourglass-based models were proposed, with the best model achieving high accuracy (92.86%) and frame rates (27.64 FPS), suitable for integration into robotic systems. An evaluation on an independent test set yielded slightly lower accuracy, indicating limited generalizability. The model was further analyzed using the Grad-CAM technique to gain insights into its functionality. Overall, this work presents a promising solution for automating aspects of laparoscopic surgery, potentially enhancing surgical efficiency by reducing the need for manual endoscope manipulation. Full article
Show Figures

Figure 1

13 pages, 2646 KiB  
Article
Suppression of Clothing-Induced Acoustic Attenuation in Robotic Auscultation
by Ryosuke Tsumura, Akihiro Umezawa, Yuko Morishima, Hiroyasu Iwata and Kiyoshi Yoshinaka
Sensors 2023, 23(4), 2260; https://doi.org/10.3390/s23042260 - 17 Feb 2023
Viewed by 1977
Abstract
For patients who are often embarrassed and uncomfortable when exposing their breasts and having them touched by physicians of different genders during auscultation, we are developing a robotic system that performs auscultation over clothing. As the technical issue, the sound obtained through the [...] Read more.
For patients who are often embarrassed and uncomfortable when exposing their breasts and having them touched by physicians of different genders during auscultation, we are developing a robotic system that performs auscultation over clothing. As the technical issue, the sound obtained through the clothing is often attenuated. This study aims to investigate clothing-induced acoustic attenuation and develop a suppression method for it. Because the attenuation is due to the loss of energy as sound propagates through a medium with viscosity, we hypothesized that the attenuation is improved by compressing clothing and shortening the sound propagation distance. Then, the amplitude spectrum of the heart sound was obtained over clothes of different thicknesses and materials in a phantom study and human trial at varying contact forces with a developed passive-actuated end-effector. Our results demonstrate the feasibility of the attenuation suppression method by applying an optimum contact force, which varied according to the clothing condition. In the phantom experiments, the attenuation rate was improved maximumly by 48% when applying the optimal contact force (1 N). In human trials, the attenuation rate was under the acceptable attenuation (40%) when applying the optimal contact force in all combinations in each subject. The proposed method promises the potential of robotic auscultation toward eliminating gender bias. Full article
Show Figures

Figure 1

Review

Jump to: Research

25 pages, 6012 KiB  
Review
Modern Image-Guided Surgery: A Narrative Review of Medical Image Processing and Visualization
by Zhefan Lin, Chen Lei and Liangjing Yang
Sensors 2023, 23(24), 9872; https://doi.org/10.3390/s23249872 - 16 Dec 2023
Cited by 5 | Viewed by 4026
Abstract
Medical image analysis forms the basis of image-guided surgery (IGS) and many of its fundamental tasks. Driven by the growing number of medical imaging modalities, the research community of medical imaging has developed methods and achieved functionality breakthroughs. However, with the overwhelming pool [...] Read more.
Medical image analysis forms the basis of image-guided surgery (IGS) and many of its fundamental tasks. Driven by the growing number of medical imaging modalities, the research community of medical imaging has developed methods and achieved functionality breakthroughs. However, with the overwhelming pool of information in the literature, it has become increasingly challenging for researchers to extract context-relevant information for specific applications, especially when many widely used methods exist in a variety of versions optimized for their respective application domains. By being further equipped with sophisticated three-dimensional (3D) medical image visualization and digital reality technology, medical experts could enhance their performance capabilities in IGS by multiple folds. The goal of this narrative review is to organize the key components of IGS in the aspects of medical image processing and visualization with a new perspective and insights. The literature search was conducted using mainstream academic search engines with a combination of keywords relevant to the field up until mid-2022. This survey systemically summarizes the basic, mainstream, and state-of-the-art medical image processing methods as well as how visualization technology like augmented/mixed/virtual reality (AR/MR/VR) are enhancing performance in IGS. Further, we hope that this survey will shed some light on the future of IGS in the face of challenges and opportunities for the research directions of medical image processing and visualization. Full article
Show Figures

Figure 1

Back to TopTop