Next Article in Journal
Impact of Persistent Inflammation, Immunosuppression, and Catabolism Syndrome during Intensive Care Admission on Each Post-Intensive Care Syndrome Component in a PICS Clinic
Previous Article in Journal
Red Cell Distribution Width as a Prognostic Indicator in Acute Medical Admissions
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Systematic Review of New Imaging Technologies for Robotic Prostatectomy: From Molecular Imaging to Augmented Reality

by
Severin Rodler
1,*,
Marc Anwar Kidess
1,
Thilo Westhofen
1,
Karl-Friedrich Kowalewski
2,
Ines Rivero Belenchon
3,
Mark Taratkin
4,
Stefano Puliatti
5,
Juan Gómez Rivas
6,
Alessandro Veccia
7,
Pietro Piazza
8,
Enrico Checcucci
9,
Christian Georg Stief
1 and
Giovanni Enrico Cacciamani
10
1
Department of Urology, University Hospital of Munich, 81377 Munich, Germany
2
Department of Urology, Klinikum Mannheim, 68167 Mannheim, Germany
3
Urology and Nephrology Department, Virgen del Rocío University Hospital, Manuel Siurot s/n, 41013 Seville, Spain
4
Institute for Urology and Reproductive Health, Sechenov University, 117418 Moscow, Russia
5
Department of Urology, University of Modena and Reggio Emilia, 42122 Modena, Italy
6
Department of Urology, Hospital Clinico San Carlos, 28040 Madrid, Spain
7
Urology Unit, Azienda Ospedaliera Universitaria Integrata Verona, 37126 Verona, Italy
8
Division of Urology, IRCCS Azienda Ospedaliero-Universitaria di Bologna, 40138 Bologna, Italy
9
Department of Surgery, Candiolo Cancer Institute, FPO-IRCCS, Candiolo, 10060 Turin, Italy
10
USC Institute of Urology, University of Southern California, Los Angeles, CA 90007, USA
*
Author to whom correspondence should be addressed.
J. Clin. Med. 2023, 12(16), 5425; https://doi.org/10.3390/jcm12165425
Submission received: 4 June 2023 / Revised: 1 August 2023 / Accepted: 10 August 2023 / Published: 21 August 2023
(This article belongs to the Section Nephrology & Urology)

Abstract

:
New imaging technologies play a pivotal role in the current management of patients with prostate cancer. Robotic assisted radical prostatectomy (RARP) is a standard of care for localized disease and through the already imaging-based console subject of research towards combinations of imaging technologies and RARP as well as their impact on surgical outcomes. Therefore, we aimed to provide a comprehensive analysis of the currently available literature for new imaging technologies for RARP. On 24 January 2023, we performed a systematic review of the current literature on Pubmed, Scopus and Web of Science according to the PRISMA guidelines and Oxford levels of evidence. A total of 46 studies were identified of which 19 studies focus on imaging of the primary tumor, 12 studies on the intraoperative tumor detection of lymph nodes and 15 studies on the training of surgeons. While the feasibility of combined approaches using new imaging technologies including MRI, PSMA-PET CT or intraoperatively applied radioactive and fluorescent dyes has been demonstrated, the prospective confirmation of improvements in surgical outcomes is currently ongoing.

1. Introduction

Imaging technologies play a pivotal role in surgical interventions. The major applications include preoperative staging to assess local tumor invasion, metastatic spread and planning for intraoperative proceeding including 3D models for practice. Intraoperatively, orientation on anatomical landmarks provided by conventional imaging as well as virtual reality application for real-time detection of tumors and metastatic spread might be applied [1,2].
Robotic assisted radical prostatectomy (RARP) is the standard of care for localized prostate cancer [1]. Due to anatomical conditions of the pelvis, important nerve and vessel structures and difficulties detecting lymph node metastases, RARP is a challenging procedure [3]. Imaging technologies are therefore warranted to support the surgeon and to improve outcomes of patients. Interestingly, robotic procedures are prone to visual enhancement as an endoscopic camera is already used. Augmented reality and simulations are therefore an obvious to implement addition [4].
During prostatectomy, preoperative planning is paramount either for the primary tumor and its surrounding structures as well as for lymphatic spread into locoregional lymph nodes (LN). Here, 3D reconstructions and prostate models might help to visualize extraprostatic tumor growth and neurovascular invasion. As 3D models might help understanding of anatomical positions, they might support surgical planning. Furthermore, those reconstructions might be then used to overlay with the video console intraoperatively to guide the surgeon. Further, in conventional surgery, beta and gamma probes are used to detect metastases marked by radioligands, for example. Those technical solutions require adoption for robotic surgery. Interestingly, RARP is already using imaging through the video console. Therefore, combination of this approach with modern imaging technologies might be one of the cornerstones of modern RARP.
The present study is a systematic review of new imaging technologies for RARP that focuses on the preoperative planning and intraoperative utilization of technology as well as teaching modalities specific for RARP.

2. Materials and Methods

A systematic literature analysis was conducted on 24 January 2023. Pubmed, Web of Science and Scopus database were systematically queried with a predefined research string defined as followed:
((robotic prostatectomy) AND (augmented reality)) OR ((robotic prostatectomy) AND (molecular imaging)) OR ((robotic prostatectomy) AND (neuronal imaging)) OR ((robotic prostatectomy) AND (virtual reality)) OR ((robotic prostatectomy) AND (new imaging technology). All identified papers were considered for further analysis.
First, all duplicates originating from the three databases were removed. All identified studies from the three databases were then analyzed for overall eligibility. Therefore, only original articles were included and replies, editorials, reviews and book chapters were removed. The received articles were then screened for inclusion criteria. Inclusion criteria were then original articles covering any aspect of new technologies specific for RARP that either support imaging of the primary tumor or lymph node metastases to improve surgical outcomes or that focus on imaging and visualization for training modalities. Accordingly, we excluded articles that reported preoperative imaging without direct intraoperative utilization (screening via ultrasound or MRI, staging imaging including PSMA-PET CT imaging), preclinical models or technology for salvage lymphadenectomy. The focus was put on research published within the last 5 years. However, the literature research was conducted without restriction of the publishing year. The analysis for eligibility was performed independently by two researchers. In cases of disagreement, a third researcher was involved to form consensus. The protocol of this systematic review was not registered prior initiation of the study.
Analyses were performed according to the PRISMA guideline for systematic reviews [5].

3. Results

The systematic search on Pubmed, Scopus and Web of Science with the described research string revealed 511 studies. After removing duplicates, 229 studies were screened. Here, 95 studies were identified for eligibility. Next, 49 studies were excluded for not meeting the inclusion criteria. A total of 46 articles were selected after qualitative analysis for this review (see Figure 1).
All 46 studies were analyzed and a level of evidence was determined based on the 2011 Oxford Center for evidence-based medicine level of evidence [6]. All studies were then categorized by the assessed organ (prostate, LN, abdominal wall), the area of application (preoperative planning, visualization of the primary tumor, intraoperative diagnostics, intraoperative detection of LN, education/training and feedback) as well as by the applied imaging modality (see Table 1).
Below we report our findings stratified by the imaging modalities for the primary tumor and locoreginonal LN detection as well as applications for training purposes.

3.1. Primary Tumor

From the identified 46 studies, 19 studies focused on imaging of the primary tumor either for preoperative planning, intraoperative tumor detection, real time imaging or to support intraoperative diagnostics.

3.1.1. Preoperative Planning

Four studies focused on preoperative planning. Shirk et al. conducted a randomized trial (n = 92) to evaluate the performance of surgeons when reviewing virtual models prior RARP. The trial revealed improved oncological outcomes with significantly lower postoperative detectable PSA (31% vs. 9%, p = 0.036) and a trend towards lower positive margin rates. Surgeons changed their surgical strategy in 32% of the cases based on the reviewed model leading to a trend towards bilateral nerve sparing [8]. In addition, a retrospective analysis performed by Checcuci et al. revealed the use of a 3D model as a protective factor for positive surgical margins [10]. Similar results have been shown by Martini et al., that compared patients before and after introduction of 3D models derived from 3T-MRI [9].
When looking at the patient perspective, Wake et al. revealed that patients gain a better understanding of their disease when their organ was 3D printed versus visualized in augmented reality or viewed on a 3D or 2D screen [7].

3.1.2. Intraoperative Tumor Detection and Real Time Imaging

A total of 12 studies focused on intraoperative real-time monitoring or augmented reality regarding the primary tumor.
Samei et al. demonstrate the feasibility of real-time augmented reality-based motion tracking of the prostate using ultrasound [13]. The working group has developed their system further and have tested the combination of preoperative MRI and ultrasound guidance in twelve patients undergoing RARP. Thereby, the surgeon can navigate the transducer via the robotic instruments. Imaging data are then overlayed on the endoscopic image. An accuracy of 3.2 mm is achieved [19].
A phase I study by Kratiras et al. using a tablet-based image guidance system that mapped the preoperative MRI to the patient revealed that such solutions are mainly used during challenging steps of RARP at the bladder neck and apical dissection as well as during nerve sparing [14].
Mehralivand et al. present a virtual reality imaging technology derived from preoperative MRI imaging that can be overlayed at several time points of RARP. However, according to the authors this system shows the limitation of VR-imaging, that it is not integrated into the video console as it was not considered useful for challenging surgical situations [17].
Schiavina et al. used MRI derived 3D models of the prostate that are superimposed on the video stream of the robotic system. The research group aimed to evaluate the impact of this technological support on intraoperative nerve sparing planning during RARP. The initial surgical plan was changed in 38.5% of all patients with 11.5% of all patients presenting with positive surgical margins after surgery. The sensitivity of the model was 70%, the specificity was 100% and the accuracy 92% [20].
Similarly, Porpiglia et al. demonstrate the feasibility of augmented reality RARP with a model accuracy of 1–5 mm with 85% of mismatch being less than 3 mm [11]. This group further tested hyperaccuracy 3D reconstruction with similar results [16]. In another study, Porpiglia et al. used an elastic augmented reality model to detect areas of capsular involvement during the nerve-sparing phase of RARP. This model has been developed to superimpose images even during the dynamic phases of surgery with deformation of the prostate. The authors demonstrate a superiority compared to 2D cognitive RARP in terms of detection of capsular involvement [15].
Although most research groups use MRI as input data for the augmented or virtual reality models, Canda et al. also incorporate PSMA-PET-imaging data for their VR model. The model was used in five RARP cases and revealed the clinical feasibility of this approach [18].
Intraoperative real-time augmented reality assistance might be achieved by deep-learning approaches and requires computing power. Therefore, Tanzi et al. investigate different algorithms to achieve this and demonstrated the superiority of a new convolutional neural network they applied with an intersection over unit (IoU) of 0.894 [21]. Similarly, Padovan et al. achieve real time 3D model alignment by semantic segmentation and use convolutional neuronal networks and motion analysis to compensate for rotation. Here IoU scores greater than 0.80 were achieved [22].
When evaluating the surgeons′ view on this development of augmented reality RARP, surgeons revealed a strongly positive opinion about this support for all evaluated critical steps of a RARP including bladder neck dissection, nerve sparing and apex dissection [12].
An example of the potential application of virtual reality superimposing video console real-time imaging is provided in Figure 2.

3.1.3. Intraoperative Diagnostics

Three studies reported the use of new imaging modalities for intraoperative diagnostics.
Lopez et al. used confocal laser endomicroscopy to detect tumors as well as damage to the neurovascular bundle. The study revealed the clinical feasibility with standard robotic instrumentation [23]. The frequency of abdominal wall hematoma caused by trocars during insertion for RARP might be decreased through an infrared device that detects veins. In a study of 724 cases, the device led to change in trocar placement in 65% of all cases and decreased the frequency of abdominal wall hematoma from 8.8% to 2.6% (p = 0.03) [24]. Bianchi et al. demonstrate the application of augmented reality to perform intraoperative frozen sections. In this study augmented reality was used to guide intraoperative frozen section in 20 patients that were propensity score matched against 20 patients. Positive surgical margins at the level of the index lesion were significantly reduced in the augmented reality guided group (5% vs. 20%, p = 0.01) [25].

3.2. Intraoperative Detection of Lymph Node Metastases

Intraoperative detection of lymph nodes via specialized imaging has been analyzed in 12 studies identified by our literature search. Thereby three technologies were used, namely fluorescence cameras and drop-in beta and gamma probes. In recent studies PSMA has thereby been used as a target for the fluorescent dye.
Van der Poel et al. revealed the feasibility of an approach to use intraoperative fluorescent imaging to detect SN during RARP. Hereby, the tracer indocyanine-(ICG)-99mTc was injected into the prostate under ultrasound guidance three hours prior to surgery. Two hours after injection, SPECT-CT were acquired to detect SN. Intraoperatively, a fluorescence laparoscope and a laparoscopic gamma probe were used to identify SN. In total, 11 patients underwent this procedure. Fluorescent imaging improved the detection SN in this setting, especially in areas with high background radioactivity [26].
De Korne et al. analyzed whether the site of injection has an impact on detection of SN during surgery. In this study, 67 patients received an ICG-99mTc-nanocolloid injection into the prostate. Intratumoral tracer injection increased the chance of visualizing nodal metastases [29].
KleinJan et al. report a combined approach using indocyanine green-99mTc-nanocolloid as a radioactive and fluorescent tracer. Here, no improvement in detection rates of sentinel lymph nodes was observed. The procedure is described as safe [27]. This tracer was further evaluated by van den Berg et al. They revealed that the combination of ICG-99mTc-nanocolloid together with the lymphangiographic tracer fluorescein improves lymph node detection in patients undergoing RARP [28]. Özkan et al. revealed in a patient cohort of 50 patients that of nine LN positive patients eight had fluorescent positive LN whereas six were detected by preoperative PSMA-PET CT [37].
Another study using indocyanine green-99mTc-nanocolloid revealed higher detection rates of positive lymph nodes in patients undergoing sentinel node biopsy during RARP [32]. A recent phase-II trial analysed the status of indocyanine green-99mTc-nanocolloid further and revealed that intratumoral application improves detection rates compared to application into the prostate. However, metastatic spread from non-index tumors was not detected by the intratumoral application. Therefore, the authors propose combining the intratumoral and intraprostatic tracer injection to optimize sentinel lymph node detection [33]. In a retrospective study, Hinsenveld et al. revealed that the combination of preoperative PSMA PET-CT and 99mTc-nanocolloid for sentinel lymph node detection increased the overall detection in patients with PSMA negative lymph node metastases [30].
Collamati et al. performed a different approach and further develop the approach of SPECT-isotopes by using 68Ga-PSMA-11 and a DROP-IN beta particle detector [31]. A comparable approach has been performed by Gondoputro et al. This group used a DROP-IN γ-probe and 99mTc PSMA as a tracer to detect lymph node metastases. This prospective single-arm study (n = 12) revealed a high detection rate of positive lymph nodes outside the resection template. A total of 11 metastatic lymph nodes were detected that were not visible on PSMA-PET imaging [34].
Dell’Oglio et al. performed a study to compare a DROP-IN gamma probe with traditional laparoscopic gamma probes as well as fluorescence guidance. Thereby, 47 sentinel lymph node procedures were conducted in the intervention group with 100% detection in the intervention group. Furthermore, 91% of those were identified by fluorescence imaging and 76% by the laparoscopic gamma probe [35].
The sensitivity and specificity of the concept of PSMA guided surgery is currently being tested in a phase II study by Gandaglia et al. In a planned interim analysis, sensitivity (67%), specificity (100%), positive predictive value (100%), and negative predictive values (90%) were observed. Despite an overall good performance, the authors raise the issue of suboptimal sensitivity leading to missed micrometastases of the approach [36].

3.3. Training

A total of 15 studies focused on training of surgeons for RARP.

3.3.1. Virtual Training

Various approaches using imaging for virtual- or simulation-based surgical training have been described in 12 studies.
Hung et al. report on one of the first simulators that was still limited to basic skill training [38]. However, Aghazadeh et al. already report a positive correlation between simulated robotic performance and robotic clinical performance [39]. Further virtual reality models are used, and it has been demonstrated that they improve surgical skills of novice surgeons [41].
Shim et al. demonstrate in a study with 45 participants that educational videos are comparable to expert-guided training but are superior to unguided training to fulfil robotic surgical tasks [42]. Shim et al. also investigated the performance of procedure specific training modules in virtual simulators for vesicourethral anastomosis and revealed significant improvements in live surgery after undergoing the training module [43].
Papalois et al. discuss a mixed reality application to train surgical decision making and anatomical structures. Here, multi-rater agreement reached 70.0% for every step of the training and significant improvement was achieved through the training [48].
Basic robotic skills acquired in the lab are transferable to the operating room according to Almarzouq et al. as they observe a positive correlation between the Global Evaluative Assessment of Robotic Skills (GEARS) scores for defined practice sessions on a simulator compared to GEARS scores during urethro-vesical anastomosis and bladder mobilization [44].
Simulation-based training and its effectiveness might depend on the experience level that trainees have. Hoogenes et al. revealed in a randomized trial that two different training programs led to different outcomes in junior trainees but not in more experienced trainees. The hereby used dV-Trainer (dV-T) (Mimic Technologies, Inc., Seattle, WA, USA) uses similar hand and foot controls as a da Vinci console whereas the da Vinci Surgical Skills Simulator (dVSSS) is software that is integrated into the console and uses the normal hand and foot controls [40]. This impact on the learning curve of surgeons is further analyzed by Wang et al. Here, surgeons with VR training revealed shorter learning curves than surgeons without, leading to shorter procedure times and especially anastomosis times (25.1 ± 7.1 min versus 40.0 ± 12.4 min; p = 0.015) [45].
A new development is full procedure simulation. Ebbing et al. demonstrate the face and content validity of a full procedure simulation module [47].
Besides from improving surgical skills based on simulation training, Olsen et al. address the question of when to proceed from simulation-based training to live surgery. This research group found a simulator score based on performance during bladder neck dissection, neurovascular bundle dissection and ureterovesical anastomosis that predicts experience levels of surgeons. According to the authors this score might be used to define which surgeon can proceed to supervised clinical training [46].
Further, training scores derived from simulation-based training not only correlate with scores in live surgery but can also impact clinical outcomes as continence recovery rates. In a study from Sanford et al., a high performance during VR needle driving led to a continence recovery rate after 24 months of 98.5% versus 84.9% in surgeons with lower scores (p = 0.028) [49].

3.3.2. Peer Review and Structured Feedback

Video review has been identified as an important element of training by van der Leun et al. In this study, students revealed significantly less injuries to the urethra or performed sutures with higher accuracy when reviewing videos of their training [50].
As RARP can be recorded including the use of augmented reality platforms, mentoring and teaching is possible from remote, potentially improving diffusion of robotic training beyond centers [51].
The future in this process might be artificial-intelligence-based video-labeling. A preliminary study of Youssef et al. demonstrates the feasibility of self-training of novices to surgical procedures to perform segmentation of RARP videos [52].

4. Discussion

RARP is a challenging procedure that requires precise treatment planning and intraoperative visualization. Various imaging tools have been developed to optimize outcomes of patients with PC. Thereby, the combination of innovative imaging tools and intraoperative guidance on the video console are at the center of the current research. We provide a comprehensive overview of the current literature and insights into future developments.
New imaging technologies can provide assistance at every step of RARP. Treatment of the primary tumor as well as LN dissection can be improved by incorporating those technologies into the surgical workflow as outlined in several feasibility studies described in the results part of this manuscript. Ultimately, surgical training can be enhanced by those advances and might be standardized by the support of simulation-based training and standardized performance metrics in order to improve outcomes [53].
Guidelines have not incorporated most of the described techniques yet. The current EAU guideline describes MRI-guided and PSMA-PET-based normograms that omit the necessity for LN dissection in certain patients. Our review has not covered this topic, as it does not cover staging that is used for both RARP and conventional prostatectomy. Sentinel lymph node biopsy and the use of indocyanin green is discussed in the current guidelines, but insufficient evidence is seen as an obstacle to a broad use of this technology as a meta-analysis revealed a sensitivity of 95.2% and NPV of 98.0% for finding LN metastases in those patients regardless of the primary surgical approach [1,54].
Despite a currently low uptake in guidelines, the impact of current augmented reality models in small case series is dramatic. Around one out of three procedures are performed differently when using the superimposed imaging as described by Schiavina et al. [20]. As the input imaging modalities (MRI, PSMA-PET CT) and robotic systems for RARP become increasingly available [55], the impact of a combination of both technologies might be dramatic for future management of patients with prostate cancer. This impact is not only restricted to the surgical procedure, where preoperative imaging can help to guide surgeons. In addition, preoperative counseling of patients and planning of surgeries can be impacted by virtual reality approaches. However, some adaptions must be made to implement all technologies. To address this, several groups have developed novel tools to provide fluorescence imaging or detection of radioactivity on robotic consoles. However, further efforts will be required to optimize the interfaces between surgeons and the currently developed tools. Interestingly, the addition of a variety of tools to the conventional surgical platforms might change the market leadership of currently dominating companies. Potentially, platforms that are open for development and quickly integrate new cost-effective tools might provide advantages for urologists.
Currently, urologic curricula have no structure for robotic assisted surgery. In the Netherlands, residents already participate in robotic surgery mostly in their final year of residency. However, no criteria exist when residents are allowed to take up training or surgery. At some institutions residents are required to complete online training courses while others are required to reach a certain threshold at simulator-based training [56]. New imaging technologies and combining them with surgical curricula might help to structure the education of future surgeons. As demonstrated in our analysis, those curricula must be designed differently for less experienced and experienced surgeons. New biomarkers can be developed to predict surgical learning curves and to give feedback to surgeons. Interestingly, lifelong training might be possible under those conditions as experienced surgeons can still receive feedback from those algorithms or peer surgeons despite spatial distance.
Financial toxicity has to be considered when adding new armamentarium to the diagnostic and treatment landscape of prostate cancer patients [57]. Adding more features and specialized instruments to a robotic system requires more resources in the healthcare system and might add to the direct costs of RARP that can already be considerable with conventional tools [58]. RARP exceeds the costs of conventional surgery by approximately EUR 2000 [59]. Further equipment such as gamma-probes or preoperative imaging is required for AR and VR augmented approaches. Interestingly, all cited studies in this manuscript do not report on the actual costs of the approach. In addition, not only the direct costs of required technology are of note. When integrating these technologies in a complex metaverse, new infrastructure is required which might not be affordable and thereby might not be provided all over the world [60]. However, some technologies might not necessarily increase the financial toxicity, as especially virtual reality models mostly rely on software and might therefore profit from low material costs and potentially improved outcomes. Future studies will have to assess the impact of new technologies on the overall costs of RARP.
Ultimately, the emerging imaging tools can contribute to a complete change in urologic surgery as it may allow for a step-by-step development towards autonomous surgery. Currently, RARP is performed completely controlled by a surgeon. Current tracking and localization techniques as outlined in the manuscript can help to determine organ boundaries or tumor location. In a first approach this information supports a surgeon to detect those important structures or provides feedback for training. Further development of those techniques might leverage this information and lead to autonomous surgery [61].
The adoption of technology might be impacted by the age of surgeons. Similar to the use of technology in urological patient cohorts [62], age plays a role for the overall use of technology. Technology adoption in physicians is dependent on the age of physicians. However, only very high age is impacting the uptake significantly [63]. Still, there is a need to further investigate the adoption of those new technologies amongst surgeons beyond the feasibility that has been shown in the analyzed studies in this manuscript. In addition, the importance of training and stepwise implementation of the described imaging modalities is paramount [64]
Most of the discussed studies are highly limited by their study design. Mostly, match-pair analysis, retrospective cohort analysis or exploratory single-arm studies are performed. Therefore, a final conclusion regarding the impact of those new technologies on outcomes of patients cannot be drawn and requires further research especially in the light of cost-effectiveness. The authors do not report on direct and indirect costs associated with the introduction of those new technologies. Further, a variety of different instruments and programs are used throughout the studies. Especially for AI-based applications, precise reporting of the algorithm functions will be paramount to facilitate a clear explanation [65]. Standardization and use in several consecutive studies might improve those issues in the future.

5. Conclusions

New imaging technologies have been increasingly tested to reduce complications and improve surgical outcomes of patients undergoing RARP. Currently, the feasibility of combined approaches using preoperative imaging or intraoperatively applied radioactive or fluorescent dyes has been demonstrated while a prospective confirmation of improvements is currently ongoing. Balancing improvements in surgical outcomes against financial toxicity of new imaging technologies for RARP will be a cornerstone for a broad clinical implementation.

Author Contributions

Conceptualization: S.R. and G.E.C.; methodology: S.R.; software: S.R.; validation: S.R.; formal analysis: S.R. and M.A.K.; investigation: S.R., M.A.K. and T.W.; resources: S.R.; data curation: S.R. and M.A.K.; writing—original draft preparation, S.R., M.A.K. and T.W.; writing—review and editing: K.-F.K., I.R.B., M.T., S.P., J.G.R., A.V., P.P., E.C., C.G.S. and G.E.C.; visualization, S.R. and E.C.; supervision: S.R. and G.E.C.; project administration: S.R. and G.E.C.; funding acquisition: S.R. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Mottet, N.; van den Bergh, R.C.N.; Briers, E.; Van den Broeck, T.; Cumberbatch, M.G.; De Santis, M.; Fanti, S.; Fossati, N.; Gandaglia, G.; Gillessen, S.; et al. EAU-EANM-ESTRO-ESUR-SIOG Guidelines on Prostate Cancer-2020 Update. Part 1: Screening, Diagnosis, and Local Treatment with Curative Intent. Eur. Urol. 2021, 79, 243–262. [Google Scholar] [CrossRef]
  2. Esperto, F.; Prata, F.; Autrán-Gómez, A.M.; Rivas, J.G.; Socarras, M.; Marchioni, M.; Albisinni, S.; Cataldo, R.; Scarpa, R.M.; Papalia, R. New Technologies for Kidney Surgery Planning 3D, Impression, Augmented Reality 3D, Reconstruction: Current Realities and Expectations. Curr. Urol. Rep. 2021, 22, 35. [Google Scholar] [CrossRef]
  3. Tewari, A.; Peabody, J.O.; Fischer, M.; Sarle, R.; Vallancien, G.; Delmas, V.; Hassan, M.; Bansal, A.; Hemal, A.K.; Guillonneau, B.; et al. An Operative and Anatomic Study to Help in Nerve Sparing during Laparoscopic and Robotic Radical Prostatectomy. Eur. Urol. 2003, 43, 444–454. [Google Scholar] [CrossRef]
  4. Amparore, D.; Pecoraro, A.; Checcucci, E.; De Cillis, S.; Piramide, F.; Volpi, G.; Piana, A.; Verri, P.; Granato, S.; Sica, M.; et al. 3D imaging technologies in minimally invasive kidney and prostate cancer surgery: Which is the urologists’ perception? Minerva Urol. Nephrol. 2022, 74, 178–185. [Google Scholar] [CrossRef]
  5. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ 2021, 372, n71. [Google Scholar] [CrossRef]
  6. Howick, J.; Glasziou, P.; Greenhalgh, T.; Heneghan, C.; Liberati, A.; Moschetti, I.; Chalmers, I.; Moschetti, I.; Phillips, B.; Thornton, H. The Oxford 2011 Levels of Evidence. CEBM. Available online: https://www.cebm.ox.ac.uk/resources/levels-of-evidence/ocebm-levels-of-evidence (accessed on 1 March 2023).
  7. Wake, N.; Rosenkrantz, A.B.; Huang, R.; Park, K.U.; Wysock, J.S.; Taneja, S.S.; Huang, W.C.; Sodickson, D.K.; Chandarana, H. Patient-specific 3D printed and augmented reality kidney and prostate cancer models: Impact on patient education. 3D Print. Med. 2019, 5, 4. [Google Scholar] [CrossRef]
  8. Shirk, J.D.; Reiter, R.; Wallen, E.M.; Pak, R.; Ahlering, T.; Badani, K.K.; Porter, J.R. Effect of 3-Dimensional, Virtual Reality Models for Surgical Planning of Robotic Prostatectomy on Trifecta Outcomes: A Randomized Clinical Trial. J. Urol. 2022, 208, 618–625. [Google Scholar] [CrossRef]
  9. Martini, A.; Falagario, U.G.; Cumarasamy, S.; Jambor, I.; Wagaskar, V.G.; Ratnani, P.; Haines, K.G., III; Tewari, A.K. The Role of 3D Models Obtained from Multiparametric Prostate MRI in Performing Robotic Prostatectomy. J. Endourol. 2022, 36, 387–393. [Google Scholar] [CrossRef]
  10. Checcucci, E.; Pecoraro, A.; Amparore, D.; De Cillis, S.; Granato, S.; Volpi, G.; Sica, M.; Verri, P.; Piana, A.; Piazzolla, P.; et al. The impact of 3D models on positive surgical margins after robot-assisted radical prostatectomy. World J. Urol. 2022, 40, 2221–2229. [Google Scholar] [CrossRef]
  11. Porpiglia, F.; Fiori, C.; Checcucci, E.; Amparore, D.; Bertolo, R. Augmented Reality Robot-assisted Radical Prostatectomy: Preliminary Experience. Urology 2018, 115, 184. [Google Scholar] [CrossRef]
  12. Porpiglia, F.; Bertolo, R.; Amparore, D.; Checcucci, E.; Artibani, W.; Dasgupta, P.; Montorsi, F.; Tewari, A.; Fiori, C. Augmented reality during robot-assisted radical prostatectomy: Expert robotic surgeons’ on-the-spot insights after live surgery. Minerva Urol. Nephrol. 2018, 70, 226–229. [Google Scholar] [CrossRef] [PubMed]
  13. Samei, G.; Goksel, O.; Lobo, J.; Mohareri, O.; Black, P.; Rohling, R.; Salcudean, S. Real-Time FEM-Based Registration of 3-D to 2.5-D Transrectal Ultrasound Images. IEEE Trans. Med. Imaging 2018, 37, 1877–1886. [Google Scholar] [CrossRef] [PubMed]
  14. Kratiras, Z.; Gavazzi, A.; Belba, A.; Willis, B.; Chew, S.; Allen, C.; Amoroso, P.; Dasgupta, P. Phase I study of a new tablet-based image guided surgical system in robot-assisted radical prostatectomy. Minerva Urol. Nephrol. 2019, 71, 92–95. [Google Scholar] [CrossRef]
  15. Porpiglia, F.; Checcucci, E.; Amparore, D.; Manfredi, M.; Massa, F.; Piazzolla, P.; Manfrin, D.; Piana, A.; Tota, D.; Bollito, E.; et al. Three-dimensional Elastic Augmented-reality Robot-assisted Radical Prostatectomy Using Hyperaccuracy Three-dimensional Reconstruction Technology: A Step Further in the Identification of Capsular Involvement. Eur. Urol. 2019, 76, 505–514. [Google Scholar] [CrossRef] [PubMed]
  16. Porpiglia, F.; Checcucci, E.; Amparore, D.; Autorino, R.; Piana, A.; Bellin, A.; Piazzolla, P.; Massa, F.; Bollito, E.; Gned, D.; et al. Augmented-reality robot-assisted radical prostatectomy using hyper-accuracy three-dimensional reconstruction (HA3D™) technology: A radiological and pathological study. BJU Int. 2019, 123, 834–845. [Google Scholar] [CrossRef]
  17. Mehralivand, S.; Kolagunda, A.; Hammerich, K.; Sabarwal, V.; Harmon, S.; Sanford, T.; Gold, S.; Hale, G.; Romero, V.V.; Bloom, J.; et al. A multiparametric magnetic resonance imaging-based virtual reality surgical navigation tool for robotic-assisted radical prostatectomy. Turk. J. Urol. 2019, 45, 357–365. [Google Scholar] [CrossRef] [PubMed]
  18. Canda, A.E.; Aksoy, S.F.; Altinmakas, E.; Koseoglu, E.; Falay, O.; Kordan, Y.; Çil, B.; Balbay, M.D.; Esen, T. Virtual reality tumor navigated robotic radical prostatectomy by using three-dimensional reconstructed multiparametric prostate MRI and (68)Ga-PSMA PET/CT images: A useful tool to guide the robotic surgery? BJUI Compass 2020, 1, 108–115. [Google Scholar] [CrossRef]
  19. Samei, G.; Tsang, K.; Kesch, C.; Lobo, J.; Hor, S.; Mohareri, O.; Chang, S.; Goldenberg, S.L.; Black, P.C.; Salcudean, S. A partial augmented reality system with live ultrasound and registered preoperative MRI for guiding robot-assisted radical prostatectomy. Med. Image Anal. 2020, 60, 101588. [Google Scholar] [CrossRef]
  20. Schiavina, R.; Bianchi, L.; Lodi, S.; Cercenelli, L.; Chessa, F.; Bortolani, B.; Gaudiano, C.; Casablanca, C.; Droghetti, M.; Porreca, A.; et al. Real-time Augmented Reality Three-dimensional Guided Robotic Radical Prostatectomy: Preliminary Experience and Evaluation of the Impact on Surgical Planning. Eur. Urol. Focus 2021, 7, 1260–1267. [Google Scholar] [CrossRef]
  21. Tanzi, L.; Piazzolla, P.; Porpiglia, F.; Vezzetti, E. Real-time deep learning semantic segmentation during intra-operative surgery for 3D augmented reality assistance. Int. J. Comput. Assist. Radiol. Surg. 2021, 16, 1435–1445. [Google Scholar] [CrossRef]
  22. Padovan, E.; Marullo, G.; Tanzi, L.; Piazzolla, P.; Moos, S.; Porpiglia, F.; Vezzetti, E. A deep learning framework for real-time 3D model registration in robot-assisted laparoscopic surgery. Int. J. Med. Robot. 2022, 18, e2387. [Google Scholar] [CrossRef] [PubMed]
  23. Lopez, A.; Zlatev, D.V.; Mach, K.E.; Bui, D.; Liu, J.J.; Rouse, R.V.; Harris, T.; Leppert, J.T.; Liao, J.C. Intraoperative Optical Biopsy during Robotic Assisted Radical Prostatectomy Using Confocal Endomicroscopy. J. Urol. 2016, 195, 1110–1117. [Google Scholar] [CrossRef] [PubMed]
  24. Law, K.W.; Ajib, K.; Couture, F.; Tholomier, C.; Bondarenko, H.D.; Preisser, F.; Karakiewicz, P.I.; Zorn, K.C. Use of the AccuVein AV400 during RARP: An infrared augmented reality device to help reduce abdominal wall hematoma. Can. J. Urol. 2018, 25, 9384–9388. [Google Scholar] [PubMed]
  25. Bianchi, L.; Chessa, F.; Angiolini, A.; Cercenelli, L.; Lodi, S.; Bortolani, B.; Molinaroli, E.; Casablanca, C.; Droghetti, M.; Gaudiano, C.; et al. The Use of Augmented Reality to Guide the Intraoperative Frozen Section During Robot-assisted Radical Prostatectomy. Eur. Urol. 2021, 80, 480–488. [Google Scholar] [CrossRef] [PubMed]
  26. van der Poel, H.G.; Buckle, T.; Brouwer, O.R.; Valdés Olmos, R.A.; van Leeuwen, F.W. Intraoperative laparoscopic fluorescence guidance to the sentinel lymph node in prostate cancer patients: Clinical proof of concept of an integrated functional imaging approach using a multimodal tracer. Eur. Urol. 2011, 60, 826–833. [Google Scholar] [CrossRef] [PubMed]
  27. KleinJan, G.H.; van den Berg, N.S.; de Jong, J.; Wit, E.M.; Thygessen, H.; Vegt, E.; van der Poel, H.G.; van Leeuwen, F.W. Multimodal hybrid imaging agents for sentinel node mapping as a means to (re)connect nuclear medicine to advances made in robot-assisted surgery. Eur. J. Nucl. Med. Mol. Imaging 2016, 43, 1278–1287. [Google Scholar] [CrossRef] [PubMed]
  28. van den Berg, N.S.; Buckle, T.; KleinJan, G.H.; van der Poel, H.G.; van Leeuwen, F.W.B. Multispectral Fluorescence Imaging During Robot-assisted Laparoscopic Sentinel Node Biopsy: A First Step Towards a Fluorescence-based Anatomic Roadmap. Eur. Urol. 2017, 72, 110–117. [Google Scholar] [CrossRef]
  29. de Korne, C.M.; Wit, E.M.; de Jong, J.; Valdés Olmos, R.A.; Buckle, T.; van Leeuwen, F.W.B.; van der Poel, H.G. Anatomical localization of radiocolloid tracer deposition affects outcome of sentinel node procedures in prostate cancer. Eur. J. Nucl. Med. Mol. Imaging 2019, 46, 2558–2568. [Google Scholar] [CrossRef]
  30. Hinsenveld, F.J.; Wit, E.M.K.; van Leeuwen, P.J.; Brouwer, O.R.; Donswijk, M.L.; Tillier, C.N.; Vegt, E.; van Muilekom, E.; van Oosterom, M.N.; van Leeuwen, F.W.B.; et al. Prostate-Specific Membrane Antigen PET/CT Combined with Sentinel Node Biopsy for Primary Lymph Node Staging in Prostate Cancer. J. Nucl. Med. 2020, 61, 540–545. [Google Scholar] [CrossRef]
  31. Collamati, F.; van Oosterom, M.N.; De Simoni, M.; Faccini, R.; Fischetti, M.; Mancini Terracciano, C.; Mirabelli, R.; Moretti, R.; Heuvel, J.O.; Solfaroli Camillocci, E.; et al. A DROP-IN beta probe for robot-assisted (68)Ga-PSMA radioguided surgery: First ex vivo technology evaluation using prostate cancer specimens. EJNMMI Res. 2020, 10, 92. [Google Scholar] [CrossRef]
  32. Mazzone, E.; Dell’Oglio, P.; Grivas, N.; Wit, E.; Donswijk, M.; Briganti, A.; Leeuwen, F.V.; Poel, H.V. Diagnostic Value, Oncologic Outcomes, and Safety Profile of Image-Guided Surgery Technologies During Robot-Assisted Lymph Node Dissection with Sentinel Node Biopsy for Prostate Cancer. J. Nucl. Med. 2021, 62, 1363–1371. [Google Scholar] [CrossRef] [PubMed]
  33. Wit, E.M.K.; van Beurden, F.; Kleinjan, G.H.; Grivas, N.; de Korne, C.M.; Buckle, T.; Donswijk, M.L.; Bekers, E.M.; van Leeuwen, F.W.B.; van der Poel, H.G. The impact of drainage pathways on the detection of nodal metastases in prostate cancer: A phase II randomized comparison of intratumoral vs intraprostatic tracer injection for sentinel node detection. Eur. J. Nucl. Med. Mol. Imaging 2022, 49, 1743–1753. [Google Scholar] [CrossRef] [PubMed]
  34. Gondoputro, W.; Scheltema, M.J.; Blazevski, A.; Doan, P.; Thompson, J.E.; Amin, A.; Geboers, B.; Agrawal, S.; Siriwardana, A.; Van Leeuwen, P.J.; et al. Robot-Assisted Prostate-Specific Membrane Antigen-Radioguided Surgery in Primary Diagnosed Prostate Cancer. J. Nucl. Med. 2022, 63, 1659–1664. [Google Scholar] [CrossRef] [PubMed]
  35. Dell’Oglio, P.; Meershoek, P.; Maurer, T.; Wit, E.M.K.; van Leeuwen, P.J.; van der Poel, H.G.; van Leeuwen, F.W.B.; van Oosterom, M.N. A DROP-IN Gamma Probe for Robot-assisted Radioguided Surgery of Lymph Nodes During Radical Prostatectomy. Eur. Urol. 2021, 79, 124–132. [Google Scholar] [CrossRef] [PubMed]
  36. Gandaglia, G.; Mazzone, E.; Stabile, A.; Pellegrino, A.; Cucchiara, V.; Barletta, F.; Scuderi, S.; Robesti, D.; Leni, R.; Samanes Gajate, A.M.; et al. Prostate-specific membrane antigen Radioguided Surgery to Detect Nodal Metastases in Primary Prostate Cancer Patients Undergoing Robot-assisted Radical Prostatectomy and Extended Pelvic Lymph Node Dissection: Results of a Planned Interim Analysis of a Prospective Phase 2 Study. Eur. Urol. 2022, 82, 411–418. [Google Scholar] [CrossRef] [PubMed]
  37. Özkan, A.; Köseoğlu, E.; Canda, A.E.; Çil, B.E.; Aykanat, C.İ.; Sarıkaya, A.F.; Tarım, K.; Armutlu, A.; Kulaç, İ.; Barçın, E.; et al. Fluorescence-guided extended pelvic lymphadenectomy during robotic radical prostatectomy. J. Robot. Surg. 2022, 17, 885–890. [Google Scholar] [CrossRef] [PubMed]
  38. Hung, A.J.; Zehnder, P.; Patil, M.B.; Cai, J.; Ng, C.K.; Aron, M.; Gill, I.S.; Desai, M.M. Face, content and construct validity of a novel robotic surgery simulator. J. Urol. 2011, 186, 1019–1024. [Google Scholar] [CrossRef]
  39. Aghazadeh, M.A.; Mercado, M.A.; Pan, M.M.; Miles, B.J.; Goh, A.C. Performance of robotic simulated skills tasks is positively associated with clinical robotic surgical performance. BJU Int. 2016, 118, 475–481. [Google Scholar] [CrossRef]
  40. Hoogenes, J.; Wong, N.; Al-Harbi, B.; Kim, K.S.; Vij, S.; Bolognone, E.; Quantz, M.; Guo, Y.; Shayegan, B.; Matsumoto, E.D. A Randomized Comparison of 2 Robotic Virtual Reality Simulators and Evaluation of Trainees’ Skills Transfer to a Simulated Robotic Urethrovesical Anastomosis Task. Urology 2018, 111, 110–115. [Google Scholar] [CrossRef]
  41. Harrison, P.; Raison, N.; Abe, T.; Watkinson, W.; Dar, F.; Challacombe, B.; Van Der Poel, H.; Khan, M.S.; Dasgupa, P.; Ahmed, K. The Validation of a Novel Robot-Assisted Radical Prostatectomy Virtual Reality Module. J. Surg. Educ. 2018, 75, 758–766. [Google Scholar] [CrossRef]
  42. Shim, J.S.; Kim, J.Y.; Pyun, J.H.; Cho, S.; Oh, M.M.; Kang, S.H.; Lee, J.G.; Kim, J.J.; Cheon, J.; Kang, S.G. Comparison of effective teaching methods to achieve skill acquisition using a robotic virtual reality simulator: Expert proctoring versus an educational video versus independent training. Medicine 2018, 97, e13569. [Google Scholar] [CrossRef] [PubMed]
  43. Shim, J.S.; Noh, T.I.; Kim, J.Y.; Pyun, J.H.; Cho, S.; Oh, M.M.; Kang, S.H.; Cheon, J.; Lee, J.G.; Kim, J.J.; et al. Predictive Validation of a Robotic Virtual Reality Simulator: The Tube 3 module for Practicing Vesicourethral Anastomosis in Robot-Assisted Radical Prostatectomy. Urology 2018, 122, 32–36. [Google Scholar] [CrossRef] [PubMed]
  44. Almarzouq, A.; Hu, J.; Noureldin, Y.A.; Yin, A.; Anidjar, M.; Bladou, F.; Tanguay, S.; Kassouf, W.; Aprikian, A.G.; Andonian, S. Are basic robotic surgical skills transferable from the simulator to the operating room? A randomized, prospective, educational study. Can. Urol. Assoc. J. 2020, 14, 416–422. [Google Scholar] [CrossRef] [PubMed]
  45. Wang, F.; Zhang, C.; Guo, F.; Sheng, X.; Ji, J.; Xu, Y.; Cao, Z.; Lyu, J.; Lu, X.; Yang, B. The application of virtual reality training for anastomosis during robot-assisted radical prostatectomy. Asian J. Urol. 2021, 8, 204–208. [Google Scholar] [CrossRef] [PubMed]
  46. Olsen, R.G.; Bjerrum, F.; Konge, L.; Jepsen, J.V.; Azawi, N.H.; Bube, S.H. Validation of a Novel Simulation-Based Test in Robot-Assisted Radical Prostatectomy. J. Endourol. 2021, 35, 1265–1272. [Google Scholar] [CrossRef] [PubMed]
  47. Ebbing, J.; Wiklund, P.N.; Akre, O.; Carlsson, S.; Olsson, M.J.; Höijer, J.; Heimer, M.; Collins, J.W. Development and validation of non-guided bladder-neck and neurovascular-bundle dissection modules of the RobotiX-Mentor® full-procedure robotic-assisted radical prostatectomy virtual reality simulation. Int. J. Med. Robot. 2021, 17, e2195. [Google Scholar] [CrossRef] [PubMed]
  48. Papalois, Z.A.; Aydın, A.; Khan, A.; Mazaris, E.; Rathnasamy Muthusamy, A.S.; Dor, F.; Dasgupta, P.; Ahmed, K. HoloMentor: A Novel Mixed Reality Surgical Anatomy Curriculum for Robot-Assisted Radical Prostatectomy. Eur. Surg. Res. 2022, 63, 40–45. [Google Scholar] [CrossRef]
  49. Sanford, D.I.; Ma, R.; Ghoreifi, A.; Haque, T.F.; Nguyen, J.H.; Hung, A.J. Association of Suturing Technical Skill Assessment Scores between Virtual Reality Simulation and Live Surgery. J. Endourol. 2022, 36, 1388–1394. [Google Scholar] [CrossRef]
  50. van der Leun, J.A.; Siem, G.; Meijer, R.P.; Brinkman, W.M. Improving Robotic Skills by Video Review. J. Endourol. 2022, 36, 1126–1135. [Google Scholar] [CrossRef]
  51. Noël, J.; Moschovas, M.C.; Patel, E.; Rogers, T.; Marquinez, J.; Rocco, B.; Mottrie, A.; Patel, V. Step-by-step optimisation of robotic-assisted radical prostatectomy using augmented reality. Int. Braz. J. Urol. 2022, 48, 600–601. [Google Scholar] [CrossRef]
  52. Cheikh Youssef, S.; Hachach-Haram, N.; Aydin, A.; Shah, T.T.; Sapre, N.; Nair, R.; Rai, S.; Dasgupta, P. Video labelling robot-assisted radical prostatectomy and the role of artificial intelligence (AI): Training a novice. J. Robot. Surg. 2022, 17, 695–701. [Google Scholar] [CrossRef] [PubMed]
  53. Hung, A.J.; Chen, J.; Jarc, A.; Hatcher, D.; Djaladat, H.; Gill, I.S. Development and Validation of Objective Performance Metrics for Robot-Assisted Radical Prostatectomy: A Pilot Study. J. Urol. 2018, 199, 296–304. [Google Scholar] [CrossRef] [PubMed]
  54. Wit, E.M.K.; Acar, C.; Grivas, N.; Yuan, C.; Horenblas, S.; Liedberg, F.; Valdes Olmos, R.A.; van Leeuwen, F.W.B.; van den Berg, N.S.; Winter, A.; et al. Sentinel Node Procedure in Prostate Cancer: A Systematic Review to Assess Diagnostic Accuracy. Eur. Urol. 2017, 71, 596–605. [Google Scholar] [CrossRef] [PubMed]
  55. Bravi, C.A.; Paciotti, M.; Sarchi, L.; Mottaran, A.; Nocera, L.; Farinha, R.; De Backer, P.; Vinckier, M.-H.; De Naeyer, G.; D’Hondt, F.; et al. Robot-assisted Radical Prostatectomy with the Novel Hugo Robotic System: Initial Experience and Optimal Surgical Set-up at a Tertiary Referral Robotic Center. Eur. Urol. 2022, 82, 233–237. [Google Scholar] [CrossRef] [PubMed]
  56. Beulens, A.J.W.; Vaartjes, L.; Tilli, S.; Brinkman, W.M.; Umari, P.; Puliatti, S.; Koldewijn, E.L.; Hendrikx, A.J.M.; van Basten, J.P.; van Merriënboer, J.J.G.; et al. Structured robot-assisted surgery training curriculum for residents in Urology and impact on future surgical activity. J. Robot. Surg. 2021, 15, 497–510. [Google Scholar] [CrossRef] [PubMed]
  57. Imber, B.S.; Varghese, M.; Ehdaie, B.; Gorovets, D. Financial toxicity associated with treatment of localized prostate cancer. Nat. Rev. Urol. 2020, 17, 28–40. [Google Scholar] [CrossRef] [PubMed]
  58. Özman, O.; Tillier, C.N.; van Muilekom, E.; van de Poll-Franse, L.V.; van der Poel, H.G. Financial Toxicity After Robot-Assisted Radical Prostatectomy and Its Relation with Oncologic, Functional Outcomes. J. Urol. 2022, 208, 978–986. [Google Scholar] [CrossRef]
  59. Bolenz, C.; Gupta, A.; Hotze, T.; Ho, R.; Cadeddu, J.A.; Roehrborn, C.G.; Lotan, Y. Cost comparison of robotic, laparoscopic, and open radical prostatectomy for prostate cancer. Eur. Urol. 2010, 57, 453–458. [Google Scholar] [CrossRef]
  60. Checcucci, E.; Verri, P.; Amparore, D.; Cacciamani, G.E.; Rivas, J.G.; Autorino, R.; Mottrie, A.; Breda, A.; Porpiglia, F. The future of robotic surgery in urology: From augmented reality to the advent of metaverse. Ther. Adv. Urol. 2023, 15, 17562872231151853. [Google Scholar] [CrossRef]
  61. Andras, I.; Mazzone, E.; van Leeuwen, F.W.B.; De Naeyer, G.; van Oosterom, M.N.; Beato, S.; Buckle, T.; O’Sullivan, S.; van Leeuwen, P.J.; Beulens, A.; et al. Artificial intelligence and robotics: A combination that is changing the operating room. World J. Urol. 2020, 38, 2359–2366. [Google Scholar] [CrossRef]
  62. Rodler, S.; Buchner, A.; Stief, C.G.; Heinemann, V.; Staehler, M.; Casuscelli, J. Patients’ Perspective on Digital Technologies in Advanced Genitourinary Cancers. Clin. Genitourin. Cancer 2021, 19, 76–82.e76. [Google Scholar] [CrossRef]
  63. Zachrison, K.S.; Yan, Z.; Samuels-Kalow, M.E.; Licurse, A.; Zuccotti, G.; Schwamm, L.H. Association of Physician Characteristics with Early Adoption of Virtual Health Care. JAMA Netw. Open 2021, 4, e2141625. [Google Scholar] [CrossRef]
  64. Gandaglia, G.; Schatteman, P.; De Naeyer, G.; D’Hondt, F.; Mottrie, A. Novel Technologies in Urologic Surgery: A Rapidly Changing Scenario. Curr. Urol. Rep. 2016, 17, 19. [Google Scholar] [CrossRef]
  65. Cacciamani, G.E.; Chu, T.N.; Sanford, D.I.; Abreu, A.; Duddalwar, V.; Oberai, A.; Kuo, C.C.J.; Liu, X.; Denniston, A.K.; Vasey, B.; et al. PRISMA AI reporting guidelines for systematic reviews and meta-analyses on AI in healthcare. Nat. Med. 2023, 29, 14–15. [Google Scholar] [CrossRef]
Figure 1. PRISMA flowchart.
Figure 1. PRISMA flowchart.
Jcm 12 05425 g001
Figure 2. Augmented reality application for RARP. (A) 3D model of the prostate with intraprostatic lesion (C) 3D model of the prostate with lesion with capsular contact (bright green). (B,D) AI 3D-AR RARP: the 3D virtual model of the prostate was automatically overlapped to in vivo anatomy thanks to the AI; then a 3D guided selective biopsy was performed (images are courtesy of Prof. Porpiglia).
Figure 2. Augmented reality application for RARP. (A) 3D model of the prostate with intraprostatic lesion (C) 3D model of the prostate with lesion with capsular contact (bright green). (B,D) AI 3D-AR RARP: the 3D virtual model of the prostate was automatically overlapped to in vivo anatomy thanks to the AI; then a 3D guided selective biopsy was performed (images are courtesy of Prof. Porpiglia).
Jcm 12 05425 g002
Table 1. Evidence synthesis.
Table 1. Evidence synthesis.
Author [Ref.]YearStudy DesignLoE [6]OrganArea of ApplicationImaging Modality
Wake et al. [7]2019ProspectiveIVProstatePreoperative PlanningMRI—Virtual Reality, 3D models
Shirk et al. [8]2022ProspectiveIIProstatePreoperative PlanningMRI—Virtual reality
Martini et al. [9]2022RetrospectiveIVProstatePreoperative PlanningMRI—3D models
Checcuci et al. [10]2022RetrospectiveIVProstatePreoperative PlanningMRI—3D models
Porpilgia et al. [11]2018ProspectiveIVProstateVisualization of PTMRI—console
Porpiglia et al. [12]2018ProspectiveIIIProstateVisualization of PTMRI—console
Samei et al. [13]2018ProspectiveIVProstateVisualization of PTUltrasound
Kratiras et al. [14]2019ProspectiveIVProstateVisualization of PTMRI—tablet
Porpiglia et al. [15]2019ProspectiveIIIProstateVisualization of PTMRI—console
Porpiglia et al. [16]2019ProspectiveIIIProstateVisualization of PTMRI—console
Mehralivand et al. [17]2019ProspectiveIVProstateVisualization of PTMRI—separate display
Canda et al. [18]2020ProspectiveIVProstateVisualization of PTMRI/PSMA-PET—console
Samei et al. [19]2020ProspectiveIVProstateVisualization of PTMRI/Ultrasound—console
Schiavina et al. [20]2021ProspectiveIVProstateVisualization of PTMRI—console
Tanzi et al. [21]2021RetrospectiveIVProstateVisualization of PTConsole
Padovan et al. [22]2022RetrospectiveIVProstateVisualization of PTConsole
Lopez et al. [23]2016ProspectiveIVProstateIntraoperative diagnosticsConfocal
Law et al. [24]2018RetrospectiveIVAbdominal wallIntraoperative diagnosticsInfrared
Bianchi et al. [25]2021ProspectiveIIIProstateIntraoperative diagnosticsAugmented reality—console
van der Poel et al. [26]2011ProspectiveIIILNIntraoperative detection of LNFluorescent camera
KleinJan et al. [27]2016ProspectiveIVLNIntraoperative detection of LNFluorescent camera
van den Berg et al. [28]2017ProspectiveIVLNIntraoperative detection of LNFluorescent camera
De Korne et al. [29]2019RetrospectiveIIILNIntraoperative detection of LNFluorescent camera/gamma probe
Hinsenveld et al. [30]2020RetrospectiveIIILNIntraoperative detection of LNFluorescent camera
Collamati et al. [31]2020ProspectiveIVLNIntraoperative detection of LNDROP-IN beta particle detector
Mazzone et al. [32]2021RetrospectiveIIILNIntraoperative detection of LNFluorescent camera
Wit et al. [33]2022ProspectiveIILNIntraoperative detection of LNFluorescent camera
Gondoputro et al. [34]2022ProspectiveIVLNIntraoperative detection of LNDrop-IN gamma detector
DellÓglio et al. [35]2021ProspectiveIVLNIntraoperative detection of LNDrop-in gamma probe, laparoscopic gamma probe, fluorescent camera
Gandaglia et al. [36]2022ProspectiveIIILNIntraoperative detection of LNDrop-in gamma probe
Özkan et al. [37]2022RetrospectiveIVLNIntraoperative detection of LNFluorescent camera
Hung et al. [38]2011ProspectiveIVProstateEducation/
Training
Simulator—basic skills
Aghazadeh et al. [39]2016ProspectiveIVProstateEducation/
Training
Simulator—clinical skills
Hoogenes et al. [40]2018ProspectiveIIIProstateEducation/
Training
Simulator
Harrison et al. [41]2018ProspectiveIIIProstateEducation/
Training
Simulator—clinical skills
Shim et al. [42]2018ProspectiveIVProstateEducation/
Training
Video instruction vs. Guided
Shim et al. [43]2018ProspectiveIVProstateEducation/
Training
Simulator
Almarzouq et al. [44]2020ProspectiveIIIProstateEducation/
Training
Simulator
Wang et al. [45]2021ProspectiveIIIProstateEducation/
Training
Simulator
Olsen et al. [46]2021ProspectiveIIIProstateEducation/
Training
Simulator
Ebbing et al. [47]2021ProspectiveIVProstateEducation/
Training
Full procedure simulator
Papalois et al. [48]2022ProspectiveIVProstateEducation/
Training
Mixed reality/VR Glasses
Sanford et al. [49]2022ProspectiveIVProstateEducation/
Training
VR Simulator
Van der Leun et al. [50]2022ProspectiveIIIProstateFeedbackSimulator—video
Noël et al. [51]2022ProspectiveIVProstateFeedbackRemote Teaching
Cheikh et al. [52]2022RetrospectiveIVProstateFeedbackVideo labeling
Abbreviation: LoE: Level of evidence, LN: lymph node, VR: virtual reality.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Rodler, S.; Kidess, M.A.; Westhofen, T.; Kowalewski, K.-F.; Belenchon, I.R.; Taratkin, M.; Puliatti, S.; Gómez Rivas, J.; Veccia, A.; Piazza, P.; et al. A Systematic Review of New Imaging Technologies for Robotic Prostatectomy: From Molecular Imaging to Augmented Reality. J. Clin. Med. 2023, 12, 5425. https://doi.org/10.3390/jcm12165425

AMA Style

Rodler S, Kidess MA, Westhofen T, Kowalewski K-F, Belenchon IR, Taratkin M, Puliatti S, Gómez Rivas J, Veccia A, Piazza P, et al. A Systematic Review of New Imaging Technologies for Robotic Prostatectomy: From Molecular Imaging to Augmented Reality. Journal of Clinical Medicine. 2023; 12(16):5425. https://doi.org/10.3390/jcm12165425

Chicago/Turabian Style

Rodler, Severin, Marc Anwar Kidess, Thilo Westhofen, Karl-Friedrich Kowalewski, Ines Rivero Belenchon, Mark Taratkin, Stefano Puliatti, Juan Gómez Rivas, Alessandro Veccia, Pietro Piazza, and et al. 2023. "A Systematic Review of New Imaging Technologies for Robotic Prostatectomy: From Molecular Imaging to Augmented Reality" Journal of Clinical Medicine 12, no. 16: 5425. https://doi.org/10.3390/jcm12165425

APA Style

Rodler, S., Kidess, M. A., Westhofen, T., Kowalewski, K. -F., Belenchon, I. R., Taratkin, M., Puliatti, S., Gómez Rivas, J., Veccia, A., Piazza, P., Checcucci, E., Stief, C. G., & Cacciamani, G. E. (2023). A Systematic Review of New Imaging Technologies for Robotic Prostatectomy: From Molecular Imaging to Augmented Reality. Journal of Clinical Medicine, 12(16), 5425. https://doi.org/10.3390/jcm12165425

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop