Next Article in Journal
A 3D CBCT Analysis of Airway and Cephalometric Values in Patients Diagnosed with Juvenile Idiopathic Arthritis Compared to a Control Group
Previous Article in Journal
Solving Inverse Problems of Stationary Convection–Diffusion Equation Using the Radial Basis Function Method with Polyharmonic Polynomials
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Augmented Reality: Mapping Methods and Tools for Enhancing the Human Role in Healthcare HMI

Department of Management, Production and Design, Politecnico di Torino, C.so Duca degli Abruzzi, 24, 10129 Torino, Italy
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(9), 4295; https://doi.org/10.3390/app12094295
Submission received: 25 March 2022 / Revised: 20 April 2022 / Accepted: 22 April 2022 / Published: 24 April 2022
(This article belongs to the Topic Human–Machine Interaction)

Abstract

:
Background: Augmented Reality (AR) represents an innovative technology to improve data visualization and strengthen the human perception. Among Human–Machine Interaction (HMI), medicine can benefit most from the adoption of these digital technologies. In this perspective, the literature on orthopedic surgery techniques based on AR was evaluated, focusing on identifying the limitations and challenges of AR-based healthcare applications, to support the research and the development of further studies. Methods: Studies published from January 2018 to December 2021 were analyzed after a comprehensive search on PubMed, Google Scholar, Scopus, IEEE Xplore, Science Direct, and Wiley Online Library databases. In order to improve the review reporting, the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines were used. Results: Authors selected sixty-two articles meeting the inclusion criteria, which were categorized according to the purpose of the study (intraoperative, training, rehabilitation) and according to the surgical procedure used. Conclusions: AR has the potential to improve orthopedic training and practice by providing an increasingly human-centered clinical approach. Further research can be addressed by this review to cover problems related to hardware limitations, lack of accurate registration and tracking systems, and absence of security protocols.

1. Introduction

The human ability in controlling machines through intuitive and natural behaviors has been fostered by even more user-friendly interfaces, including gesture-based systems [1]. Innovative solutions such as Augmented Reality (AR) and Virtual Reality (VR) play a key role in this respect, adapting interfaces to individual users/scenarios through the design of virtual environments powered by biofeedback such as human posture, motion [1], and emotion [2,3].
AR was defined as “the concept of digitally superimposing virtual objects on physical objects in real space so that individuals can interact with both at the same time” [4]. As perceived by humans, AR is an innovative reality in which the perception of the real world is enriched and enhanced by new virtual sensory impressions, in a way that they coexist with the real world [5]. This augmented information is provided by sensory stimuli, either visual or auditory, entirely generated by a computer and then transmitted to the user through specific tools. One of the most used is Head-Mounted Displays (HMDs), which are helmets equipped with a monitor capable of displaying holograms of a real-time video stream in the user’s field of view, such as smart glasses, or showing a completely artificial 3D-generated world, such as VR headsets [6]. In this perspective, registration and tracking are essential operations to integrate virtual and real information. Registration is the possibility of overlapping the 3D model on the real surface of the acquired object, for instance, considering the medical field, a surgical tool, or even an anatomical structure; tracking refers to the possibility of anchoring and updating the 3D model position to the actual object position over time.
The coexistence of virtual perceptual stimuli and the real world is the substantial difference that distinguishes AR from VR, in which, instead, desktop computers, CAVE systems, or head-mounted displays entirely generate the experience, and the individual is entirely embedded in it [7]. The term Mixed Reality (MR) was first used by Milgram et al. [8], who defined MR as a super set of AR and VR, in which virtual and real-world information were fused together and displayed simultaneously in a single display within the Reality–Virtuality continuum. However, with the introduction of a new class of wearable displays, its meaning has changed over time, until MR is sometimes represented as a more sophisticated iteration of AR [9].
Nowadays, AR, VR, and MR technologies, also named “Extended Reality” technologies [10], are becoming increasingly common and, in particular, AR is considered one of the leading advanced technologies. AR technology originated from the military field [5], where it was mostly used for training in combat situations, preparing soldiers for non-replicable conditions outside actual combat, then was extensively used commercially in gaming and entertainment [11]. The substantial interest in AR is also witnessed by the explosion in the number of scientific publications in different research fields. In fact, the possibility of creating an ideal form of HMI is getting closer thanks to the rapid growth of AR technologies [12], providing more natural and efficient ways for a user to interact with a real or virtual environment.
Papers show how AR has improved its efficacy and affected a significant number of application fields, such as design and planning [13], industrial implementation, education and training [14], healthcare management [15], and emergency services [16]. In particular, there is an exponential increase in scientific studies concerning AR applications in the area of orthopedics, on which this systematic review article focuses on.
Application usability has paramount importance in sensitive fields such as medicine and healthcare and is core in the development of user-centered design systems [17]. AR represents a valuable solution for improving the transfer and consideration of information during surgery [18]. Orthopedic surgery is characterized by the frequent use of numerical and geometric information, as well as visual data such as pre- and intra-operatively acquired medical images. Additionally, typical mechanical orthopedic procedures, such as the insertion of screws, wires, or implants, and the correction of deformities require the correct knowledge of the positioning of insertion points in relation to the patient’s anatomy [18]. Thus, AR applications seem to be particularly useful in supporting surgeons in precise and planned surgical procedures, enhancing their skills through intuitive augmentation of medical information. Because of the large number of AR studies related to medicine, in past years some reviews have already been carried out focusing on specific aspects of the research; for instance, Sakai et al. [19] investigated AR, VR, and MR solutions in spinal surgery, confirming the benefits introduced in minimally invasive interventions; Longo et al. [20] analyzed AR, VR, and Artificial intelligence (AI) as support for preoperative diagnosis; Ara et al. [21] reported a collection of AR technologies with the perspective on security issues data-related, proposing a solution to avoid this kind of risk.
This review aims at investigating the impacts of AR in the field of orthopedics, analyzing the wide range of AR-based orthopedic tools and methodologies that have been developed in the last few years, involving different anatomical districts and referring to different steps of surgical procedures. The main core is focused on identifying the limitations and challenges of AR-based healthcare applications, to support the research and the development of further studies by gathering known issues encountered by the scientific community up to now.
The present article is structured as follows: In Section 2 the methods of selection and organization of scientific papers relevant to the purpose of the review are described. In Section 3 an overview of the selected studies is made. In Section 4 a wide range of challenges and open problems in the world of AR applications for orthopedics is addressed, and finally, in Section 5, conclusions are drawn.

2. Materials and Methods

2.1. Literature Search

The purpose of this review was to identify challenges and issues that were found by previous research regarding AR-based tools applied to orthopedic surgery procedures.
To find the published research articles, a comprehensive search on the following electronic databases was performed: PubMed, Google Scholar, Scopus, IEEE Xplore, Science Direct, and Wiley Online Library. Two reviewers conducted the search in online digital libraries in January 2022.
For our work, only English-written articles published between January 2018 and December 2021 inclusive were considered. The choice of limiting the time span to the last four years stems from the desire to identify challenges and open questions regarding the state of the art of AR-based technologies in orthopedics, thus excluding solutions that use tools that are now up-to-date, or whose technicalities have already been solved.
Article searching was performed through the use of the following keywords: Orthopedic; Orthopaedic; Orthopedics; Orthopaedics; Orthopedic Surgery; Augmented Reality; Mixed Reality; Extended Reality; Extended Medicine. All the research terms were analyzed isolated or combined between them.
A PICOS approach [22]—Patient (P); Intervention (I); Comparison (C); Outcome (O), and Study design (S)—was used to formulate our research questions.
According to the aim of our review, only articles describing orthopedic surgical procedures (P) assisted by AR (I) were selected and subsequently analyzed in terms of AR utilization, in order to find which are the most relevant challenges to the scientific community (O). To this end, randomized and non-randomized controlled studies, such as prospective, retrospective, cross-sectional, observational studies, case-series, pilot, and case-control studies, were included (S).
Our research questions were focused on identifying the most relevant aspects for our purpose:
RQ1: In which orthopedic surgical procedures has the use of AR technology been most studied?
RQ2: Which hardware equipment and visual tools have been most used in AR-supported orthopedic procedures?
RQ3: Which are the recording and tracking methods that have been most used in the development of AR applications for AR-supported orthopedic procedures?

2.2. Selection Process

The Preferred Reporting Items on Systematic Reviews and Meta-analysis (PRISMA) guidelines were followed for the current review [23]. We considered some inclusion and exclusion criteria for searching relevant documents from the databases selected based on relevance to our work.
This first list shows the inclusion criteria:
  • Articles written in the English language;
  • AR was used in orthopedic surgery, training, and rehabilitation;
  • Studies have received a minimum of ten citations if published in 2018, 2019, and 2020. This criterion has not been applied to studies published in 2021;
  • Studies with the greatest impact on the scientific community. Authors chose the quartile metric to identify suitable studies; indeed, only works published in Q1 and Q2 journals were selected. This choice was made to identify papers whose reliability is recognized at the international level.
The second list below highlights the exclusion criteria:
  • Review papers, thesis, book chapters, or oral presentations;
  • Non-English papers;
  • Articles without full-text available;
  • AR application field is different from orthopedic surgery.
A screening for article eligibility was performed first by the titles and abstract, and ultimately by a subsequent full-text review. Finally, the studies were further subgrouped according to the subject area in which the study was applied. The selection process is schematized in Figure 1.

3. Results

Sixty-two studies were considered significant to this review. Data collected were synthesized using Microsoft Excel, and qualitative characteristics were reported. Because of the wide range of the selected studies, quantitative characteristics were not described. Table 1 contains basic information on the included studies, regarding author, year of publication, and reference number. Additionally, visualization tools used, anatomical parts, and the type of application to which the study relates were reported.
The results from Table 1 are summarized by the two diagrams in Figure 2.
Table 1 shows that most of the reviewed studies concern AR-based applications used in the intraoperative phase of orthopedic surgery. In particular, most of them are related to procedures involving the spine. In fact, the anatomical shape of the spine and the proximity of the spinal bones to the spinal cord makes visualization of hidden anatomical structures and knowledge of the spatial relationships between them of primary importance. Next, in order of numerosity, there are studies on the use of AR in hip surgery, especially for the total hip arthroplasty (THA) procedure [86], which is one of the most widely performed orthopedic procedures worldwide.
As far as visualization tools are concerned, HMDs are the most widely used devices. They are an ergonomic and reliable solution, suitable for many demanding manual tasks such as image-guided surgery, where the absence of physical constraints, preservation of reality, and safety are crucial. Next, 2D displays such as desktops, smartphones, and tablets are used, and only to a lesser extent other devices, such as Head-Up (HUDs) and 3D displays.
The analyzed studies were subdivided according to their scope of application in physician and resident training (Section 3.1), patient rehabilitation (Section 3.2), or in the intraoperative field (Section 3.3). Studies related to intraoperative applications were further grouped according to the type of surgical procedure to which they refer.
Table 2 describes, for each application purpose, the main features of the AR-based application and the expected outcomes from its use.

3.1. Training

The training of new surgeons is a slow but essential process so that they gain the necessary experience to manage surgical procedures without jeopardizing patient safety. In this context, it is critical to develop surgical simulation platforms to improve outcomes and reduce the likelihood of potentially serious risks. AR features have the capability to train new surgeons by providing guidance and optimizing the visualization of hidden hard tissues and soft tissues while performing surgery.
In their study on AR-based applications for STEM subject learning, Mystakidis et al. [87] identified five instructional models, which were distinguished based on the instructional approach applied to subjects’ learning: experiential, activity-based, presentation, cooperative/collaborative, and discovery instructional strategies. According to this distinction, five of the studies reviewed [27,68,77,84,85] applied the experiential instructional design strategy, in which the subject takes advantage of his or her prior experience and applies it in a simulated environment, while only one study [42] followed the activity-based instructional design strategy, in which subjects were guided by an expert in carrying out the training activity.
Van Duren et al. [85] developed a proof-of-concept digital fluoroscopic imaging simulator for guidewire insertion in dynamic hip screw (DHS) surgery. With AR they simulated fluoroscopy in order to allow the trainee to interact with real instrumentation and perform the procedure on workshop bone models. The developed device uses a visual tracking system consisting of two video cameras and image processing algorithms to superimpose the position of the guidewire on the corresponding fluoroscopic images, by tracking colored markers attached to the guidewire. To assess the accuracy of the guidewire tracking system, the tip–apex distance (TAD) was calculated, and compared with that physically measured, obtaining 4.2 mm as the average root mean square error.
The work of Logishetty et al. [68] involved the use of an HMD and an innovative AR platform. Using a MicronTracker camera (ClaroNav, Toronto, ON, Canada), the system can track the implant position and orientation in relation to bony anatomy. During simulated THA a clinically relevant level of accuracy was obtained. Through the platform, the use of real instruments was allowed, and live feedback was provided. No difference in accuracy was detected between learners trained with AR or by the surgeon, showing that AR is a feasible and valuable training tool in addition to the traditional guidance of more experienced surgeons in the operating room.
Condino et al. [77] investigated the feasibility of constructing a patient-specific hybrid simulator to perform THA surgery in order to train resident surgeons and trainees in a safe and comfortable environment. The system allows surgeons wearing a Microsoft HoloLens 1 (Microsoft, Redmond, WA, USA) head-mounted display (HMD) to perform a THA surgery procedure on five patients using patient-derived 3D models, interacting with them using voice commands, gestures, and movements. Virtual and physical content was recorded using Vuforia SDK. That simulator received positive feedback from medical staff involved in the study to evaluate visual/auditory perceptions, and gesture/voice interactions.
Turini et al. have conducted a similar work [84], using Vuforia SDK functionalities as a real-time registration tool to detect and track the virtual content onto the real scene. In order to ensure the smooth running of the sensing and tracking functions, the support for a Vuforia Image Target was anchored to the mannequin.
The same research group [27] also proposed a method to build a hybrid simulator for the fixation of pedicle screws based on 3D-printed patient-specific spine models and AR and virtual fluoroscopy visualization. A patient-specific spine phantom and surgical tools, both equipped with markers for optical tracking, and a calibration cube were used. The scene is rendered on a PC with a traditional monitor, where the torso mannequin with the overlaid virtual content is seen to assist the implantation of the screws at the proper anatomical targets. The accuracy of the tracking method and consistent visualization of the vertebral model and surgical instruments were evaluated in both AR and virtual fluoroscopic modes by quantitative tests. From the results obtained, it is clear that further research and clinical validation are needed to make the simulator a valuable training tool for the insertion of pedicle screws in lumbar vertebrae.
Veloso et al. [42] worked on a method that uses Dynamics 365 Remote Assistance and Microsoft HoloLens 2 to provide the surgical team with all the training and information needed to properly implant prostheses and medical devices remotely from a manufacturer’s support center, without a technician’s presence. Collected data show that HoloLens could enable the avoidance of the manufacturer’s technicians in the operating room to improve the experience of both patients and physicians, and thus reduce the duration of the surgeries while reducing some economic and environmental costs.

3.2. Rehabilitation

AR technology is becoming more popular in rehabilitation due to its ability to create controlled, user-specific environmental, and perceptual stimuli while retaining the ability to interact with the real environment allowing the patient to remain engaged and motivated. Condino et al. [62] presented the first wearable AR system to provide the patient with the prescribed therapy to improve the shoulder range of motion (ROM) by obtaining real-time performance feedback and providing a stimulating, intense, and engaging activity. The potentiality of HoloLens to track users’ hands is here used to develop a system for indoor use, with no need for any adjunctive markers, sensors for arms, wires, and cables. However, results suggest that integration of more technical features is still needed to achieve optimal usability of the application.

3.3. Intraoperative Solution

While AR has just begun to be accredited to simulators for preoperative planning and surgical training, it is a more encouraging technology for achieving solutions within the intraoperative phase. AR can support the surgeon in understanding and visualizing the anatomical regions of interest and the topographical relationships between surgical instruments, orthopedic implants, and anatomical landmarks. In addition, through AR, virtual images can be generated and projected onto the real environment, offering guidance to perform minimally invasive surgery.
Studies concerning AR-based intraoperative solutions represent the vast majority of the reviewed papers. Considering the wide variety of surgical interventions, and thus the different needs of orthopedic surgeons in performing these procedures, the intraoperative solutions were further subdivided according to the anatomical part treated and the type of surgical procedure to which AR was applied. Section 3.3.1 describes arthroplasty procedures involving hip (n = 7), knee (n = 5), elbow (n = 1), and shoulder (n = 4), and fixation procedures on femur and tibia (n = 2). Section 3.3.2 describes AR-based solutions applied to spine surgery procedures, and details pedicle screw placement (n = 17), percutaneous surgery (n = 3), lumbar facet joint injections (n = 1), rod bending implant (n = 1), and Sacral Alar Iliac (SAI) screw implantation (n = 1). Section 3.3.3 discusses AR-based applications for treating tumor pathologies (n = 4).

3.3.1. Lower and Upper Limbs

With the development of technology, AR was adopted to support surgeons in challenging arthroplasty procedures, improving the outcome of this kind of surgery thanks to its peculiar advantages. The accurate prosthetic components’ positioning is one of the core factors contributing to a fast recovery and lower risk of revision.
Liu et al. [81] developed an AR-based 3D computer navigation tool for hip resurfacing applied to femoral preparation. They used the HoloLens, a 3D camera, and robotic technology assistance to automatically track the pose of the target femur without the use of any marker. A comparison of the post-intervention perforation orientation with the pre-intervention plane in a phantom study gave an average error of approximately 2 mm.
In the work of Lei et al. [65], AR and 3D-printed technologies were combined to accomplish a challenging THA surgery on a 59-year-old man. The real-time automatic registration process involved an HMD through landmark recognition on the reference registration tool. In this way, the virtual models of the patient’s hard and soft tissues were superimposed on the real anatomy; a customized surgery was achieved, while the operation time was not significantly increased.
Fotohui et al. [80] proposed an AR solution for intraoperative assistance of direct anterior approach THA, based on two C-arm X-ray images combined with 3D AR visualization to simplify planning-based impactor and cup placement. The developed methodology allows the intraoperative planning of the acetabular cup position relying on two fluoroscopic images acquired after a femoral head dislocation. Subsequently, it is possible to align the impactor color and depth information, namely a point cloud provided by the 3D camera, with the planned virtual cup-impactor displayed in the AR environment. The system was validated and compared with standard fluoroscopic guidance for acetabular component positioning using a radiopaque foam pelvis [47]. Compared to the fluoroscopic technique, the AR guidance system produced a more robust implantation of the acetabular component in both inclination and anteversion. In addition, the AR guidance system was easier and faster. Although the radiation dose involved was similar to the fluoroscopic technique, the surgical team could be not exposed to the radiation.
Ogawa et al. [57] presented an AR-based device for acetabular cup positioning. The so-called AR-HIP system uses the smartphone camera to overlay the acetabular cup positioning angle and functional pelvic floor image with real-world images of the surgical field. In the registration process, the surgeon is required to register two anatomical landmarks, namely the right and the left anterior superior iliac spines, in order to automatically determine the functional pelvic floor. In a randomized controlled trial [83], the group obtained that the absolute values of the difference between the detected angles and the postoperative radiographic measurement angle were significantly lower in the AR-based navigation system group than in the mechanical guidance group in terms of inclination, thus demonstrating that the AR-based navigation system provided better accuracy in terms of cup positioning angle than the conventional technique using mechanical guidance.
In a retrospective study, the above-mentioned AR-Hip system was compared to the accelerometer-based portable navigation system, called HipAlign, consisting of a reference sensor and a disposable computer display unit [39]. With both the HipAlign and AR-Hip systems, it was always possible to know the angles of acetabular cup placement during THA, although the final acetabular cup placement depended on the surgeon’s medical judgment in each patient. The most important finding of this study was that in the THA procedure with the patient in the lateral decubitus position, the AR-Hip system provided more accurate acetabular positioning than the HipAlign system.
In Total Knee Arthroplasty (TKA), an AR-assisted system was developed by Pokhrel et al. [66]. This system uses an optical camera and a computer to project an image of the exposed bone in the surgical field onto the image obtained from the preoperative CT scan during surgery. In contrast to this system, the work of Tsukada et al. [71] requires no preoperative images. Using a smartphone and 2D barcode markers, their AR-KNEE system allows overlay of the tibial and anteroposterior long axis on the surgical field. In this way, the center of the femoral head and the femoral mechanical axis are seen directly on the surgical field, allowing the surgeon to recognize an incorrect registration by comparing the femoral mechanical axis to the tibial line. The AR-KNEE system was demonstrated to achieve an error < 1° in the resection of the proximal part of the tibia in both the coronal and the sagittal planes, as well as the accuracy in the resection of the distal part of the femur, evaluated by the same team in an experimental setting [40], was also found to be < 1° in both coronal and sagittal planes. In addition, they compared a distal femoral resection performed with conventional intramedullary guidance and the AR-KNEE system in a clinical setting and found that the AR-KNEE system ensured considerably higher accuracy.
A different system was developed by Fucentese et al. [30], using AR as a guide for TKA. The footprint of the system comprises smart glasses, two small disposable sensors, and a control unit that managed tracking information collected by a pocket-sized wireless optical tracking system. A new feature enabled the intraoperative measurement of the alignment and positioning effects of the prosthesis on soft tissue balance, in order to assess whether the best implant position and limb axis are achieved.
Chen et al. [26] proposed an in-situ AR navigation tool based on a reconstruction of the 3D model of the knee using preoperative imaging. The preoperative 3D model of the knee is updated with tissue properties considering the actually adopted intraoperative positions. Then, the obtained model is shown through a glasses-free 3D display for navigation, a stereo display that, thanks to stereopsis, is capable of transmitting depth perception to the viewer. In this way, an accurate 3D structure of the tissue was obtained to allow surgical AR navigation, giving the surgeon a more detailed view of the patient’s 3D anatomy.
Ma et al. [82] used an AR-based surgical navigation (ARSN) system with hybrid electromagnetic and optical tracking that provided 3D images using an integral videography approach for intramedullary nail locking. They were able to correctly set eighteen drills in five tibia models and one leg phantom during their pre-clinical study.
A similar system was provided by Tu et al. [41], who proposed a hybrid HoloLens-to-world methodology for the registration exploiting a custom-made registration cube and an external EM tracker. The feasibility of this approach was proved in a cadaver experiment using an ARSN system based on Microsoft HoloLens 2 as a support for surgeons to complete distal interlocking, indicating that the system satisfies the accuracy requirement of distal interlocking while reducing the total time for intramedullary nailing.
Hu et al. [32] developed an HMD-based AR navigation platform to assist surgeons in orthopedic procedures involving the drilling of exposed bone structures. The application was deployed on HoloLens 1, includes markerless tracking of target anatomy, and supports visualization in both optical see-through (OST) and video see-through (VST) paradigms. The navigation performance of the system was evaluated by a user study both quantitatively and qualitatively, obtaining that AR visualizations can be a reliable tool for manual guidance of femoral drilling.
Regarding the upper limb, Yamamoto et al. [45] reported the successful integration of AR into elbow arthroscopy. Their system uses AR technology to superimpose nerve data obtained from preoperative images, which are essential for successful elbow surgery, on an arthroscopy monitor. An optical pose tracking system was used to superimpose images onto the arthroscopic video view, and markers were placed to allow distinction between several different instruments. They believe that arthroscopic elbow surgery, in which many complications related to nerve damage can arise, will benefit from AR technology.
Schlueter-Brust et al. [38] presented a proof-of-concept AR-based guidance system for Kirschner wires placement of the glenoid component placement in total shoulder arthroplasty (TSA). In their solution, the orthopedic surgeon had to manually align the virtual hologram with the target anatomy using the Microsoft HoloLens 2, even though the manual alignment is considered extremely subjective and prone to human error process. The use of AR for assisted glenoid component placement in TSA was demonstrated only in one previous study [34].
Some groups have sought markerless computer vision-based solutions to eliminate the need for rigidly fixed markers. These techniques require the acquisition of the patient’s anatomy preoperatively to create a 3D model and then obtain the registration intraoperatively by scanning the surface of the model with a probe [33]. Although the techniques were different, both studies [33,38] obtained comparable results in terms of input point accuracy, while Kriechling et al. [33] reported a lower mean orientation error, which could be justified by the automatic registration method used, based on the glenoid surface scanning.
Gu et al. [31], instead, provided a markerless image-based registration approach to guide TSA using the Microsoft HoloLens 1. They used the depth managing sensors of HoloLens 1 to obtain the intra-operatively alignment with the patient anatomy via image-based registration. However, although their results were promising, the system did not meet surgical accuracy requirements yet.

3.3.2. Spine Surgery

The spinal cord, emerging spinal nerves, and respective blood vessels are more likely to be injured during orthopedic surgery interventions, given their proximity to the bones of the spine. For this reason, many researchers have developed ARSN systems for instrumentation of the pedicle screw, whose misplacement can cause severe vascular or neurological injury.
In a clinical study, Elmi-Terander et al. [63] evaluated, for the first time, the accuracy of pedicle screw placement using an ARSN system. A hybrid operating room was equipped with the ARSN system, a motorized flat-sensing C-arm with 2D/3D intraoperative capabilities, a surgical table, a noninvasive marker-based patient motion tracking system, and an integrated optical camera for AR-based navigation. For each screw, the insertion was planned, and the proper track for instrument navigation was displayed in AR by rotation of the C-arm. This clinical trial was the first prospective human study in a hybrid operating room equipped with the ARSN system for the placement of 253 screws in twenty consecutive patients, among which only three required intraoperative revision.
In a subsequent cadaver study [79], the same ARSN system was used by two neurosurgeons for the insertion of 66 Jamshidi needles and 18 cannulated pedicle screws into the thoracolumbar spine. The distance between the actual needle tip position and the corresponding planned path, as well as the angles between the needle and the desired path, were measured to assess the technical accuracy of the ARSN system. The obtained results indicate that the ARSN system can be used without the use of any fluoroscopy or X-ray imaging during the procedure for percutaneous pedicle screw placement. The screw placement accuracy and clinical aspects of the ARSN system were compared with those achievable using the freehand (FH) technique in twenty patients [54]. The same surgeon performed the surgical procedures both with ARSN and FH, resulting in greater accuracy of the first compared to the latter.
Edström et al. [53] presented a workflow to introduce, in clinical practice, the ARSN system described by Elmi-Terander et al. [63] installed in a hybrid operating room for visualization of the anatomy and instrument guidance during pedicle screw placement. The workflow encompasses all surgical processes from pedicle screw path planning for instrument guidance during surgery to their placement and spinal fixation, to wound closure and intraoperative verification of treatment results. It also considers imaging time and automated 3D model creation. Special attention was paid to the radiation exposure of the staff so that the need for lead aprons was avoided [52]. A study with twenty cases requiring the placement of pedicle screws (13/20 scoliosis) validated the navigation workflow. The results show that the ARSN system could decrease the risk of complications and surgical revision and minimizes staff radiation exposure, demonstrating how it can be integrated into the clinical routine without invalidating it, as only 8% of the total time was used for intraoperative imaging and preparation for surgical navigation.
Auloge et al. [48] evaluated the safety, accuracy, technical feasibility, and patient exposure to radiation of a new navigation system that integrates artificial intelligence (AI) and AR, during percutaneous vertebroplasty of patients with vertebral compression fractures (VCFs). Four cameras were integrated into the flat panel detector of a standard C-arm fluoroscopy machine for AR navigation. Spinal segmentation and optimal trajectory planning were automatically performed by an integrated AI software, that also suggested a detector position for the initial targeting. The planned “virtual” trajectory is then superimposed on the “real world” camera inputs, and the merged information is displayed on a 2D monitor without the need for fluoroscopy. In all cases, the AI software was successful in identifying landmarks and generating a safe trajectory, achieving an accuracy similar to that of standard fluoroscopic guidance.
The cadaveric study by Peh et al. [58] focused on determining the feasibility and the clinical accuracy of an ARSN for minimally invasive instrumentation of spinal and lumbar pedicle screws compared to the standard fluoroscopy-guided minimally invasive technique. The ARSN system consists of four optical cameras placed in the frame of a C-arm detector, and, for visual tracking, optical markers applied to the patient’s skin. Minimally invasive screw placement with the ARSN system was shown to be as suitable and accurate as fluoroscopy, without requiring additional navigation time or X-ray imaging during the surgery. Additionally, the achieved clinical accuracy was comparable to values reported in current literature where different navigation systems for minimally invasive procedures were used [59,63].
In a laboratory study on cadaveric animals, Burström et al. [59] assessed the feasibility and accuracy of pedicle cannulation using automatic instrument tracking integrated into an ARSN system, to retrieve the instrument position with regard to deep anatomy. Cone-beam computed tomography (CBCT) was performed on two pig cadavers to generate a 3D model to plan the insertion of seventy-eight pedicle screws. CBCT was acquired after the insertion and the distance between the navigated device and the corresponding pre-planned path and angular deviations were measured. ARSN with instrument tracking for minimally invasive spine surgery achieved clinical accuracy from 97.4% to 100% depending on the screw size considered for placement, proving to be a feasible, radiation-free, and accurate navigation method.
Burström et al. [49] investigated the feasibility, efficacy, and accuracy of a novel AR tool for semi-automatic and minimally invasive pedicle screw placement with the addition of a robotic arm. A hybrid operating room was equipped with an ARSN system to plan screw paths and direct the customized robotic arm. Thanks to the robot guide, some Jamshidi needles were placed in 113 pedicles of four cadavers with 100% clinical accuracy, demonstrating how the positioning of the pedicle screws would greatly benefit from using a semi-automatic surgical robot, achieving much higher precision than that clinically acceptable.
Some works describe AR-based systems in which the superimposition of virtual images was performed intraoperatively using a separate screen or a microscope. Carl et al. [61] implemented support for degenerative spine surgery by applying an approach based on items available on the market, demonstrating that a microscope-based AR system is feasible. A HUD integrated into the operating microscope was used for AR visualization, showing AR data in the user’s line of sight without the need for glasses. The operating microscope was calibrated by adjusting the superimposition of the 3D reference array representation on the actual reference array structure [60]. The automatic registration of AR was based on intraoperative CT, and its robustness was assessed by comparing the pointer tip position in relation to known structures, for instance, the retractor systems, or fiducials markers placed on the skin. Results indicate that this kind of application has enormous potential and could be useful in complex anatomical situations and the training of trainees.
In another study [50], Carl et al. also deepened whether microscope-based AR support can be applied in different spine surgery procedures for pathologies such as tumors and intradural lesions, deformities, infections, and degenerative cases. AR has proven its feasibility, especially in complex anatomical situations due to the patient positioning; indeed, in all cases of reinterventions, surgical navigation made the approach to the spine easier. As an alternative to the microscope, also HMD devices implementing AR were studied in spine surgery procedures, particularly for percutaneous interventions such as pedicle screw placement, injections into facet joints, and kyphoplasty.
In the study of Wei et al. [70], an effective AR-based method for navigation and position during pedicle puncture in percutaneous kyphoplasty surgery was developed. Microsoft HoloLens, in combination with fluoroscopic C-arm images, was used to support orthopedics during surgery to obtain accurate, real-time guidance in the intravertebral vacuum cleft area. The use of HMD has reduced the operation time and the number of C-arm fluoroscopies acquired and has increased the accuracy and efficacy of the intervention.
Wanivenhaus et al. [73] used Microsoft HoloLens to visualize a virtual model of the target rod. To navigate the bending of the rod implant ex situ, they estimate the 3D positions of the pedicle screw heads using a pointing device with an image-based marker, allowing the surgeon to shorten the length of the rod to the desired shape, bend it, and iteratively compare it with the target shape displayed by HoloLens. With this procedure, they were able to reduce re-bending maneuvers and operating time and decrease the risk of infection. Instead of using a pointer-based approach, von Atzigen et al. [43] proposed a markerless approach for surgical navigation to bend rod implants, achieving a similar average localization error of pedicle screw heads compared to the approach of Wanivenhaus et al. [73]. Their approach combines AR with Machine Learning so that, without touching the instrumented anatomy, a virtual model of the optimal rod shape on the device is generated and displayed. This method allows detection of the 3D positions of the implanted screw heads at interactive rates, without the need for manual acquisition and registration of the anatomy. Although rod bending navigation has demonstrated clinical and biomechanical advantages with respect to the conventional method, it has not yet been fully adopted in spinal surgery.
In an in vitro study using a customized phantom, an optical see-through HMD was successfully tested by Deib et al. [78] for kyphoplasty. Augmented visualization was achieved by placing the display in a specific location in the environment to obtain a “world-anchored” view. In phantom studies, such a registration approach is always possible, because a clear outline is easily seen, and the AR display can always be adjusted. The real world, on the other hand, always presents less controlled conditions, for which it is necessary to develop more robust approaches, based, for example, on intraoperative imaging.
The study by Gibby et al. [64] assessed whether the use of Microsoft HoloLens for percutaneous needle navigation to simulate pedicle screw placement was suitable. The HMD-AR technology projects rebuilt 3D and 2D CT images onto a phantom of the lumbar spine, wrapped in an opaque silicone block, without the use of fluoroscopy or an intraoperative CT arm. The OpenSight (Novarad) application was used to integrate these visual data into Microsoft HoloLens, allowing the superimposition of virtual trajectory guides and CT images on the phantom in 3D and 2D. For intraoperative registration, commercial software was used, allowing the possibility for the user to manually refine the registration process. A limit for this study is the registration process because the need for intraoperative CT and a simple and large object to detect (e.g., a silicone phantom) makes it almost inapplicable in real surgical conditions.
The work of Urakov et al. [72] provides a further workflow demonstration of the AR technology in spine surgery. The work consisted of projecting holographic images of the spine visualized with OpenSight onto Microsoft HoloLens on prone positioned cadavers. Surface anatomy was exploited to align the hologram, including previously planned trajectories for pedicle screw insertion, with the cadaver. Verification of correct screw insertion was assessed post-operatively with a CT scan and compared with the fluoroscopy technique. While there was no major violation of the fluoroscopic technique, some screws inserted with AR suffered an initial positioning error, while their overall orientation resulted to be parallel to the original trajectory, as expected.
Liebmann et al. [66] presented a Microsoft HoloLens-based navigation method, to capture the intraoperatively reachable surface of vertebrae, and to register and track instruments with live views without imaging acquired during surgery. The developed system allows to perform marker tracking and pose estimation, digitalize intraoperative surfaces for registration purposes, and features designed for surgical navigation. Preoperative planning of 3D screw trajectories was performed by acquiring computed tomography (CT) images from cadavers. The orthopedic procedure was performed on phantoms by a surgeon and checked postoperatively with CT acquisition, demonstrating that combining HoloLens with the proposed navigation method can lead to precise lumbar pedicle screw insertion.
Molina et al. [69] also attempted to compare AR-assisted insertion of pedicle screw with conventional methods. Accuracy was assessed by acquiring a CT following the insertion of 120 pedicle screws in five male cadaveric torsos. The AR system proved to have excellent usability, and the screw placement accuracy was not inferior to the accuracy reported for manual computer-guided pedicle insertion. In contrast, when evaluated using the Gertzbein-Robbins grading system, the accuracy results obtained with the AR system improved the results achieved with FH insertion.
In their study, Dennler et al. [51] evaluated if a new AR-supported technique, using a cutting-edge AR headset, could improve pedicle screw insertion accuracy compared to the standard FH technique in both a group of experienced and a group of novice surgeons. Through manual alignment, the virtual information displayed by the AR headset can be overlaid on the anatomy in the real world. As hypothesized, in a laboratory setting, the optimized visualization improved the pilot holes accuracy for the pedicle screws and reduced the effect of the surgeon’s experience. For novice surgeons, the group trained with AR showed a strong decrease in primary and secondary perforation of the screws, and greater accuracy of the craniocaudal tilt angle and a mediolateral convergence compared to the group trained with the traditional FH technique, while no difference was observed for experienced surgeons. Other studies, instead, proved considerably greater accuracy in the placement of pedicle screws using different ARSN systems that provide 3D data to the surgeon than the FH methodology. The same system was evaluated in a subsequent study on its feasibility in a clinical environment [28]. Surgeons received an introduction to Microsoft HoloLens functionality, then a training phase preceded the testing phase during which they were asked to wear the HoloLens during surgery. For each operation, commercial software generated a 3D triangular surface model of the patients’ readily available CT data. The results showed that the image quality of the head-mounted AR device was generally considered adequate, although some ergonomic and technical weaknesses were noted, such as inaccurate interaction with gestures and voice commands.
The same group [29] developed an AR-guided navigation device for the insertion of the Sacral Alar Iliac (SAI) screw, in which the 3D model, including optimized and previously planned screw trajectories, was loaded onto an AR-HMD (Microsoft HoloLens). Their new proposal involves the use of an AruCo fiducial marker to track, register, and overlay the holographic image onto the pelvic model, theoretically avoiding the need for large and expensive infrastructure within the operating room. To verify its accuracy, the navigation system developed using AR technology was compared with the traditional FH method, and it was found that in a laboratory environment, the data provided by the AR headset and the overlaid operating plane increased the accuracy of pilot holes for SAI screws with regards to the conventional FH.
The study of Yahanda et al. [44] is the first reporting clinical case series that evaluated the accuracy of sixty-three percutaneous pedicle screws inserted using an AR-HMD system approved by the FDA. The system used a marker and CT images acquired during the operation for registration, and a marker for tracking. The pedicle screw placement procedure was performed in nine patients in both the lumbar and thoracic spine and achieved 100% accuracy regardless of the indications for the procedure. Comparable results were also obtained in other studies, in which other positioning systems (such as robotic systems) were used [25,55].
The study by Molina et al. [55] evaluated an AR-mediated percutaneous thoracic and lumbosacral pedicle screw placement system in a cadaveric study, with an HMD using wireless optical tracking cameras, directly embedded in the headset. This technique has reported insertion accuracy rates ranging from 83.9% to 100% and is associated with varying amounts of radiation exposure for the patient and the medical team, depending on the workflow and their experience. Despite improvements in accuracy, drawbacks have commonly been reported with manual computer navigation that can result in inefficiencies and procedural errors.
Another AR system was used by Charles et al. [25] for placing a percutaneous pedicle screw during lumbar transforaminal interbody fusion. The result was 94% accuracy; however, this system does not involve AR HMD but visualizes the data on remote displays. Furthermore, a lack of device position tracking requires intraoperative fluoroscopy to inspect the insertion of the screw, especially in those patients who suffer from obesity.
Yanni et al. [46] described and evaluated screw placement using SpineAR, a prototype commercial AR-HMD platform with real-time navigation guidance. SpineAR uses an AR HMD-based user interface to display real-time navigation images from intraoperative 3D imaging. In this work, the accuracy of screw placement was evaluated in a laboratory setting where pedicle screws were placed into commercial 3D-printed models of the lumbar spine by surgeons and trainees with distinct levels of experience. Accuracy was evaluated and compared with data reported in the literature on the FH screw placement technique, reporting comparable accuracy rates.
In a cadaver study, Müller et al. [56] evaluated the accuracy of surgical navigation for pedicle screw insertion with an AR head-mounted device. The comparison was performed with a state-of-the-art pose-tracking system for navigation. Three-dimensional fluoroscopy was used as an intraoperative registration approach in both cases in order to maintain the same experimental settings and comparable outcomes. In addition, for the AR group, the positions of the markers were plotted in real-time by considering instrumentation movement, and preoperative 3D planning was superimposed on the actual anatomy with a hologram. Holographic navigation using an HMD resulted to achieve outcomes comparable with the state-of-the-art.
Liu et al. [67] assessed, in experimental bone environments, the feasibility and accuracy of percutaneous lumbar pedicle screw placement using an AR-guided method from CT images compared to placement with a radio-guided method. In this study, an automatic method for aligning an AR hologram was also compared with a manual one. Each of the twelve lumbar spine sawbones was fully embedded in hardened opaque agar and equipped with a cubic marker. An intraoperative CT scan was acquired for each phantom and the generated virtual 3D model was uploaded on Microsoft HoloLens. Two experienced surgeons placed 120 pedicle screws simulated by Kirschner wires, including eighty under AR guidance, resulting in more satisfactory and efficient AR-guided placements than radiographically guided placements. Furthermore, manual alignment of the hologram was judged less performing than automatic alignment, while the accuracy of the two alignment methods was broadly similar.
Some of the studies regarding the use of AR for spinal procedures also involve feasibility studies for AR guidance of injections in the lumbar facial joint. For example, in the work of Agten et al. [75], Microsoft HoloLens was used as an AR-based guidance system to perform lumbar facet joint injections on a phantom with two integrated vertebrae. The phantom was equipped with three virtual ribbon markers, which were used as templates for the manual alignment of the hologram. Their study provided convincing evidence to encourage the injections of lumbar facet joint using AR; nonetheless, some drawbacks were present. In fact, their study focused only on the target location, no phantom tracking system was present, and also used a larger needle than typically used in clinical practice.
In a retrospective study, Li et al. [35] investigated a novel three-dimensional MR-based image-guided intraoperative surgical navigation system (MITINS) to support pedicle screw implantation in elderly patients. The authors reported for the first time the use of MR in surgery related to lumbar fracture using the MITINS system, consisting of an electromagnetic transmitter, trakSTAR, and sensors. The MITNIS planned the 3D position using intraoperative three-point registration and assisted the orthopedic surgeon in inserting the screws into the 3D printed lumbar model. No screws resulted to be outside the pedicle, and there was no need for radiography for additional position information.

3.3.3. Tumor Surgery

Orthopedic oncology deals with all oncologic diseases affecting the skeleton, soft tissue, sarcomas, bone metastases, cancerous and benign tumors. Traditional therapy for these tumors is a complete surgical resection, which has recently benefited from AR technology due to its advantages in medical training, surgical navigation, and improved patient relations. However, nowadays, among the published studies there are few examples of the treatment of oncological patients in orthopedics in which AR technology is used.
Cho et al. [76] developed an ARSN system for the surgery of pelvic bone cancer, supported on a tablet PC, where radiological data are displayed overlaid with real-world data, suggesting more data about the tumor size. By optically tracking the target object and markers placed on it, the ARSN system can support surgeons during pelvic bone tumor surgery, with the advantage of not being as bulky as traditional navigation systems. For validation of the system, thirty-six bone tumor models were created for simulation of tumor resection in pig skin. The models were divided into two groups, in which resection was performed with AR guidance or with the traditional method. In comparison, the AR-based technique proved to be more accurate than the conventional approach.
Instead of using a tablet PC, Moreta-Martinez et al. [37] proposed a novel AR-based framework that assists surgical planning, improves patient communication, and provides support during an osteotomy procedure through a smartphone application. The patient’s anatomy and tumor location were visualized on the smartphone display and tracked by an automated registration process based on a 3D printed reference marker customized by the patient. This solution was assessed on both 3D printed patient-specific mannequins and patients, and feedback from physicians and patients was positive and encouraging.
Molina et al. [36] described the use of AR-mediated spine surgery (ARMSS) for guidance in a single osteotomy procedure to obtain a large en bloc marginal resection of an L1 chordoma through a posterior-only approach, avoiding a breach of the tumor capsule. ARMSS via an HMD with a monitoring device is a new technology approved by the FDA for neuronavigation during surgery that has the advantage of improved accuracy, framework robustness, and convenience. The procedure was completed by reconstruction using an allograft construct, obtaining excellent results during the patient’s rehabilitation.
In a different study, Ackermann et al. [24] evaluated Ganz’s periacetabular osteotomy (PAO) navigation feasibility over two cadaveric pelvises under realistic surgical conditions. Custom-made developed software was used to conduct preoperative planning on 3D models reconstructed with computed tomography (CT). In this way, cutting plane objects for osteotomy planning and reorientation of the acetabular fragment were created. The registration process, motion compensation, guidance for osteotomies, and reorientation of the fragment were all included in the developed AR application. By deploying the AR application on Microsoft HoloLens, all flat objects could be overlaid on the actual anatomy for cutting navigation, supported by a marker-based tracking system. Pre- and post-surgery pelvic bone models were registered through ICP for comparison. The results demonstrate that AR navigation is suitable for PAO surgery, since it improves the accuracy of current cutting-edge navigators, although the need for future development.

4. Discussion

In this review, we provide a review of various AR-based studies and applications in the field of orthopedics. What emerged from the development of this work is that AR technology is a growing trend, and its adoption in orthopedics has gained increasing attention and paved the way for new possibilities in the training and execution of surgical procedures. Nowadays, many of the studies still concern preclinical and proof-of-concept results, but many are still dealing with the incorporation of these technologies into daily clinical practice and with the study of adequate workflows to facilitate this process and their feasibility.
Overall, the results obtained from the studies describe AR technology as a new method capable of improving precision, reducing risks and radiation for both the patient and the physicians, and optimizing surgical times in various fields of orthopedic surgery. Furthermore, in addition to the intraoperative technology aspects, AR is increasingly emerging as a valuable tool for training physicians and as an effective rehabilitation tool. Nonetheless, the clinical differences of AR in terms of cost reductions and improvements in patient care will need to be carefully assessed in future research. Currently, the effect that AR and HMD have on cognition, perception, and concentration are not completely clear and this is the main cause why it is not yet commonly implemented in daily clinical routine. The reasons are manifold, and this review aims to highlight the issues that were found to be relevant in the scientific community, shed light on the aspects that still need research and investigation, and emphasize the future challenges for this environment. In fact, with the computing power currently available and its expected growth, AR is destinated to experience an increased usage in all the augmented humanity scenarios, including orthopedics.
In response to RQ1, statistical analysis was performed in Table 1 and summarized in Figure 2. The answer to our first research question was discussed previously in Section 3.
The following subsections discuss the limitations and challenges found by the authors in the analyzed articles and answers to research questions posed in Section 2. In particular, in Section 4.1, the answer to RQ2 is detailed, and limitations regarding AR visualization and related hardware are addressed, in Section 4.2, the answer to RQ3 was considered, along with a description of the types of registration and tracking used, and their corresponding advantages and disadvantages. Finally, in Section 4.3 implications for future research are discussed.

4.1. Visualization

The meticulous design of realistic perceptual hints is of paramount importance for the AR visualization of medical data since a wrong perception can lead to geometric ambiguities in object scale identification, make the interaction between virtual and real information more difficult, and thus hinder the usability of AR. Artificial information and real-world images are merged through video or optical display techniques. The three main solutions to implement these displays comprise HMDs, 2D monitors (mobile phones and tablets), and projectors.
The AR projector was not used in any of the reviewed studies to superimpose virtual content over physical space. Although this alternative provides the ability for the user to observe the AR content with the naked eye, without using additional instrumentation, in most of the reviewed applications, a 2D display tool was used, whether smartphone [37,39,40,57,71,83], tablet [76], or PC [27,45,47,48,49,50,52,53,54,58,59,60,61,63,69,70,79,80,82]. This choice is justified by the fact that the virtual content was not designed for real-time interaction with the user and/or real-world objects, but mostly to increase the amount of information available at the same time.
Many other studies [24,28,29,30,31,32,33,34,35,36,38,41,42,43,44,46,51,55,56,62,64,65,66,67,68,72,73,74,75,77,78,81,84], instead, benefit from an OST HMD to show information in the user’s field of view. Certainly, their use as a simple display method is favored by some of their characteristics such as weight, size, and display resolution. Although in some studies it is not well specified which HMD was used, almost the totality of the works used first- and second-generation Microsoft HoloLens to achieve their purposes. In these works, the AR content was directly superimposed on the real-world visualization, so that the user had the possibility to directly interact with the holograms to visualize anatomical models for surgical planning, could identify access points to internal anatomical structures, could replace intraoperative fluoroscopy, could guide the insertion of surgical instruments, needles, wires, and screws, or for the training of doctors and the rehabilitation of patients. However, Microsoft HoloLens, like many commercial AR devices, is originally produced as a multimedia entertaining device, hence barely suitable for clinical applications since it was not designed to provide high-precision visualization [33,34]. Indeed, the authors reported several problems with the use of HMDs. In particular, one of the main problems encountered concerns the spatial mapping of HoloLens [75]. HoloLens’ sensor cameras acquire images relying on the surrounding light reflections [64], the variability of the brightness and contrast of the environment [67], but also based on the different colors and reflectivity of the object surface [31] and the distance from the object. Thus, the performance of its sensors relies on the quality and quantity of the visual features detected, along with the environment lighting, the real-world geometrical characteristics, and the behavior of the user [88]. This means that the sensors cannot maintain 3D spatial positioning in dark rooms [64] and that the depth detection accuracy of its sensors may depend on environmental conditions, thus leading to measurement errors [31]. Furthermore, while the first generation of HoloLens lacks eye-tracking capability, making it unsuitable for surgical navigation [64], this functionality was incorporated in HoloLens 2, although its effectiveness highly depends on the user’s behavior [31]. In support of these claims, some authors have noted that holograms tend to drift [24,66,81,89]. Since virtual vision depends on the position of the user’s eyes, a slight shift in the position of the eyes relative to the HoloLens will cause the holograms to shift, so it is critically important that HoloLens has a fixed position on the head [89]. Shifts during surgery may require recalibration which would be unacceptable, and in addition, the calibration must also be performed every time the HMD is used by a different user, potentially causing a waste of time in the surgical workflow. For all these reasons, the hardware accuracy can only be defined for a very precise set of parameters and, hence, is not easily generalizable.
Other technical limitations of the device concern the two frontal environmental cameras that have a very narrow field of view [28,43]. These features were reported by authors who have pointed out that discomfort is associated with the prolonged use of HoloLens, causing eye strain, visual fatigue, and postural discomfort [26,77,90]. Discomfort can be decreased by designing user-friendly interfaces that improve the ergonomics of AR applications. An unintuitive design can have a negative impact on the surgeon’s experience when using AR technology in the clinical setting. It is of utmost importance that the needs of a key stakeholder, i.e., the surgeon, are brought to the forefront so that the user experience has optimized usability.
Developing an application with effective functionality is a challenge that could help increase the acceptability of AR devices, but also increase their popularity among healthcare professionals, who currently have ambiguous knowledge and limited resources.

4.2. Registration and Tracking

Most of the reviewed studies necessitate complicated registration and tracking processes between the patient and the AR system. Registration enables the superimposition and orientation of computer-generated objects, such as X-ray images or navigation modes, in the correct position in situ. Tracking is the process that follows registration and allows the displayed virtual object to maintain the correct position, adapting over time according to the movements of the user or the 3D space. In order to allow a correct integration of virtual models with real objects, it is therefore essential to know the spatio-temporal relationships between the real and virtual worlds; in fact, it is of paramount importance for AR applications to have a quick and accurate estimation of the visualization position with respect to real objects in order to obtain a correct alignment between the real world and the virtual content. The most widely adopted tracking and registration methods can be grouped into the following categories: marker-based optical tracking systems, using elements that are easily detectable using computer vision and image segmentation techniques; markerless tracking systems; sensor-based tracking systems, e.g., electromagnetic, inertial, or acoustic; hybrid tracking systems, combining one or more of the above tracking systems.
Depending on their applications, some research groups did not include any automatic virtual content registration functionality, leaving the orthopedic surgeon to perform a manual alignment [28,38,72,74,75]. However, the authors reported that manual alignment can often be a tedious process [75]. Furthermore, the 3D alignment quality of virtual models with regard to the real world is highly subjective, prone to human error [38], and it relies on both the accuracy of the registration and the OST HMD calibration [28].
Although these may become obsolete shortly, most studies refer to marker-based tracking systems. Leading marker-based tracking technologies, such as Vuforia Image Targets, primarily use printable planar markers [57,77,83,84], black and white square or cubic patterns [24,27,29,33,34,37,45,48,51,56,67,73,80,81], QR code [39,40,71] and colored reference arrays [50,60,61]. Other systems employ non-planar markers such as retroreflective markers [55,69] and colored markers [85].
Even though marker-based or sensor-based registration and tracking methods have been mainly used for AR displays during orthopedic interventions, such technologies are highly dependent on complicated calibration processes, and often require further preparation time and equipment [31]. Indeed, introducing markers into the surgical field of view could be a difficult and impractical process in real operating scenarios [29], as they often risk overcrowding the surgeon’s FOV or require additional incisions. In addition, the accuracy of the registration process can be highly dependent on the placement of the markers, so the system may need to be recalibrated due to patient movement [63,80], or movement of soft tissue anatomical parts [71], as might be expected in overweight subjects, where losses of the initial coupling between the anatomical structure and the markers could more frequently occur [48,58].
In other cases, it was observed that differences in lighting affect the computation of the position of the markers [85] and that the selected tracking approach can fail due to marker occlusions, making the navigation impossible till the occlusion removal [27,55].
Due to the limitation of these techniques, markerless registration relying on anatomical landmarks is an interesting option to minimize the registration procedure’s influence on the surgical framework. Examples are methods based on the stereoscopic vision that use different camera angles to visualize intraoperative anatomy [47]. However, line-of-sight problems have a great impact on these registration techniques’ performances, thus limiting the surgical team in moving and adjusting the imaging device, such as the C-arm. Moreover, landmark acquisition requires a high degree of manual interaction to resolve the misregistration task, resulting in workflow disruption.
Lately, depth cameras [91] were used for markerless registration. In particular, to overcome the limitation of camera stationarity, calibration methods were developed to be able to use these registration methods on HMD [81], providing a more robust approach. For example, Hajek et al. [92] implemented an “inside-out” tracking system to allow markerless registration for intraoperative 3D visualization directly on the anatomical lens using a HoloLens 1, while Hu et al. [32] presented a HoloLens-to-world registration technique involving a customized registration cube and an external EM tracker, obtaining a more accurate method than Hajek. Gibby et al. [64] and Urakov et al. [72] used the OpenSight application on the HoloLens to automatically align the real surface area and the virtual surface area and so registered the target object to the real world. Gu et al. [31] directly used the ToF depth camera of HoloLens 1 in short-throw mode, making markerless image-based registration possible, while von Atzigen et al. [43] introduced an efficient method of object detection and position estimation using Machine Learning; their CNN-based method allows detection of the implanted screw head positions in the world space at interactive speeds, applying a clustering technique to remove outliers after the 3D model reconstruction thanks to the stereoscopy. Although these registration methods are very appealing, the low accuracy of markerless registration and motion tracking compared to marker-based still remains an issue. Indeed, as these methods are mostly based on preoperative CT or fluoroscopic images, lesion location and size could be different at the time of surgery from the preoperative data, forcing the physician to perform a new CT image acquisition or recalibrate the instrument [45,70].
In addition, some of the proposed methods are partly constrained by the current technical limitations of HoloLens [31,64,66]. Access to the HoloLens internal sensor data is only possible by enabling the HoloLens Research Mode [93], thus making navigation unstable due to the relatively limited computing power of HoloLens [94]. Furthermore, regarding the depth sensor, higher uncertainty along the depth axis was observed with respect to the horizontal and the vertical axes [64]. It can be assumed that the accuracy of the built-in tracking will be improved in future versions of HoloLens as the quality of its sensors increases.

4.3. Implications for Future Research

The main purpose of our study is to assess, in the vast panorama of surgical applications of augmented reality in orthopedics, what were the major gaps that research has not been able to fill yet. This was performed by analyzing 62 articles that met our selection criteria imposed in the systematic search and paying particular attention to the methods and tools used in each AR-based solution and to related problems that the authors reported. Consequently, the first major practical contribution of the present research is that it provides a collection, categorized by use-case, tools involved, and surgical procedure supported, of some of the most recent developments of AR technologies in the orthopedic field. This information is of paramount importance considering that the pace at which AR is developing and growing in all fields is unstoppable, and it is increasingly important to collect and sort existing studies to provide practical guidance to researchers and end-users.
The second important implication, directly related to the first, comes from the analysis of our results, giving the possibility of identifying the open issues in order to explore how these can be solved in the near future. Technicalities that authors encountered in the development of these systems can hinder or slow down the development of such solutions in the field of orthopedics, thus it is important to know a priori what they are so that one can focus the research on bridging those gaps. The analysis of the collected results raises several opportunities for future research, both in terms of software and hardware development and concept validation.
First, configuring AR as a powerful visualization technology, a researcher has to design AR-based solutions in which the geometric entities, the scale of objects, and the spatial relationships between real and virtual objects can be easily perceived. The development of AR-based applications must be, from the design phase, end-user oriented, i.e., it must include intuitive and easy-to-use human–machine interfaces, so as not to hinder the workflow of the medical team and increase the realism of the surgical simulation. A good design of the user interface and the realization of perceptual stimuli that are as realistic as possible can minimize the physical and cognitive discomfort of the user, decreasing the effort needed to perform the activity.
In fact, the cognitive aspect regarding the use of AR is often underestimated, and although qualitative assessments have already been made on the mental effort required by AR applications, this is still not a very common practice. In these studies [37,41,46,62,68,77,84], although the overall feedback was positive, contrasting judgments about the self-assessment of one’s performance emerged. The lack of knowledge about AR technology [28], along with a slow learning curve of AR-based procedures [83], could be responsible for these contrasting feedbacks. In order to maximize surgical performance, it becomes important to organize training phases for the use of AR and gestural and voice interactions, to increase medical team awareness of the functionalities of each application. This will provide all the necessary tools to make the most of AR’s potential, and thus allow physicians to perform their activities with the necessary confidence and readiness.
Secondly, registration and tracking represent one of the main components of AR-based applications. What emerges from this review is that, at the moment, the major limitation to the introduction of AR in daily clinical practice and surgical workflow lies in the lack of reliable tracking and registration systems. As discussed in Section 4.2, although several approaches were used, still none of them seems to be completely able to guarantee accurate and stable navigation, resulting in poorly positioned virtual models and unreliable spatial relationships between virtual and real objects. It can be assumed that many of the hardware problems of current viewers and tracking systems will be solved by manufacturers in the future, and devices with higher performance and built-in tracking capabilities optimal for the surgical area will be developed. However, the development of proper calibration methods, along with the design of system architectures integrated with surgical procedures, must be addressed in order to develop robust and trustworthy surgical navigation methods.
Finally, this review brought out some common issues related to the security field: the absence of harmonized guidelines for medical AR applications makes the design and implementation process insecure and complex for researchers; the lack of a secure and specialized platform can hinder the management of code, classes, and documentation as well; the launch of a standard and specific system referring to application package interfaces (APIs), frameworks, and adequate add-ons are a challenge for current healthcare AR platforms.
Providing the security of healthcare data is not less important than the other aspects considered in previous sections of AR healthcare applications. Nevertheless, this aspect has not yet been taken into account by researchers, as many existing applications are still in a proof-of-concept or clinical validation state. In order to make AR applications ready for the market, hence usable in the future, it may be worth introducing strict protocols to ensure data security from its conceptualization and design stage. When dealing with sensitive data, such as a patient’s medical data, it will be necessary to adopt solutions that can guarantee the encrypted and protected transfer of data and the saving and storage in a form that prevents the identification of the subject without the use of additional information. Examples of methods to be adopted in this field are the implementation of network connection encryption, pseudonymization, and data encryption.
Indeed, further research will be required to further refine and elaborate on our findings in order to keep the research up to date with the state-of-the-art technologies.

5. Conclusions

Augmented reality is a fast-growing technology, and the increasing focus on surgical AR applications is creating new opportunities for the training and execution of many orthopedic surgical procedures.
The literature reviewed demonstrates how AR has the potential to optimize clinical practice by improving the way physicians and medical technology interact with each other, moving toward the concept of augmented humanity. In this regard, AR can provide user-centered design systems in which advanced visualization of anatomical landmarks and other 3D data are used to present medical information in an optimal way to the surgeon. In addition, other benefits include improved surgical accuracy, decreased surgical time and absorbed radiation dose, and increased time and cost efficiency as a training tool for residents.
However, several uncovered issues emerged, and the discussion in this review focused on the current issues most relevant to the scientific community to address future research on medical applications and services based on AR technology.
For diffusion and ultimate adoption in clinical practice, AR must be considered as part of the surgical framework. The absence of reliable registration and tracking processes, which are the main limitations of this technology, should be faced to enable the adoption of AR solutions directly during surgery, in order to solve the inaccuracies that may result in poorly positioned virtual models and untrustworthy navigation.
Nonetheless, the current technical limitations of the devices, as well as availability issues and excessive costs, are likely to be addressed by further research in this direction, granted by the increased interest in the AR technology, notable even by the constantly growing number of promising studies and scientific research published in the last years.

Author Contributions

Conceptualization, C.I., L.U., S.M. and E.V.; methodology, C.I. and L.U.; software, C.I.; validation, L.U. and S.M.; formal analysis, C.I. and L.U.; investigation, C.I. and L.U.; resources, E.V.; data curation, C.I. and S.M.; writing—original draft preparation, C.I. and L.U.; writing—review & editing, S.M. and E.V.; visualization, C.I.; supervision, S.M. and E.V.; project administration, E.V.; funding acquisition, E.V. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ke, Q.; Liu, J.; Bennamoun, M.; An, S.; Sohel, F.; Boussaid, F. Computer Vision for Human-Machine Interaction. Comput. Vis. Assist. Healthc. 2018, 127–145. [Google Scholar] [CrossRef]
  2. Marcolin, F.; Scurati, G.W.; Ulrich, L.; Nonis, F.; Vezzetti, E.; Dozio, N.; Ferrise, F.; Stork, A.; Basole, R.C. Affective Virtual Reality: How to Design Artificial Experiences Impacting Human Emotions. IEEE Comput. Graph. Appl. 2021, 41, 171–178. [Google Scholar] [CrossRef]
  3. Ulrich, L.; Dugelay, J.L.; Vezzetti, E.; Moos, S.; Marcolin, F. Perspective Morphometric Criteria for Facial Beauty and Proportion Assessment. Appl. Sci. 2019, 10, 8. [Google Scholar] [CrossRef] [Green Version]
  4. Azuma, R.T. A Survey of Augmented Reality. Presence Teleoperators Virtual Environ. 1997, 6, 355–385. [Google Scholar] [CrossRef]
  5. Azuma, R.; Baillot, Y.; Behringer, R.; Feiner, S.; Julier, S.; MacIntyre, B. Recent Advances in Augmented Reality. IEEE Comput. Graph. Appl. 2001, 21, 34–47. [Google Scholar] [CrossRef] [Green Version]
  6. Dozio, N.; Marcolin, F.; Scurati, G.W.; Ulrich, L.; Nonis, F.; Vezzetti, E.; Marsocci, G.; la Rosa, A.; Ferrise, F. A Design Methodology for Affective Virtual Reality. Int. J. Hum. Comput. Stud. 2022, 162, 102791. [Google Scholar] [CrossRef]
  7. Pellas, N.; Mystakidis, S.; Kazanidis, I. Immersive Virtual Reality in K-12 and Higher Education: A Systematic Review of the Last Decade Scientific Literature. Virtual Real. 2021, 25, 835–861. [Google Scholar] [CrossRef]
  8. Milgram, P.; Takemura, H.; Utsumi, A.; Kishino, F. Augmented Reality: A Class of Displays on the Reality-Virtuality Continuum. In Proceedings of the Telemanipulator and telepresence technologies. International Society for Optics and Photonics, Bellingham, WA, USA, 31 October–1 November 1994; Volume 2351, pp. 282–292. [Google Scholar] [CrossRef]
  9. Mystakidis, S. Metaverse. Encyclopedia 2022, 2, 486–497. [Google Scholar] [CrossRef]
  10. Verhey, J.T.; Haglin, J.M.; Verhey, E.M.; Hartigan, D.E. Virtual, Augmented, and Mixed Reality Applications in Orthopedic Surgery. Int. J. Med. Robot. Comput. Assist. Surg. 2020, 16, e2067. [Google Scholar] [CrossRef]
  11. OPUS at UTS: Augmented Reality Games: A Review—Open Publications of UTS Scholars. Available online: https://opus.lib.uts.edu.au/handle/10453/23503 (accessed on 10 March 2022).
  12. Vávra, P.; Roman, J.; Zonča, P.; Ihnát, P.; Němec, M.; Kumar, J.; Habib, N.; El-Gendi, A. Recent Development of Augmented Reality in Surgery: A Review. J. Healthc. Eng. 2017, 2017, 4574172. [Google Scholar] [CrossRef]
  13. Riva, G.; Baños, R.M.; Botella, C.; Mantovani, F.; Gaggioli, A. Transforming Experience: The Potential of Augmented Reality and Virtual Reality for Enhancing Personal and Clinical Change. Front. Psychiatry 2016, 7, 164. [Google Scholar] [CrossRef] [PubMed]
  14. Uppot, R.N.; Laguna, B.; McCarthy, C.J.; de Novi, G.; Phelps, A.; Siegel, E.; Courtier, J. Implementing Virtual and Augmented Reality Tools for Radiology Education and Training, Communication, and Clinical Care. Radiology 2019, 291, 570–580. [Google Scholar] [CrossRef] [PubMed]
  15. Sutherland, J.; Belec, J.; Sheikh, A.; Chepelev, L.; Althobaity, W.; Chow, B.J.W.; Mitsouras, D.; Christensen, A.; Rybicki, F.J.; Russa, D.J.L. Applying Modern Virtual and Augmented Reality Technologies to Medical Images and Models. J. Digit. Imaging 2019, 32, 38–53. [Google Scholar] [CrossRef]
  16. Jeong, B.; Yoon, J. Competitive Intelligence Analysis of Augmented Reality Technology Using Patent Information. Sustainability 2017, 9, 497. [Google Scholar] [CrossRef] [Green Version]
  17. Ulrich, L.; Baldassarre, F.; Marcolin, F.; Moos, S.; Tornincasa, S.; Vezzetti, E.; Speranza, D.; Ramieri, G.; Zavattero, E. A Procedure for Cutting Guides Design in Maxillofacial Surgery: A Case-Study. Lect. Notes Mech. Eng. 2019, 301–310. [Google Scholar] [CrossRef]
  18. Casari, F.A.; Navab, N.; Hruby, L.A.; Kriechling, P.; Nakamura, R.; Tori, R.; de Lourdes dos Santos Nunes, F.; Queiroz, M.C.; Fürnstahl, P.; Farshad, M. Augmented Reality in Orthopedic Surgery Is Emerging from Proof of Concept Towards Clinical Studies: A Literature Review Explaining the Technology and Current State of the Art. Curr. Rev. Musculoskelet. Med. 2021, 14, 192–203. [Google Scholar] [CrossRef] [PubMed]
  19. Sakai, D.; Joyce, K.; Sugimoto, M.; Horikita, N.; Hiyama, A.; Sato, M.; Devitt, A.; Watanabe, M. Augmented, Virtual and Mixed Reality in Spinal Surgery: A Real-World Experience. J. Orthop. Surg. 2020, 28, 2309499020952698. [Google Scholar] [CrossRef]
  20. Longo, U.G.; de Salvatore, S.; Candela, V.; Zollo, G.; Calabrese, G.; Fioravanti, S.; Giannone, L.; Marchetti, A.; de Marinis, M.G.; Denaro, V. Augmented Reality, Virtual Reality and Artificial Intelligence in Orthopedic Surgery: A Systematic Review. Appl. Sci. 2021, 11, 3253. [Google Scholar] [CrossRef]
  21. Ara, J.; Benta Karim, F.; Saud Alsubaie, M.A.; Arafat Bhuiyan, Y.; Ismail Bhuiyan, M.; Begum Bhyan, S.; Bhuiyan, H. Comprehensive Analysis of Augmented Reality Technology in Modern Healthcare System. Int. J. Adv. Comput. Sci. Appl. 2021, 12, 840–849. [Google Scholar] [CrossRef]
  22. PICO Process—Wikipedia. Available online: https://en.wikipedia.org/wiki/PICO_process (accessed on 14 March 2022).
  23. PRISMA. Available online: http://www.prisma-statement.org/ (accessed on 14 March 2022).
  24. Ackermann, J.; Liebmann, F.; Hoch, A.; Snedeker, J.G.; Farshad, M.; Rahm, S.; Zingg, P.O.; Fürnstahl, P. Augmented Reality Based Surgical Navigation of Complex Pelvic Osteotomies—A Feasibility Study on Cadavers. Appl. Sci. 2021, 11, 312. [Google Scholar] [CrossRef]
  25. Charles, Y.P.; Cazzato, R.L.; Nachabe, R.; Chatterjea, A.; Steib, J.P.; Gangi, A. Minimally Invasive Transforaminal Lumbar Interbody Fusion Using Augmented Reality Surgical Navigation for Percutaneous Pedicle Screw Placement. Clin. Spine Surg. 2021, 34, E415–E424. [Google Scholar] [CrossRef] [PubMed]
  26. Chen, F.; Cui, X.; Han, B.; Liu, J.; Zhang, X.; Liao, H. Augmented Reality Navigation for Minimally Invasive Knee Surgery Using Enhanced Arthroscopy. Comput. Methods Programs Biomed. 2021, 201, 105952. [Google Scholar] [CrossRef] [PubMed]
  27. Condino, S.; Turini, G.; Mamone, V.; Parchi, P.D.; Ferrari, V. Hybrid Spine Simulator Prototype for X-Ray Free Pedicle Screws Fixation Training. Appl. Sci. 2021, 11, 1038. [Google Scholar] [CrossRef]
  28. Dennler, C.; Bauer, D.E.; Scheibler, A.G.; Spirig, J.; Götschi, T.; Fürnstahl, P.; Farshad, M. Augmented Reality in the Operating Room: A Clinical Feasibility Study. BMC Musculoskelet. Disord. 2021, 22, 451. [Google Scholar] [CrossRef] [PubMed]
  29. Dennler, C.; Safa, N.A.; Bauer, D.E.; Wanivenhaus, F.; Liebmann, F.; Götschi, T.; Farshad, M. Augmented Reality Navigated Sacral-Alar-Iliac Screw Insertion. Int. J. Spine Surg. 2021, 15, 161–168. [Google Scholar] [CrossRef] [PubMed]
  30. Fucentese, S.F.; Koch, P.P. A Novel Augmented Reality-Based Surgical Guidance System for Total Knee Arthroplasty. Arch. Orthop. Trauma Surg. 2021, 141, 2227–2233. [Google Scholar] [CrossRef] [PubMed]
  31. Gu, W.; Shah, K.; Knopf, J.; Navab, N.; Unberath, M. Feasibility of Image-Based Augmented Reality Guidance of Total Shoulder Arthroplasty Using Microsoft HoloLens 1. Comput. Methods Biomech. Biomed. Eng. Imaging Vis. 2021, 9, 261–270. [Google Scholar] [CrossRef]
  32. Hu, X.; Rodriguez y Baena, F.; Cutolo, F. Head-Mounted Augmented Reality Platform for Markerless Orthopaedic Navigation. IEEE J. Biomed. Health Inform. 2021, 26, 910–921. [Google Scholar] [CrossRef]
  33. Kriechling, P.; Loucas, R.; Loucas, M.; Casari, F.; Fürnstahl, P.; Wieser, K. Augmented Reality through Head-Mounted Display for Navigationof Baseplate Component Placement in Reverse Total Shoulder Arthroplasty: A Cadaveric Study. Arch. Orthop. Trauma Surg. 2021, 1, 1447–1453. [Google Scholar] [CrossRef]
  34. Kriechling, P.; Roner, S.; Liebmann, F.; Casari, F.; Fürnstahl, P.; Wieser, K. Augmented Reality for Base Plate Component Placement in Reverse Total Shoulder Arthroplasty: A Feasibility Study. Arch. Orthop. Trauma Surg. 2021, 141, 1447–1453. [Google Scholar] [CrossRef]
  35. Li, J.; Zhang, H.; Li, Q.; Yu, S.; Chen, W.; Wan, S.; Chen, D.; Liu, R.; Ding, F. Treating Lumbar Fracture Using the Mixed Reality Technique. BioMed Res. Int. 2021, 2021, 6620746–6620751. [Google Scholar] [CrossRef] [PubMed]
  36. Molina, C.A.; Dibble, C.F.; Larry Lo, S.F.; Witham, T.; Sciubba, D.M. Augmented Reality–Mediated Stereotactic Navigation for Execution of En Bloc Lumbar Spondylectomy Osteotomies. J. Neurosurg. Spine 2021, 34, 700–705. [Google Scholar] [CrossRef] [PubMed]
  37. Moreta-Martinez, R.; Pose-Díez-De-la-lastra, A.; Calvo-Haro, J.A.; Mediavilla-Santos, L.; Pérez-Mañanes, R.; Pascau, J. Combining Augmented Reality and 3d Printing to Improve Surgical Workflows in Orthopedic Oncology: Smartphone Application and Clinical Evaluation. Sensors 2021, 21, 1370. [Google Scholar] [CrossRef] [PubMed]
  38. Schlueter-Brust, K.; Henckel, J.; Katinakis, F.; Buken, C.; Opt-Eynde, J.; Pofahl, T.; Rodriguez Y Baena, F.; Tatti, F. Augmented-Reality-Assisted K-Wire Placement for Glenoid Component Positioning in Reversed Shoulder Arthroplasty: A Proof-of-Concept Study. J. Pers. Med. 2021, 11, 777. [Google Scholar] [CrossRef] [PubMed]
  39. Tsukada, S.; Ogawa, H.; Hirasawa, N.; Nishino, M.; Aoyama, H.; Kurosaka, K. Augmented Reality- vs Accelerometer-Based Portable Navigation System to Improve the Accuracy of Acetabular Cup Placement During Total Hip Arthroplasty in the Lateral Decubitus Position. J. Arthroplast. 2021, 37, 488–494. [Google Scholar] [CrossRef] [PubMed]
  40. Tsukada, S.; Ogawa, H.; Nishino, M.; Kurosaka, K.; Hirasawa, N. Augmented Reality-Assisted Femoral Bone Resection in Total Knee Arthroplasty. JBJS Open Access 2021, 6, e21. [Google Scholar] [CrossRef] [PubMed]
  41. Tu, P.; Gao, Y.; Lungu, A.J.; Li, D.; Wang, H.; Chen, X. Augmented Reality Based Navigation for Distal Interlocking of Intramedullary Nails Utilizing Microsoft HoloLens 2. Comput. Biol. Med. 2021, 133, 104402. [Google Scholar] [CrossRef]
  42. Veloso, R.; Magalhães, R.; Marques, A.; Gomes, P.V.; Pereira, J.; González, M.A.; Penedo, M.G. Mixed Reality in an Operating Room Using Hololens 2—The Use of the Remote Assistance from Manufacturers Techinicians during the Surgeries. Eng. Proc. 2021, 7, 54. [Google Scholar]
  43. von Atzigen, M.; Liebmann, F.; Hoch, A.; Bauer, D.E.; Snedeker, J.G.; Farshad, M.; Fürnstahl, P. HoloYolo: A Proof-of-Concept Study for Marker-Less Surgical Navigation of Spinal Rod Implants with Augmented Reality and on-Device Machine Learning. Int. J. Med. Robot. Comput. Assist. Surg. 2021, 17, e2184. [Google Scholar] [CrossRef]
  44. Yahanda, A.T.; Moore, E.; Ray, W.Z.; Pennicooke, B.; Jennings, J.W.; Molina, C.A. First In-Human Report of the Clinical Accuracy of Thoracolumbar Percutaneous Pedicle Screw Placement Using Augmented Reality Guidance. Neurosurg. Focus 2021, 51, E10. [Google Scholar] [CrossRef]
  45. Yamamoto, M.; Oyama, S.; Otsuka, S.; Murakami, Y.; Yokota, H.; Hirata, H. Experimental Pilot Study for Augmented Reality-Enhanced Elbow Arthroscopy. Sci. Rep. 2021, 11, 4650–4659. [Google Scholar] [CrossRef] [PubMed]
  46. Yanni, D.S.; Ozgur, B.M.; Louis, R.G.; Shekhtman, Y.; Iyer, R.R.; Boddapati, V.; Iyer, A.; Patel, P.D.; Jani, R.; Cummock, M.; et al. Real-Time Navigation Guidance with Intraoperative CT Imaging for Pedicle Screw Placement Using an Augmented Reality Head-Mounted Display: A Proof-of-Concept Study. Neurosurg. Focus 2021, 51, E11. [Google Scholar] [CrossRef] [PubMed]
  47. Alexander, C.; Loeb, A.E.; Fotouhi, J.; Navab, N.; Armand, M.; Khanuja, H.S. Augmented Reality for Acetabular Component Placement in Direct Anterior Total Hip Arthroplasty. J. Arthroplast. 2020, 35, 1636–1641. [Google Scholar] [CrossRef] [PubMed]
  48. Auloge, P.; Luigi Cazzato, R.; Ramamurthy, N.; de Marini, P.; Rousseau, C.; Garnon, J.; Philippe Charles, Y.; Steib, J.-P.; Gangi, A.; Marini, D.P.; et al. Augmented Reality and Artificial Intelligence-Based Navigation during Percutaneous Vertebroplasty: A Pilot Randomised Clinical Trial. Eur. Spine J. 2020, 29, 1580–1589. [Google Scholar] [CrossRef]
  49. Burström, G.; Balicki, M.; Patriciu, A.; Kyne, S.; Popovic, A.; Holthuizen, R.; Homan, R.; Skulason, H.; Persson, O.; Edström, E.; et al. Feasibility and Accuracy of a Robotic Guidance System for Navigated Spine Surgery in a Hybrid Operating Room: A Cadaver Study. Sci. Rep. 2020, 10, 7522–7530. [Google Scholar] [CrossRef]
  50. Carl, B.; Bopp, M.; Saß, B.; Pojskic, M.; Voellger, B.; Nimsky, C. Spine Surgery Supported by Augmented Reality. Glob. Spine J. 2020, 10, 41S–55S. [Google Scholar] [CrossRef]
  51. Dennler, C.; Jaberg, L.; Spirig, J.; Agten, C.; Götschi, T.; Fürnstahl, P.; Farshad, M. Augmented Reality-Based Navigation Increases Precision of Pedicle Screw Insertion. J. Orthop. Surg. Res. 2020, 15, 174. [Google Scholar] [CrossRef]
  52. Edström, E.; Burström, G.; Omar, A.; Nachabe, R.; Söderman, M.; Persson, O.; Gerdhem, P.; Elmi-Terander, A. Augmented Reality Surgical Navigation in Spine Surgery to Minimize Staff Radiation Exposure. Spine 2020, 45, E45–E53. [Google Scholar] [CrossRef]
  53. Edström, E.; Burström, G.; Nachabe, R.; Gerdhem, P.; Terander, A.E. A Novel Augmented-Reality-Based Surgical Navigation System for Spine Surgery in a Hybrid Operating Room: Design, Workflow, and Clinical Applications. Oper. Neurosurg. 2020, 18, 496–502. [Google Scholar] [CrossRef] [Green Version]
  54. Elmi-Terander, A.; Burström, G.; Nachabé, R.; Fagerlund, M.; Ståhl, F.; Charalampidis, A.; Edström, E.; Gerdhem, P. Augmented Reality Navigation with Intraoperative 3D Imaging vs Fluoroscopy-Assisted Free-Hand Surgery for Spine Fixation Surgery: A Matched-Control Study Comparing Accuracy. Sci. Rep. 2020, 10, 707–714. [Google Scholar] [CrossRef] [Green Version]
  55. Molina, C.A.; Phillips, F.M.; Colman, M.W.; Ray, W.Z.; Khan, M.; Orru’, E.; Poelstra, K.; Khoo, L. A Cadaveric Precision and Accuracy Analysis of Augmented Reality–Mediated Percutaneous Pedicle Implant Insertion. J. Neurosurg. Spine 2021, 34, 316–324. [Google Scholar] [CrossRef] [PubMed]
  56. Müller, F.; Roner, S.; Liebmann, F.; Spirig, J.M.; Fürnstahl, P.; Farshad, M. Augmented Reality Navigation for Spinal Pedicle Screw Instrumentation Using Intraoperative 3D Imaging. Spine J. 2020, 20, 621–628. [Google Scholar] [CrossRef] [PubMed]
  57. Ogawa, H.; Kurosaka, K.; Sato, A.; Hirasawa, N.; Matsubara, M.; Tsukada, S. Does An Augmented Reality-Based Portable Navigation System Improve the Accuracy of Acetabular Component Orientation during THA? A Randomized Controlled Trial. Clin. Orthop. Relat. Res. 2020, 478, 935–943. [Google Scholar] [CrossRef]
  58. Peh, S.; Chatterjea, A.; Pfarr, J.; Schäfer, J.P.; Weuster, M.; Klüter, T.; Seekamp, A.; Lippross, S. Accuracy of Augmented Reality Surgical Navigation for Minimally Invasive Pedicle Screw Insertion in the Thoracic and Lumbar Spine with a New Tracking Device. Spine J. 2020, 20, 629–637. [Google Scholar] [CrossRef] [PubMed]
  59. Burström, G.; Nachabe, R.; Persson, O.; Edström, E.; Elmi Terander, A. Augmented and Virtual Reality Instrument Tracking for Minimally Invasive Spine Surgery: A Feasibility and Accuracy Study. Spine 2019, 44, 1097–1104. [Google Scholar] [CrossRef] [PubMed]
  60. Carl, B.; Bopp, M.; Saß, B.; Nimsky, C. Microscope-Based Augmented Reality in Degenerative Spine Surgery: Initial Experience. World Neurosurg. 2019, 128, e541–e551. [Google Scholar] [CrossRef]
  61. Carl, B.; Bopp, M.; Saß, B.; Voellger, B.; Nimsky, C. Implementation of Augmented Reality Support in Spine Surgery. Eur. Spine J. 2019, 28, 1697–1711. [Google Scholar] [CrossRef]
  62. Condino, S.; Turini, G.; Viglialoro, R.; Gesi, M.; Ferrari, V. Wearable Augmented Reality Application for Shoulder Rehabilitation. Electronics 2019, 8, 1178. [Google Scholar] [CrossRef] [Green Version]
  63. Elmi-Terander, A.; Burström, G.; Nachabe, R.; Skulason, H.; Pedersen, K.; Fagerlund, M.; Ståhl, F.; Charalampidis, A.; Söderman, M.; Holmin, S.; et al. Pedicle Screw Placement Using Augmented Reality Surgical Navigation With Intraoperative 3D Imaging: A First In-Human Prospective Cohort Study. Spine 2019, 44, 517–525. [Google Scholar] [CrossRef]
  64. Gibby, J.T.; Swenson, S.A.; Cvetko, S.; Rao, R.; Javan, R. Head-Mounted Display Augmented Reality to Guide Pedicle Screw Placement Utilizing Computed Tomography. Int. J. Comput. Assist. Radiol. Surg. 2019, 14, 525–535. [Google Scholar] [CrossRef]
  65. Lei, P.-F.; Su, S.-L.; Kong, L.-Y.; Wang, C.-G.; Zhong, D.; Hu, Y.-H. Mixed Reality Combined with Three-Dimensional Printing Technology in Total Hip Arthroplasty: An Updated Review with a Preliminary Case Presentation. Orthop. Surg. 2019, 11, 914. [Google Scholar] [CrossRef] [PubMed]
  66. Liebmann, F.; Roner, S.; von Atzigen, M.; Scaramuzza, D.; Sutter, R.; Snedeker, J.; Farshad, M.; Fürnstahl, P. Pedicle Screw Navigation Using Surface Digitization on the Microsoft HoloLens. Int. J. Comput. Assist. Radiol. Surg. 2019, 14, 1157–1165. [Google Scholar] [CrossRef] [PubMed]
  67. Liu, H.; Wu, J.; Tang, Y.; Li, H.; Wang, W.; Li, C.; Zhou, Y. Percutaneous Placement of Lumbar Pedicle Screws via Intraoperative CT Image-Based Augmented Reality-Guided Technology. J. Neurosurg. Spine 2020, 32, 542–547. [Google Scholar] [CrossRef] [PubMed]
  68. Logishetty, K.; Western, L.; Morgan, R.; Iranpour, F.; Cobb, J.P.; Auvinet, E. Can an Augmented Reality Headset Improve Accuracy of Acetabular Cup Orientation in Simulated THA? A Randomized Trial. Clin. Orthop. Relat. Res. 2019, 477, 1190. [Google Scholar] [CrossRef] [PubMed]
  69. Molina, C.A.; Theodore, N.; Karim Ahmed, A.; Westbroek, E.M.; Mirovsky, Y.; Harel, R.; Orru, E.; Khan, M.; Witham, T.; Sciubba, D.M. Augmented Reality–Assisted Pedicle Screw Insertion: A Cadaveric Proof-of-Concept Study. J. Neurosurg. Spine 2019, 31, 139–146. [Google Scholar] [CrossRef]
  70. Pokhrel, S.; Alsadoon, A.; Prasad, P.W.C.; Paul, M. A Novel Augmented Reality (AR) Scheme for Knee Replacement Surgery by Considering Cutting Error Accuracy. Int. J. Med. Robot. Comput. Assist. Surg. 2019, 15, e1958. [Google Scholar] [CrossRef] [Green Version]
  71. Tsukada, S.; Ogawa, H.; Nishino, M.; Kurosaka, K.; Hirasawa, N. Augmented Reality-Based Navigation System Applied to Tibial Bone Resection in Total Knee Arthroplasty. J. Exp. Orthop. 2019, 6, 44–50. [Google Scholar] [CrossRef]
  72. Urakov, T.M.; Wang, M.Y.; Levi, A.D. Workflow Caveats in Augmented Reality–Assisted Pedicle Instrumentation: Cadaver Lab. World Neurosurg. 2019, 126, e1449–e1455. [Google Scholar] [CrossRef]
  73. Wanivenhaus, F.; Neuhaus, C.; Liebmann, F.; Roner, S.; Spirig, J.M.; Farshad, M. Augmented Reality-Assisted Rod Bending in Spinal Surgery. Spine J. 2019, 19, 1687–1689. [Google Scholar] [CrossRef]
  74. Wei, P.; Yao, Q.; Xu, Y.; Zhang, H.; Gu, Y.; Wang, L. Percutaneous Kyphoplasty Assisted with/without Mixed Reality Technology in Treatment of OVCF with IVC: A Prospective Study. J. Orthop. Surg. Res. 2019, 14, 255–263. [Google Scholar] [CrossRef] [Green Version]
  75. Agten, C.A.; Dennler, C.; Rosskopf, A.B.; Jaberg, L.; Pfirrmann, C.W.A.; Farshad, M. Augmented Reality-Guided Lumbar Facet Joint Injections. Investig. Radiol. 2018, 53, 495–498. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  76. Cho, H.S.; Park, M.S.; Gupta, S.; Han, I.; Kim, H.S.; Choi, H.; Hong, J. Can Augmented Reality Be Helpful in Pelvic Bone Cancer Surgery? An In Vitro Study. Clin. Orthop. Relat. Res. 2018, 476, 1719–1725. [Google Scholar] [CrossRef] [PubMed]
  77. Condino, S.; Turini, G.; Parchi, P.D.; Viglialoro, R.M.; Piolanti, N.; Gesi, M.; Ferrari, M.; Ferrari, V. How to Build a Patient-Specific Hybrid Simulator for Orthopaedic Open Surgery: Benefits and Limits of Mixed-Reality Using the Microsoft Hololens. J. Healthc. Eng. 2018, 2018, 5435097. [Google Scholar] [CrossRef] [PubMed]
  78. Deib, G.; Johnson, A.; Unberath, M.; Yu, K.; Andress, S.; Qian, L.; Osgood, G.; Navab, N.; Hui, F.; Gailloud, P. Image Guided Percutaneous Spine Procedures Using an Optical See-through Head Mounted Display: Proof of Concept and Rationale. J. NeuroInterv. Surg. 2018, 10, 1187–1191. [Google Scholar] [CrossRef] [PubMed]
  79. Elmi-Terander, A.; Nachabe, R.; Skulason, H.; Pedersen, K.; Söderman, M.; Racadio, J.; Babic, D.; Gerdhem, P.; Edström, E. Feasibility and Accuracy of Thoracolumbar Minimally Invasive Pedicle Screw Placement With Augmented Reality Navigation Technology. Spine 2018, 43, 1018–1023. [Google Scholar] [CrossRef]
  80. Fotouhi, J.; Alexander, C.P.; Unberath, M.; Taylor, G.; Lee, S.C.; Fuerst, B.; Johnson, A.; Osgood, G.; Taylor, R.H.; Khanuja, H.; et al. Plan in 2-D, Execute in 3-D: An Augmented Reality Solution for Cup Placement in Total Hip Arthroplasty. J. Med. Imaging 2018, 5, 21205. [Google Scholar] [CrossRef] [Green Version]
  81. Liu, H.; Auvinet, E.; Giles, J.; Rodriguez y Baena, F. Augmented Reality Based Navigation for Computer Assisted Hip Resurfacing: A Proof of Concept Study. Ann. Biomed. Eng. 2018, 46, 1595–1605. [Google Scholar] [CrossRef] [Green Version]
  82. Ma, L.; Zhao, Z.; Zhang, B.; Jiang, W.; Fu, L.; Zhang, X.; Liao, H. Three-Dimensional Augmented Reality Surgical Navigation with Hybrid Optical and Electromagnetic Tracking for Distal Intramedullary Nail Interlocking. Int. J. Med. Robot. Comput. Assist. Surg. 2018, 14, e1909. [Google Scholar] [CrossRef]
  83. Ogawa, H.; Hasegawa, S.; Tsukada, S.; Matsubara, M. A Pilot Study of Augmented Reality Technology Applied to the Acetabular Cup Placement During Total Hip Arthroplasty. J. Arthroplast. 2018, 33, 1833–1837. [Google Scholar] [CrossRef]
  84. Turini, G.; Condino, S.; Parchi, P.D.; Viglialoro, M.R.; Piolanti, N.; Gesi, M.; Ferrari, M.; Ferrari, V. A Microsoft HoloLens Mixed Reality Surgical Simulator for Patient-Specific Hip Arthroplasty Training. In Proceedings of the 5th International Conference, AVR 2018, Otranto, Italy, 24–27 June 2018; pp. 201–210. [Google Scholar]
  85. van Duren, B.H.; Sugand, K.; Wescott, R.; Carrington, R.; Hart, A. Augmented Reality Fluoroscopy Simulation of the Guide-Wire Insertion in DHS Surgery: A Proof of Concept Study. Med. Eng. Phys. 2018, 55, 52–59. [Google Scholar] [CrossRef]
  86. Giachino, M.; Aprato, A.; Ulrich, L.; Revetria, T.A.; Tanzi, L.; Vezzetti, E.; Massè, A. Dynamic Evaluation of THA Components by Prosthesis Impingement Software (PIS). Acta Bio Med. Atenei Parm. 2021, 92, 2021295. [Google Scholar] [CrossRef]
  87. Mystakidis, S.; Christopoulos, A.; Pellas, N. A Systematic Mapping Review of Augmented Reality Applications to Support STEM Learning in Higher Education. Educ. Inf. Technol. 2022, 27, 1883–1927. [Google Scholar] [CrossRef]
  88. de Oliveira, M.E.; Debarba, H.G.; Lädermann, A.; Chagué, S.; Charbonnier, C.A. Hand-Eye Calibration Method for Augmented Reality Applied to Computer-Assisted Orthopedic Surgery. Rev. Chir. Orthop. Traumatol. 2019, 104, S110. [Google Scholar] [CrossRef]
  89. Sun, Q.; Mai, Y.; Yang, R.; Ji, T.; Jiang, X.; Chen, X. Fast and Accurate Online Calibration of Optical See-through Head-Mounted Display for AR-Based Surgical Navigation Using Microsoft HoloLens. Int. J. Comput. Assist. Radiol. Surg. 2020, 15, 1907–1919. [Google Scholar] [CrossRef]
  90. Andress, S.; Johnson, A.; Unberath, M.; Winkler, A.F.; Yu, K.; Fotouhi, J.; Weidert, S.; Osgood, G.; Navab, N. On-the-Fly Augmented Reality for Orthopedic Surgery Using a Multimodal Fiducial. J. Med. Imaging 2018, 5, 21209. [Google Scholar] [CrossRef]
  91. Ulrich, L.; Vezzetti, E.; Moos, S.; Marcolin, F. Analysis of RGB-D Camera Technologies for Supporting Different Facial Usage Scenarios. Multimed. Tools Appl. 2020, 79, 29375–29398. [Google Scholar] [CrossRef]
  92. Hajek, J.; Unberath, M.; Fotouhi, J.; Bier, B.; Lee, S.C.; Osgood, G.; Maier, A.; Armand, M.; Navab, N. Closing the Calibration Loop: An Inside-out-Tracking Paradigm for Augmented Reality in Orthopedic Surgery. In Proceedings of the Medical Image Computing and Computer Assisted Intervention—MICCAI 2018, Granada, Spain, 16–20 September 2018; pp. 299–306. [Google Scholar]
  93. HoloLens Research Mode—Mixed Reality Microsoft Docs. Available online: https://docs.microsoft.com/en-us/windows/mixed-reality/develop/advanced-concepts/research-mode (accessed on 10 March 2022).
  94. von der Heide, A.M.; Fallavollita, P.; Wang, L.; Sandner, P.; Navab, N.; Weidert, S.; Euler, E. Camera-Augmented Mobile C-Arm (CamC): A Feasibility Study of Augmented Reality Imaging in the Operating Room. Int. J. Med. Robot. Comput. Assist. Surg. 2018, 14, e1885. [Google Scholar] [CrossRef]
Figure 1. Flowchart representing the screening process of the papers. The flowchart is structured according to the available PRISMA template.
Figure 1. Flowchart representing the screening process of the papers. The flowchart is structured according to the available PRISMA template.
Applsci 12 04295 g001
Figure 2. Diagrams resume percentages of the anatomical districts of interest considered in the reviewed studies (a) and type of visualization tool used (b).
Figure 2. Diagrams resume percentages of the anatomical districts of interest considered in the reviewed studies (a) and type of visualization tool used (b).
Applsci 12 04295 g002
Table 1. Analyzed studies presenting AR applications in orthopedic surgery.
Table 1. Analyzed studies presenting AR applications in orthopedic surgery.
AuthorsPublication YearPurposeCategoryVisualization Tool
Ackermann et al. [24]2021IntraoperativeTumorHMD 1 (HoloLens)
Charles et al. [25]2021IntraoperativeSpinePC Monitor
Chen et al. [26]2021IntraoperativeKnee3D display
Condino et al. [27]2021TrainingSpinePC Monitor
Dennler et al. [28]2021IntraoperativeSpineHMD (HoloLens)
Dennler et al. [29]2021IntraoperativeSpineHMD (HoloLens)
Fucentese et al. [30]2021IntraoperativeKneeHMD (HoloLens)
Gu et al. [31]2021IntraoperativeShoulderHMD (HoloLens)
Hu et al. [32]2021IntraoperativeFemurHMD (HoloLens)
Kriechling et al. [33]2021IntraoperativeShoulderHMD (HoloLens)
Kriechling et al. [34] 2021IntraoperativeShoulderHMD (HoloLens)
Li et al. [35]2021IntraoperativeSpineHMD
Molina et al. [36]2021IntraoperativeTumorHMD
Moreta-Martinez et al. [37]2021IntraoperativeTumorSmartphone
Schlueter-Brust et al. [38]2021IntraoperativeShoulderHMD (HoloLens)
Tsukada et al. [39]2021IntraoperativeHipSmartphone
Tsukada et al. [40]2021IntraoperativeKneeSmartphone
Tu et al. [41] 2021IntraoperativeLower limbHMD (HoloLens)
Veloso et al. [42]2021TrainingHipHMD (HoloLens)
von Atzigen et al. [43]2021IntraoperativeSpineHMD (HoloLens)
Yahanda et al. [44]2021IntraoperativeSpineHMD
Yamamoto et al. [45]2021IntraoperativeElbowPC Monitor
Yanni et al. [46]2021IntraoperativeSpineHMD
Alexander et al. [47]2020IntraoperativeHipPC Monitor
Auloge et al. [48]2020IntraoperativeSpinePC Monitor
Burström et al. [49]2020IntraoperativeSpinePC Monitor
Carl et al. [50]2020IntraoperativeSpineHUD 2
Dennler et al. [51]2020IntraoperativeSpineHMD (HoloLens)
Edström et al. [52]2020IntraoperativeSpinePC Monitor
Edström et al. [53]2020IntraoperativeSpinePC Monitor
Elmi-Terander et al. [54]2020IntraoperativeSpinePC Monitor
Molina et al. [55]2020IntraoperativeSpineHMD
Müller et al. [56]2020IntraoperativeSpineHMD (HoloLens)
Ogawa et al. [57]2020IntraoperativeHipSmartphone
Peh et al. [58]2020IntraoperativeSpinePC Monitor
Burström et al. [59]2019IntraoperativeSpinePC Monitor
Carl et al. [60]2019IntraoperativeSpineHUD
Carl et al. [61]2019IntraoperativeSpineHUD
Condino et al. [62]2019RehabilitationShoulderHMD (HoloLens)
Elmi-Terander et al. [63]2019IntraoperativeSpinePC Monitor
Gibby et al. [64]2019IntraoperativeSpineHMD (HoloLens)
Lei et al. [65]2019IntraoperativeHipHMD (HoloLens)
Liebmann et al. [66]2019IntraoperativeSpineHMD (HoloLens)
Liu et al. [67]2019IntraoperativeSpineHMD (HoloLens)
Logishetty et al. [68]2019TrainingHipHMD (HoloLens)
Molina et al. [69]2019IntraoperativeSpineHMD
Pokhrel et al. [70]2019IntraoperativeKneePC Monitor
Tsukada et al. [71]2019IntraoperativeKneeSmartphone
Urakov et al. [72]2019IntraoperativeSpineHMD (HoloLens)
Wanivenhaus et al. [73]2019IntraoperativeSpineHMD (HoloLens)
Wei et al. [74]2019IntraoperativeSpineHMD (HoloLens)
Agten et al. [75]2018IntraoperativeSpineHMD (HoloLens)
Cho et al. [76]2018IntraoperativeTumorTablet
Condino et al. [77]2018TrainingHipHMD (HoloLens)
Deib et al. [78]2018IntraoperativeSpineHMD (HoloLens)
Elmi-Terander et al. [79]2018IntraoperativeSpinePC Monitor
Fotohui et al. [80]2018IntraoperativeHipPC Monitor
Liu et al. [81]2018IntraoperativeHipHMD (HoloLens)
Ma et al. [82]2018IntraoperativeTibiaPC Monitor
Ogawa et al. [83]2018IntraoperativeHipSmartphone
Turini et al. [84]2018TrainingHipHMD (HoloLens)
Van Duren et al. [85]2018TrainingHipPC Monitor
1 Head-Mounted Display. 2 Head-Up Display.
Table 2. Overview of the main features and expected outcomes of AR-based applications.
Table 2. Overview of the main features and expected outcomes of AR-based applications.
PurposeFeaturesOutcomes
TrainingHybrid environment simulators that recreate surgical conditions using AR display and phantoms, 3D models, and surgical instrumentsTrainees are provided guidance and advanced visualization; surgical simulation can be repeated multiple times in a safe and controlled environment; live feedback is obtained
RehabilitationAR-based platforms that involve controlled, user-specific environmental and perceptual stimuli to achieve functional recovery, improve range of motion, reach patient autonomyPatient activity is constantly monitored; real-time feedback is obtained; tasks are engaging and motivating
IntraoperativeAR-based applications or surgical navigation systems that project virtual additional information and images directly on the site of intervention, giving the surgeon improved sensory perceptionOptimized visualization and better understanding of spatial relationship between surgical instrument and anatomical structures is achieved; surgery time and radiation exposure are decreased
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Innocente, C.; Ulrich, L.; Moos, S.; Vezzetti, E. Augmented Reality: Mapping Methods and Tools for Enhancing the Human Role in Healthcare HMI. Appl. Sci. 2022, 12, 4295. https://doi.org/10.3390/app12094295

AMA Style

Innocente C, Ulrich L, Moos S, Vezzetti E. Augmented Reality: Mapping Methods and Tools for Enhancing the Human Role in Healthcare HMI. Applied Sciences. 2022; 12(9):4295. https://doi.org/10.3390/app12094295

Chicago/Turabian Style

Innocente, Chiara, Luca Ulrich, Sandro Moos, and Enrico Vezzetti. 2022. "Augmented Reality: Mapping Methods and Tools for Enhancing the Human Role in Healthcare HMI" Applied Sciences 12, no. 9: 4295. https://doi.org/10.3390/app12094295

APA Style

Innocente, C., Ulrich, L., Moos, S., & Vezzetti, E. (2022). Augmented Reality: Mapping Methods and Tools for Enhancing the Human Role in Healthcare HMI. Applied Sciences, 12(9), 4295. https://doi.org/10.3390/app12094295

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop