Microsoft HoloLens 2 in Medical and Healthcare Context: State of the Art and Future Prospects
Abstract
:1. Introduction
2. Research Methodology
2.1. Search Strategy
2.2. Inclusion and Exclusion Criteria
2.3. Study Selection
3. HoloLens 2 versus Other Commercially Available Optical See-Through Head-Mounted Displays
HoloLens First and Second Generation Comparison: A Detailed Study of Features, Functionalities and Performances
4. MS HoloLens 2 Applications in Medical and Healthcare Context: Literature Results
4.1. Surgical Navigation
4.2. Human Computer Interaction and AR-BCI Systems Integration
4.3. Gait Analysis and Rehabilitation
4.4. Medical Education and Training/Virtual Teaching/Tele-Mentoring/Tele-Consulting
4.5. Other Applications
Reference | Year | Aim of Study | Sub-Field Application | Methodology | Device Functionality | HoloLens 2 Natively Integrated and Other Software Used | Study Validation Type (n Participants) | |
---|---|---|---|---|---|---|---|---|
1 | Wang et al. [58] | 2022 | To establish a 3-dimensional visualization model of percutaneous nephrolithotomy, apply it to guiding intraoperative puncture in a mixed reality environment, and evaluate its accuracy and clinical value. | Surgical navigation | MR | 3D visualization—Preoperative Planning | Vuforia Engine | 3D group (Pz: n = 21) Control group (Pz: n = 40) |
2 | Liu et al. [59] | 2021 | To evaluate the use of MixR technology using OST-HMDs during TPED. | MR | To navigate the four procedures of marking, needle insertion, foraminoplasty, and positioning of the working sheath. | Mimics software version 20.0 (Interactive Medical Image Control System, Materialise, Leuven, Belgium) Scene Editing System (Midivi, Changzhou, China, https://www.midivi.cn (accessed on 16 February 2021) MixR system (Midivi, Changzhou, China) | Patients treated with MixR-assisted TPED through OST-HMDs (n = 44) were compared with matched patients treated with conventional TPED (n = 43). | |
3 | Eom et al. [60] | 2022 | To present an AR assisted surgical guidance system that aims to improve the accuracy of catheter placement in ventriculostomy. | AR | AR-assisted surgical guidance | n.d. | On phantom model | |
4 | Kitagawa et al. [61] | 2022 | To assess the safety and efficacy of laparoscopic cholecystectomy using a holography-guided navigation system as an intraoperative support image. | MR | Intraoperative imaging support | HoloeyesMD system (Holoeyes, Inc., Tokyo, Japan) | (Pz: n = 27) | |
5 | Doughty et al. [62] | 2022 | To compare the perceptual accuracy of several visualization paradigms involving an adjacent monitor, or the Microsoft HoloLens 2 OST-HMD, in a targeted task and to assess the feasibility of displaying imaging-derived virtual models aligned with the injured porcine heart. | AR | Display of virtual models for guidance | Unity (https://unity.com/) (accessed on 22 September 2022). Eigen (https://eigen.tuxfamily.org/) (accessed on 22 September 2022) ArUco library | On MRI-based anatomical models, aligned with the surgically exposed heart in a motion-arrested open-chest porcine model. | |
6 | Torabinia et al. [63] | 2022 | To present the use of a mixed reality headset (i.e., Microsoft HoloLens 2), as a tool for intra-procedural image-guidance during a mock myomectomy of an ex vivo animal uterus. | MR | Intra-procedural image guidance | Materialize Mimics Research software 21.0 SolidWorks 3D Viewer app | On custom-made uterine fibroid animal model | |
7 | Gsaxner et al. [64] | 2022 | To present an AR-SNS for a commercial OST-HMD, the HoloLens 2. | AR | Tracking | n.d. | On 3D-printed patient Phantom | |
8 | Garciía-sevilla et al. [65] | 2022 | To propose to use augmented reality to guide and verify PSIs placement in pelvic tumor resections | AR | Surgical guided Navigation | Unity platform (version 2019.3) Vuforia development kit | On plastic 3D-printed phantom | |
9 | Amiras et al. [66] | 2021 | To present a simulator for CT-guided biopsies with haptic feedback using the HoloLens 2 and a bespoke software application. | AR | Real-time 3D mapping and tracking | HoloLens application (Microsoft Visual Studio 2019, the DirectX SDK, and the ChArUco implementation in OpenCV) | n = 16 users (CTR) trialled the application on 3D model of a torso | |
10 | Park et al. [67] | 2020 | To describe the design of a 3D AR-assisted navigation system using the next-generation HoloLens 2 headset device. | AR | 3D guidance to assist CT-guided targeting. | Unity 2019.2.21 Mixed Reality Toolkit Foundation 2.3.0 Vuforia 9.0.12 | A prospective trial was performed assessing CT-guided lesion targeting on an abdominal phantom with and without AR guidance using HoloLens 2. (HC: n = 8) | |
11 | Benmahdjoub et al. [68] | 2021 | To investigate the effect of instrument visualization/non-visualization on alignment tasks, and to compare it with virtual extensions approach which augments the realistic representation of the instrument with simple 3D objects. | AR | AR device | Unity | (HC: n = 18 volunteers) | |
12 | Benmahdjoub et al. [69] | 2022 | To develop and assess a generic approach which aligns an AR device, such as the HoloLens 2, with existing navigation systems. | AR | AR device and tracking system | Vuforia 2020 MevisLab 2020 | (HC: n = 10 volunteers) | |
13 | Farshad et al. [70] | 2021 | To prove operator independent reliability and accuracy of both AR assisted pedicle screw navigation and AR assisted rod bending in a cadaver setting. | AR | AR-based surgical navigation | Mimics 19.0, Materialise NV, Leuven, Belgium Preoperative planning software (CASPA, University Hospital Balgrist, Zurich, Switzerland) | Experiments performed in human cadavers (HC: n = 2 biomedical engineers) | |
14 | Doughty et al. [71] | 2021 | To present SurgeonAssist-Net: a lightweight framework making action-and-workflow-driven virtual assistance, for a set of predefined surgical tasks, accessible to commercially available OST-HMDs. | AR | Surgical Guidance | PyTorch | Online simulated surgical scenario and proprietary dataset for training the SurgeonAssist-Net framework | |
15 | Nagayo et al. [72] | 2022 | To evaluate the effectiveness and usability of the suture training system for novices to learn a suture skill in open surgery, subcuticular interrupted suture, in comparison with the existing self-training system which uses instructional videos. | AR | AR training | n.d. | (HC: n = 43 medical students) | |
16 | Nagayo et al. [73] | 2021 | To develop a new suture training system for open surgery that provides trainees with the three-dimensional information of exemplary procedures performed by experts and allows them to observe and imitate the procedures during self-practice. | AR | A 3D replication system of surgical procedures | Vuforia Engine (PTC, Inc., Boston, MA), Unity (Unity Technologies, San Francisco, CA), MRTK (Microsoft, Inc.). | (HC: n = 2) | |
17 | von Haxthausen [74] | 2021 | To propose an approach to automatically register a hologram to the according RWO. | AR | Visual guidance | Unity 2019.4.15f1 | To quantify the displacements between certain known positions between the virtual object and the RWO on torso phantom. | |
18 | Wierzbicki et al. [75] | 2022 | To investigate the potential of a combination of 3D mixed-reality visualization of medical images using CarnaLife Holo system as a supporting tool for innovative, minimally invasive Surgery (MIS)/irreversible electroporation (IRA)/ microwave ablation (MWA)/for advanced gastrointestinal tumors. | MR | Mixed Reality Consultation | CarnaLife Holo | (Pz: n = 8) | |
19 | Brunzini et al. [76] | 2022 | The proposed work aims to develop and test an AR application for different maxillofacial surgeries. | AR | AR surgical guides | Unity 2020.1.17f1 Visual Studio 2019 | Preliminary laboratory validation (HC: n = 7) | |
20 | Thabit et al. [77] | 2022 | To develop an AR-based system to visualize cranial sutures, and to assess the accuracy and usability of using AR-based navigation for surgical guidance in minimally invasive spring-assisted craniectomy. | AR | AR-based navigation | Vuforia (version 9.3, https://developer.vuforia.com/) (accessed on 22 September 2022) | (HC: n = 20) | |
21 | Cercenelli et al. [78] | 2022 | To describe the AR-based protocol for assisting skin paddle harvesting in osteomyocutaneous fibular flap reconstructive procedure, usable both with a handheld device, such as a tablet, and with a HMD, such as Microsoft HoloLens 2 smart glasses. | AR | Unity 3D software (Unity Technologies, San Francisco, CA, USA) Vuforia Engine package, PTC, Inc., Boston, MA, USA | Experimental tests on phantom | ||
22 | Felix et al. [79] | 2022 | To determine the accuracy of pedicle screw placement using VisAR for open spine and MISS procedures. | AR | AR guidance | VisAR (Novarad, Provo, UT) | 7 cadavers were instrumented with 124 thoracolumbar pedicle screws using VisAR augmented reality/guidance. | |
23 | Tu et al. [80] | 2021 | To develop an augmented reality-based navigation system for distal interlocking of intramedullary nail using Microsoft HoloLens 2 | AR | AR-based navigation system | Atamai Image Guided Surgery (AIGS) toolkit (https://github.com/dgobbi/AIGS) (accessed on 22 September 2022) Unity and C# Mixed Reality Toolkit (MRTK) | Phantom experiment (HC: n = 1 senior orthopedic surgeon) | |
24 | Zhou et al. [81] | 2022 | To present a mixed reality surgical navigation system for glioma resection | MR | MR device (Surgical Navigation & Spatial Markers) | n.d. | Phantom experiments in an ideal environment in an operating room conducted by experienced surgeons (n = 20) Clinical trial (Pz: n = 16) | |
25 | Ivanov et al. [82] | 2021 | To develop an approach that would allow surgeons to conduct operations using MR smart glasses MS HoloLens 2 on a large scale, reducing the preparation time required for the procedure and without having to create custom solutions for each patient. | MR | Visualization | Unity Vuforia SDK | 3 clinical cases:
| |
26 | Heinrich et al. [83] | 2022 | To compare three state-of-the-art navigation concepts displayed by an optical see-through head-mounted display and a stereoscopic projection system. | AR | Visualization | Unity Vuforia AR SDK (PTC Inc, USA). | (HC: n = 24) | |
27 | Morita et al. [84] | 2022 | To develop and assess the accuracy of a MR needle guidance application on smartglasses. | MR | MR needle guidance | Unity 2019.4.9 MR toolkit (MRTK v2.4.0, Microsoft) MR Needle Guide | Phantom experiment: the needle placement errors from 12 different entry points in a phantom by 7 operators (HC) were compared between the MR guidance and conventional methods | |
28 | Mitani et. al. [85] | 2021 | To use a case-specific 3D hologram for tumor resection in otolaryngology, show the proof of concept. | MR | See-through head mount displays | ZIOSTATION Holoeyes XR system (Holoeyes Inc., Tokyo, Japan) | HDMs experience evaluation using 1uestionnaire: (HC: n = 18) | |
29 | Kosmyna et al. [95] | 2021 | To integrate an EEG-BCI system with an AR headset, design a simple 3D game and couple the prototype with a real-time attention classifier. | AR | EEG-based BCI | Unity 3D | (HC: n = 14) | |
30 | Kosmyna et al. [94] | 2020 | To propose a prototype which combines an existing AR headset, the Microsoft HoloLens 2, with EEG BCI system based on CVSA—a process of focusing attention on different regions of the visual field without overt eye movements. | Human computer interaction and AR-BCI systems integration | AR | EEG-based BCI | Unity 3D | (HC: n = 14) |
31 | Wolf et al. [96] | 2021 | To analyze hand-eye coordination in real-time to predict hand actions during target selection and warn users of potential errors before they occur. | AR | AR-Supported Manual Tasks | Unity’s 3D Game engine (2019.4.14f1) Mixed Reality Toolkit (MRTK 2.4.0). | Study 1: patterns in hand-eye coordination (HC: n = 11) Study 2: validating closed-loop user support (HC: n = 12) | |
32 | Wolf et al. [97] | 2022 | To develop a holographic AR mirror system for investigating presence, avatar embodiment, and body weight perception in AR. | AR | Holographic AR mirror system | Unity 2020.3.11f1 LTS | (HC: n = 27) | |
33 | Koop et al. [57] | 2022 | To determine whether the data derived from the HoloLens 2 characterizing lower extremity function during continuous walking and the TUG were equivalent to the outcomes derived using the gold standard MoCap system. | AR | Motion and biomechanical outcomes capture system | n.d. | (HC: n = 66) | |
34 | Held et al. [98] | 2020 | (1) To investigate manipulation of the gait pattern of persons who have had a stroke based on virtual augmentation during overground walking compared to walking without AR performance feedback (2) To investigate the usability of the AR system. | Gait analysis and Rehabilitation | AR | AR parkour course visual system | n.d. | (Pz: n = 1) |
35 | Wolf et al. [99] | 2021 | The present study investigates the potential benefits of AR-based, contextual instructions for ECMO cannulation training as compared to instructions used during conventional training at a university hospital. | AR | AR guide system | Unity 3D Game Engine (Unity Technologies, San Francisco, California). | Comparison between conventional and AR-based instructions for ECMO cannulation training (HC: n = 21) | |
36 | Mill et al. [100] | 2021 | To explore the feasibility of using a wearable headset to live stream teaching ward rounds to remotely based medical students. | Medical Training/Virtual teaching/Tele-mentoring and Tele-consulting systems | AR | Live streamed and remote teaching | Microsoft Teams | Live streamed teaching (HC: n = 53) |
37 | Levy et al. [101] | 2021 | To investigate the value and acceptability of using the Microsoft HoloLens 2 MR headset in a COVID-19 renal medicine ward. | MR | (HC: n = 16: 9 patients and 7 staff) | |||
38 | Sivananthan et al. [102] | 2022 | To assess the feasibility of using a MR headset to deliver remote bedside teaching. | MR | (HC: n = 24: 19 junior doctors and 4 specialist trainees) | |||
39 | Rafi et al. [103] | 2021 | To utilize a new AR technology (the Microsoft HoloLens 2) to deliver students a remote bedside teaching experience. | AR | (HC: n = 30: students) | |||
40 | Dolega- Dolegowski et al. [104] | 2022 | To describe the development of a Microsoft HoloLens 2-based application enabling 3D display of the internal anatomy of dental roots for facilitation of learning process. | AR | AR system | Autodesk Maya Unity software | (HC: n = 12: 6 Dental students 6 Dentists) | |
41 | Bui et al. [105] | 2022 | To evaluate the usability of AR technology in tele-mentorship for managing clinical scenarios. | AR | Entirely hands-free operations, real-time annotations in 3D space, and document sharing | n.d. | (HC: n = 24: 4 mentors 12 mentees) | |
42 | Mentis et al. [107] | 2022 | To introduce the use of AR HMD for remote instruction in healthcare and present the challenges author’s team has faced in achieving this application in two contexts: surgical telementoring and paramedic teleconsulting. | AR | Tele-mentoring and tele-consulting | Dynamics 365 Remote Assist https://dynamics.microsoft.com/it-it/mixed-reality/remote-assist/ (accessed on 22 September 2022) Microsoft Teams | n.d. | |
43 | Bala et al. [106] | 2021 | To conduct a proof-of- Concept study at a hospital using mixed reality technology (HoloLens 2™) to deliver a remote access teaching ward round. | MR | Live-streaming, remote access, interactive teaching ward round for medical students. | Dynamics 365 Remote Assist https://dynamics.microsoft.com/it-it/mixed-reality/remote-assist/ (accessed on 22 September 2022) Microsoft Teams | (HC: n = 11) (Pz: n = 2) | |
44 | Onishi et al. [108] | 2022 | To propose a combined gaze and breathing inputs system | MR | Gaze pointing function | Unity version 2020.2.2f1 Holographic Remoting Player 2.2.1 | (HC: n = 10) | |
45 | Johnson et al. [109] | 2021 | To develop and preliminarily test a radiotherapy system for patient posture correction and alignment using MixR visualization. | Other applications | MR | Live and visual reference system, enabling real-time feedback and on-line patient posture correction and alignment | Unity v2019.2.21f1 (Unity Technologies, San Francisco, CA) Mixed Realty Toolkit v2.4 (MRTK2.4) Visual Studio v2019 (Microsoft, Redmond, WA) 3D Slicer (www.slicer.org) (accessed on 22 September 2022) Vuforia SDK v9.2.8 https://developer.vuforia.com/ (accessed on 22 September 2022) | Preliminary estimation of registration accuracy (Phantom testing) |
46 | Kurazume et al. [110] | 2022 | To presents a new prototype (HEARTS 2) consisting of Microsoft HoloLens 2 as well as realistic and animated CG models of older women. | AR | AR training device | Mixed Reality Toolkit v2 | 4 experiments:
| |
47 | Matyash et al. [111] | 2021 | To investigate the accuracy and precision of the HoloLens 2 position finding capabilities, quantify the pose repeatability and the deviation of the device from a known trajectory. | AR | Position and motion tracking | Unity Visual Studio 2019 | Measurements of pose repeatability and path deviation during motion. |
5. Discussion
6. Conclusions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Demystifying the Virtual Reality Landscape. Available online: https://www.intel.com/content/www/us/en/tech-tips-and-tricks/virtual-reality-vs-augmented-reality.html (accessed on 22 September 2021).
- Makhataeva, Z.; Varol, H.A. Augmented reality for robotics: A review. Robotics 2020, 9, 21. [Google Scholar] [CrossRef] [Green Version]
- Hu, H.Z.; Feng, X.B.; Shao, Z.W.; Xie, M.; Xu, S.; Wu, X.H.; Ye, Z.W. Application and Prospect of Mixed Reality Technology in Medical Field. Curr. Med. Sci. 2019, 39, 1–6. [Google Scholar] [CrossRef]
- Morimoto, T.; Kobayashi, T.; Hirata, H.; Otani, K.; Sugimoto, M.; Tsukamoto, M.; Yoshihara, T.; Ueno, M.; Mawatari, M. XR (Extended Reality: Virtual Reality, Augmented Reality, Mixed Reality) Technology in Spine Medicine: Status Quo and Quo Vadis. J. Clin. Med. 2022, 11, 470. [Google Scholar] [CrossRef] [PubMed]
- Morimoto, T.; Hirata, H.; Ueno, M.; Fukumori, N.; Sakai, T.; Sugimoto, M.; Kobayashi, T.; Tsukamoto, M.; Yoshihara, T.; Toda, Y.; et al. Digital Transformation Will Change Medical Education and Rehabilitation in Spine Surgery. Medicina 2022, 58, 508. [Google Scholar] [CrossRef] [PubMed]
- Liu, Y.; Dong, H.; Zhang, L.; El Saddik, A. Technical evaluation of HoloLens for multimedia: A first look. IEEE Multimed. 2018, 25, 8–18. [Google Scholar] [CrossRef]
- Microsoft HoloLens Docs. Available online: https://docs.microsoft.com/en-us/hololens/ (accessed on 22 September 2022).
- Microsoft HoloLens (1st Gen) Docs. Available online: https://docs.microsoft.com/en-us/hololens/hololens1-basic-usage (accessed on 22 September 2022).
- Microsoft HoloLens2 Docs. Available online: https://www.microsoft.com/it-it/hololens (accessed on 22 September 2022).
- Microsoft HoloLens vs Microsoft HoloLens 2. Available online: https://versus.com/en/microsoft-hololens-vs-microsoft-hololens-2#group_features (accessed on 22 September 2022).
- Gasmi, A.; Benlamri, R. Augmented reality, virtual reality and new age technologies demand escalates amid COVID-19. In Novel AI and Data Science Advancements for Sustainability in the Era of COVID-19; Academic Press: Cambridge, MA, USA, 2022; pp. 89–111. [Google Scholar]
- Cartucho, J.; Shapira, D.; Ashrafian, H.; Giannarou, S. Multimodal mixed reality visualisation for intraoperative surgical guidance. Int. J. Comput. Assist. Radiol. Surg. 2020, 15, 819–826. [Google Scholar] [CrossRef] [Green Version]
- Zafar, S.; Zachar, J.J. Evaluation of HoloHuman augmented reality application as a novel educational tool in dentistry. Eur. J. Dent. Educ. 2020, 24, 259–265. [Google Scholar] [CrossRef]
- Hanna, M.G.; Ahmed, I.; Nine, J.; Prajapati, S.; Pantanowitz, L. Augmented reality technology using microsoft hololens in anatomic pathology. Arch. Pathol. Lab. Med. 2018, 142, 638–644. [Google Scholar] [CrossRef] [Green Version]
- Moher, D.; Liberati, A.; Tetzlaff, J.; Altman, D.G. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. BMJ 2009, 339, b2535. [Google Scholar] [CrossRef] [Green Version]
- Zhu, T.; Jiang, S.; Yang, Z.; Zhou, Z.; Li, Y.; Ma, S.; Zhuo, J. A neuroendoscopic navigation system based on dual-mode augmented reality for minimally invasive surgical treatment of hypertensive intracerebral hemorrhage. Comput. Biol. Med. 2022, 140, 105091. [Google Scholar] [CrossRef] [PubMed]
- Wolf, E.; Dollinger, N.; Mal, D.; Wienrich, C.; Botsch, M.; Latoschik, M.E. Body Weight Perception of Females using Photorealistic Avatars in Virtual and Augmented Reality. In Proceedings of the 2020 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2020, Recife/Porto de Galinhas, Brazil, 9–13 November 2020; pp. 462–473. [Google Scholar]
- van Lopik, K.; Sinclair, M.; Sharpe, R.; Conway, P.; West, A. Developing augmented reality capabilities for industry 4.0 small enterprises: Lessons learnt from a content authoring case study. Comput. Ind. 2020, 117, 103208. [Google Scholar] [CrossRef]
- Velazco-Garcia, J.D.; Navkar, N.V.; Balakrishnan, S.; Younes, G.; Abi-Nahed, J.; Al-Rumaihi, K.; Darweesh, A.; Elakkad, M.S.M.; Al-Ansari, A.; Christoforou, E.G.; et al. Evaluation of how users interface with holographic augmented reality surgical scenes: Interactive planning MR-Guided prostate biopsies. Int. J. Med. Robot. Comput. Assist. Surg. 2021, 17, e2290. [Google Scholar] [CrossRef] [PubMed]
- Pezzera, M.; Chitti, E.; Borghese, N.A. MIRARTS: A mixed reality application to support postural rehabilitation. In Proceedings of the 2020 IEEE 8th International Conference on Serious Games and Applications for Health, SeGAH 2020, Vancouver, BC, Canada, 12–14 August 2020. [Google Scholar]
- Kumar, K.; Groom, K.; Martin, L.; Russell, G.K.; Elkin, S.L. Educational opportunities for postgraduate medical trainees during the COVID-19 pandemic: Deriving value from old, new and emerging ways of learning. Postgrad. Med. J. 2022, 98, 328–330. [Google Scholar] [CrossRef] [PubMed]
- Yamazaki, A.; Ito, T.; Sugimoto, M.; Yoshida, S.; Honda, K.; Kawashima, Y.; Fujikawa, T.; Fujii, Y.; Tsutsumi, T. Patient-specific virtual and mixed reality for immersive, experiential anatomy education and for surgical planning in temporal bone surgery. Auris Nasus Larynx 2021, 48, 1081–1091. [Google Scholar] [CrossRef]
- Koyachi, M.; Sugahara, K.; Odaka, K.; Matsunaga, S.; Abe, S.; Sugimoto, M.; Katakura, A. Accuracy of Le Fort I osteotomy with combined computer-aided design/computer-aided manufacturing technology and mixed reality. Int. J. Oral Maxillofac. Surg. 2021, 50, 782–790. [Google Scholar] [CrossRef] [PubMed]
- Sugahara, K.; Koyachi, M.; Koyama, Y.; Sugimoto, M.; Matsunaga, S.; Odaka, K.; Abe, S.; Katakura, A. Mixed reality and three dimensional printed models for resection of maxillary tumor: A case report. Quant Imaging Med. Surg. 2021, 11, 2187–2194. [Google Scholar] [CrossRef] [PubMed]
- Aoki, T.; Koizumi, T.; Sugimoto, M.; Murakami, M. Holography-guided percutaneous puncture technique for selective near-infrared fluorescence-guided laparoscopic liver resection using mixed-reality wearable spatial computer. Surg. Oncol. 2020, 35, 476–477. [Google Scholar] [CrossRef] [PubMed]
- Kostov, G.; Wolfartsberger, J. Designing a Framework for Collaborative Mixed Reality Training. Procedia Comput. Sci. 2022, 200, 896–903. [Google Scholar] [CrossRef]
- Sato, Y.; Sugimoto, M.; Tanaka, Y.; Suetsugu, T.; Imai, T.; Hatanaka, Y.; Matsuhashi, N.; Takahashi, T.; Yamaguchi, K.; Yoshida, K. Holographic image-guided thoracoscopic surgery: Possibility of usefulness for esophageal cancer patients with abnormal artery. Esophagus 2020, 17, 508–511. [Google Scholar] [CrossRef] [PubMed]
- Yoshida, S.; Sugimoto, M.; Fukuda, S.; Taniguchi, N.; Saito, K.; Fujii, Y. Mixed reality computed tomography-based surgical planning for partial nephrectomy using a head-mounted holographic computer. Int. J. Urol. 2019, 26, 681–682. [Google Scholar] [CrossRef]
- Shimada, M.; Kurihara, K.; Tsujii, T. Prototype of an Augmented Reality System to Support Animal Surgery using HoloLens 2. In Proceedings of the LifeTech 2022—IEEE 4th Global Conference on Life Sciences and Technologies, Osaka, Japan, 7–9 March 2022; pp. 335–337. [Google Scholar]
- Matsuhashi, K.; Kanamoto, T.; Kurokawa, A. Thermal model and countermeasures for future smart glasses. Sensors 2020, 20, 1446. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Vaz De Carvalho, C. Virtual Experiential Learning in Engineering Education. In Proceedings of the Frontiers in Education Conference, FIE, Covington, KY, USA, 16–19 October 2019. [Google Scholar]
- Hammady, R.; Ma, M.; Strathearn, C. User experience design for mixed reality: A case study of HoloLens in museum. Int. J. Technol. Mark. 2019, 13, 354–375. [Google Scholar] [CrossRef] [Green Version]
- Walko, C.; Maibach, M.J. Flying a helicopter with the HoloLens as head-mounted display. Opt. Eng. 2021, 60, 103103. [Google Scholar] [CrossRef]
- Dan, Y.; Shen, Z.; Xiao, J.; Zhu, Y.; Huang, L.; Zhou, J. HoloDesigner: A mixed reality tool for on-site design. Autom. Constr. 2021, 129, 103808. [Google Scholar] [CrossRef]
- Hertel, J.; Steinicke, F. Augmented reality for maritime navigation assistance—Egocentric depth perception in large distance outdoor environments. In Proceedings of the 2021 IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2021, Lisboa, Portugal, 27 March–1 April 2021; pp. 122–130. [Google Scholar]
- Harborth, D.; Kümpers, K. Intelligence augmentation: Rethinking the future of work by leveraging human performance and abilities. Virtual Real. 2021, 26, 849–870. [Google Scholar] [CrossRef]
- de Boeck, M.; Vaes, K. Structuring human augmentation within product design. Proc. Des. Soc. 2021, 1, 2731–2740. [Google Scholar] [CrossRef]
- Shao, Q.; Sniffen, A.; Blanchet, J.; Hillis, M.E.; Shi, X.; Haris, T.K.; Liu, J.; Lamberton, J.; Malzkuhn, M.; Quandt, L.C.; et al. Teaching American Sign Language in Mixed Reality. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2020, 4, 152. [Google Scholar] [CrossRef]
- Jin, Y.; Ma, M.; Liu, Y. Interactive Narrative in Augmented Reality: An Extended Reality of the Holocaust. In Virtual, Augmented and Mixed Reality. Industrial and Everyday Life Applications; Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Cham, Switzerland, 2020; pp. 249–269. [Google Scholar]
- Nguyen, V.; Rupavatharam, S.; Liu, L.; Howard, R.; Gruteser, M. HandSense: Capacitive coupling-based dynamic, micro finger gesture recognition. In Proceedings of the SenSys 2019—p17th Conference on Embedded Networked Sensor Systems, New York, NY, USA, 10–13 November 2019; pp. 285–297. [Google Scholar]
- Lopez, M.A.; Terron, S.; Lombardo, J.M.; Gonzalez-Crespo, R. Towards a solution to create, test and publish mixed reality experiences for occupational safety and health learning: Training-MR. Int. J. Interact. Multimed. Artif. Intell. 2021, 7, 212–223. [Google Scholar] [CrossRef]
- Moghaddam, M.; Wilson, N.C.; Modestino, A.S.; Jona, K.; Marsella, S.C. Exploring augmented reality for worker assistance versus training. Adv. Eng. Inform. 2021, 50, 101410. [Google Scholar] [CrossRef]
- Maier, W.; Rothmund, J.; Möhring, H.-C.; Dang, P.-D.; Hoffarth, E.; Zinn, B.; Wyrwal, M. Experiencing the structure and features of a machine tool with mixed reality. Procedia CIRP 2022, 106, 244–249. [Google Scholar] [CrossRef]
- De Paolis, L.T.; De Luca, V. The effects of touchless interaction on usability and sense of presence in a virtual environment. Virtual Real. 2022. [Google Scholar] [CrossRef]
- Liao, H.; Dong, W.; Zhan, Z. Identifying map users with eye movement data from map-based spatial tasks: User privacy concerns. Cartogr. Geogr. Inf. Sci. 2022, 49, 50–69. [Google Scholar] [CrossRef]
- Nowak, A.; Zhang, Y.; Romanowski, A.; Fjeld, M. Augmented Reality with Industrial Process Tomography: To Support Complex Data Analysis in 3D Space. In Proceedings of the UbiComp/ISWC 2021—Adjunct Proceedings of the 2021 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2021 ACM International Symposium on Wearable Computers, Virtual, 21–26 September 2021; pp. 56–58. [Google Scholar]
- Woodward, J.; Alemu, F.; López Adames, N.E.; Anthony, L.; Yip, J.C.; Ruiz, J. “It Would Be Cool to Get Stampeded by Dinosaurs”: Analyzing Children’s Conceptual Model of AR Headsets through Co-Design. In Proceedings of the Conference on Human Factors in Computing Systems, New Orleans, LA, USA, 29 April–5 May 2022. [Google Scholar]
- Cetinsaya, B.; Neumann, C.; Reiners, D. Using Direct Volume Rendering for Augmented Reality in Resource-constrained Platforms. In Proceedings of the 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops, VRW 2022, Christchurch, New Zealand, 12–16 March 2022; pp. 768–769. [Google Scholar]
- Nishi, K.; Fujibuchi, T.; Yoshinaga, T. Development and evaluation of the effectiveness of educational material for radiological protection that uses augmented reality and virtual reality to visualise the behaviour of scattered radiation. J. Radiol. Prot. 2022, 42, 011506. [Google Scholar] [CrossRef]
- Ito, K.; Sugimoto, M.; Tsunoyama, T.; Nagao, T.; Kondo, H.; Nakazawa, K.; Tomonaga, A.; Miyake, Y.; Sakamoto, T. A trauma patient care simulation using extended reality technology in the hybrid emergency room system. J. Trauma Acute Care Surg. 2021, 90, e108–e112. [Google Scholar] [CrossRef]
- Iizuka, K.; Sato, Y.; Imaizumi, Y.; Mizutani, T. Potential Efficacy of Multimodal Mixed Reality in Epilepsy Surgery. Oper. Neurosurg. 2021, 20, 276–281. [Google Scholar] [CrossRef]
- Doughty, M.; Ghugre, N.R.; Wright, G.A. Augmenting Performance: A Systematic Review of Optical See-Through Head-Mounted Displays in Surgery. J. Imaging 2022, 8, 203. [Google Scholar] [CrossRef]
- Glass Enterprise Edition 2. Available online: https://www.google.com/glass/tech-specs/ (accessed on 22 September 2022).
- Magic Leap 1. Available online: https://www.magicleap.com/device (accessed on 22 September 2022).
- Magic Leap 2. Available online: https://ml1-developer.magicleap.com/en-us/home (accessed on 22 September 2022).
- Birlo, M.; Edwards, P.J.E.; Clarkson, M.; Stoyanov, D. Utility of optical see-through head mounted displays in augmented reality-assisted surgery: A systematic review. Med. Image Anal. 2022, 77, 102361. [Google Scholar] [CrossRef]
- Koop, M.M.; Rosenfeldt, A.B.; Owen, K.; Penko, A.L.; Streicher, M.C.; Albright, A.; Alberts, J.L. The Microsoft HoloLens 2 Provides Accurate Measures of Gait, Turning, and Functional Mobility in Healthy Adults. Sensors 2022, 22, 2009. [Google Scholar] [CrossRef]
- Wang, L.; Zhao, Z.; Wang, G.; Zhou, J.; Zhu, H.; Guo, H.; Huang, H.; Yu, M.; Zhu, G.; Li, N.; et al. Application of a three-dimensional visualization model in intraoperative guidance of percutaneous nephrolithotomy. Int. J. Urol. 2022, 29, 838–844. [Google Scholar] [CrossRef]
- Liu, X.; Sun, J.; Zheng, M.; Cui, X. Application of Mixed Reality Using Optical See-Through Head-Mounted Displays in Transforaminal Percutaneous Endoscopic Lumbar Discectomy. BioMed Res. Int. 2021, 2021, 9717184. [Google Scholar] [CrossRef]
- Eom, S.; Kim, S.; Rahimpour, S.; Gorlatova, M. AR-Assisted Surgical Guidance System for Ventriculostomy. In Proceedings of the 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops, VRW 2022, Christchurch, New Zealand, 12–16 March 2022; pp. 402–405. [Google Scholar]
- Kitagawa, M.; Sugimoto, M.; Haruta, H.; Umezawa, A.; Kurokawa, Y. Intraoperative holography navigation using a mixed-reality wearable computer during laparoscopic cholecystectomy. Surgery 2022, 171, 1006–1013. [Google Scholar] [CrossRef] [PubMed]
- Doughty, M.; Ghugre, N.R. Head-Mounted Display-Based Augmented Reality for Image-Guided Media Delivery to the Heart: A Preliminary Investigation of Perceptual Accuracy. J. Imaging 2022, 8, 33. [Google Scholar] [CrossRef]
- Torabinia, M.; Caprio, A.; Fenster, T.B.; Mosadegh, B. Single Evaluation of Use of a Mixed Reality Headset for Intra-Procedural Image-Guidance during a Mock Laparoscopic Myomectomy on an Ex-Vivo Fibroid Model. Appl. Sci. 2022, 12, 563. [Google Scholar] [CrossRef]
- Gsaxner, C.; Pepe, A.; Schmalstieg, D.; Li, J.; Egger, J. Inside-out instrument tracking for surgical navigation in augmented reality. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology, VRST, Osaka, Japan, 8–10 December 2021. [Google Scholar]
- García-sevilla, M.; Moreta-martinez, R.; García-mato, D.; Pose-diez-de-la-lastra, A.; Pérez-mañanes, R.; Calvo-haro, J.A.; Pascau, J. Augmented reality as a tool to guide psi placement in pelvic tumor resections. Sensors 2021, 21, 7824. [Google Scholar] [CrossRef]
- Amiras, D.; Hurkxkens, T.J.; Figueroa, D.; Pratt, P.J.; Pitrola, B.; Watura, C.; Rostampour, S.; Shimshon, G.J.; Hamady, M. Augmented reality simulator for CT-guided interventions. Eur. Radiol. 2021, 31, 8897–8902. [Google Scholar] [CrossRef] [PubMed]
- Park, B.J.; Hunt, S.J.; Nadolski, G.J.; Gade, T.P. Augmented reality improves procedural efficiency and reduces radiation dose for CT-guided lesion targeting: A phantom study using HoloLens 2. Sci. Rep. 2020, 10, 18620. [Google Scholar] [CrossRef] [PubMed]
- Benmahdjoub, M.; Niessen, W.J.; Wolvius, E.B.; Van Walsum, T. Virtual extensions improve perception-based instrument alignment using optical see-through devices. IEEE Trans. Vis. Comput. Graph. 2021, 27, 4332–4341. [Google Scholar] [CrossRef]
- Benmahdjoub, M.; Niessen, W.J.; Wolvius, E.B.; Walsum, T. Multimodal markers for technology-independent integration of augmented reality devices and surgical navigation systems. Virtual Real 2022. [Google Scholar] [CrossRef]
- Farshad, M.; Spirig, J.M.; Suter, D.; Hoch, A.; Burkhard, M.D.; Liebmann, F.; Farshad-Amacker, N.A.; Fürnstahl, P. Operator independent reliability of direct augmented reality navigated pedicle screw placement and rod bending. N. Am. Spine Soc. J. 2021, 8, 100084. [Google Scholar] [CrossRef]
- Doughty, M.; Singh, K.; Ghugre, N.R. SurgeonAssist-Net: Towards Context-Aware Head-Mounted Display-Based Augmented Reality for Surgical Guidance. In Medical Image Computing and Computer Assisted Intervention; Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Cham, Switzerland, 2021; pp. 667–677. [Google Scholar]
- Nagayo, Y.; Saito, T.; Oyama, H. Augmented reality self-training system for suturing in open surgery: A randomized controlled trial. Int. J. Surg. 2022, 102, 106650. [Google Scholar] [CrossRef] [PubMed]
- Nagayo, Y.; Saito, T.; Oyama, H. A Novel Suture Training System for Open Surgery Replicating Procedures Performed by Experts Using Augmented Reality. J. Med. Syst. 2021, 45, 60. [Google Scholar] [CrossRef]
- Haxthausen, F.V.; Chen, Y.; Ernst, F. Superimposing holograms on real world objects using HoloLens 2 and its depth camera. Curr. Dir. Biomed. Eng. 2021, 7, 20211126. [Google Scholar] [CrossRef]
- Wierzbicki, R.; Pawłowicz, M.; Job, J.; Balawender, R.; Kostarczyk, W.; Stanuch, M.; Janc, K.; Skalski, A. 3D mixed-reality visualization of medical imaging data as a supporting tool for innovative, minimally invasive surgery for gastrointestinal tumors and systemic treatment as a new path in personalized treatment of advanced cancer diseases. J. Cancer Res. Clin. Oncol. 2022, 148, 237–243. [Google Scholar] [CrossRef] [PubMed]
- Brunzini, A.; Mandolini, M.; Caragiuli, M.; Germani, M.; Mazzoli, A.; Pagnoni, M. HoloLens 2 for Maxillofacial Surgery: A Preliminary Study. In Design Tools and Methods in Industrial Engineering II; Lecture Notes in Mechanical Engineering; Springer: Cham, Switzerland, 2022; pp. 133–140. [Google Scholar]
- Thabit, A.; Benmahdjoub, M.; van Veelen, M.L.C.; Niessen, W.J.; Wolvius, E.B.; van Walsum, T. Augmented reality navigation for minimally invasive craniosynostosis surgery: A phantom study. Int. J. Comput. Assist. Radiol. Surg. 2022, 17, 1453–1460. [Google Scholar] [CrossRef]
- Cercenelli, L.; Babini, F.; Badiali, G.; Battaglia, S.; Tarsitano, A.; Marchetti, C.; Marcelli, E. Augmented Reality to Assist Skin Paddle Harvesting in Osteomyocutaneous Fibular Flap Reconstructive Surgery: A Pilot Evaluation on a 3D-Printed Leg Phantom. Front. Oncol. 2021, 11, 804748. [Google Scholar] [CrossRef] [PubMed]
- Felix, B.; Kalatar, S.B.; Moatz, B.; Hofstetter, C.; Karsy, M.; Parr, R.; Gibby, W. Augmented Reality Spine Surgery Navigation Increasing Pedicle Screw : Insertion Accuracy for Both Open and Minimally Invasive S Surgeries. Spine 2022, 47, 865–872. [Google Scholar] [CrossRef]
- Tu, P.; Gao, Y.; Lungu, A.J.; Li, D.; Wang, H.; Chen, X. Augmented reality based navigation for distal interlocking of intramedullary nails utilizing Microsoft HoloLens 2. Comput. Biol. Med. 2021, 133, 104402. [Google Scholar] [CrossRef]
- Zhou, Z.; Yang, Z.; Jiang, S.; Zhuo, J.; Zhu, T.; Ma, S. Augmented reality surgical navigation system based on the spatial drift compensation method for glioma resection surgery. Med. Phys. 2022, 49, 3963–3979. [Google Scholar] [CrossRef] [PubMed]
- Ivanov, V.M.; Krivtsov, A.M.; Strelkov, S.V.; Kalakutskiy, N.V.; Yaremenko, A.I.; Petropavlovskaya, M.Y.; Portnova, M.N.; Lukina, O.V.; Litvinov, A.P. Intraoperative use of mixed reality technology in median neck and branchial cyst excision. Future Internet 2021, 13, 214. [Google Scholar] [CrossRef]
- Heinrich, F.; Schwenderling, L.; Joeres, F.; Hansen, C. 2D versus 3D: A Comparison of Needle Navigation Concepts between Augmented Reality Display Devices. In Proceedings of the 2022 IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2022, Christchurch, New Zealand, 12–16 March 2022; pp. 260–269. [Google Scholar]
- Morita, S.; Suzuki, K.; Yamamoto, T.; Kunihara, M.; Hashimoto, H.; Ito, K.; Fujii, S.; Ohya, J.; Masamune, K.; Sakai, S. Mixed Reality Needle Guidance Application on Smartglasses Without Pre-procedural CT Image Import with Manually Matching Coordinate Systems. CardioVascular Interv. Radiol. 2022, 45, 349–356. [Google Scholar] [CrossRef] [PubMed]
- Mitani, S.; Sato, E.; Kawaguchi, N.; Sawada, S.; Sakamoto, K.; Kitani, T.; Sanada, T.; Yamada, H.; Hato, N. Case-specific three-dimensional hologram with a mixed reality technique for tumor resection in otolaryngology. Laryngoscope Investig. Otolaryngol. 2021, 6, 432–437. [Google Scholar] [CrossRef]
- Vávra, P.; Roman, J.; Zonča, P.; Ihnát, P.; Němec, M.; Kumar, J.; Habib, N.; El-Gendi, A. Recent Development of Augmented Reality in Surgery: A Review. J. Healthc. Eng. 2017, 2017, 4574172. [Google Scholar] [CrossRef] [Green Version]
- Zabcikova, M.; Koudelkova, Z.; Jasek, R.; Lorenzo Navarro, J.J. Recent advances and current trends in brain-computer interface research and their applications. Int. J. Dev. Neurosci. 2022, 82, 107–123. [Google Scholar] [CrossRef]
- van Dokkum, L.E.H.; Ward, T.; Laffont, I. Brain computer interfaces for neurorehabilitation-its current status as a rehabilitation strategy post-stroke. Ann. Phys. Rehabil. Med. 2015, 58, 3–8. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Palumbo, A.; Gramigna, V.; Calabrese, B.; Ielpo, N. Motor-imagery EEG-based BCIs in wheelchair movement and control: A systematic literature review. Sensors 2021, 21, 6285. [Google Scholar] [CrossRef] [PubMed]
- Kos’Myna, N.; Tarpin-Bernard, F. Evaluation and comparison of a multimodal combination of BCI paradigms and eye tracking with affordable consumer-grade hardware in a gaming context. IEEE Trans. Comput. Intell. AI Games 2013, 5, 150–154. [Google Scholar] [CrossRef]
- Amores, J.; Richer, R.; Zhao, N.; Maes, P.; Eskofier, B.M. Promoting relaxation using virtual reality, olfactory interfaces and wearable EEG. In Proceedings of the 2018 IEEE 15th International Conference on Wearable and Implantable Body Sensor Networks, BSN 2018, Las Vegas, NV, USA, 4–7 March 2018; pp. 98–101. [Google Scholar]
- Semertzidis, N.; Scary, M.; Andres, J.; Dwivedi, B.; Kulwe, Y.C.; Zambetta, F.; Mueller, F.F. Neo-Noumena: Augmenting Emotion Communication. In Proceedings of the Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020. [Google Scholar]
- Kohli, V.; Tripathi, U.; Chamola, V.; Rout, B.K.; Kanhere, S.S. A review on Virtual Reality and Augmented Reality use-cases of Brain Computer Interface based applications for smart cities. Microprocess. Microsyst. 2022, 88, 104392. [Google Scholar] [CrossRef]
- Kosmyna, N.; Hu, C.Y.; Wang, Y.; Wu, Q.; Scheirer, C.; Maes, P. A Pilot Study using Covert Visuospatial Attention as an EEG-based Brain Computer Interface to Enhance AR Interaction. In Proceedings of the International Symposium on Wearable Computers, ISWC, Cancun, Mexico, 12–16 September 2020; pp. 43–47. [Google Scholar]
- Kosmyna, N.; Wu, Q.; Hu, C.Y.; Wang, Y.; Scheirer, C.; Maes, P. Assessing Internal and External Attention in AR using Brain Computer Interfaces: A Pilot Study. In Proceedings of the 2021 IEEE 17th International Conference on Wearable and Implantable Body Sensor Networks, BSN 2021, Athens, Greece, 27–30 July 2021. [Google Scholar]
- Wolf, J.; Lohmeyer, Q.; Holz, C.; Meboldt, M. Gaze comes in Handy: Predicting and preventing erroneous hand actions in ar-supported manual tasks. In Proceedings of the 2021 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2021, Bari, Italy, 4–8 October 2021; pp. 166–175. [Google Scholar]
- Wolf, E.; Fiedler, M.L.; Dollinger, N.; Wienrich, C.; Latoschik, M.E. Exploring Presence, Avatar Embodiment, and Body Perception with a Holographic Augmented Reality Mirror. In Proceedings of the 2022 IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2022, Christchurch, New Zealand, 12–16 March 2022; pp. 350–359. [Google Scholar]
- Held, J.P.O.; Yu, K.; Pyles, C.; Bork, F.; Heining, S.M.; Navab, N.; Luft, A.R. Augmented reality-based rehabilitation of gait impairments: Case report. JMIR mHealth uHealth 2020, 8, e17804. [Google Scholar] [CrossRef] [PubMed]
- Wolf, J.; Wolfer, V.; Halbe, M.; Maisano, F.; Lohmeyer, Q.; Meboldt, M. Comparing the effectiveness of augmented reality-based and conventional instructions during single ECMO cannulation training. Int. J. Comput. Assist. Radiol. Surg. 2021, 16, 1171–1180. [Google Scholar] [CrossRef] [PubMed]
- Mill, T.; Parikh, S.; Allen, A.; Dart, G.; Lee, D.; Richardson, C.; Howell, K.; Lewington, A. Live streaming ward rounds using wearable technology to teach medical students: A pilot study. BMJ Simul. Technol. Enhanc. Learn. 2021, 7, 494–500. [Google Scholar] [CrossRef]
- Levy, J.B.; Kong, E.; Johnson, N.; Khetarpal, A.; Tomlinson, J.; Martin, G.F.; Tanna, A. The mixed reality medical ward round with the MS HoloLens 2: Innovation in reducing COVID-19 transmission and PPE usage. Future Healthc. J. 2021, 8, e127–e130. [Google Scholar] [CrossRef]
- Sivananthan, A.; Gueroult, A.; Zijlstra, G.; Martin, G.; Baheerathan, A.; Pratt, P.; Darzi, A.; Patel, N.; Kinross, J. Using Mixed Reality Headsets to Deliver Remote Bedside Teaching during the COVID-19 Pandemic: Feasibility Trial of HoloLens 2. JMIR Form. Res. 2022, 6, e35674. [Google Scholar] [CrossRef]
- Rafi, D.; Stackhouse, A.A.; Walls, R.; Dani, M.; Cowell, A.; Hughes, E.; Sam, A.H. A new reality: Bedside geriatric teaching in an age of remote learning. Future Healthc. J. 2021, 8, e714–e716. [Google Scholar] [CrossRef]
- Dolega-Dolegowski, D.; Proniewska, K.; Dolega-Dolegowska, M.; Pregowska, A.; Hajto-Bryk, J.; Trojak, M.; Chmiel, J.; Walecki, P.; Fudalej, P.S. Application of holography and augmented reality based technology to visualize the internal structure of the dental root—A proof of concept. Head Face Med. 2022, 18, 12. [Google Scholar] [CrossRef] [PubMed]
- Bui, D.T.; Barnett, T.; Hoang, H.; Chinthammit, W. Usability of augmented reality technology in tele-mentorship for managing clinical scenarios-A study protocol. PLoS ONE 2022, 17, e0266255. [Google Scholar] [CrossRef] [PubMed]
- Bala, L.; Kinross, J.; Martin, G.; Koizia, L.J.; Kooner, A.S.; Shimshon, G.J.; Hurkxkens, T.J.; Pratt, P.J.; Sam, A.H. A remote access mixed reality teaching ward round. Clin. Teach. 2021, 18, 386–390. [Google Scholar] [CrossRef] [PubMed]
- Mentis, H.M.; Avellino, I.; Seo, J. AR HMD for Remote Instruction in Healthcare. In Proceedings of the 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops, VRW 2022, Christchurch, New Zealand, 12–16 March 2022; pp. 437–440. [Google Scholar]
- Onishi, R.; Morisaki, T.; Suzuki, S.; Mizutani, S.; Kamigaki, T.; Fujiwara, M.; Makino, Y.; Shinoda, H. GazeBreath: Input Method Using Gaze Pointing and Breath Selection. In Proceedings of the Augmented Humans 2022, Kashiwa, Chiba, Japan, 13–15 March 2022; pp. 1–9. [Google Scholar]
- Johnson, P.B.; Jackson, A.; Saki, M.; Feldman, E.; Bradley, J. Patient posture correction and alignment using mixed reality visualization and the HoloLens 2. Med. Phys. 2022, 49, 15–22. [Google Scholar] [CrossRef]
- Kurazume, R.; Hiramatsu, T.; Kamei, M.; Inoue, D.; Kawamura, A.; Miyauchi, S.; An, Q. Development of AR training systems for Humanitude dementia care. Adv. Robot. 2022, 36, 344–358. [Google Scholar] [CrossRef]
- Matyash, I.; Kutzner, R.; Neumuth, T.; Rockstroh, M. Accuracy measurement of HoloLens2 IMUs in medical environments. Curr. Dir. Biomed. Eng. 2021, 7, 633–636. [Google Scholar] [CrossRef]
- Xu, X.; Mangina, E.; Campbell, A.G. HMD-Based Virtual and Augmented Reality in Medical Education: A Systematic Review. Front. Virtual Real. 2021, 2, 692103. [Google Scholar] [CrossRef]
- D’Amato, R.; Cutolo, F.; Badiali, G.; Carbone, M.; Lu, H.; Hogenbirk, H.; Ferrari, V. Key Ergonomics Requirements and Possible Mechanical Solutions for Augmented Reality Head-Mounted Displays in Surgery. Multimodal Technol. Interact. 2022, 6, 15. [Google Scholar] [CrossRef]
- Weech, S.; Kenny, S.; Barnett-Cowan, M. Presence and Cybersickness in Virtual Reality Are Negatively Related: A Review. Front. Psychol. 2019, 10, 158. [Google Scholar] [CrossRef] [Green Version]
- Rebenitsch, L.; Owen, C. Review on cybersickness in applications and visual displays. Virtual Real. 2016, 20, 101–125. [Google Scholar] [CrossRef]
- Hughes, C.L.; Fidopiastis, C.; Stanney, K.M.; Bailey, P.S.; Ruiz, E. The Psychometrics of Cybersickness in Augmented Reality. Front. Virtual Real. 2020, 1, 602954. [Google Scholar] [CrossRef]
- Vovk, A.; Wild, F.; Guest, W.; Kuula, T. Simulator Sickness in Augmented Reality Training Using the Microsoft HoloLens. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–26 April 2018; pp. 1–9. [Google Scholar]
- McCauley, M.E.; Sharkey, T.J. Cybersickness: Perception of self-motion in virtual environments. Presence: Teleoper. Virtual Environ. 1992, 1, 311–318. [Google Scholar]
- Moro, C.; Štromberga, Z.; Raikos, A.; Stirling, A. The effectiveness of virtual and augmented reality in health sciences and medical anatomy. Anat. Sci. Educ. 2017, 10, 549–559. [Google Scholar] [CrossRef] [Green Version]
- Saredakis, D.; Szpak, A.; Birckhead, B.; Keage, H.A.D.; Rizzo, A.; Loetscher, T. Factors Associated With Virtual Reality Sickness in Head-Mounted Displays: A Systematic Review and Meta-Analysis. Front. Hum. Neurosci. 2020, 14, 96. [Google Scholar] [CrossRef] [Green Version]
- Dilanchian, A.T.; Andringa, R.; Boot, W.R. A Pilot Study Exploring Age Differences in Presence, Workload, and Cybersickness in the Experience of Immersive Virtual Reality Environments. Front. Virtual Real. 2021, 2, 736793. [Google Scholar] [CrossRef]
- Haptics for Virtual Reality (VR) and Mixed Reality (MR). Available online: https://www.interhaptics.com/products/haptics-for-vr-and-mr (accessed on 22 September 2021).
- Alberts, J.L.; Modic, M.T.; Udeh, B.L.; Zimmerman, N.; Cherian, K.; Lu, X.; Gray, R.; Figler, R.; Russman, A.; Linder, S.M. A Technology-Enabled Concussion Care Pathway Reduces Costs and Enhances Care. Phys. Ther. 2020, 100, 136–148. [Google Scholar] [CrossRef]
Database Name | URL | Date Access |
---|---|---|
Pubmed | https://pubmed.ncbi.nlm.nih.gov/ | 19 April 2022 |
IEEEXplore | https://www.ieee.org/ | 19 April 2022 |
Science Direct | https://www.sciencedirect.com/ | 19 April 2022 |
Scopus | https://www.scopus.com/ | 19 April 2022 |
Google Glass 2 | HoloLens 1 | HoloLens 2 | Magic Leap 1 | Magic Leap 2 | |
---|---|---|---|---|---|
Specifications | |||||
Release Date | 2019 | 2016 | 2019 | 2018 | 2022 |
Price | $999 | $3000 | $3500 | $2295 | $3299 |
Status | Available | Discontinued | Available | Available | Upcoming |
Design | Glasses-like | Hat-like | Hat-like | Glasses-like | Glasses-like |
Weight | 46 g | 579 g | 566 g | 345 g | 260 g |
Battery life | 8-h | 2.5-h | 3-h | 3/3.5-h | 3.5-h continuous use 7-h sleep mode |
Interaction | Touchpad | Head, hand, voice | Head, hand, voice | Controller | Eye, controller |
Eye Tracking | No | No | Yes | Yes | Yes |
Computing | On-board | On-board | On-board | On-board | External pad |
Field of View | 30° diagonal | 30 × 17.5° | 43 × 29° | 40 × 30° | 44 × 53° |
Focal Planes | Single Fixed | Single Fixed | Single Fixed | Two Fixed | Single Fixed |
Optics | Beam Spitter | Waveguide | Waveguide | Waveguide | Waveguide |
SLAM | 6 DoF | 6 DoF | 6 DoF | 6 DoF | 6 DoF |
PRO | Super lightweight and very unobtrusive; battery life | Comfortable; easy to use; support for Microsoft platforms | Comfortable; easy to use; very elegant device high-quality materials; navigation with hand gestures and voice; excellent positional tracking | Large FoV | Largest FoV |
CONS | Intended for developers, only a few applications available natively | Small field of view; text can be difficult to read | Battery life; less suitable for industry | Price; battery life | Less suitable for use in heavy industry |
HoloLens 2 | HoloLens 1 | |||
---|---|---|---|---|
COMPUTE SPECIFICATIONS | CPU Model | Qualcomm Snapdragon 850 Compute Platfom | Intel Atom x5-Z8100P @ 1.04 Ghz | |
Core Architecture | ARM Cortex-A75 | Intel Airmont | ||
Logical CPU Cores | 8 | 4 | ||
Instruction Set | ARMv8 | 32-bit X86 | ||
Memory | 4 GB LPDDR4× DRAM | 1 GB LPDDR3 | ||
Storage | 64 GB UFS 2.1 | 64 GB | ||
HPU | Model | 2nd generation custom-built holo-graphic processing unit | 1st generation custom-built holographic processing unit | |
HPU Memory | Not specified | 1 GB LPDDR3 RAM | ||
WIRELESS CONNECTIVITY | Wifi | WiFi 5 (802.11ac 2 × 2) | WiFi 5 (802.11ac) | |
Bluetooth | Bluetooth LE 5.0 | Bluetooth 4.1 + BLE | ||
USB | USB Type-C | Micro USB 2.0 | ||
DISPLAY | Optics | See-through holographic Lenses (waveguides) | See-through holographic lenses (waveguides) | |
Resolution | 2k 3:2 light engines (screen aspect ratio) | 2 HD 16:9 light engines (screen aspect ratio) | ||
Holographic density | >2.5k radiants (light points per radian) | 2.5k radiants (light points per radian) | ||
Eye-based rendering | Display optimization for 3D eye position | Automatic pupillary distance calibration | ||
Visible FoV | 43° horizontal 29° vertical 52° diagonal | 30° horizontal 17° vertical | ||
AUDIO | Microphone array | 5 channels | 4 channels | |
SENSORS | CAMERA | Resolution | 8-MP stills, | 2.4 MP (2048 × 1152) |
Video Resolution | 1080 p30 | 1.1 MP (1408 × 792) | ||
Video Speed | 24 fps | 30 fps | ||
IMU | Accelerometer, gyroscope, magnetometer | 1 | 1 | |
AUDIO | Speakers | Built-in, spatial audio | Built-in, spatial audio |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Palumbo, A. Microsoft HoloLens 2 in Medical and Healthcare Context: State of the Art and Future Prospects. Sensors 2022, 22, 7709. https://doi.org/10.3390/s22207709
Palumbo A. Microsoft HoloLens 2 in Medical and Healthcare Context: State of the Art and Future Prospects. Sensors. 2022; 22(20):7709. https://doi.org/10.3390/s22207709
Chicago/Turabian StylePalumbo, Arrigo. 2022. "Microsoft HoloLens 2 in Medical and Healthcare Context: State of the Art and Future Prospects" Sensors 22, no. 20: 7709. https://doi.org/10.3390/s22207709
APA StylePalumbo, A. (2022). Microsoft HoloLens 2 in Medical and Healthcare Context: State of the Art and Future Prospects. Sensors, 22(20), 7709. https://doi.org/10.3390/s22207709