Evaluation of HMDs by QFD for Augmented Reality Applications in the Maxillofacial Surgery Domain
Abstract
:1. Introduction
2. Materials and Methods
2.1. Desk Research on Maxillofacial Practices
2.1.1. Oncological and Reconstructive Surgery
2.1.2. Orthognathic Surgery
2.1.3. Maxillofacial Trauma Surgery
2.2. Desk Research on Augmented Reality Technologies
- Head-up displays (HUD) are mostly used on vehicles (e.g., airplanes and cars) to give additional info without taking the driver’s eyes off the road. This kind of device has a fixed transparent screen, in the line of sight of the pilot/driver, where information is projected.
- Holographic displays use light diffraction to generate 3D objects in the real space, or to give a depth to displayed images without using 3D glasses or other tools.
- Smart glasses or wearable devices may have many aspects, ranging from glasses-type devices to more futuristic visors. Visors are also known as head-mounted displays (HMD), and they include two families: optical see-through and video see-through devices. This classification derives from the technological solution adopted to show the real-world image to the user: an optical device views reality directly through transparent displays while a video device shows the reality through camera images.
- Handheld devices are tools such as smartphones or tablets where AR is realised as video see-through by capturing the reality with the camera of the devices.
Comparison of Market HDMs
- Computing power: the horsepower of a device in terms of hardware and software capabilities.
- Display: device core specification in AR applications.
- Audio: built-in microphone and earphones.
- Sensors: device features necessary to get information from the surroundings.
- File formats: multimedia documents manageable by the device and so the kind of information the user may consult.
- Connectivity: ports and technologies to connect with other devices.
- User inputs: ways for the users to control the device.
- Power: how the device is powered.
- General features: miscellaneous information about the device.
2.3. Quality Function Deployment
- Points assignment—the non-measurable specifications, such as the CPU or the camera, will be arranged from worst to best and receive points from one to five. In the case of a feature where the interest is just to have it or not (e.g., the depth camera or the hand tracking) a simple plus/minus one point will be used.
- Adjustment—the assigned points will be tuned using, in each scenario, requirements weights.
- Standardisation—for the weighted points an upper limit of five will be forced. The points will be proportionally re-calculated. This operation is necessary to avoid distorted information when comparing the three scenarios.
- Aggregation—the standardised scores will be aggregated to summarise devices performance in each scenario.
3. Results
3.1. Scenario n. 1—Oncological and Reconstructive Surgery
- Simple and smart user interface: smooth navigation through menus and an intuitive layout for tools and information.
- Custom user interface: custom interface layout for user comfort.
- Lightweight and comfortable: the device must not hinder or weary the surgeon.
- Ample and unobstructed field of view: the device must not hinder the surgeon’s field of view. The AR is an addition to, and not a substitution for, human view.
- Real-world colour fidelity: the device must not alter the aspect of real objects.
- User feedback: using visual (colour maps, vector maps, etc.) or audio messaging for real-time surgical assistance.
- Robust voice control: the number of people in the operating room should not condition the voice command functionality.
- Robust to occlusions: the device must keep track of operation site, surgical tools, and surgeon’s hand independently from any visual hindrance.
- Robust to contrast: the device must not be influenced by any light sources (e.g., surgical lights, headlights) present in the operating room.
- Robust face tracking: the device must never lose the operation site tracking. The face is a sensitive site in case of head repositioning due to the number of joints in the head/neck area.
- Robust leg tracking: the device must never lose the tracking of the leg operation site irrespective of how much soft tissue is involved.
- Display and navigate face DICOM: the face is important as the site of the implant, then all available medical data are necessary to be displayed on surgeon’s request.
- Display and navigate leg DICOM: the leg, as the donor site for the fibula-free flap procedure, needs all available medical data to be displayed on surgeon’s request.
- Display and handle 3D models of bones: it is fundamental that bone resection models are displayed.
- Display and handle soft tissue 3D models: in oncologic and reconstructive surgery, soft tissue may be included in the collected graft.
- Display and handle non-standard tools 3D models (guides, splints, plates): tailored surgical tools must be easily managed.
- Display osteotomy cut lines: virtual guides to help the surgeon to identify the position and orientation of the resection planes.
- Choice for every 3D model the display style: solid, wireframe or hidden. Many objects are visualised in mixed reality; choosing the best display style for each one is, therefore, necessary.
- Share live images on other devices: viewing the operation site from a user’s point-of-view aids teamwork and is useful as a training method.
- Record images for training and evaluation: recorded images from the surgeon’s point-of-view are useful for training and postoperative checks.
- Device autonomy must be adequate for the required task: in battery-powered devices, the autonomy must be adequate to last the duration of the operation.
- Cordless device: many people and devices are present in the operating room; therefore, easy-to-manoeuvre solutions are preferable.
- Optical zoom: some structures and vessels are too small to be seen by the human eye; therefore, a digital zoom may alter the images.
- Add-on accessories: a modular device allows the device to be customized.
3.2. Scenario n. 2—Orthognathic Surgery
- Simple and smart user interface: smooth navigation through menus and an intuitive layout for tools and information.
- Custom user interface: custom interface layout for user comfort.
- Lightweight and comfortable: the device must not hinder or weary the surgeon.
- Ample and unobstructed field of view: the device must not hinder the surgeon’s field of view. The AR is an addition to, and not a substitution for, normal view.
- Real-world colour fidelity: the device must not alter the aspect of real objects.
- User feedback: using visual (colour maps, vector maps, etc.) or audio message for real-time surgical assistance.
- Robust voice control: the number of people in the operating room should not condition the voice command’s functionality.
- Robust to occlusions: the device must keep track of operation site, surgical tools, and surgeon’s hand independently from any visual hindrance.
- Robust to contrast: the device must not be influenced by any light sources (e.g., surgical lights, headlights) present in the operating room.
- Robust face tracking: the device must never lose the operation site tracking. The face is a sensitive site in case of head repositioning due to the number of joints in the head/neck area.
- Display and navigate face DICOM: the face is important as the site of the implant; therefore, all available medical data are necessary to be displayed at the surgeon’s request.
- Display and handle 3D models of bones: it is fundamental that bone resection models are displayed.
- Display and handle standard tools 3D models (distractors, locators): a library of standard tools used during operations must be managed.
- Display and handle non-standard tools 3D models (guides, splints, plates): tailored surgical tools must be easily managed.
- Display osteotomy cut lines: virtual guides to assist the surgeon to identify the position and orientation of the resection planes.
- Choice for every 3D model’s display style: solid, wireframe or hidden. Many objects are visualised in mixed reality; choosing the best display style for each one is, therefore, necessary.
- Share live images on other devices: viewing the operation site from the user’s point-of-view assists teamwork and is useful as a means of training.
- Record images for training and evaluation: recorded images from the surgeon’s point-of-view are useful for training and postoperative checks.
- Device autonomy must be adequate for the required task: in battery-powered devices, the autonomy must be adequate to last the duration of the operation.
- Cordless device: many people and devices are present in the operating room; therefore, easy-to-manoeuvre solutions are preferable.
- Optical zoom: some structures and vessels are too small to be seen by the human eye, while a digital zoom may alter the images.
- Add-on accessories: a modular device allows the device to be customized.
3.3. Scenario n. 3—Maxillofacial Trauma Surgery
- Simple and smart user interface: smooth navigation through menus and an intuitive layout for tools and information.
- Custom user interface: custom interface layout for user comfort.
- Lightweight and comfortable: the device must not hinder or weary the surgeon.
- Ample and unobstructed field of view: the device must not hinder the surgeon’s field of view. The AR is an addition to, and not a subtraction of human view.
- Real-world colour fidelity: the device must not alter the aspect of real objects.
- User feedback: using visual (colour maps, vector maps, etc.) or audio message for real-time surgical assistance.
- Robust voice control: the number of people in the operating room should not condition the voice command’s functionality.
- Robust to occlusions: the device must keep track of operation site, surgical tools, and surgeon’s hand independently from any visual hindrance.
- Robust to contrast: the device must not be influenced by any light sources (e.g., surgical lights, headlights) present in the operating room.
- Robust face tracking: the device must never lose the operation site tracking. The face is a sensitive site in case of head repositioning due to the number of joints in the head/neck area.
- Display and navigate face DICOM: the face is important as the site of the implant; therefore, all available medical data are necessary to be displayed at the surgeon’s request.
- Display and handle 3D models of bones: it is fundamental that bone resection models are displayed.
- Display and handle non-standard tools 3D models (guides, splints, plates): tailored surgical tools must be easily managed.
- Choice for every 3D model the display style: solid, wireframe or hidden. Many objects are visualised in mixed reality; choosing the best display style for each one is, therefore, necessary.
- Share live images on other devices: viewing the operation site from a user’s point-of-view aids teamwork and is useful as a training method.
- Record images for training and evaluation: recorded images from the surgeon’s point-of-view are useful for training and postoperative checks.
- Device autonomy must be adequate for the required task: in battery-powered devices, the autonomy must be adequate to last the duration of the operation.
- Cordless device: many people and devices are present in the operating room; therefore, easy-to-manoeuvre solutions are preferable.
- Optical zoom: some structures and vessels are too small to be seen by the human eye, while a digital zoom may alter the images.
- Add-on accessories: a modular device allows the device to be customized.
4. Discussion
5. Conclusions
- Optical zoom to look at small structures during the vascular anastomosis in oncological surgery. Optical see-through devices have too many difficulties to have a digital magnification overlapped on the scene; although it is easier for video see-through, digital magnification has resolution issues.
- Blood flux live view to verify during the surgery the proper vascularization of the implanted graft.
- Non-optical landmarks reader to overcome the need for an appropriate and clean scene are essential to not lose sight of landmarks and, consequently, the registration between reality and virtual elements.
Supplementary Materials
Author Contributions
Funding
Conflicts of Interest
References
- Lavallée, S.; Cinquin, P.; Szeliski, R.; Peria, O.; Hamadeh, A.; Champleboux, G.; Troccaz, J. Building a Hybrid Patient’s Model for Augmented Reality in Surgery: A Registration Problem. Comput. Biol. Med. 1995, 25, 149–164. [Google Scholar] [CrossRef]
- Ackermann, J.; Liebmann, F.; Hoch, A.; Snedeker, J.G.; Farshad, M.; Rahm, S.; Zingg, P.O.; Fürnstahl, P. Augmented Reality Based Surgical Navigation of Complex Pelvic Osteotomies—A Feasibility Study on Cadavers. Appl. Sci. 2021, 11, 1228. [Google Scholar] [CrossRef]
- Colt, H.G. Therapeutic Thoracoscopy. Clin. Chest Med. 1998, 19, 383–394. [Google Scholar] [CrossRef]
- Keating, T.C.; Jacobs, J.J. Augmented Reality in Orthopedic Practice and Education. Orthop. Clin. N. Am. 2021, 52, 15–26. [Google Scholar] [CrossRef]
- Kenngott, H.G.; Pfeiffer, M.; Preukschas, A.A.; Bettscheider, L.; Wise, P.A.; Wagner, M.; Speidel, S.; Huber, M.; Nickel, F.; Mehrabi, A.; et al. IMHOTEP: Cross-Professional Evaluation of a Three-Dimensional Virtual Reality System for Interactive Surgical Operation Planning, Tumor Board Discussion and Immersive Training for Complex Liver Surgery in a Head-Mounted Display. Surg. Endosc. 2021. [Google Scholar] [CrossRef]
- Chen, X.; Xu, L.; Sun, Y.; Politis, C. A Review of Computer-Aided Oral and Maxillofacial Surgery: Planning, Simulation and Navigation. Expert Rev. Med. Devices 2016, 13, 1043–1051. [Google Scholar] [CrossRef]
- Creighton, F.X.; Unberath, M.; Song, T.; Zhao, Z.; Armand, M.; Carey, J. Early Feasibility Studies of Augmented Reality Navigation for Lateral Skull Base Surgery. Otol. Neurotol. 2020, 41, 883–888. [Google Scholar] [CrossRef]
- Edgcumbe, P.; Singla, R.; Pratt, P.; Schneider, C.; Nguan, C.; Rohling, R. Follow the Light: Projector-Based Augmented Reality Intracorporeal System for Laparoscopic Surgery. J. Med. Imaging 2018, 5, 021216. [Google Scholar] [CrossRef]
- Xu, B.; Yang, Z.; Jiang, S.; Zhou, Z.; Jiang, B.; Yin, S. Design and Validation of a Spinal Surgical Navigation System Based on Spatial Augmented Reality. Spine 2020, 45, E1627–E1633. [Google Scholar] [CrossRef]
- Tabrizi, L.B.; Mahvash, M. Augmented Reality–Guided Neurosurgery: Accuracy and Intraoperative Application of an Image Projection Technique. J. Neurosci. 2015, 123, 206–211. [Google Scholar] [CrossRef]
- Kobayashi, L.; Zhang, X.C.; Collins, S.A.; Karim, N.; Merck, D.L. Exploratory Application of Augmented Reality/Mixed Reality Devices for Acute Care Procedure Training. West. J. Emerg. Med. 2018, 19, 158. [Google Scholar] [CrossRef]
- Verschuren, P.; Doorewaard, H. Designing a Research Project, 2nd ed.; Mellion, M., Ed.; Eleven International Publishing: The Hague, The Netherlands, 2010; ISBN 978-90-5931-572-3. [Google Scholar]
- Dell’Era, V.; Aluffi Valletti, P.; Garzaro, G.; Garzaro, M. Maxillo-Mandibular Osteoradionecrosis Following C-Ion Radiotherapy: Clinical Notes and Review of Literature. Eur. J. Inflamm. 2020, 18, 2058739220934562. [Google Scholar] [CrossRef]
- Hidalgo, D.A. Fibula Free Flap: A New Method of Mandible Reconstruction. Plast. Reconstr. Surg. 1989, 84, 71–79. [Google Scholar] [CrossRef]
- Su, T.; Fernandes, R. Microvascular reconstruction of the mandible: An argument for the fibula osteocutaneous free flap. Rev. Esp. Cir. Oral Maxilofac. 2014, 36, 1–8. [Google Scholar] [CrossRef] [Green Version]
- Gerbino, G.; Autorino, U.; Borbon, C.; Marcolin, F.; Olivetti, E.; Vezzetti, E.; Zavattero, E. Malar Augmentation with Zygomatic Osteotomy in Orthognatic Surgery: Bone and Soft Tissue Changes Threedimensional Evaluation. J. Cranio-Maxillofac. Surg. 2021, 49, 223–230. [Google Scholar] [CrossRef]
- Zhang, X.; Wang, J.; Wang, T.; Ji, X.; Shen, Y.; Sun, Z.; Zhang, X. A Markerless Automatic Deformable Registration Framework for Augmented Reality Navigation of Laparoscopy Partial Nephrectomy. Int J. Comput. Assist. Radiol. Surg. 2019, 14, 1285–1294. [Google Scholar] [CrossRef]
- Olivetti, E.C.; Marcolin, F.; Moos, S.; Ferrando, A.; Vezzetti, E.; Autorino, U.; Borbon, C.; Zavattero, E.; Gerbino, G.; Ramieri, G. Three-Dimensional Evaluation of Soft Tissue Malar Modifications after Zygomatic Valgization Osteotomy via Geometrical Descriptors. J. Pers. Med. 2021, 11, 205. [Google Scholar] [CrossRef]
- Chang, C.-F.; Schock, E.N.; Billmire, D.A.; Brugmann, S.A. Craniofacial Syndromes: Etiology, Impact and Treatment. In Principles of Developmental Genetics; Moody, S.A., Ed.; Academic Press: Oxford, MA, USA, 2015; pp. 653–676. ISBN 978-0-12-405945-0. [Google Scholar]
- Soliman, D.; Cladis, F.P.; Davis, P.J. The Pediatric Patient. In Anesthesia and Uncommon Diseases; Fleisher, L.A., Ed.; W.B. Saunders: Philadelphia, PA, USA, 2012; pp. 586–626. ISBN 978-1-4377-2787-6. [Google Scholar]
- Greensmith, A.L.; Meara, J.G.; Holmes, A.D.; Lo, P. Complications Related to Cranial Vault Surgery. Oral Maxillofac. Surg. Clin. N. Am. 2004, 16, 465–473. [Google Scholar] [CrossRef]
- Tessier, P.; Kawamoto, H.; Matthews, D.; Posnick, J.; Raulo, Y.; Tulasne, J.F.; Wolfe, S.A. Autogenous Bone Grafts and Bone Substitutes—Tools and Techniques: I. A 20,000-Case Experience in Maxillofacial and Craniofacial Surgery. Plast. Reconstr. Surg. 2005, 116, 6S–24S. [Google Scholar] [CrossRef]
- Jackson, I.T.; Adham, M.N. Metallic Plate Stabilisation of Bone Grafts in Craniofacial Surgery. Br. J. Plast. Surg. 1986, 39, 341–344. [Google Scholar] [CrossRef]
- Monson, L.A. Bilateral Sagittal Split Osteotomy. Semin. Plast. Surg. 2013, 27, 145–148. [Google Scholar] [CrossRef] [Green Version]
- Haug, R.H.; Prather, J.; Indresano, A.T. An Epidemiologic Survey of Facial Fractures and Concomitant Injuries. J. Oral Maxillofac. Surg. 1990, 48, 926–932. [Google Scholar] [CrossRef]
- Kim, M.; Lee, S.; Ko, D.R.; Kim, D.-H.; Huh, J.-K.; Kim, J.-Y. Craniofacial and Dental Injuries Associated with Stand-up Electric Scooters. Dent. Traumatol. 2021, 37, 229–233. [Google Scholar] [CrossRef]
- Trivedi, B.; Kesterke, M.J.; Bhattacharjee, R.; Weber, W.; Mynar, K.; Reddy, L. Craniofacial Injuries Seen With the Introduction of Bicycle-Share Electric Scooters in an Urban Setting. J. Oral Maxillofac. Surg. 2019, 77, 2292–2297. [Google Scholar] [CrossRef] [Green Version]
- Barak, M.; Bahouth, H.; Leiser, Y.; Abu El-Naaj, I. Airway Management of the Patient with Maxillofacial Trauma: Review of the Literature and Suggested Clinical Approach. Biomed. Res. Int. 2015, 2015, 724032. [Google Scholar] [CrossRef]
- Barca, I.; Stroscio, C.; Cordaro, R.; Boschetti, C.E.; Della Torre, A.; Cristofaro, M.G. Reconstruction of Comminuted Frontal Bone Fracture with Titanium Plates and Acrylic Resin: Report of Two Cases. Interdiscip. Neurosurg. 2021, 23, 100988. [Google Scholar] [CrossRef]
- Oren, D.; Dror, A.A.; Zoabi, A.; Kasem, A.; Tzadok, L.; Kablan, F.; Morozov, N.G.; Safory, E.; Sela, E.; Srouji, S. The Impact of Delayed Surgical Intervention Following High Velocity Maxillofacial Injuries. Sci. Rep. 2021, 11, 1379. [Google Scholar] [CrossRef]
- Gerbino, G.; Roccia, F.; De Gioanni, P.P.; Berrone, S. Maxillofacial Trauma in the Elderly. J. Oral Maxillofac. Surg. 1999, 57, 777–782. [Google Scholar] [CrossRef]
- Ellis, E. Orbital Trauma. Oral Maxillofac. Surg. Clin. N. Am. 2012, 24, 629–648. [Google Scholar] [CrossRef]
- IEICE Trans—A Taxonomy of Mixed Reality Visual Displays. Available online: https://search.ieice.org/bin/summary.php?id=e77-d_12_1321 (accessed on 31 March 2021).
- Caudell, T.P.; Mizell, D.W. Augmented Reality: An Application of Heads-up Display Technology to Manual Manufacturing Processes. In Proceedings of the Twenty-Fifth Hawaii International Conference on System Sciences, Kauai, HI, USA, 7 January 1992; Volume 2, pp. 659–669. [Google Scholar]
- Azuma, R.T. A Survey of Augmented Reality. Presence 1997, 6, 355–385. [Google Scholar] [CrossRef]
- Yuan, M.L.; Ong, S.K.; Nee, A.Y.C. A Generalized Registration Method for Augmented Reality Systems. Comput. Graph. 2005, 29, 980–997. [Google Scholar] [CrossRef]
- Pepe, A.; Trotta, G.F.; Mohr-Ziak, P.; Gsaxner, C.; Wallner, J.; Bevilacqua, V.; Egger, J. A Marker-Less Registration Approach for Mixed Reality–Aided Maxillofacial Surgery: A Pilot Evaluation. J. Digit. Imaging 2019, 32, 1008–1018. [Google Scholar] [CrossRef]
- Gsaxner, C.; Eck, U.; Schmalstieg, D.; Navab, N.; Egger, J. 5—Augmented Reality in Oral and Maxillofacial Surgery. In Computer-Aided Oral and Maxillofacial Surgery; Academic Press: Cambridge, MA, USA, 2021; pp. 107–139. [Google Scholar]
- How AR Can (Actually) Shape the Future of Manufacturing. Available online: https://www.forbes.com/sites/forbestechcouncil/2021/03/04/how-ar-can-actually-shape-the-future-of-manufacturing/ (accessed on 20 April 2021).
- Augmented Reality & Virtual Reality In Healthcare Market Size, Share & Trends Analysis Report By Component (Hardware, Software, Service), By Technology (Augmented Reality, Virtual Reality), By Region, And Segment Forecasts, 2021–2028. Available online: https://www.grandviewresearch.com/industry-analysis/virtual-reality-vr-in-healthcare-market (accessed on 20 April 2021).
- Ishihara, K. QFD: The Customer-Driven Approach to Quality Planning and Deployment, 1st ed.; Mizuno, S., Akao, Y., Eds.; Asian Productivity Organization: Japan, Tokyo, 1994; ISBN 978-92-833-1121-8. [Google Scholar]
Computing Power | Display | Audio | Sensors | File Formats | Connectivity | User Inputs | Power | General Features |
---|---|---|---|---|---|---|---|---|
CPU | Display technology | Earphones | Camera | Videos | Bluetooth | Physical commands | Battery | Exterior dimensions |
Operating system | Resolution | Microphone | Depth camera | Images | Wi-Fi | Voice commands | Battery duration | Mass |
RAM | Refresh rate | Position | 3D | Ports | Eye tracking | Modularity | ||
Storage | Angle of view | Light | Hand tracking | Cordless | ||||
Colour reproduction | Wearability |
Purpose | Device |
---|---|
Industrial | Epson MOVERIO PRO BT-2000, Vuzix M4000 |
Professional | Microsoft HoloLens 2, Magic Leap One, Varjo XR-3 |
Entertainment | Epson MOVERIO BT-300, Epson MOVERIO BT-350 |
Specification | Best | Worst |
---|---|---|
Computing power | 8 cores @ 2.5 GHz—6 Gb RAM 64 Gb memory—Android 9 | 2 cores @ 1.25 GHz—1 Gb RAM 40 Gb memory—Android 4 |
Display | optical see-through—960 × 540 @ 23° diagonal | optical see-through—845 × 480 @ 28° diagonal |
Audio | 3× microphones + stereo audio | microphone + stereo audio |
Sensors | 5 MP camera + 640 × 480 depth camera + 2× 9-axis IMU + ALS | 12.8 MP camera + 9-axis IMU |
Connectivity | Bluetooth ver. 5 + dual band Wi-Fi USB-C port ver. 3.1 | Bluetooth ver. 4 + dual band Wi-Fi micro usb port |
User inputs | buttons + voice controls | buttons + voice controls |
Power | battery powered up to 12 h autonomy | battery powered 4 h autonomy |
Certification | IP 67 | shock resistance |
Specification | Best | Worst |
---|---|---|
Computing power | NVIDIA Parker SOC + Pascal GPU 4 GB RAM—Lumin OS | no computing power on-board |
Display | video see-through—1920 × 1920 (70PPD) @ 115° diagonal | optical see-through—1280 × 960 @ 50° diagonal |
Audio | 5-channel microphone + spatial audio | 3.5 mm audio jack |
Sensors | 2× 12 MP camera + LiDAR and RGB depth camera | 2 MP camera + 9-axis IMU |
Connectivity | Bluetooth ver. 5 + dual band Wi-Fi USB-C port | cabled USB and display ports connections |
User inputs | voice control + 2× IR sensors eye track + two-hands tracking | IR sensor for eye and hand tracking |
Power | 3.5 h | cable powered |
Certification | none | none |
Specification | Best | Worst |
---|---|---|
Computing power | 4 cores 1.44 GHz | 4 cores 1.44 GHz |
Display | optical see-through—640 × 720 @ 23° diagonal | optical see-through—640 × 720 @ 23° diagonal |
Audio | microphone + stereo audio | stereo audio |
Sensors | 5 MP camera + 2× 9-axis IMU + ALS | 5 MP camera + 2× 9-axis IMU |
Connectivity | Bluetooth ver. 4.1 + dual band Wi-Fi | Bluetooth ver. 4.1 + dual band Wi-Fi |
User inputs | buttons + track pad | buttons |
Power | 6 h | 6 h |
Certification | none | none |
Scenario Requirements vs. Device Specifications | Computing Power | Display | Audio | Sensors | File Formats | Connectivity | User Inputs | Power | General Features |
---|---|---|---|---|---|---|---|---|---|
Visual cutting guides | |||||||||
Multiple sites DICOM | |||||||||
Preoperatory planning information | |||||||||
Custom and standard tools models | |||||||||
Live feedback |
Scenario | PRO BT-2000 | BT-350 | BT-300 | XR-3 | M4000 | HoloLens 2 | Magic Leap 1 | Best Case |
---|---|---|---|---|---|---|---|---|
Oncological and reconstructive surgery | 5.59 | 5.99 | 5.60 | 5.74 | 8.15 | 7.72 | 8.81 | 11.73 |
Orthognathic surgery | 5.61 | 6.00 | 5.63 | 5.70 | 8.10 | 7.64 | 8.80 | 11.67 |
Maxillofacial trauma surgery | 5.20 | 5.73 | 5.30 | 5.37 | 7.83 | 7.28 | 8.37 | 11.03 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Carpinello, A.; Vezzetti, E.; Ramieri, G.; Moos, S.; Novaresio, A.; Zavattero, E.; Borbon, C. Evaluation of HMDs by QFD for Augmented Reality Applications in the Maxillofacial Surgery Domain. Appl. Sci. 2021, 11, 11053. https://doi.org/10.3390/app112211053
Carpinello A, Vezzetti E, Ramieri G, Moos S, Novaresio A, Zavattero E, Borbon C. Evaluation of HMDs by QFD for Augmented Reality Applications in the Maxillofacial Surgery Domain. Applied Sciences. 2021; 11(22):11053. https://doi.org/10.3390/app112211053
Chicago/Turabian StyleCarpinello, Alessandro, Enrico Vezzetti, Guglielmo Ramieri, Sandro Moos, Andrea Novaresio, Emanuele Zavattero, and Claudia Borbon. 2021. "Evaluation of HMDs by QFD for Augmented Reality Applications in the Maxillofacial Surgery Domain" Applied Sciences 11, no. 22: 11053. https://doi.org/10.3390/app112211053
APA StyleCarpinello, A., Vezzetti, E., Ramieri, G., Moos, S., Novaresio, A., Zavattero, E., & Borbon, C. (2021). Evaluation of HMDs by QFD for Augmented Reality Applications in the Maxillofacial Surgery Domain. Applied Sciences, 11(22), 11053. https://doi.org/10.3390/app112211053