Next Article in Journal
A Preliminary Laboratory Evaluation on the Use of Shredded Cigarette Filters as Stabilizing Fibers for Stone Mastic Asphalts
Previous Article in Journal
Effect of the Sintering Temperature on the Compressive Strengths of Reticulated Porous Zirconia
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Case Report

Virtual Reality (VR) Simulation and Augmented Reality (AR) Navigation in Orthognathic Surgery: A Case Report

1
Department of Oral and Maxillofacial Surgery, Chosun University Dental Hospital, Gwangju 61452, Korea
2
Department of Oral and Maxillofacial Surgery, College of Dentistry, Chosun University, Gwangju 61452, Korea
*
Author to whom correspondence should be addressed.
Appl. Sci. 2021, 11(12), 5673; https://doi.org/10.3390/app11125673
Submission received: 25 May 2021 / Revised: 12 June 2021 / Accepted: 14 June 2021 / Published: 18 June 2021
(This article belongs to the Section Applied Dentistry and Oral Sciences)

Abstract

:
VR and AR technology have gradually developed to the extent that they could help operators in the surgical field. In this study, we present a case of VR simulation for preoperative planning and AR navigation applied to orthognathic surgery. The average difference between the preplanned data and the post-operative results was 3.00 mm, on average, and the standard deviation was 1.44 mm. VR simulation could provide great advantages for 3D medical simulations, with accurate manipulation and immersiveness. AR navigation has great potential in medical application; its advantages include displaying real time augmented 3D models of patients. Moreover, it is easily applied in the surgical field, without complicated 3D simulations or 3D-printed surgical guides.

Graphical Abstract

1. Introduction

Orthognathic surgery is a popular surgery performed by oral and maxillofacial surgeons. The role of new dental application technologies are focused on the early detection of pathologic lesions, with noninvasive screening for surgical applications, using three-dimensional (3D) simulation and navigation [1].
Nowadays, most types of orthognathic surgery are computer-assisted (i.e., computer simulation and fabrication of surgical guides for accurate surgery). Computer-assisted surgery includes surgical planning with simulation software and fabrication of customized surgical guides via 3D printing. However, the process of manufacturing surgical guides is time-consuming and involves additional costs. Furthermore, surgical guides cannot respond to various unexpected clinical situations [2]. A virtual reality (VR) medical simulation is generally composed of a virtual patient in a medical 3D model, with a reconstructed pathological lesion, surgical instruments, and tools for a VR interface [3]. Surgical simulation can play an important role in the way we approach surgical training and how we prepare for challenging cases in the operation [4]. VR simulation has many advantages, such as immersiveness, easy simulation without learning the software, and direct handling of 3D models. These benefits can provide the clinician with fast and easy simulation for preoperative planning. Augmented reality (AR) is a technology that superimposes and displays a virtual three-dimensional model on the operator’s field of view. AR can provide the operator with a 3D computer-generated model superimposed onto real objects, all in real time [5]. When using this as a navigation system, it is possible to provide immediate guidance by intuitively expressing necessary information for the surgeon, such as anatomical structure in a deep area or as accurate surgical guidance. As a result, it is expected that the accuracy of the operation will be improved and the operation time will be reduced. It can provide more clear information, make the users improve safety, and lower the risks [6]. As image recognition and tracking technology improve gradually, real patient can be tracking by combining with camera system and optical markers. [7] This study presents the application of VR planning and an AR-guided system for orthognathic surgery in Class III dentofacial deformity patient.

2. Case Presentation

This study protocol was approved by the Institutional Ethics Committee (CUDHIRB 2007003). Written informed consent was obtained from the patient.
A 24-year-old male visited the department of oral and maxillofacial surgery for orthognathic surgery. Radiographic optical markers (Marker, Dio Co., Ltd.; Busan, Korea) were attached to the patient’s face; a cone beam CT was used (Hitachi, Marunouchi, Japan), and facial scan data were obtained with a handheld scanner (Artec Space Spider; Artec Group; Luxembourg). Skin and skull 3D models were reconstructed with Mimics software (18.0, Materialise, Leuven, Belgium), facial scan and oral scan 3D models were imported. With a registration system (Mimics software), the facial scan model was registered to the 3D skin model generated from the CBCT, based on the optical markers (Figure 1).
VR simulation was performed in virtual space with segmented and registered 3D models, by controlling with a controller (Vive Pro, HTC Vive, Taipei, Taiwan), wearing the head mount display (HMD). The maxilla position was adjusted for posterior impaction and advancement, and then the distal segment of the mandible was moved for the optimal position, matched with the maxillary position (Figure 2).
In addition, the intermediate and final splint were designed for surgical application and then exported as STL files for 3D printing (Figure 3).
The 3D models were implemented with an AR guide system by Unity 3D software. We developed a four-point tracking algorithm, which was tested with a dental cast to evaluate the tracking accuracy. Tracking accuracy was 0.55 mm on average. The four point auto-tracking algorithm was applied for intraoperative AR display of the pre-planned 3D model with a 4k RGB camera (Figure 4).
To apply the AR navigation system, first, we set the optical markers onto the patient’s face; CT and face scanning was then carried out. The 3D models were registered to the real patient’s face, with the four-point tracking algorithm registering manually during operation. Finally, the AR navigation system automatically tracked the markers and projected the 3D model onto the actual patient’s operating field. A 4K RGB camera acquired the “movie” from the actual patient, and a 4k display monitor showed an augmented 3D model in real time (Figure 5).
For accurate surgery, LeFort I osteotomy was performed according to the pre-planned osteotomy line under AR navigation. After the osteotomy was completed, the intermediate splint was applied and then the maxilla position was confirmed with AR navigation. After intermaxillary fixation, the maxilla was fixed with miniplates and then the final splint was applied for setting the mandibular position. Finally, the position of the repositioned maxilla and mandible were confirmed with AR navigation. Two weeks after operation, the CT scan and facial scan were taken for postoperative evaluation. Post-operative CT and facial scan data were registered to the preoperative 3D models, based on Mimics software. For comparison with the preoperative plan, reference points were set, such as anterior nasal spine (ANS), posterior nasal spine (PNS), right maxillary first molar mesiobuccal cusp (Right MxM1), left maxillary first molar mesiobuccal cusp (Left MxM1), pogonion point (Pog), right mandibular molar mesiobuccal cusp (Right MnM1), and left mandibular molar mesiobuccal cusp (Left MnM1). The differences of the reference points were measured as three-dimensional linear distances (Figure 6).
The differences between the preoperative plan and postoperative 3D model were 1.18 mm at ANS, 4.22 mm at PNS, 2.05 mm at Right MxM1, 2.02 mm at Left MxM1, 2.34 mm at Pog, 4.32 mm at Right MnM1, and 4.88 mm at Left MnM1. The mean value was 3.00 mm and the standard deviation was 1.44 mm. The maximum distance was 4.88 mm at Left MnM1 and the minimum distance was 1.18 mm at ANS (Table 1).

3. Discussion

Digital technology is used to perform operations with accuracy. Since the introduction of computer-assisted surgery, CBCT has been used to create surgical guides that apply to implant placement, tumor resection and reconstruction, and orthognathic surgery performed by oral and maxillofacial surgeons. Recently, orthognathic surgery has been performed as a type of computer-assisted surgery, including preoperative planning by computer simulation and fabrication of surgical guides by 3D printing technology. Previous studies showed a clinically acceptable error in the range of 0.18 to 1.7 mm compared to preoperative planning data and postoperative results [8,9,10,11,12].
VR simulation is attracting modality because it has been shown to improve operative accuracy, efficiency, and outcomes [13]. Fushima and Kobayashi suggested a mixed reality-based system that synchronizes the movement of dental models in the real world and a 3D mesh model in the virtual world for orthognathic surgery [14]. Wang et al. applied a VR-based simulation system for mandibular angle reduction surgery [15]. In this study, we could easily make a plan via VR simulation. VR simulation showed great potential in surgical planning, in terms of convenience, intuition, and immersiveness compared to a 2D monitor display and mouse input interface. AR navigation surgery was introduced with technological development in the field of AR tracking accuracy. The AR guide (to guide implant placement position) showed errors of 0.53, 0.50, 0.46, and 0.48 mm [16]. In addition, an error of 1.63 mm appeared when the surgeon placed it at 1.25 mm, according to experience, and the error was small when using the AR guide [17]. The technology to track a marker with AR technology showed an accuracy of a maximum of 1.03 mm, an average of 0.71 mm, and a standard deviation of 0.27 mm [18]. Studies suggest that AR systems are becoming comparable to traditional navigation techniques, with precision, and sufficient safety for routine clinical practice [6].
The deviation in our finding was measured somewhat higher than other AR-based navigation surgeries, especially the mandible position. This deviation is considered a simulation error, in that preoperative simulation could not reproduce the exact mandibular movement in terms of the condyle position. In addition, since we attempted to introduce AR navigation as a simple device in the surgical process, more accurate results can be obtained when depth cameras and marker trackers are used.
When an AR navigation system is applied to the operation, we could experience a small delay when compared to a traditional navigation system. It is possible that this happens because (i) the 4K camera takes the scene and then sends it to the computer, along with numerous data; (ii) the computer needs to calculate the data, integrating the 3D models to display onto the monitor. This problem can be resolved by replacing with a high quality GPU.
There are many important structures in the oral and maxillofacial areas, including nerves and blood vessels [19]. To preserve these anatomical structures, it is necessary to improve the accuracy in the surgical filed. When AR navigation is applied to the face with optical makers—metal objects, such as oral and nose piercings, can influence the tracking system [20].
Therefore, it is necessary to remove all face piercings during operation. AR technology is appropriate for this purpose and well-suited to the currently preferred minimally invasive philosophy in maxillofacial surgery [19]. Applications of an AR navigation system in maxillofacial surgery have been extended to orthognathic surgery, tumor surgery, temporomandibular joint motion analysis, foreign body removal, osteotomy, minimally invasive biopsy, prosthetic surgery, and dental implantation [21]. One of its chief attractions is providing information on deep-tissue structures during the operation, allowing surgery to be less invasive [5]. For those reasons, orthognathic surgery is one of the most widely used fields in AR applications [19].
Recently, VR and AR have created a synergy effect in producing excellent therapeutic tools [22]. VR-based surgical simulation often combines AR navigation to guide the operation, with useful information, such as the patient`s anatomy and/or preoperative planning [23]. In the future, VR and AR will develop to implement more realistic, immersive, and interactive simulations [23].
AR in the medical field is quite useful (in regards to displaying dynamic navigation), despite some software and hardware limitations [16]. Simulated images are generated faster and more realistically due to the development of computing power [24]. In the future, AR will likely serve as a navigation system for surgeries, working in symbiosis with surgeons, allowing them to perform accurate surgeries [6]. Despite our case report, VR simulation and AR navigation systems showed potential application in orthognathic surgery in this study. This means that we will be able to make surgical plans using VR for all surgeries, performing more convenient and accurate surgeries using AR navigation systems. This study has some limitations. First, this is a case report, so it is hard to evaluate the statistical analysis. Second, the equipment, such as tracking camera and computer, were common items in our office; therefore, if we use a high quality tracking system and computer, it will provide us with better results in terms of performance and accuracy.

4. Conclusions

In this study, preoperative planning for orthognathic surgery was performed in a virtual space. It was shown that VR simulation could provide easy handling of 3D models, for adjusting the maxilla and mandible with intuition. Orthognathic surgery was performed with an AR navigation system, which provided the operator with the preoperatively planned 3D model superimposed with the actual surgical site. The differences between the preoperative plan and postoperative 3D model were 1.18 mm at ANS, 4.22 mm at PNS, 2.05 mm at Right MxM1, 2.02 mm at Left MxM1, 2.34 mm at Pog, 4.32 mm at Right MnM1, and 4.88 mm at Left MnM1. The Mean value was 3.00 mm and the standard deviation was 1.44 mm. The deviation in our findings was measured as somewhat high. This difference is considered a depth problem; results that are more accurate can be obtained when depth cameras and optical markers are used. In the future, we need to apply haptic-related VR simulation; it will provide us with great advantages for 3D medical simulations, with accurate manipulation and immersiveness. AR navigation has great potential for medical applications; it has many advantages, including displaying a real-time augmented 3D model of the patient, and it is easily applied in the surgical field, without complicated 3D simulations or 3D-printed surgical guides.

Author Contributions

Conceptualization, S.-Y.M. and Y.-J.J.; methodology, S.-Y.M., Y.-J.J.; software, S.-Y.M. and Y.-J.J.; validation, S.-Y.M., H.-J.K. and Y.-J.J.; formal analysis, S.-Y.M. and Y.-J.J.; investigation, S.-Y.M., J.-S.C. and Y.-J.J.; resources, S.-Y.M.; data curation, J.-S.C. and Y.-J.J.; writing—original draft preparation, S.-Y.M. and Y.-J.J.; writing—review and editing, S.-Y.M., J.K. and Y.-J.J.; visualization, S.-Y.M. and H.-J.K.; supervision, S.-Y.M. and J.K.; project administration, S.-Y.M.; funding acquisition, S.-Y.M. All authors have read and agreed to the published version of the manuscript.

Funding

This study was supported by a research fund from Chosun University Dental Hospital, 2020.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki, and approved by the Institutional Review Board of Chosun University Dental Hospital (CUDHIRB 2007003 and 26 November 2020).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Tatullo, M.; Marrelli, M.; Amantea, M.; Paduano, F.; Santacroce, L.; Gentile, S.; Scacco, S. Bioimpedance detection of oral lichen planus used as preneoplastic model. J. Cancer 2015, 6, 976. [Google Scholar] [CrossRef] [Green Version]
  2. Kim, H.-J.; Jo, Y.-J.; Choi, J.-S.; Kim, H.-J.; Park, I.-S.; You, J.-S.; Oh, J.-S.; Moon, S.-Y. Virtual Reality Simulation and Augmented Reality-Guided Surgery for Total Maxillectomy: A Case Report. Appl. Sci. 2020, 10, 6288. [Google Scholar] [CrossRef]
  3. Nowinski, W.L. Virtual reality in brain intervention. Int. J. Artif. Intell. Tools 2006, 15, 741–752. [Google Scholar] [CrossRef]
  4. Chan, S.; Conti, F.; Salisbury, K.; Blevins, N.H. Virtual reality simulation in neurosurgery: Technologies and evolution. Neurosurgery 2013, 72, A154–A164. [Google Scholar] [CrossRef]
  5. Shuhaiber, J.H. Augmented reality in surgery. Arch. Surg. 2004, 139, 170–174. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  6. Vávra, P.; Roman, J.; Zonča, P.; Ihnát, P.; Němec, M.; Kumar, J.; Habib, N.; El-Gendi, A. Recent development of augmented reality in surgery: A review. J. Healthc. Eng. 2017, 2017. [Google Scholar] [CrossRef]
  7. Huang, T.-K.; Yang, C.-H.; Hsieh, Y.-H.; Wang, J.-C.; Hung, C.-C. Augmented reality (AR) and virtual reality (VR) applied in dentistry. Kaohsiung J. Med. Sci. 2018, 34, 243–248. [Google Scholar] [CrossRef] [PubMed]
  8. Kang, S.-H.; Lee, J.-W.; Lim, S.-H.; Kim, Y.-H.; Kim, M.-K. Validation of mandibular genioplasty using a stereolithographic surgical guide: In vitro comparison with a manual measurement method based on preoperative surgical simulation. J. Oral Maxillofac. Surg. 2014, 72, 2032–2042. [Google Scholar] [CrossRef] [PubMed]
  9. Li, B.; Shen, S.; Yu, H.; Li, J.; Xia, J.; Wang, X. A new design of CAD/CAM surgical template system for two-piece narrowing genioplasty. Int. J. Oral Maxillofac. Surg. 2016, 45, 560–566. [Google Scholar] [CrossRef] [Green Version]
  10. Lin, H.-H.; Chang, H.-W.; Lo, L.-J. Development of customized positioning guides using computer-aided design and manufacturing technology for orthognathic surgery. Int. J. Comput. Assist. Radiol. Surg. 2015, 10, 2021–2033. [Google Scholar] [CrossRef]
  11. Li, B.; Zhang, L.; Sun, H.; Yuan, J.; Shen, S.G.; Wang, X. A novel method of computer aided orthognathic surgery using individual CAD/CAM templates: A combination of osteotomy and repositioning guides. Br. J. Oral Maxillofac. Surg. 2013, 51, e239–e244. [Google Scholar] [CrossRef]
  12. Lin, H.-H.; Lonic, D.; Lo, L.-J. 3D printing in orthognathic surgery−A literature review. J. Formos. Med. Assoc. 2018, 117, 547–558. [Google Scholar] [CrossRef]
  13. Timonen, T.; Iso-Mustajärvi, M.; Linder, P.; Lehtimäki, A.; Löppönen, H.; Elomaa, A.-P.; Dietz, A. Virtual reality improves the accuracy of simulated preoperative planning in temporal bones: A feasibility and validation study. Eur. Arch. Oto-Rhino-Laryngol. 2020, 1–12. [Google Scholar] [CrossRef] [PubMed]
  14. Fushima, K.; Kobayashi, M. Mixed-reality simulation for orthognathic surgery. Maxillofac. Plast. Reconstr. Surg. 2016, 38, 13. [Google Scholar] [CrossRef] [Green Version]
  15. Wang, Q.; Chen, H.; Wu, W.; Jin, H.-Y.; Heng, P.-A. Real-time mandibular angle reduction surgical simulation with haptic rendering. IEEE Trans. Inf. Technol. Biomed. 2012, 16, 1105–1114. [Google Scholar] [CrossRef] [PubMed]
  16. Pellegrino, G.; Mangano, C.; Mangano, R.; Ferri, A.; Taraschi, V.; Marchetti, C. Augmented reality for dental implantology: A pilot clinical report of two cases. BMC Oral Health 2019, 19, 158. [Google Scholar] [CrossRef]
  17. Ma, L.; Jiang, W.; Zhang, B.; Qu, X.; Ning, G.; Zhang, X.; Liao, H. Augmented reality surgical navigation with accurate CBCT-patient registration for dental implant placement. Med. Biol. Eng. Comput. 2019, 57, 47–57. [Google Scholar] [CrossRef] [PubMed]
  18. Wang, J.; Suenaga, H.; Hoshi, K.; Yang, L.; Kobayashi, E.; Sakuma, I.; Liao, H. Augmented reality navigation with automatic marker-free image registration using 3-D image overlay for dental surgery. IEEE Trans. Biomed. Eng. 2014, 61, 1295–1304. [Google Scholar] [CrossRef]
  19. Kwon, H.-B.; Park, Y.-S.; Han, J.-S. Augmented reality in dentistry: A current perspective. Acta Odontol. Scand. 2018, 76, 497–503. [Google Scholar] [CrossRef] [PubMed]
  20. Inchingolo, F.; Tatullo, M.; Abenavoli, F.M.; Marrelli, M.; Inchingolo, A.D.; Palladino, A.; Inchingolo, A.M.; Dipalma, G. Oral piercing and oral diseases: A short time retrospective study. Int. J. Med. Sci. 2011, 8, 649. [Google Scholar] [CrossRef] [Green Version]
  21. Enislidis, G.; Wagner, A.; Ploder, O.; Truppe, M.; Ewers, R. Augmented reality in oral and maxillofacial surgery. J. Med. Virt. Real. 1995, 22–24. [Google Scholar]
  22. Cervino, G.; Fiorillo, L.; Arzukanyan, A.V.; Spagnuolo, G.; Cicciù, M. Dental restorative digital workflow: Digital smile design from aesthetic to function. Dent. J. 2019, 7, 30. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  23. Kim, Y.; Kim, H.; Kim, Y.O. Virtual reality and augmented reality in plastic surgery: A review. Arch. Plast. Surg. 2017, 44, 179. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  24. Ayoub, A.; Pulijala, Y. The application of virtual reality and augmented reality in Oral & Maxillofacial Surgery. BMC Oral Health 2019, 19, 238. [Google Scholar]
Figure 1. Facial scan model and CT skin model were fused by registering according to the optical markers in virtual space.
Figure 1. Facial scan model and CT skin model were fused by registering according to the optical markers in virtual space.
Applsci 11 05673 g001
Figure 2. Maxilla moved to the optimal position with the controller (left) and the mandible was moved to the appropriate position with the controller (right).
Figure 2. Maxilla moved to the optimal position with the controller (left) and the mandible was moved to the appropriate position with the controller (right).
Applsci 11 05673 g002
Figure 3. Intermediate splint and final splint were designed after simulation.
Figure 3. Intermediate splint and final splint were designed after simulation.
Applsci 11 05673 g003
Figure 4. The four-point registration system was tested with a dental cast to evaluate the accuracy and was applied to track the 3D model onto the dummy model for verification. (A): dental 3D model and dental cast, (B): 3D dental model augmented onto the dental cast with the tracking algorithm, (C): dummy and patient 3D model with four points, (D): 3D patient model was augmented onto the dummy, (E): preplanning 3D models were shown onto the dummy.
Figure 4. The four-point registration system was tested with a dental cast to evaluate the accuracy and was applied to track the 3D model onto the dummy model for verification. (A): dental 3D model and dental cast, (B): 3D dental model augmented onto the dental cast with the tracking algorithm, (C): dummy and patient 3D model with four points, (D): 3D patient model was augmented onto the dummy, (E): preplanning 3D models were shown onto the dummy.
Applsci 11 05673 g004aApplsci 11 05673 g004b
Figure 5. Repositioned maxilla was dysplayed for the confirming of the maxilla position (left); the intermediate splint was displayed (right) on the patient with a 4k RGB camera.
Figure 5. Repositioned maxilla was dysplayed for the confirming of the maxilla position (left); the intermediate splint was displayed (right) on the patient with a 4k RGB camera.
Applsci 11 05673 g005
Figure 6. The pre-planned 3D model and postoperative 3D model were fused and measured the differences at bony landmarks.
Figure 6. The pre-planned 3D model and postoperative 3D model were fused and measured the differences at bony landmarks.
Applsci 11 05673 g006
Table 1. Differences between preoperative planned 3D model and postoperative 3D model.
Table 1. Differences between preoperative planned 3D model and postoperative 3D model.
Bony LandmarksDistance (mm)
ANS1.18
PNS4.22
Right MxM12.05
Left MxM12.02
Pog2.34
Right MnM14.32
Left MnM14.88
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Jo, Y.-J.; Choi, J.-S.; Kim, J.; Kim, H.-J.; Moon, S.-Y. Virtual Reality (VR) Simulation and Augmented Reality (AR) Navigation in Orthognathic Surgery: A Case Report. Appl. Sci. 2021, 11, 5673. https://doi.org/10.3390/app11125673

AMA Style

Jo Y-J, Choi J-S, Kim J, Kim H-J, Moon S-Y. Virtual Reality (VR) Simulation and Augmented Reality (AR) Navigation in Orthognathic Surgery: A Case Report. Applied Sciences. 2021; 11(12):5673. https://doi.org/10.3390/app11125673

Chicago/Turabian Style

Jo, Ye-Joon, Jun-Seok Choi, Jin Kim, Hyo-Joon Kim, and Seong-Yong Moon. 2021. "Virtual Reality (VR) Simulation and Augmented Reality (AR) Navigation in Orthognathic Surgery: A Case Report" Applied Sciences 11, no. 12: 5673. https://doi.org/10.3390/app11125673

APA Style

Jo, Y. -J., Choi, J. -S., Kim, J., Kim, H. -J., & Moon, S. -Y. (2021). Virtual Reality (VR) Simulation and Augmented Reality (AR) Navigation in Orthognathic Surgery: A Case Report. Applied Sciences, 11(12), 5673. https://doi.org/10.3390/app11125673

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop