CAL-Tutor: A HoloLens 2 Application for Training in Obstetric Sonography and User Motion Data Recording
Abstract
:1. Introduction
- A view of both the foetal anatomy and the ultrasound slice in their correct physical location;
- Mixed reality guidance during US probe navigation to the three standard planes—HC, AC and FL.
2. State of the Art
2.1. AR-Assisted Ultrasound Training
2.2. Deep Learning-Based Standard Plane Navigation Methods
3. Proposed Method
- Unity game engine v2020.3.14 (https://unity3d.com/unity/whats-new/2020.3.14, accessed on 27 December 2022);
- Mixed Reality Toolkit (MRTK) v2.7.2.0 (https://github.com/Microsoft/MixedRealityToolkit-Unity/releases, accessed on 27 December 2022);
- HoloLensARToolKit: A Unity-based marker tracking software for the HoloLens2 that uses its front-facing camera [16];
- Aurora electromagnetic tracker (https://www.ndigital.com/electromagnetic-tracking-technology/aurora/, accessed on 27 December 2022).
3.1. Design of the Mixed Reality Concept
3.1.1. Ultrasound Probe Tracking
3.1.2. Holographic Guidance during Standard Plane Navigation
- Instruction card: The card is a 2D plane with an example image of the standard plane and text explaining how to find the standard plane. The plane can be scaled and positioned anywhere in the scene via the MRTK’s hand gesture-based object interaction.
- Guidance arrows: Four pink arrows emanating from the edges of the US plane attached to the holographic Voluson probe point to the four edges of the standard plane positioned at the respective baby location. The guide arrows are intended to enable the user to navigate to the standard planes more efficiently.
- Numeric offset between the source and target US plane: The relative distance between the US plane attached to the probe and the standard plane is displayed in the upper right corner of the user’s field of view via six numbers: position offset x, y, z and rotation offset x, y, z. These numbers are intended to help trainees verify that the standard plane was positioned in a precise manner.
- Directional indicator: The indicator is a standard MRTK asset consisting of a chevron symbol pointing to the standard plane, helping trainees maintain a broader sense of direction when needed.
3.2. User Workflow
3.2.1. Manual Registration of the Baby Model
3.2.2. Standard Plane Definition
3.2.3. Trainee Navigation to Standard Plane
3.3. User Data Recording
3.4. User Study
- NASA Task Load Index (TLX)-based workload assessment via five seven-point scales with 21 graduations (from very low to very high);
- Product assessment (user experience) via twenty six seven-point scales using different product characteristics.
4. Results
4.1. Workload Assessment
4.2. Product Assessment
4.3. HoloLens 2 User Motion Data
5. Discussion
Accuracy of Hologram Alignment and Tracking
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
US | Ultrasound |
MR | Mixed Reality |
HC | Head Circumference |
AC | Abdomen Circumference |
FL | Femur Length |
References
- Todsen, T.; Jensen, M.L.; Tolsgaard, M.G.; Olsen, B.H.; Henriksen, B.M.; Hillingsø, J.G.; Konge, L.; Ringsted, C. Transfer from point-of-care ultrasonography training to diagnostic performance on patients—A randomized controlled trial. Am. J. Surg. 2016, 211, 40–45. [Google Scholar] [CrossRef] [PubMed]
- Rahmatullah, B.; Sarris, I.; Papageorghiou, A.; Noble, J.A. Quality control of fetal ultrasound images: Detection of abdomen anatomical landmarks using adaboost. In Proceedings of the 2011 IEEE International Symposium on Biomedical Imaging: From Nano to Macro, Chicago, IL, USA, 30 March–2 April 2011; pp. 6–9. [Google Scholar] [CrossRef]
- Chen, H.; Wu, L.; Dou, Q.; Qin, J.; Li, S.; Cheng, J.Z.; Ni, D.; Heng, P.A. Ultrasound standard plane detection using a composite neural network framework. IEEE Trans. Cybern. 2017, 47, 1576–1586. [Google Scholar] [CrossRef] [PubMed]
- State, A.; Livingston, M.A.; Garrett, W.F.; Hirota, G.; Whitton, M.C.; Pisano, E.D.; Fuchs, H. Technologies for augmented reality systems: Realizing ultrasound-guided needle biopsies. In Proceedings of the 23rd Annual Conference on Computer Graphics and Interactive Techniques, New Orleans, LA, USA, 4–9 August 1996; pp. 439–446. [Google Scholar]
- Farshad-Amacker, N.A.; Bay, T.; Rosskopf, A.B.; Spirig, J.M.; Wanivenhaus, F.; Pfirrmann, C.W.; Farshad, M. Ultrasound-guided interventions with augmented reality in situ visualisation: A proof-of-mechanism phantom study. Eur. Radiol. Exp. 2020, 4, 1–7. [Google Scholar] [CrossRef] [PubMed]
- Birlo, M.; Edwards, P.E.; Clarkson, M.; Stoyanov, D. Utility of optical see-through head mounted displays in augmented reality-assisted surgery: A systematic review. Med. Image Anal. 2022, 77, 102361. [Google Scholar] [CrossRef] [PubMed]
- Dromey, B.P.; Peebles, D.M.; Stoyanov, D.V. A systematic review and meta-analysis of the use of high-fidelity simulation in obstetric ultrasound. Simul. Healthc. J. Soc. Simul. Healthc. 2020, 16, 52–59. [Google Scholar] [CrossRef] [PubMed]
- Shao, M.Y.; Vagg, T.; Seibold, M.; Doughty, M. Towards a low-cost monitor-based augmented reality training platform for at-home ultrasound skill development. J. Imaging 2022, 8, 305. [Google Scholar] [CrossRef] [PubMed]
- Costa, J.N.; Gomes-Fonseca, J.; Valente, S.; Ferreira, L.; Oliveira, B.; Torres, H.R.; Morais, P.; Alves, V.; VilaçA, J.L. Ultrasound training simulator using augmented reality glasses: An accuracy and precision assessment study. In Proceedings of the 2022 44th Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Glasgow, UK, 11–15 July 2022; pp. 4461–4464. [Google Scholar] [CrossRef]
- Burden, C.; Preshaw, J.; White, P.; Draycott, T.J.; Grant, S.; Fox, R. Usability of virtual-reality simulation training in obstetric ultrasonography: A prospective cohort study. Ultrasound Obstet. Gynecol. 2013, 42, 213–217. [Google Scholar] [CrossRef] [PubMed]
- Blum, T.; Heining, S.M.; Kutter, O.; Navab, N. Advanced training methods using an augmented reality ultrasound simulator. In Proceedings of the 2009 8th IEEE International Symposium on Mixed and Augmented Reality, Washington, DC, USA, 19–22 October 2009; pp. 177–178. [Google Scholar]
- Mahmood, F.; Mahmood, E.; Dorfman, R.G.; Mitchell, J.; Mahmood, F.U.; Jones, S.B.; Matyal, R. Augmented reality and ultrasound education: Initial experience. J. Cardiothorac. Vasc. Anesth. 2018, 32, 1363–1367. [Google Scholar] [CrossRef] [PubMed]
- Cai, Y.; Droste, R.; Sharma, H.; Chatelain, P.; Drukker, L.; Papageorghiou, A.T.; Noble, J.A. Spatio-temporal visual attention modelling of standard biometry plane-finding navigation. Med. Image Anal. 2020, 65, 101762. [Google Scholar] [CrossRef]
- Wang, Y.; Yang, Q.; Drukker, L.; Papageorghiou, A.; Hu, Y.; Noble, J.A. Task model-specific operator skill assessment in routine fetal ultrasound scanning. Int. J. Comput. Assist. Radiol. Surg. 2022, 17, 1437–1444. [Google Scholar] [CrossRef]
- Li, K.; Wang, J.; Xu, Y.; Qin, H.; Liu, D.; Liu, L.; Meng, M.Q.H. Autonomous navigation of an ultrasound probe towards standard scan planes with deep reinforcement learning. In Proceedings of the 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi’an China, 30 May–5 June 2021; pp. 8302–8308. [Google Scholar]
- Qian, L.; Deguet, A.; Kazanzides, P. Arssist: Augmented reality on a head-mounted display for the first assistant in robotic surgery. Healthc. Technol. Lett. 2018, 5, 194–200. [Google Scholar] [CrossRef] [PubMed]
- Condino, S.; Carbone, M.; Piazza, R.; Ferrari, M.; Ferrari, V. Perceptual limits of optical see-through visors for augmented reality guidance of manual tasks. IEEE Trans. Biomed. Eng. 2020, 67, 411–419. [Google Scholar] [CrossRef] [PubMed]
ARToolkit | HoloLens 2 | Voluson US scanner |
ProbePositionx, y, z | EyeGaze: Game object hit Positionx, y, z | US video |
ProbeRotationx, y, z | EyeGaze: name of game object user is looking at | |
HandPalmPositionx, y, z | NDI Aurora | |
HandWristPositionx, y, z | ProbePositionx, y, z | |
HeadPositionx, y, z | ProbeRotationx, y, z | |
HeadRotationx, y, z | ||
External camera | ||
External camera video of the overall scene |
Experimental Condition |
---|
Condition 1 (Baseline): Probe navigation without mixed reality assistance The participant has to wear the HoloLens 2 device during standard plane navigation since user data will be recorded. Despite the fact that the user has to wear the HoloLens 2, no holographic information is being displayed. |
Condition 2 (MR guidance): Probe navigation with mixed reality assistance The user is asked to perform the standard plane navigation with holographic guidance which includes the instruction card, the guidance arrows, directional indicator, elapsed time and numerical offset between the probe’s US plane and the target standard plane, as described in Section 3.1.2. |
User Number | Condition | Mental | Physical | Temporal | Performance | Effort | Frustration | Mean |
---|---|---|---|---|---|---|---|---|
1 | A | 40 | 30 | 50 | 50 | 80 | 20 | 45 |
1 | B | 70 | 60 | 80 | 50 | 95 | 80 | 73 |
2 | A | 40 | 65 | 5 | 5 | 10 | 10 | 23 |
2 | B | 75 | 90 | 40 | 25 | 60 | 30 | 53 |
3 | A | 45 | 35 | 25 | 75 | 45 | 30 | 43 |
3 | B | 65 | 25 | 50 | 55 | 60 | 50 | 51 |
4 | A | 25 | 25 | 30 | 35 | 20 | 20 | 26 |
4 | B | 60 | 35 | 50 | 65 | 65 | 45 | 53 |
6 | A | 95 | 40 | 55 | 40 | 70 | 75 | 63 |
6 | B | 35 | 25 | 30 | 25 | 20 | 40 | 29 |
Workload | Value | Workload Component | With MR guidance | Without MR guidance | ||||
Low | 0–9 | Mental | 53 | 62 | ||||
Medium | 10–29 | Physical | 39 | 46 | ||||
Somewhat high | 30–49 | Temporal | 33 | 46 | ||||
High | 50–79 | Performance | 44 | 42 | ||||
Very high | 80–100 | Effort | 50 | 63 | ||||
Frustration | 31 | 43 |
Scale | Condition | Mean | STD | N | Confidence | Confidence | Interval |
---|---|---|---|---|---|---|---|
Attractiveness | A | 2.00 | 0.85 | 6 | 0.68 | 1.32 | 2.68 |
B | 0.47 | 0.68 | 6 | 0.54 | −0.07 | 1.02 | |
Perspicuity | A | 1.79 | 0.86 | 6 | 0.69 | 1.11 | 2.48 |
B | −0.42 | 1.37 | 6 | 1.09 | −1.51 | 0.68 | |
Efficiency | A | 1.88 | 0.59 | 6 | 0.47 | 1.41 | 2.34 |
B | 0.29 | 0.95 | 6 | 0.76 | −0.47 | 1.06 | |
Dependability | A | 1.71 | 0.87 | 6 | 0.70 | 1.01 | 2.41 |
B | 0.50 | 1.00 | 6 | 0.80 | −0.30 | 1.30 | |
Stimulation | A | 2.13 | 0.68 | 6 | 0.55 | 1.58 | 2.67 |
B | 0.96 | 0.89 | 6 | 0.71 | 0.25 | 1.67 | |
Novelty | A | 2.21 | 0.25 | 6 | 0.20 | 2.01 | 2.41 |
B | −0.13 | 2.08 | 6 | 1.66 | −1.79 | 1.54 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Birlo, M.; Edwards, P.J.E.; Yoo, S.; Dromey, B.; Vasconcelos, F.; Clarkson, M.J.; Stoyanov, D. CAL-Tutor: A HoloLens 2 Application for Training in Obstetric Sonography and User Motion Data Recording. J. Imaging 2023, 9, 6. https://doi.org/10.3390/jimaging9010006
Birlo M, Edwards PJE, Yoo S, Dromey B, Vasconcelos F, Clarkson MJ, Stoyanov D. CAL-Tutor: A HoloLens 2 Application for Training in Obstetric Sonography and User Motion Data Recording. Journal of Imaging. 2023; 9(1):6. https://doi.org/10.3390/jimaging9010006
Chicago/Turabian StyleBirlo, Manuel, Philip J. Eddie Edwards, Soojeong Yoo, Brian Dromey, Francisco Vasconcelos, Matthew J. Clarkson, and Danail Stoyanov. 2023. "CAL-Tutor: A HoloLens 2 Application for Training in Obstetric Sonography and User Motion Data Recording" Journal of Imaging 9, no. 1: 6. https://doi.org/10.3390/jimaging9010006
APA StyleBirlo, M., Edwards, P. J. E., Yoo, S., Dromey, B., Vasconcelos, F., Clarkson, M. J., & Stoyanov, D. (2023). CAL-Tutor: A HoloLens 2 Application for Training in Obstetric Sonography and User Motion Data Recording. Journal of Imaging, 9(1), 6. https://doi.org/10.3390/jimaging9010006