Clinical Utility of Breast Ultrasound Images Synthesized by a Generative Adversarial Network
Abstract
:1. Introduction
2. Materials and Methods
2.1. Patients
2.2. Ultrasound Examinations
2.3. Data Set
2.4. Image Synthesis
2.5. Image-Evaluation Method
3. Results
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Wilkinson, L.; Gathani, T. Understanding breast cancer as a global health concern. Br. J. Radiol. 2022, 95, 20211033. [Google Scholar] [CrossRef] [PubMed]
- Guo, R.; Lu, G.; Qin, B.; Fei, B. Ultrasound imaging technologies for breast cancer detection and management: A review. Ultrasound Med. Biol. 2018, 44, 37–70. [Google Scholar] [CrossRef] [PubMed]
- Qian, X.; Pei, J.; Zheng, H.; Xie, X.; Yan, L.; Zhang, H.; Han, C.; Gao, X.; Zhang, H.; Zheng, W.; et al. Prospective assessment of breast cancer risk from multimodal multiview ultrasound images via clinically applicable deep learning. Nat. Biomed. Eng. 2021, 5, 522–532. [Google Scholar] [CrossRef] [PubMed]
- Li, Y.; Liu, Y.; Huang, L.; Wang, Z.; Luo, J. Deep weakly-supervised breast tumor segmentation in ultrasound images with explicit anatomical constraints. Med. Image Anal. 2022, 76, 102315. [Google Scholar] [CrossRef] [PubMed]
- Shen, Y.; Shamout, F.E.; Oliver, J.R.; Witowski, J.; Kannan, K.; Park, J.; Wu, N.; Huddleston, C.; Wolfson, S.; Millet, A.; et al. Artificial intelligence system reduces false-positive findings in the interpretation of breast ultrasound exams. Nat. Commun. 2021, 12, 5645. [Google Scholar] [CrossRef] [PubMed]
- Berg, W.A.; López Aldrete, A.L.; Jairaj, A.; Ledesma Parea, J.C.; García, C.Y.; McClennan, R.C.; Cen, S.Y.; Larsen, L.H.; de Lara, M.T.S.; Love, S. Toward AI-supported US triage of women with palpable breast lumps in a low-resource setting. Radiology. 2023, 307, e223351. [Google Scholar] [CrossRef] [PubMed]
- Chen, C.; Wang, Y.; Niu, J.; Liu, X.; Li, Q.; Gong, X. Domain knowledge powered deep learning for breast cancer diagnosis based on contrast-enhanced ultrasound videos. IEEE Trans. Med. Imaging 2021, 40, 2439–2451. [Google Scholar] [CrossRef] [PubMed]
- Shin, S.Y.; Lee, S.; Yun, I.D.; Kim, S.M.; Lee, K.M. Joint weakly and semi-supervised deep learning for localization and classification of masses in breast ultrasound images. IEEE Trans. Med. Imaging 2019, 38, 762–774. [Google Scholar] [CrossRef]
- Ozaki, J.; Fujioka, T.; Yamaga, E.; Hayashi, A.; Kujiraoka, Y.; Imokawa, T.; Takahashi, K.; Okawa, S.; Yashima, Y.; Mori, M.; et al. Deep learning method with a convolutional neural network for image classification of normal and metastatic axillary lymph nodes on breast ultrasonography. Jpn. J. Radiol. 2022, 40, 814–822. [Google Scholar] [CrossRef]
- Zhou, L.Q.; Wu, X.L.; Huang, S.Y.; Wu, G.G.; Ye, H.R.; Wei, Q.; Bao, L.Y.; Deng, Y.B.; Li, X.R.; Cui, X.W.; et al. Lymph node metastasis prediction from primary breast cancer US images using deep learning. Radiology 2020, 294, 19–28. [Google Scholar] [CrossRef]
- Schaefferkoetter, J.; Yan, J.; Moon, S.; Chan, R.; Ortega, C.; Metser, U.; Berlin, A.; Veit-Haibach, P. Deep learning for whole-body medical image generation. Eur. J. Nucl. Med. Mol. Imaging 2021, 48, 3817–3826. [Google Scholar] [CrossRef] [PubMed]
- Minaee, S.; Boykov, Y.; Porikli, F.; Plaza, A.; Kehtarnavaz, N.; Terzopoulos, D. Image segmentation using deep learning: A survey. IEEE Trans. Pattern Anal. Mach. Intell. 2022, 44, 3523–3542. [Google Scholar] [CrossRef] [PubMed]
- Huang, Z.; Wu, Y.; Fu, F.; Meng, N.; Gu, F.; Wu, Q.; Zhou, Y.; Yang, Y.; Liu, X.; Zheng, H.; et al. Parametric image generation with the uEXPLORER total-body PET/CT system through deep learning. Eur. J. Nucl. Med. Mol. Imaging 2022, 49, 2482–2492. [Google Scholar] [CrossRef] [PubMed]
- Litjens, G.; Ciompi, F.; Wolterink, J.M.; de Vos, B.D.; Leiner, T.; Teuwen, J.; Išgum, I. State-of-the-art deep learning in cardiovascular image analysis. JACC Cardiovasc. Imaging 2019, 12, 1549–1565. [Google Scholar] [CrossRef] [PubMed]
- Goodfellow, I.J.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Yoshua Bengio, Y. Generative adversarial nets. Neural Inf. Process. Syst. 2014, 27, 2672–2680. [Google Scholar]
- Goodfellow, I.J.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Bengio, Y. Generative adversarial networks. arXiv 2014. [Google Scholar] [CrossRef]
- Sreejith Kumar, A.J.; Chong, R.S.; Crowston, J.G.; Chua, J.; Bujor, I.; Husain, R.; Vithana, E.N.; Girard, M.J.A.; Ting, D.S.W.; Cheng, C.Y.; et al. Evaluation of generative adversarial networks for high-resolution synthetic image generation of circumpapillary optical coherence tomography images for glaucoma. JAMA Ophthalmol. 2022, 140, 974–981. [Google Scholar] [CrossRef] [PubMed]
- Subramaniam, P.; Kossen, T.; Ritter, K.; Hennemuth, A.; Hildebrand, K.; Hilbert, A.; Sobesky, J.; Livne, M.; Galinovic, I.; Khalil, A.A.; et al. Generating 3D TOF-MRA volumes and segmentation labels using generative adversarial networks. Med. Image Anal. 2022, 78, 102396. [Google Scholar] [CrossRef]
- Lei, B.; Xia, Z.; Jiang, F.; Jiang, X.; Ge, Z.; Xu, Y.; Qin, J.; Chen, S.; Wang, T.; Wang, S. Skin lesion segmentation via generative adversarial networks with dual discriminators. Med. Image Anal. 2020, 64, 101716. [Google Scholar] [CrossRef]
- Sanchez, K.; Hinojosa, C.; Arguello, H.; Kouame, D.; Meyrignac, O.; Basarab, A. CX-DaGAN: Domain adaptation for pneumonia diagnosis on a small chest X-ray dataset. IEEE Trans. Med. Imaging 2022, 41, 3278–3288. [Google Scholar] [CrossRef]
- Introducing ChatGPT. OpenAI. Available online: https://openai.com/blog/chatgpt/ (accessed on 15 September 2023).
- Harsha, N.; King, N.; McKinney, S.M.; Carignan, D.; Horvitz, E. Capabilities of GPT-4 on Medical Challenge Problems. arXiv 2023, arXiv:2303.13375. [Google Scholar]
- Radford, A.; Metz, L.; Chintala, S. Unsupervised representation learning with deep convolutional generative adversarial networks. arXiv 2015, arXiv:1511.06434. [Google Scholar]
- Fujioka, T.; Mori, M.; Kubota, K.; Kikuchi, Y.; Katsuta, L.; Adachi, M.; Oda, G.; Nakagawa, T.; Kitazume, Y.; Tateishi, U. Breast ultrasound image synthesis using deep convolutional generative adversarial networks. Diagnostics 2019, 9, 176. [Google Scholar] [CrossRef] [PubMed]
- Fujioka, T.; Kubota, K.; Mori, M.; Katsuta, L.; Kikuchi, Y.; Kimura, K.; Kimura, M.; Adachi, M.; Oda, G.; Nakagawa, T.; et al. Virtual interpolation images of tumor development and growth on breast ultrasound image synthesis with deep convolutional generative adversarial networks. J. Ultrasound Med. 2021, 40, 61–69. [Google Scholar] [CrossRef] [PubMed]
- Japan Association of Breast and Thyroid Sonology (JABTS). Guidelines for Breast Ultrasound: Management and Diagnosis, 4th ed.; Nankodo: Tokyo, Japan, 2020. (In Japanese) [Google Scholar]
- Yamaguchi, K.; Nakazono, T.; Egashira, R.; Fukui, S.; Baba, K.; Hamamoto, T.; Irie, H. Maximum slope of ultrafast dynamic contrast-enhanced MRI of the breast: Comparisons with prognostic factors of breast cancer. Jpn. J. Radiol. 2021, 39, 246–253. [Google Scholar] [CrossRef] [PubMed]
- Honda, M.; Kataoka, M.; Kawaguchi, K.; Iima, M.; Miyake, K.K.; Kishimoto, A.O.; Ota, R.; Ohashi, A.; Toi, M.; Nakamoto, Y. Subcategory classifications of breast imaging and data system (BI-RADS) category 4 lesions on MRI. Jpn. J. Radiol. 2021, 39, 56–65. [Google Scholar] [CrossRef] [PubMed]
- Li, X.; Chai, W.; Sun, K.; Fu, C.; Yan, F. The value of whole-tumor histogram and texture analysis based on apparent diffusion coefficient (ADC) maps for the discrimination of breast fibroepithelial lesions: Corresponds to clinical management decisions. Jpn. J. Radiol. 2022, 40, 1263–1271. [Google Scholar] [CrossRef] [PubMed]
- Nara, M.; Fujioka, T.; Mori, M.; Aruga, T.; Tateishi, U. Prediction of breast cancer risk by automated volumetric breast density measurement. Jpn. J. Radiol. 2023, 41, 54–62. [Google Scholar] [CrossRef]
- Satoh, Y.; Imai, M.; Ikegawa, C.; Onishi, H. Image quality evaluation of real low-dose breast PET. Jpn. J. Radiol. 2022, 40, 1186–1193. [Google Scholar] [CrossRef]
- Terada, K.; Kawashima, H.; Yoneda, N.; Toshima, F.; Hirata, M.; Kobayashi, S.; Gabata, T. Predicting axillary lymph node metastasis in breast cancer using the similarity of quantitative dual-energy CT parameters between the primary lesion and axillary lymph node. Jpn. J. Radiol. 2022, 40, 1272–1281. [Google Scholar] [CrossRef]
- Uematsu, T.; Nakashima, K.; Harada, T.L.; Nasu, H.; Igarashi, T. Comparisons between artificial intelligence computer-aided detection synthesized mammograms and digital mammograms when used alone and in combination with tomosynthesis images in a virtual screening setting. Jpn. J. Radiol. 2023, 41, 63–70. [Google Scholar] [CrossRef] [PubMed]
- Ueda, D.; Yamamoto, A.; Takashima, T.; Onoda, N.; Noda, S.; Kashiwagi, S.; Morisaki, T.; Tsutsumi, S.; Honjo, T.; Shimazaki, A.; et al. Visualizing “featureless” regions on mammograms classified as invasive ductal carcinomas by a deep learning algorithm: The promise of AI support in radiology. Jpn. J. Radiol. 2021, 39, 333–340. [Google Scholar] [CrossRef]
- Ishihara, M.; Shiiba, M.; Maruno, H.; Kato, M.; Ohmoto-Sekine, Y.; Antoine, C.; Ouchi, Y. Detection of intracranial aneurysms using deep learning-based CAD system: Usefulness of the scores of CNN’s final layer for distinguishing between aneurysm and infundibular dilatation. Jpn. J. Radiol. 2023, 41, 131–141. [Google Scholar] [CrossRef] [PubMed]
- Cunniff, C.; Byrne, J.L.B.; Hudgins, L.M.; Moeschler, J.B.; Olney, A.H.; Pauli, R.M.; Seaver, L.H.; Stevens, C.A.; Figone, C. Informed consent for medical photographs. Genet. Med. 2000, 2, 353–355. [Google Scholar] [CrossRef] [PubMed]
- Sharmila, V.J.; Jedi, F.D. Deep learning algorithm for COVID-19 classification using chest X-ray images. Comput. Math. Methods Med. 2021, 2021, 9269173. [Google Scholar] [CrossRef]
- Rasheed, K.; Qadir, J.; O’Brien, T.J.; Kuhlmann, L.; Razi, A. A generative model to synthesize EEG Data for epileptic seizure prediction. IEEE Trans. Neural Syst. Rehabil. Eng. 2021, 29, 2322–2332. [Google Scholar] [CrossRef]
- Mutepfe, F.; Kalejahi, B.K.; Meshgini, S.; Danishvar, S. Generative adversarial network image synthesis method for skin lesion generation and classification. J. Med. Signals Sens. 2021, 11, 237–252. [Google Scholar] [CrossRef]
Histologic Type | Number of Images [n] | Long Diameter of Masses | Patient Age | ||
---|---|---|---|---|---|
Range [mm] | Average [mm] | Range [y] | Average [y] | ||
Cyst | 202 | 4–11 | 6.84 | 28–84 | 50.2 |
Fibroadenoma | 203 | 5–12 | 7.79 | 20–85 | 50.4 |
IDC—Scirrhous type | 201 | 5–12 | 8.57 | 40–89 | 63.3 |
IDC—Solid type | 200 | 5–14 | 8.87 | 33–90 | 63.9 |
IDC—Tubule-forming type | 202 | 5–13 | 9.41 | 33–89 | 58.4 |
Correct Diagnosis Rate | Synthetic Image | Original Image | ||
---|---|---|---|---|
Reader 1 | Reader 2 | Reader 1 | Reader 2 | |
Benign or malignant | 50/50 100% | 47/50 94.0% | 50/50 100% | 48/50 96.0% |
All histological types | 43/50 86.0% | 39/50 78.0% | 44/50 88.0% | 39/50 78.0% |
Cyst | 9/10 90.0% | 9/10 90.0% | 9/10 90.0% | 7/10 70.0% |
Fibroadenoma | 9/10 90.0% | 10/10 100% | 9/10 90.0% | 10/10 100% |
IDC—Scirrhous type | 10/10 100% | 6/10 60.0% | 9/10 90.0% | 9/10 90.0% |
IDC—Solid type | 9/10 90.0% | 7/10 70.0% | 9/10 90.0% | 6/10 60.0% |
IDC—Tubule forming type | 6/10 60.0% | 7/10 70.0% | 8/10 80.0% | 7/10 70.0% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zama, S.; Fujioka, T.; Yamaga, E.; Kubota, K.; Mori, M.; Katsuta, L.; Yashima, Y.; Sato, A.; Kawauchi, M.; Higuchi, S.; et al. Clinical Utility of Breast Ultrasound Images Synthesized by a Generative Adversarial Network. Medicina 2024, 60, 14. https://doi.org/10.3390/medicina60010014
Zama S, Fujioka T, Yamaga E, Kubota K, Mori M, Katsuta L, Yashima Y, Sato A, Kawauchi M, Higuchi S, et al. Clinical Utility of Breast Ultrasound Images Synthesized by a Generative Adversarial Network. Medicina. 2024; 60(1):14. https://doi.org/10.3390/medicina60010014
Chicago/Turabian StyleZama, Shu, Tomoyuki Fujioka, Emi Yamaga, Kazunori Kubota, Mio Mori, Leona Katsuta, Yuka Yashima, Arisa Sato, Miho Kawauchi, Subaru Higuchi, and et al. 2024. "Clinical Utility of Breast Ultrasound Images Synthesized by a Generative Adversarial Network" Medicina 60, no. 1: 14. https://doi.org/10.3390/medicina60010014
APA StyleZama, S., Fujioka, T., Yamaga, E., Kubota, K., Mori, M., Katsuta, L., Yashima, Y., Sato, A., Kawauchi, M., Higuchi, S., Kawanishi, M., Ishiba, T., Oda, G., Nakagawa, T., & Tateishi, U. (2024). Clinical Utility of Breast Ultrasound Images Synthesized by a Generative Adversarial Network. Medicina, 60(1), 14. https://doi.org/10.3390/medicina60010014