Machine Learning in Prostate MRI for Prostate Cancer: Current Status and Future Opportunities
Abstract
:1. Introduction
2. Machine Learning Applications to Enhance Utility of Prostate MRI: Current Status
2.1. Related Reviews
2.2. Segmentation
Publication Year | Method | Prostate Zone | Input Image Dimension (Pixel/Voxel/mm) | Data Source | MRI Sequence(s) | Sample Sizes | CV | Results | Refs. | |||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Train | Val | Test | Acc (%) | DSC (%) | ||||||||
2008 | Nonrigid registration of prelabelled atlas images | WG | 512 × 512 × 90, 271 × 333 × 86 | Pv | T2w | 38 | - | 50 | No | - | 85 | [27] |
2009 | Level set | WG | Pv | DWI | 10 | - | 10 | No | - | 91 | [28] | |
2012 | AAM | WG | 0.54 × 0.54 × 3 mm | Pv | T2w | 86 | - | 22 | 5-fold | 88 | [29] | |
2007 | Organ model-based, region-growing | WG | 3D | Pv | T1w, T2w | 15 | - | 24 | No | 94.75 | [30] | |
2014 | RF and graph cuts | WG | 512 × 512 or 320 × 320 | PRO12 | T2w | 50 | - | 30 | 10-fold | - | >91 (training), >81 (test) | [31] |
2014 | Atlas-based AAM and SVM | WG | 512 × 512 | Pv | T2w | 100 | - | 40 | leave-one-out | 90 | 87 | [32] |
2016 | Atlas and C-Means classifier | WG, PZ, TZ | Varying sizes | PRO12, Pv | T2w | 30 | 35 | No | - | 81 (WG), 70 (TZ), 62 (PZ) | [33] | |
2016 | Volumetric CNN | WG | 128 × 128 × 64 | PRO12 | T2w | 50 | - | 31 | No | - | 86.9 | [34] |
2017 | FCN | WG, TZ | 0.625 × 0.625 × 1.5 mm | PRO12 | T2w | 50 | - | 30 | 10-fold | - | 89.43 | [35] |
2021 | V-Net using bicubic interpolation | WG | 1024 × 1024 × 3 × 16 | PRO12, Pv | T2w | 106 | - | 30 | Y | - | 96.13 | [36] |
2019 | Cascade dense-UNet | WG | 256 × 256 | PRO12 | T2w | 40 | - | 10 | 5-fold | - | 85.6 | [37] |
2021 | 3D-2D UNet | WG | - | Pv | T2w | 299 | - | 5-fold | - | 89.8 | [38] | |
2020 | convLSTM and GGNN | WG | 28 × 28 × 128 | PRO12, ISBI13, Pv | T2w | 140 | - | 30 | No | - | 91.78 | [39] |
2020 | Transfer learning, data augmentation, fine-tuning | WG, TZ | - | Pv | T2w | 684 | - | 406 | 10-fold | - | 91.5 (WG), 89.7 (TZ) | [40] |
2021 | Federated learning with AutoML | WG | 160 × 160 × 32 | MSD-Pro, PRO12, ISBI13, PROx | T2w | 344 | 46 | 96 | No | - | 89.06 | [41] |
2020 | Anisotropic 3D multi-stream CNN | WG | 144 × 144 × 144 | PRO12, Pv | T2w | 87 | 30 | 19 | 4-fold | - | 90.6 (base), 90.1 (apex) | [42] |
2020 | MS-Net | WG | 384 × 384 | Pv | T2w | 63 | - | 16 | No | - | 91.66 | [43] |
2017 | FCN | WG, TZ | 144 × 144 × 26 | PRO12, Pv | DWI | 141 | - | 13 | 4-fold | 97 | 93,88 | [44] |
2020 | Transfer learning | WG, TZ | 1.46 × 1.46 × 3 mm | Pv | DWI | 291 | 97 | 145 | No | - | 65 (WG), 51 (TZ) | [45] |
2019 | Cascaded U-Net | WG, PZ | 192 × 192 | Pv | DWI | 76 | 36 | 51 | No | - | 92.7 (WG), 79.3 (PZ) | [46] |
2021 | Three 3D/2D UNet pipeline | WG, PZ, TZ | 256 × 256 × (3 mm) | Pv | T2w | 145 | 48 | 48 | No | - | 0.94 (WG), 0.914 (TZ), 0.776 (PZ) | [47] |
2021 | U-Net, ENet, ERFNet | WG, PZ, TZ | 512 × 512 | PROx | T2w | 99 | - | 105 | 5-fold | - | ENet (best): 91 (WG), 87 (TZ), 71 (PZ) | [48] |
2021 | Transfer learning, aggregated learning, U-Net | WG, PZ, CG | 192 × 192, 192 × 192 × 192 | ISBI13 | T2w | 5–40 | - | 20 | 5-fold | - | 73 (PZ), 83 (CG), 88 (WG) | [49] |
2018 | PSNet | WG | 320 × 320 × 512 × 512 | PRO12, ISBI13 | T2w | 112 | - | 28 | 5-fold | - | 85 | [50] |
2.2.1. Traditional Machine Learning Methods
2.2.2. Deep Learning-Based Methods
2.2.3. Zonal Segmentation
2.3. Image Registration
2.3.1. MRI–US
2.3.2. MRI–Histopathology
2.3.3. MRI–CT
2.4. Lesion Detection and Characterization
2.4.1. Lesion Detection
2.4.2. Lesion Scoring (PI-RADS and Gleason Score)
2.5. Treatment Decision Support
2.5.1. EPE Prediction
2.5.2. Biochemical Recurrence Prediction
2.5.3. Histological and Outcome Predictions
3. Machine Learning Applications to Enhance Utility of Prostate MRI: Limitations
4. Future Opportunities
Federated Learning
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Mottet, N.; van den Bergh, R.C.N.; Briers, E.; Van den Broeck, T.; Cumberbatch, M.G.; De Santis, M.; Fanti, S.; Fossati, N.; Gandaglia, G.; Gillessen, S.; et al. EAU-EANM-ESTRO-ESUR-SIOG Guidelines on Prostate Cancer-2020 Update. Part 1: Screening, Diagnosis, and Local Treatment with Curative Intent. Eur. Urol. 2021, 79, 243–262. [Google Scholar] [CrossRef] [PubMed]
- Barentsz, J.O.; Richenberg, J.; Clements, R.; Choyke, P.; Verma, S.; Villeirs, G.; Rouviere, O.; Logager, V.; Fütterer, J.J. ESUR prostate MR guidelines 2012. Eur. Radiol. 2012, 22, 746–757. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Kasivisvanathan, V.; Rannikko, A.S.; Borghi, M.; Panebianco, V.; Mynderse, L.A.; Vaarala, M.H.; Briganti, A.; Budäus, L.; Hellawell, G.; Hindley, R.G.; et al. MRI-Targeted or Standard Biopsy for Prostate-Cancer Diagnosis. N. Engl. J. Med. 2018, 378, 1767–1777. [Google Scholar] [CrossRef] [PubMed]
- Ahmed, H.U.; Bosaily, A.E.-S.; Brown, L.C.; Gabe, R.; Kaplan, R.; Parmar, M.K.; Collaco-Moraes, Y.; Ward, K.; Hindley, R.G.; Freeman, A.; et al. Diagnostic accuracy of multi-parametric MRI and TRUS biopsy in prostate cancer (PROMIS): A paired validating confirmatory study. Lancet 2017, 389, 815–822. [Google Scholar] [CrossRef] [Green Version]
- Connor, M.J.; Gorin, M.A.; Ahmed, H.U.; Nigam, R. Focal therapy for localized prostate cancer in the era of routine multi-parametric MRI. Prostate Cancer Prostatic Dis. 2020, 23, 232–243. [Google Scholar] [CrossRef]
- De Visschere, P.J.L.; Vral, A.; Perletti, G.; Pattyn, E.; Praet, M.; Magri, V.; Villeirs, G.M. Multiparametric magnetic resonance imaging characteristics of normal, benign and malignant conditions in the prostate. Eur. Radiol. 2017, 27, 2095–2109. [Google Scholar] [CrossRef]
- Chesnais, A.L.; Niaf, E.; Bratan, F.; Mège-Lechevallier, F.; Roche, S.; Rabilloud, M.; Colombel, M.; Rouvière, O. Differentiation of transitional zone prostate cancer from benign hyperplasia nodules: Evaluation of discriminant criteria at multiparametric MRI. Clin. Radiol. 2013, 68, e323–e330. [Google Scholar] [CrossRef]
- Brembilla, G.; Dell’Oglio, P.; Stabile, A.; Damascelli, A.; Brunetti, L.; Ravelli, S.; Cristel, G.; Schiani, E.; Venturini, E.; Grippaldi, D.; et al. Interreader variability in prostate MRI reporting using Prostate Imaging Reporting and Data System version 2.1. Eur. Radiol. 2020, 30, 3383–3392. [Google Scholar] [CrossRef]
- Park, K.J.; Choi, S.H.; Lee, J.S.; Kim, J.K.; Kim, M.-H. Interreader Agreement with Prostate Imaging Reporting and Data System Version 2 for Prostate Cancer Detection: A Systematic Review and Meta-Analysis. J. Urol. 2020, 204, 661–670. [Google Scholar] [CrossRef]
- Leake, J.L.; Hardman, R.; Ojili, V.; Thompson, I.; Shanbhogue, A.; Hernandez, J.; Barentsz, J. Prostate MRI: Access to and current practice of prostate MRI in the United States. J. Am. Coll. Radiol. 2014, 11, 156–160. [Google Scholar] [CrossRef] [Green Version]
- Shinmoto, H.; Tamura, C.; Soga, S.; Shiomi, E.; Yoshihara, N.; Kaji, T.; Mulkern, R.V. An intravoxel incoherent motion diffusion-weighted imaging study of prostate cancer. Am. J. Roentgenol. 2012, 199, W496–W500. [Google Scholar] [CrossRef]
- Tamura, C.; Shinmoto, H.; Soga, S.; Okamura, T.; Sato, H.; Okuaki, T.; Pang, Y.; Kosuda, S.; Kaji, T. Diffusion kurtosis imaging study of prostate cancer: Preliminary findings. J. Magn. Reson. Imaging 2014, 40, 723–729. [Google Scholar] [CrossRef]
- Fei, B. Computer-aided diagnosis of prostate cancer with MRI. Curr. Opin. Biomed. Eng. 2017, 3, 20–27. [Google Scholar] [CrossRef]
- Greer, M.D.; Lay, N.; Shih, J.H.; Barrett, T.; Bittencourt, L.K.; Borofsky, S.; Kabakus, I.; Law, Y.M.; Marko, J.; Shebel, H.; et al. Computer-aided diagnosis prior to conventional interpretation of prostate mpMRI: An international multi-reader study. Eur. Radiol. 2018, 28, 4407–4417. [Google Scholar] [CrossRef]
- Armato, S.G.; Huisman, H.; Drukker, K.; Hadjiiski, L.; Kirby, J.S.; Petrick, N.; Redmond, G.; Giger, M.L.; Cha, K.; Mamonov, A.; et al. PROSTATEx Challenges for computerized classification of prostate lesions from multiparametric magnetic resonance images. J. Med. Imaging 2018, 5, 44501. [Google Scholar] [CrossRef]
- Cuocolo, R.; Cipullo, M.B.; Stanzione, A.; Ugga, L.; Romeo, V.; Radice, L.; Brunetti, A.; Imbriaco, M. Machine learning applications in prostate cancer magnetic resonance imaging. Eur. Radiol. Exp. 2019, 3, 1–8. [Google Scholar] [CrossRef]
- Sanford, T.; Harmon, S.A.; Turkbey, E.B.; Kesani, D.; Tuncer, S.; Madariaga, M.; Yang, C.; Sackett, J.; Mehralivand, S.; Yan, P.; et al. Deep-Learning-Based Artificial Intelligence for PI-RADS Classification to Assist Multiparametric Prostate MRI Interpretation: A Development Study. J. Magn. Reson. Imaging 2020, 52, 1499–1507. [Google Scholar] [CrossRef]
- Schelb, P.; Kohl, S.; Radtke, J.P.; Wiesenfarth, M.; Kickingereder, P.; Bickelhaupt, S.; Kuder, T.A.; Stenzinger, A.; Hohenfellner, M.; Schlemmer, H.P.; et al. Classification of cancer at prostate MRI: Deep Learning versus Clinical PI-RADS Assessment. Radiology 2019, 293, 607–617. [Google Scholar] [CrossRef]
- Goldenberg, S.L.; Nir, G.; Salcudean, S.E. A new era: Artificial intelligence and machine learning in prostate cancer. Nat. Rev. Urol. 2019, 16, 391–403. [Google Scholar] [CrossRef]
- van Sloun, R.J.G.; Wildeboer, R.R.; Mannaerts, C.K.; Postema, A.W.; Gayet, M.; Beerlage, H.P.; Salomon, G.; Wijkstra, H.; Mischi, M. Deep Learning for Real-time, Automatic, and Scanner-adapted Prostate (Zone) Segmentation of Transrectal Ultrasound, for Example, Magnetic Resonance Imaging-transrectal Ultrasound Fusion Prostate Biopsy. Eur. Urol. Focus 2021, 7, 78–85. [Google Scholar] [CrossRef]
- Padhani, A.R.; Turkbey, B. Detecting Prostate Cancer with Deep Learning for MRI: A Small Step Forward. Radiology 2019, 293, 618–619. [Google Scholar] [CrossRef]
- Gaziev, G.; Wadhwa, K.; Barrett, T.; Koo, B.C.; Gallagher, F.A.; Serrao, E.; Frey, J.; Seidenader, J.; Carmona, L.; Warren, A.; et al. Defining the learning curve for multiparametric magnetic resonance imaging (MRI) of the prostate using MRI-transrectal ultrasonography (TRUS) fusion-guided transperineal prostate biopsies as a validation tool. BJU Int. 2016, 117, 80–86. [Google Scholar] [CrossRef]
- Chaddad, A.; Kucharczyk, M.J.; Cheddad, A.; Clarke, S.E.; Hassan, L.; Ding, S.; Rathore, S.; Zhang, M.; Katib, Y.; Bahoric, B.; et al. Magnetic resonance imaging based radiomic models of prostate cancer: A narrative review. Cancers 2021, 13, 552. [Google Scholar] [CrossRef]
- Zeeshan Hameed, B.M.; Aiswarya Dhavileswarapu, V.L.S.; Raza, S.Z.; Karimi, H.; Khanuja, H.S.; Shetty, D.K.; Ibrahim, S.; Shah, M.J.; Naik, N.; Paul, R.; et al. Artificial intelligence and its impact on urological diseases and management: A comprehensive review of the literature. J. Clin. Med. 2021, 10, 1864. [Google Scholar] [CrossRef]
- Khan, Z.; Yahya, N.; Isam Al-Hiyali, M.; Meriaudeau, F. Recent Automatic Segmentation Algorithms of MRI Prostate Regions: A Review. IEEE Access 2021, 9, 97878–97905. [Google Scholar] [CrossRef]
- Zou, K.H.; Warfield, S.K.; Bharatha, A.; Tempany, C.M.C.; Kaus, M.R.; Haker, S.J.; Wells, W.M., 3rd; Jolesz, F.A.; Kikinis, R. Statistical validation of image segmentation quality based on a spatial overlap index. Acad. Radiol. 2004, 11, 178–189. [Google Scholar] [CrossRef] [Green Version]
- Klein, S.; Van Der Heide, U.A.; Lips, I.M.; Van Vulpen, M.; Staring, M.; Pluim, J.P.W. Automatic segmentation of the prostate in 3D MR images by atlas matching using localized mutual information. Med. Phys. 2008, 35, 1407–1417. [Google Scholar] [CrossRef]
- Liu, X.; Langer, D.L.; Haider, M.A.; Van Der Kwast, T.H.; Evans, A.J.; Wernick, M.N.; Yetik, I.S. Unsupervised segmentation of the prostate using MR images based on level set with a shape prior. In Proceedings of the 2009 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Minneapolis, MN, USA, 2–6 September 2009; pp. 3613–3616. [Google Scholar] [CrossRef]
- Toth, R.; Madabhushi, A. Multifeature landmark-free active appearance models: Application to prostate MRI segmentation. IEEE Trans. Med. Imaging 2012, 31, 1638–1650. [Google Scholar] [CrossRef]
- Pasquier, D.; Lacornerie, T.; Vermandel, M.; Rousseau, J.; Lartigau, E.; Betrouni, N. Automatic Segmentation of Pelvic Structures From Magnetic Resonance Images for Prostate Cancer Radiotherapy. Int. J. Radiat. Oncol. Biol. Phys. 2007, 68, 592–600. [Google Scholar] [CrossRef]
- Mahapatra, D.; Buhmann, J.M. Prostate MRI segmentation using learned semantic knowledge and graph cuts. IEEE Trans. Biomed. Eng. 2014, 61, 756–764. [Google Scholar] [CrossRef] [PubMed]
- Cheng, R.; Turkbey, B.; Gandler, W.; Agarwal, H.K.; Shah, V.P.; Bokinsky, A.; McCreedy, E.; Wang, S.; Sankineni, S.; Bernardo, M.; et al. Atlas based AAM and SVM model for fully automatic MRI prostate segmentation. Annu. Int. Conf. IEEE Eng. Med. Biol. Soc. 2014, 2014, 2881–2885. [Google Scholar] [CrossRef]
- Chilali, O.; Puech, P.; Lakroum, S.; Diaf, M.; Mordon, S.; Betrouni, N. Gland and Zonal Segmentation of Prostate on T2W MR Images. J. Digit. Imaging 2016, 29, 730–736. [Google Scholar] [CrossRef] [Green Version]
- Milletari, F.; Navab, N.; Ahmadi, S.A. V-Net: Fully convolutional neural networks for volumetric medical image segmentation. In Proceedings of the 2016 Fourth International Conference on 3D Vision (3DV), Stanford, CA, USA, 25–28 October 2016; pp. 565–571. [Google Scholar] [CrossRef] [Green Version]
- Yu, L.; Yang, X.; Chen, H.; Qin, J.; Heng, P.A. Volumetric convnets with mixed residual connections for automated prostate segmentation from 3d MR images. In Proceedings of the AAAI’17: Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence, San Francisco, CA, USA, 4–9 February 2017; pp. 66–72. [Google Scholar]
- Jin, Y.; Yang, G.; Fang, Y.; Li, R.; Xu, X.; Liu, Y.; Lai, X. 3D PBV-Net: An automated prostate MRI data segmentation method. Comput. Biol. Med. 2021, 128, 104160. [Google Scholar] [CrossRef]
- Li, S.; Chen, Y.; Yang, S.; Luo, W. Cascade Dense-Unet for Prostate Segmentation in MR Images. In Intelligent Computing Theories and Application; Springer: Cham, Switzerland, 2019; pp. 481–490. [Google Scholar] [CrossRef]
- Ushinsky, A.; Bardis, M.; Glavis-Bloom, J.; Uchio, E.; Chantaduly, C.; Nguyentat, M.; Chow, D.; Chang, P.D.; Houshyar, R. A 3d-2d hybrid u-net convolutional neural network approach to prostate organ segmentation of multiparametric MRI. Am. J. Roentgenol. 2021, 216, 111–116. [Google Scholar] [CrossRef]
- Tian, Z.; Li, X.; Chen, Z.; Zheng, Y.; Fan, H.; Li, Z.; Li, C.; Du, S. Interactive prostate MR image segmentation based on ConvLSTMs and GGNN. Neurocomputing 2021, 438, 84–93. [Google Scholar] [CrossRef]
- Sanford, T.H.; Harmon, S.A.; Sackett, J.; Barrett, T.; Wood, B.J.; Choyke, P.L.; Th, S.; Zhang, L.; Sa, H. Data Augmentation and Transfer Learning to Improve Generalizability of an Automated Prostate Segmentation Model Thomas. Am. J. Roentgenol. 2020, 215, 1403–1410. [Google Scholar] [CrossRef]
- Roth, H.R.; Yang, D.; Li, W.; Myronenko, A.; Zhu, W.; Xu, Z.; Wang, X.; Xu, D. Federated Whole Prostate Segmentation in MRI with Personalized Neural Architectures. In International Conference on Medical Image Computing and Computer-Assisted Intervention; Springer: Cham, Switzerland, 2021; pp. 357–366. [Google Scholar] [CrossRef]
- Meyer, A.; Chlebus, G.; Rak, M.; Schindele, D.; Schostak, M.; van Ginneken, B.; Schenk, A.; Meine, H.; Hahn, H.K.; Schreiber, A.; et al. Anisotropic 3D Multi-Stream CNN for Accurate Prostate Segmentation from Multi-Planar MRI. Comput. Methods Programs Biomed. 2021, 200, 105821. [Google Scholar] [CrossRef]
- Liu, Q.; Dou, Q.; Yu, L.; Heng, P.A. MS-Net: Multi-Site Network for Improving Prostate Segmentation with Heterogeneous MRI Data. IEEE Trans. Med. Imaging 2020, 39, 2713–2724. [Google Scholar] [CrossRef] [Green Version]
- Clark, T.; Zhang, J.; Baig, S.; Wong, A.; Haider, M.A.; Khalvati, F. Fully automated segmentation of prostate whole gland and transition zone in diffusion-weighted MRI using convolutional neural networks. J. Med. Imaging 2017, 4, 1. [Google Scholar] [CrossRef]
- Motamed, S.; Gujrathi, I.; Deniffel, D.; Oentoro, A.; Haider, M.A.; Khalvati, F. Transfer Learning for Automated Segmentation of Prostate Whole Gland and Transition Zone in Diffusion Weighted MRI. arXiv 2020, arXiv:1909.09541. [Google Scholar]
- Zhu, Y.; Wei, R.; Gao, G.; Ding, L.; Zhang, X.; Wang, X.; Zhang, J. Fully automatic segmentation on prostate MR images based on cascaded fully convolution network. J. Magn. Reson. Imaging 2019, 49, 1149–1156. [Google Scholar] [CrossRef]
- Bardis, M.; Houshyar, R.; Chantaduly, C.; Tran-Harding, K.; Ushinsky, A.; Chahine, C.; Rupasinghe, M.; Chow, D.; Chang, P. Segmentation of the Prostate Transition Zone and Peripheral Zone on MR Images with Deep Learning. Radiol. Imaging Cancer 2021, 3, e200024. [Google Scholar] [CrossRef]
- Cuocolo, R.; Comelli, A.; Stefano, A.; Benfante, V.; Dahiya, N.; Stanzione, A.; Castaldo, A.; De Lucia, D.R.; Yezzi, A.; Imbriaco, M. Deep Learning Whole-Gland and Zonal Prostate Segmentation on a Public MRI Dataset. J. Magn. Reson. Imaging 2021, 54, 452–459. [Google Scholar] [CrossRef]
- Saunders, S.L.; Leng, E.; Spilseth, B.; Wasserman, N.; Metzger, G.J.; Bolan, P.J. Training Convolutional Networks for Prostate Segmentation with Limited Data. IEEE Access 2021, 9, 109214–109223. [Google Scholar] [CrossRef]
- Tian, Z.; Liu, L.; Zhang, Z.; Fei, B. PSNet: Prostate segmentation on MRI based on a convolutional neural network. J. Med. Imaging 2018, 5, 1. [Google Scholar] [CrossRef]
- Litjens, G.; Toth, R.; van de Ven, W.; Hoeks, C.; Kerkstra, S.; van Ginneken, B.; Vincent, G.; Guillard, G.; Birbeck, N.; Zhang, J.; et al. Evaluation of prostate segmentation algorithms for MRI: The PROMISE12 challenge. Med. Image Anal. 2014, 18, 359–373. [Google Scholar] [CrossRef] [Green Version]
- NCI-ISBI 2013 Challenge—Automated Segmentation of Prostate Structures. Available online: https://wiki.cancerimagingarchive.net/display/Public/NCI-ISBI+2013+Challenge+-+Automated+Segmentation+of+Prostate+Structures (accessed on 23 November 2021).
- Simpson, A.L.; Antonelli, M.; Bakas, S.; Bilello, M.; Farahani, K.; Van Ginneken, B.; Kopp-Schneider, A.; Landman, B.A.; Litjens, G.; Menze, B.; et al. A large annotated medical image dataset for the development and evaluation of segmentation algorithms. arXiv 2019, arXiv:1902.09063. [Google Scholar]
- Hoar, D.; Lee, P.Q.; Guida, A.; Patterson, S.; Bowen, C.V.; Merrimen, J.; Wang, C.; Rendon, R.; Beyea, S.D.; Clarke, S.E. Combined Transfer Learning and Test-Time Augmentation Improves Convolutional Neural Network-Based Semantic Segmentation of Prostate Cancer from Multi-Parametric MR Images. Comput. Methods Programs Biomed. 2021, 210, 106375. [Google Scholar] [CrossRef]
- Almeida, G.; Tavares, J.M.R.S. Deep Learning in Radiation Oncology Treatment Planning for Prostate Cancer: A Systematic Review. J. Med. Syst. 2020, 44, 1–15. [Google Scholar] [CrossRef]
- Long, J.; Shelhamer, E.; Darrell, T. Fully Convolutional Networks for Semantic Segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 39, 640–651. [Google Scholar] [CrossRef]
- Ronneberger, O.; Fischer, P.; Brox, T. U-Net: Convolutional Networks for Biomedical Image Segmentation. Available online: http://lmb.informatik.uni-freiburg.de/ (accessed on 27 October 2021).
- He, X.; Zhao, K.; Chu, X. AutoML: A survey of the state-of-the-art. Knowl.-Based Syst. 2021, 212, 106622. [Google Scholar] [CrossRef]
- Simonyan, K.; Zisserman, A. Very deep convolutional networks for large-scale image recognition Karen. Am. J. Health Pharm. 2018, 75, 398–406. [Google Scholar]
- Paszke, A.; Chaurasia, A.; Kim, S.; Culurciello, E. ENet: A Deep Neural Network Architecture for Real-Time Semantic Segmentation. arXiv 2016, arXiv:1606.02147. [Google Scholar]
- Romera, E.; Álvarez, J.M.; Bergasa, L.M.; Arroyo, R. ERFNet: Efficient Residual Factorized ConvNet for Real-Time Semantic Segmentation. IEEE Trans. Intell. Transp. Syst. 2018, 19, 263–272. [Google Scholar] [CrossRef]
- Nicolae, A.M.; Venugopal, N.; Ravi, A. Trends in targeted prostate brachytherapy: From multiparametric MRI to nanomolecular radiosensitizers. Cancer Nanotechnol. 2016, 7, 6. [Google Scholar] [CrossRef] [Green Version]
- Humphrey, P.A. Histopathology of Prostate Cancer. Cold Spring Harb. Perspect. Med. 2017, 7, a030411. [Google Scholar] [CrossRef] [Green Version]
- Cool, D.W.; Zhang, X.; Romagnoli, C.; Izawa, J.I.; Romano, W.M.; Fenster, A. Evaluation of MRI-TRUS Fusion Versus Cognitive Registration Accuracy for MRI-Targeted, TRUS-Guided Prostate Biopsy. Am. J. Roentgenol. 2015, 204, 83–91. [Google Scholar] [CrossRef]
- Sun, Y.; Reynolds, H.M.; Parameswaran, B.; Wraith, D.; Finnegan, M.E.; Williams, S.; Haworth, A. Multiparametric MRI and radiomics in prostate cancer: A review. Australas Phys. Eng. Sci. Med. 2019, 42, 3–25. [Google Scholar] [CrossRef]
- Mohamed, A.; Davatzikos, C.; Taylor, R. A combined statistical and biomechanical model for estimation of intra-operative prostate deformation. In International Conference on Medical Image Computing and Computer-Assisted Intervention; Springer: Berlin/Heidelberg, Germany, 2002; Volume 2489, pp. 452–460. [Google Scholar] [CrossRef] [Green Version]
- Hu, Y.; Carter, T.J.; Ahmed, H.U.; Emberton, M.; Allen, C.; Hawkes, D.J.; Barratt, D.C. Modelling prostate motion for data fusion during image-guided interventions. IEEE Trans. Med. Imaging 2011, 30, 1887–1900. [Google Scholar] [CrossRef]
- Hu, Y.; Ahmed, H.U.; Taylor, Z.; Allen, C.; Emberton, M.; Hawkes, D.; Barratt, D. MR to ultrasound registration for image-guided prostate interventions. Med. Image Anal. 2012, 16, 687–703. [Google Scholar] [CrossRef]
- Wang, Y.; Cheng, J.Z.; Ni, D.; Lin, M.; Qin, J.; Luo, X.; Xu, M.; Xie, X.; Heng, P.A. Towards personalized statistical deformable model and hybrid point matching for robust MR-TRUS registration. IEEE Trans. Med. Imaging 2016, 35, 589–604. [Google Scholar] [CrossRef]
- Hu, Y.; Modat, M.; Gibson, E.; Ghavami, N.; Bonmati, E.; Moore, C.M.; Emberton, M.; Noble, J.A.; Barratt, D.C.; Vercauteren, T. Label-driven weakly-supervised learning for multimodal deformarle image registration. Proc.—Int. Symp. Biomed. Imaging 2018, 2018, 1070–1074. [Google Scholar] [CrossRef] [Green Version]
- Hu, Y.; Modat, M.; Gibson, E.; Li, W.; Ghavami, N.; Bonmati, E.; Wang, G.; Bandula, S.; Moore, C.M.; Emberton, M.; et al. Weakly-supervised convolutional neural networks for multimodal image registration. Med. Image Anal. 2018, 49, 1–13. [Google Scholar] [CrossRef]
- Yan, P.; Xu, S.; Rastinehad, A.R.; Wood, B.J. Adversarial Image registration with application for MR and TRUS image fusion. In Machine Learning in Medical Imaging; Springer: Cham, Switzerland, 2018. [Google Scholar] [CrossRef] [Green Version]
- Zeng, Q.; Fu, Y.; Tian, Z.; Lei, Y.; Zhang, Y.; Wang, T.; Mao, H.; Liu, T.; Curran, W.J.; Jani, A.B.; et al. Label-driven magnetic resonance imaging (MRI)-transrectal ultrasound (TRUS) registration using weakly supervised learning for MRI-guided prostate radiotherapy. Phys. Med. Biol. 2020, 65, 135002. [Google Scholar] [CrossRef]
- Chen, Y.; Xing, L.; Yu, L.; Liu, W.; Fahimian, B.P.; Nieder-, T.; Bagshaw, H.P.; Buyyounouski, M.; Han, B. MR to ultrasound image registration with segmentation-based learning for HDR prostate brachytherapy. Med. Phys. 2021, 48, 3074–3083. [Google Scholar] [CrossRef]
- Bhardwaj, A.; Park, J.-S.; Mukhopadhyay, S.; Sharda, S.; Son, Y.; Ajani, B.N.; Kudavelly, S.R. Rigid and deformable corrections in real-time using deep learning for prostate fusion biopsy. In Proceedings of the Medical Imaging 2020: Image-Guided Procedures, Robotic Interventions, and Modeling, Houston, TX, USA, 15–20 February 2020. [Google Scholar] [CrossRef]
- Hu, Y.; Gibson, E.; Ghavami, N.; Bonmati, E.; Moore, C.M.; Emberton, M.; Vercauteren, T.; Noble, J.A.; Barratt, D.C. Adversarial Deformation Regularization for Training Image Registration Neural Networks; Springer International Publishing: New York, NY, USA, 2018; Volume 11070, ISBN 9783030009274. [Google Scholar]
- Yang, X.; Fu, Y.; Lei, Y.; Tian, S.; Wang, T.; Shelton, J.W.; Jani, A.; Curran, W.J.; Patel, P.R.; Liu, T. Deformable MRI-TRUS Registration Using Biomechanically Constrained Deep Learning Model for Tumor-Targeted Prostate Brachytherapy. Int. J. Radiat. Oncol. 2020, 108, e339. [Google Scholar] [CrossRef]
- Shafai-Erfani, G.; Wang, T.; Lei, Y.; Tian, S.; Patel, P.; Jani, A.B.; Curran, W.J.; Liu, T.X.Y. Does Evaluation of MRI-based Synthetic CT Generated Using a Machine Learning for Prostate Cancer RAdiotherapy. Physiol. Behav. 2019, 44, e64–e70. [Google Scholar] [CrossRef]
- Rusu, M.; Shao, W.; Kunder, C.A.; Wang, J.B.; Soerensen, S.J.C.; Teslovich, N.C.; Sood, R.R.; Chen, L.C.; Fan, R.E.; Ghanouni, P.; et al. Registration of presurgical MRI and histopathology images from radical prostatectomy via RAPSODI. Med. Phys. 2020, 47, 4177–4188. [Google Scholar] [CrossRef] [PubMed]
- Fu, Y.; Wang, T.; Lei, Y.; Patel, P.; Jani, A.B.; Curran, W.J.; Liu, T.; Yang, X. Deformable MR-CBCT prostate registration using biomechanically constrained deep learning networks. Med. Phys. 2021, 48, 253–263. [Google Scholar] [CrossRef]
- Shao, W.; Banh, L.; Kunder, C.A.; Fan, R.E.; Soerensen, S.J.C.; Wang, J.B.; Teslovich, N.C.; Madhuripan, N.; Jawahar, A.; Ghanouni, P.; et al. ProsRegNet: A deep learning framework for registration of MRI and histopathology images of the prostate. Med. Image Anal. 2021, 68, 101919. [Google Scholar] [CrossRef]
- Sood, R.R.; Shao, W.; Kunder, C.; Teslovich, N.C.; Wang, J.B.; Soerensen, S.J.C.; Madhuripan, N.; Jawahar, A.; Brooks, J.D.; Ghanouni, P.; et al. 3D Registration of pre-surgical prostate MRI and histopathology images via super-resolution volume reconstruction. Med. Image Anal. 2021, 69, 101957. [Google Scholar] [CrossRef] [PubMed]
- Steiger, P.; Thoeny, H.C. Prostate MRI based on PI-RADS version 2: How we review and report. Cancer Imaging 2016, 16, 1–9. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Woo, S.; Suh, C.H.; Kim, S.Y.; Cho, J.Y.; Kim, S.H. Diagnostic Performance of Prostate Imaging Reporting and Data System Version 2 for Detection of Prostate Cancer: A Systematic Review and Diagnostic Meta-analysis. Eur. Urol. 2017, 72, 177–188. [Google Scholar] [CrossRef]
- Smith, C.P.; Harmon, S.A.; Barrett, T.; Bittencourt, L.K.; Law, Y.M.; Shebel, H.; An, J.Y.; Czarniecki, M.; Mehralivand, S.; Coskun, M.; et al. Intra- and interreader reproducibility of PI-RADSv2: A multireader study. J. Magn. Reson. Imaging 2019, 49, 1694–1703. [Google Scholar] [CrossRef]
- Twilt, J.J.; van Leeuwen, K.G.; Huisman, H.J.; Fütterer, J.J.; de Rooij, M. Artificial intelligence based algorithms for prostate cancer classification and detection on magnetic resonance imaging: A narrative review. Diagnostics 2021, 11, 959. [Google Scholar] [CrossRef]
- Algohary, A.; Viswanath, S.; Shiradkar, R.; Ghose, S.; Pahwa, S.; Moses, D.; Jambor, I.; Shnier, R.; Böhm, M.; Haynes, A.M.; et al. Radiomic features on MRI enable risk categorization of prostate cancer patients on active surveillance: Preliminary findings. J. Magn. Reson. Imaging 2018, 48, 818–828. [Google Scholar] [CrossRef]
- Min, X.; Li, M.; Dong, D.; Feng, Z.; Zhang, P.; Ke, Z.; You, H.; Han, F.; Ma, H.; Tian, J.; et al. Multi-parametric MRI-based radiomics signature for discriminating between clinically significant and insignificant prostate cancer: Cross-validation of a machine learning method. Eur. J. Radiol. 2019, 115, 16–21. [Google Scholar] [CrossRef] [Green Version]
- Wu, M.; Krishna, S.; Thornhill, R.E.; Flood, T.A.; McInnes, M.D.F.; Schieda, N. Transition zone prostate cancer: Logistic regression and machine-learning models of quantitative ADC, shape and texture features are highly accurate for diagnosis. J. Magn. Reson. Imaging 2019, 50, 940–950. [Google Scholar] [CrossRef]
- Liu, Y.; Zheng, H.; Liang, Z.; Qi, M.; Brisbane, W.; Marks, L.; Raman, S.; Reiter, R.; Yang, G.; Sung, K. Textured-Based Deep Learning in Prostate Cancer Classification with 3T Multiparametric MRI: Comparison with PI-RADS-Based Classification. Diagnostics 2021, 11, 1785. [Google Scholar] [CrossRef]
- Aldoj, N.; Lukas, S.; Dewey, M.; Penzkofer, T. Semi-automatic classification of prostate cancer on multi-parametric MR imaging using a multi-channel 3D convolutional neural network. Eur. Radiol. 2020, 30, 1243–1253. [Google Scholar] [CrossRef]
- Chen, Q.; Xu, X.; Hu, S.; Li, X.; Zou, Q.; Li, Y. A transfer learning approach for classification of clinical significant prostate cancers from mpMRI scans. Proc. SPIE 2017, 10134, 1154–1157. [Google Scholar] [CrossRef]
- Yuan, Y.; Qin, W.; Buyyounouski, M.; Ibragimov, B.; Hancock, S.; Han, B.; Xing, L. Prostate cancer classification with multiparametric MRI transfer learning model. Med. Phys. 2019, 46, 756–765. [Google Scholar] [CrossRef]
- Zhong, X.; Cao, R.; Shakeri, S.; Scalzo, F.; Lee, Y.; Enzmann, D.R.; Wu, H.H.; Raman, S.S.; Sung, K. Deep transfer learning-based prostate cancer classification using 3 Tesla multi-parametric MRI. Abdom. Radiol. 2019, 44, 2030–2039. [Google Scholar] [CrossRef]
- Giannini, V.; Mazzetti, S.; Vignati, A.; Russo, F.; Bollito, E.; Porpiglia, F.; Stasi, M.; Regge, D. A fully automatic computer aided diagnosis system for peripheral zone prostate cancer detection using multi-parametric magnetic resonance imaging. Comput. Med. Imaging Graph. 2015, 46, 219–226. [Google Scholar] [CrossRef]
- Mcgarry, S.D.; Hurrell, S.L.; Iczkowski, K.A.; Hall, W.; Kaczmarowski, A.L.; Banerjee, A.; Keuter, T.; Jacobsohn, K.; Bukowy, J.D.; Nevalainen, M.T.; et al. Radio-pathomic Maps of Epithelium and Lumen Density Predict the Location of High-Grade Prostate Cancer. Int. J. Radiat. Oncol. Biol. Phys. 2018, 101, 1179–1187. [Google Scholar] [CrossRef] [Green Version]
- Zhang, L.; Li, L.; Tang, M.; Huan, Y.; Zhang, X.; Zhe, X. A new approach to diagnosing prostate cancer through magnetic resonance imaging. Alex. Eng. J. 2021, 60, 897–904. [Google Scholar] [CrossRef]
- Arif, M.; Schoots, I.G.; Castillo Tovar, J.; Bangma, C.H.; Krestin, G.P.; Roobol, M.J.; Niessen, W.; Veenland, J.F. Clinically significant prostate cancer detection and segmentation in low-risk patients using a convolutional neural network on multi-parametric MRI. Eur. Radiol. 2020, 30, 6582–6592. [Google Scholar] [CrossRef]
- Seetharaman, A.; Bhattacharya, I.; Chen, L.C.; Kunder, C.A.; Shao, W.; Soerensen, S.J.C.; Wang, J.B.; Teslovich, N.C.; Fan, R.E.; Ghanouni, P.; et al. Automated detection of aggressive and indolent prostate cancer on magnetic resonance imaging. Med. Phys. 2021, 48, 2960–2972. [Google Scholar] [CrossRef]
- Mehralivand, S.; Yang, D.; Harmon, S.A.; Xu, D.; Xu, Z.; Roth, H.; Masoudi, S.; Sanford, T.H.; Kesani, D.; Lay, N.S.; et al. A Cascaded Deep Learning–Based Artificial Intelligence Algorithm for Automated Lesion Detection and Classification on Biparametric Prostate Magnetic Resonance Imaging. Acad. Radiol. 2021. [Google Scholar] [CrossRef]
- Alkadi, R.; Taher, F.; El-baz, A.; Werghi, N. A Deep Learning-Based Approach for the Detection and Localization of Prostate Cancer in T2 Magnetic Resonance Images. J. Digit. Imaging 2018, 32, 793–807. [Google Scholar] [CrossRef]
- Wang, J.; Wu, C.J.; Bao, M.L.; Zhang, J.; Wang, X.N.; Zhang, Y.D. Machine learning-based analysis of MR radiomics can help to improve the diagnostic performance of PI-RADS v2 in clinically relevant prostate cancer. Eur. Radiol. 2017, 27, 4082–4090. [Google Scholar] [CrossRef] [PubMed]
- De Vente, C.; Vos, P.; Pluim, J.; Veta, M. Simultaneous Detection and Grading of Prostate Cancer in Multi-Parametric MRI. Med. Imaging Deep. Learn. 2019, 2019, 1–5. [Google Scholar]
- Cao, R.; Mohammadian Bajgiran, A.; Afshari Mirak, S.; Shakeri, S.; Zhong, X.; Enzmann, D.; Raman, S.; Sung, K. Joint Prostate Cancer Detection and Gleason Score Prediction in mp-MRI via FocalNet. IEEE Trans. Med. Imaging 2019, 38, 2496–2506. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- de Vente, C.; Vos, P.; Hosseinzadeh, M.; Pluim, J.; Veta, M. Deep Learning Regression for Prostate Cancer Detection and Grading in Bi-Parametric MRI. IEEE Trans. Biomed. Eng. 2021, 68, 374–383. [Google Scholar] [CrossRef] [PubMed]
- Jensen, C.; Carl, J.; Boesen, L.; Langkilde, N.C.; Østergaard, L.R. Assessment of prostate cancer prognostic Gleason grade group using zonal-specific features extracted from biparametric MRI using a KNN classifier. J. Appl. Clin. Med. Phys. 2019, 20, 146–153. [Google Scholar] [CrossRef]
- Abraham, B.; Nair, M.S. Computer-aided classification of prostate cancer grade groups from MRI images using texture features and stacked sparse autoencoder. Comput. Med. Imaging Graph. 2018, 69, 60–68. [Google Scholar] [CrossRef] [PubMed]
- Initiative for Collaborative Computer Vision Benchmarking. Available online: https://i2cvb.github.io/ (accessed on 24 November 2021).
- Gillies, R.J.; Kinahan, P.E.; Hricak, H. Radiomics: Images Are More than Pictures, They Are Data. Radiology 2015, 278, 563–577. [Google Scholar] [CrossRef] [Green Version]
- Tibshirani, R. Regression Shrinkage and Selection via the Lasso. J. R. Stat. Soc. Ser. B 1996, 58, 267–288. [Google Scholar] [CrossRef]
- Gatenby, R.A.; Grove, O.; Gillies, R.J. Quantitative imaging in cancer evolution and ecology. Radiology 2013, 269, 8–15. [Google Scholar] [CrossRef] [Green Version]
- Gutiérrez, P.A.; Pérez-Ortiz, M.; Sánchez-Monedero, J.; Fernández-Navarro, F.; Hervás-Martínez, C. Ordinal Regression Methods: Survey and Experimental Study. IEEE Trans. Knowl. Data Eng. 2016, 28, 127–146. [Google Scholar] [CrossRef] [Green Version]
- Stanzione, A.; Cuocolo, R.; Cocozza, S.; Romeo, V.; Persico, F.; Fusco, F.; Longo, N.; Brunetti, A.; Imbriaco, M. Detection of Extraprostatic Extension of Cancer on Biparametric MRI Combining Texture Analysis and Machine Learning: Preliminary Results. Acad. Radiol. 2019, 26, 1338–1344. [Google Scholar] [CrossRef] [PubMed]
- Ma, S.; Xie, H.; Wang, H.; Yang, J.; Han, C.; Wang, X.; Zhang, X. Preoperative Prediction of Extracapsular Extension: Radiomics Signature Based on Magnetic Resonance Imaging to Stage Prostate Cancer. Mol. Imaging Biol. 2020, 22, 711–721. [Google Scholar] [CrossRef] [PubMed]
- Xu, L.; Zhang, G.; Zhao, L.; Mao, L.; Li, X.; Yan, W.; Xiao, Y.; Lei, J.; Sun, H.; Jin, Z. Radiomics Based on Multiparametric Magnetic Resonance Imaging to Predict Extraprostatic Extension of Prostate Cancer. Front. Oncol. 2020, 10, 940. [Google Scholar] [CrossRef]
- Losnegård, A.; Reisæter, L.A.R.; Halvorsen, O.J.; Jurek, J.; Assmus, J.; Arnes, J.B.; Honoré, A.; Monssen, J.A.; Andersen, E.; Haldorsen, I.S.; et al. Magnetic resonance radiomics for prediction of extraprostatic extension in non-favorable intermediate- and high-risk prostate cancer patients. Acta Radiol. 2020, 61, 1570–1579. [Google Scholar] [CrossRef]
- Cuocolo, R.; Stanzione, A.; Faletti, R.; Gatti, M.; Calleris, G.; Fornari, A.; Gentile, F.; Motta, A.; Dell’Aversana, S.; Creta, M.; et al. MRI index lesion radiomics and machine learning for detection of extraprostatic extension of disease: A multicenter study. Eur. Radiol. 2021, 31, 7575–7583. [Google Scholar] [CrossRef]
- Fuchsjäger, M.H.; Shukla-Dave, A.; Hricak, H.; Wang, L.; Touijer, K.; Donohue, J.F.; Eastham, J.A.; Kattan, M.W. Magnetic resonance imaging in the prediction of biochemical recurrence of prostate cancer after radical prostatectomy. BJU Int. 2009, 104, 315–320. [Google Scholar] [CrossRef]
- Park, S.Y.; Oh, Y.T.; Jung, D.C.; Cho, N.H.; Choi, Y.D.; Rha, K.H.; Hong, S.J. Prediction of biochemical recurrence after radical prostatectomy with PI-RADS version 2 in prostate cancers: Initial results. Eur. Radiol. 2015, 26, 2502–2509. [Google Scholar] [CrossRef]
- Capogrosso, P.; Vertosick, E.A.; Benfante, N.E.; Sjoberg, D.D.; Vickers, A.J.; Eastham, J.A. Can We Improve the Preoperative Prediction of Prostate Cancer Recurrence With Multiparametric MRI? Clin. Genitourin. Cancer 2019, 17, e745–e750. [Google Scholar] [CrossRef]
- Park, S.Y.; Kim, C.K.; Park, B.K.; Lee, H.M.; Lee, K.S. Prediction of biochemical recurrence following radical prostatectomy in men with prostate cancer by diffusion-weighted magnetic resonance imaging: Initial results. Eur. Radiol. 2010, 21, 1111–1118. [Google Scholar] [CrossRef]
- Bourbonne, V.; Vallières, M.; Lucia, F.; Doucet, L.; Visvikis, D.; Tissot, V.; Pradier, O.; Hatt, M.; Schick, U. MRI-Derived Radiomics to Guide Post-operative Management for High-Risk Prostate Cancer. Front. Oncol. 2019, 9, 807. [Google Scholar] [CrossRef]
- Zhang, Y.D.; Wang, J.; Wu, C.J.; Bao, M.L.; Li, H.; Wang, X.N.; Tao, J.; Shi, H.-B. An imaging-based approach predicts clinical outcomes in prostate cancer through a novel support vector machine classification. Oncotarget 2016, 7, 78140–78151. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Shiradkar, R.; Ghose, S.; Jambor, I.; Taimen, P.; Ettala, O.; Purysko, A.S.; Madabhushi, A. Radiomic features from pretreatment biparametric MRI predict prostate cancer biochemical recurrence: Preliminary findings. J. Magn. Reson. Imaging 2018, 48, 1626–1636. [Google Scholar] [CrossRef] [PubMed]
- Yan, Y.; Shao, L.; Liu, Z.; He, W.; Yang, G.; Liu, J.; Xia, H.; Zhang, Y.; Chen, H.; Liu, C.; et al. Deep learning with quantitative features of magnetic resonance images to predict biochemical recurrence of radical prostatectomy: A multi-center study. Cancers 2021, 13, 3098. [Google Scholar] [CrossRef]
- Kang, J.; Doucette, C.W.; El Naqa, I.; Zhang, H. Comparing the Kattan Nomogram to a Random Forest Model to Predict Post-Prostatectomy Pathology. Int. J. Radiat. Oncol. 2018, 102, S61–S62. [Google Scholar] [CrossRef]
- Abdollahi, H.; Mofid, B.; Shiri, I.; Razzaghdoust, A.; Saadipoor, A.; Mahdavi, A.; Galandooz, H.M.; Mahdavi, S.R. Machine learning-based radiomic models to predict intensity-modulated radiation therapy response, Gleason score and stage in prostate cancer. La Radiol. Med. 2019, 124, 555–567. [Google Scholar] [CrossRef] [PubMed]
- Poulakis, V.; Witzsch, U.; de Vries, R.; Emmerlich, V.; Meves, B.; Altmannsberger, H.-M.; Becht, E. Preoperative neural network using combined magnetic resonance imaging variables, prostate specific antigen, and Gleason score to predict prostate cancer recurrence after radical prostatectomy. Eur. Urol. 2004, 46, 571–578. [Google Scholar] [CrossRef] [PubMed]
- Harrell, F.E.J.; Califf, R.M.; Pryor, D.B.; Lee, K.L.; Rosati, R.A. Evaluating the yield of medical tests. JAMA 1982, 247, 2543–2546. [Google Scholar] [CrossRef] [PubMed]
- de Rooij, M.; Hamoen, E.H.J.; Witjes, J.A.; Barentsz, J.O.; Rovers, M.M. Accuracy of Magnetic Resonance Imaging for Local Staging of Prostate Cancer: A Diagnostic Meta-analysis. Eur. Urol. 2016, 70, 233–245. [Google Scholar] [CrossRef]
- Heidenreich, A. Consensus Criteria for the Use of Magnetic Resonance Imaging in the Diagnosis and Staging of Prostate Cancer: Not Ready for Routine Use. Eur. Urol. 2011, 59, 495–497. [Google Scholar] [CrossRef]
- Stephenson, A.J.; Kattan, M.W.; Eastham, J.A.; Dotan, Z.A.; Bianco, F.J., Jr.; Lilja, H.; Scardino, P.T. Defining biochemical recurrence of prostate cancer after radical prostatectomy: A proposal for a standardized definition. J. Clin. Oncol. 2006, 24, 3973–3978. [Google Scholar] [CrossRef] [PubMed]
- Kattan, M.W.; Stapleton, A.M.; Wheeler, T.M.; Scardino, P.T. Evaluation of a nomogram used to predict the pathologic stage of clinically localized prostate carcinoma. Cancer 1997, 79, 528–537. [Google Scholar] [CrossRef]
- Shariat, S.F.; Karakiewicz, P.I.; Roehrborn, C.G.; Kattan, M.W. An updated catalog of prostate cancer predictive tools. Cancer 2008, 113, 3075–3099. [Google Scholar] [CrossRef] [PubMed]
- Zwanenburg, A.; Leger, S.; Vallières, M.; Löck, S. Image biomarker standardisation initiative. arXiv 2016, arXiv:1612.07003. [Google Scholar]
- Moore, C.M.; Giganti, F.; Albertsen, P.; Allen, C.; Bangma, C.; Briganti, A.; Carroll, P.; Haider, M.; Kasivisvanathan, V.; Kirkham, A.; et al. Reporting Magnetic Resonance Imaging in Men on Active Surveillance for Prostate Cancer: The PRECISE Recommendations-A Report of a European School of Oncology Task Force. Eur. Urol. 2017, 71, 648–655. [Google Scholar] [CrossRef] [Green Version]
- Nayan, M.; Salari, K.; Bozzo, A.; Ganglberger, W.; Lu, G.; Carvalho, F.; Gusev, A.; Schneider, A.; Westover, B.M.; Feldman, A.S. A machine learning approach to predict progression on active surveillance for prostate cancer. Urol. Oncol. 2021. [Google Scholar] [CrossRef] [PubMed]
- McMahan, B.; Moore, E.; Ramage, D.; Hampson, S.; Arcas, B.A. Communication-Efficient Learning of Deep Networks from Decentralized Data. In Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, Fort Lauderdale, FL, USA, 20–22 April 2017; Volume 54, pp. 1273–1282. [Google Scholar]
- Xu, J.; Glicksberg, B.S.; Su, C.; Walker, P.; Bian, J.; Wang, F. Federated Learning for Healthcare Informatics. J. Healthc. Inform. Res. 2021, 5, 1–19. [Google Scholar] [CrossRef] [PubMed]
- Zhang, C.; Xie, Y.; Bai, H.; Yu, B.; Li, W.; Gao, Y. A survey on federated learning. Knowl.-Based Syst. 2021, 216, 106775. [Google Scholar] [CrossRef]
- Sarma, K.V.; Harmon, S.; Sanford, T.; Roth, H.R.; Xu, Z.; Tetreault, J.; Xu, D.; Flores, M.G.; Raman, A.G.; Kulkarni, R.; et al. Federated learning improves site performance in multicenter deep learning without data sharing. J. Am. Med. Inform. Assoc. 2021, 28, 1259–1264. [Google Scholar] [CrossRef]
- Dayan, I.; Roth, H.R.; Zhong, A.; Harouni, A.; Gentili, A.; Abidin, A.Z.; Liu, A.; Costa, A.B.; Wood, B.J.; Tsai, C.-S.; et al. Federated learning for predicting clinical outcomes in patients with COVID-19. Nat. Med. 2021, 27, 1735–1743. [Google Scholar] [CrossRef] [PubMed]
- NVIDIA Clara|NVIDIA Developer. Available online: https://developer.nvidia.com/clara (accessed on 10 November 2021).
- TensorFlow Federated: Machine Learning on Decentralized Data. Available online: https://www.tensorflow.org/federated (accessed on 10 November 2021).
- IBM Federated Learning. Available online: https://ibmfl.mybluemix.net/ (accessed on 10 November 2021).
- GitHub—Intel/Openfl: An Open Framework for Federated Learning. Available online: https://github.com/intel/openfl (accessed on 10 November 2021).
- An Industrial Grade Federated Learning Framework. Available online: https://fate.fedai.org/ (accessed on 10 November 2021).
- XayNet|Open Source Federated Learning Framework for Edge AI. Available online: https://www.xaynet.dev/ (accessed on 10 November 2021).
- GitHub—PaddlePaddle/PaddleFL: Federated Deep Learning in PaddlePaddle. Available online: https://github.com/PaddlePaddle/PaddleFL (accessed on 10 November 2021).
Publication Year | Approach | Registration Type | Registration Modalities | ML/DL Method | Auto-Seg | Sample Size | CV | Results | Refs. | ||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
TRE (mm) | DSC% | MSD (mm) | HD (mm) | Error (%) | |||||||||
2002 | Knowledge-based | Deformable | MRI–TRUS | homogeneous Mooney-Rivlin model, Linear least squares fitting | N | 25 simulations of TRUS | No | - | - | - | - | 26.7 | [66] |
2011 | Knowledge-based | non-rigid, deformable | MRI–TRUS | PCA | N | 5 patients | Leave-one-out | 5.8 | - | - | - | - | [67] |
2012 | Knowledge-based | non-rigid, deformable | MRI–TRUS | PCA | N | 8 patients | Leave-one-out | 2.4 | - | - | - | - | [68] |
2016 | Knowledge-based | non-rigid, deformable | MRI–TRUS | PCA, surface point matching | N | 1 MRI dataset and 60 TRUS datasets | Leave-one-out | 1.44 | - | - | - | - | [69] |
2018 | Weakly supervised | Deformable | MRI–TRUS | CNN | N | 111 pairs | 10-fold | 9.4 | 73 | - | - | - | [70] |
2018 | Weakly supervised | non-rigid, deformable | MRI–TRUS | CNN | N | 76 patients | 12-fold | 3.6 | 87 | - | - | - | [71] |
2018 | Unsupervised | Affine | MRI–TRUS | GAN, CNN | N | 763 pairs | No | 3.48 | - | - | - | - | [72] |
2020 | Weakly supervised | Affine and nonrigid, deformable | MRI–TRUS | FCN, 3D UNet | Y | 36 pairs | Leave-one-out | 2.53 | 91 | 0.88 | 4.41 | - | [73] |
2021 | Weakly supervised | Deformable | MRI–TRUS | 3D UNet | Y | 288 patients | No | - | 87 | - | 7.21 | [74] | |
2020 | Supervised | Rigid, Deformable | MRI–TRUS | UNet, CNN | Y | 12 patients | No | 2.99 | - | - | - | - | [75] |
2018 | Knowledge-based and DL | Non-rigid, deformable | MRI–TRUS | 3D encoder-decoder | N | 108 pairs | 12-fold | 6.3 | 82 | - | - | - | [76] |
2020 | Knowledge-based and DL | non-rigid, deformable | MRI–TRUS | CNN, 3D Point Cloud | Y | 50 patients | Leave-one-out | 1.57 | 94 | 0.90 | 2.96 | - | [77] |
2019 | Supervised | Rigid, deformable | MRI–CT | RF based on an Auto-context model | N | 17 treatment plans from 10 patients | No | - | - | - | - | <1 | [78] |
2020 | Knowledge-based | Rigid, Affine, and Deformable | MRI–histology images | - | N | 157 patients | No | - | 97 | - | 1.99 | - | [79] |
2021 | Knowledge-based | Rigid, deformable | MRI–CBCT | CNN, 3D Point Cloud | Y | 50 patients | 5-fold | 2.68 | 93 | 1.66 | - | - | [80] |
2021 | Unsupervised | Affine, Deformable | MRI–histology image | CNN | N | 99 patients (training), 53 patients (test) | No | - | 97.5, 96.1, 96.7 | - | 1.72, 1.98, 1.96 | - | [81] |
2017 | Unsupervised | Rigid, affine, deformable | MRI–histology image | Multi-image super-resolution GAN | N | 533 patients | 5-fold | - | 95 (prostate), 68 (cancer) | - | - | - | [59] |
Publication Year | Application | Method | Serum PSA (ng/mL) | Prostate Zone | Data Source | MRI Sequence(s) | Sample Sizes: | CV | Ground-Truth | Non-MRI Data Features (If Any) | Results | Refs. | ||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Train | Val | Test | Acc, AUC (%) | Ssv, Spc (%) | Kap, DSC (%) | |||||||||||
2018 | Detecting csPCa in AS patients | MRMR, QDA, RF, SVM | 6.96 ± 5.8 | WG | Pv | T2w, ADC | 31 | - | 25 | 3-fold (training) | PI-RADS score and biopsy | - | 72, - | - | - | [87] |
2019 | Differentiating csPCa and non-cs PCa | MRMR and LASSO algorithm | >10 | WG | Pv | T1w, T2w, DWI, ADC | 187 | - | 93 | 10-fold | Gleason Score | - | -, 82.3 | 84.1, 72.7 | - | [88] |
2019 | Differentiating TZ Pca from BPH | Logistic Regression and SVM | - | TZ | Pv | T2w, ADC | 105 | - | No | - | - | -, 98.9 | 93.2, 98.4 | 84 (tumour), 87 (BPH) | [89] | |
2021 | Prediction of csPCa (PI-RADS ≥ 4) | Textured-DL and CNN | 4.7–8.7 | WG | Pv | T2w, ADC | 239 | 42 | 121 | No | PI-RADS score | - | -, 85 | -, 70 | - | [90] |
2020 | Differentiating csPCA and non-cs Pca | 3D CNN | - | WG | PROx | ADC, DWI, K-trans (from DCE) | 175 | - | 25 | 8-fold | PI-RADS score | Location of lesion center | -, 89.7 | 81.9, 86.1 | - | [91] |
2017 | Differentiating csPCa and non-cs Pca | Transfer learning, ImageNet | - | WG | PROx | T2W, DWI, ADC, DCE | 330 | - | 208 | k-fold | PI-RADS score | - | -, 83 | - | - | [92] |
2019 | Classifying low-grade and high-grade Pca | Transfer learning, AlexNet NN | - | WG | Pv, PROx-2 | T2w, ADC | 110 | 66 | 44 | No | Gleason Score | - | 86.92, - | - | - | [93] |
2019 | Prediction of csPCa (PI-RADS ≥ 4) | Transfer learning | 7.9 ± 2.5 | WG | Pv | T2w, ADC | 169 | 47 | No | PI-RADS score | Zonal information | 72.3, 72.6 | 63.6, 80 | - | [94] | |
2015 | PZ cancer detection | Regression, SVM | 4.9–8.6 | PZ | Pv | T2w, ADC | 56 | 56 | Yes | prostectomy | - | -, 91 | 97 | - | [95] | |
2018 | Predictive maps of epithelium and lumen density | Least square regression | - | WG | Pv | T2w, ADC, ECE | 20 | - | 19 | No | prostectomy | - | -, 77(epithelium); 84 (lumen) | - | [96] | |
2021 | Pca detection and segmentation | Growcut, Zernik, KNN, SVM, MLP | - | PZ, TZ | Pv | T2w | 217 | - | 54 | No | prostectomy | Clinical and histopathological variables | 80.97, - | - | 79 | [97] |
2020 | Pca detection and segmentation | 3D CNN | - | WG | Pv | T2w, DWI, ADC | 116 | - | 155 | 3-fold | biopsy | Location of lesion | -, 0.65–0.89 | 82–92, 43–76 | - | [98] |
2021 | Pca differentiation and segmentation | SPCNet | 6.8–7.1 | WG | Pv | T2w, ADC | 102 | - | 332 | 5-fold | prostectomy | - | -, 0.75–0.85 | - | - | [99] |
2021 | Pca detection and classification | Cascaded DL | 4.7–9.9 | WG | Pv, PROx | T2w, ADC | 1290 | - | 150 | 5-fold | PI-RADS score | - | 30.8, - | 56.1, - | 35.9 | [100] |
2021 | Pca segmentation | Transfer learning, CNN, Test time augmentation | 2.1–18 | WG | Pv, PROx | T2w, DWI and DCE | 16, 16 | - | 16 | Leave-one-out | prostectomy | - | - | - | 59 | [54] |
2018 | Pca segmentation | Encoder–decoder CNN | - | WG, PZ, CG | I2CVB | T2w | 1413 | 236 | 707 | 10-fold | Radiologist segmented results | - | 89.4, - | - | - | [101] |
2017 | Improve PI-RADS v2 | RBF-SVM, SVM-RFE | 12.5–56.1 | WG, TZ, PZ | Pv | T2w, DWI, DCE | 97 | - | - | Leave-one-out | PI-RAD scores | - | -, 98.3 (PZ); 96.8 (TZ) | 94.4 (PZ); 91.6 (TZ), 97.7 (PZ); 95.5 (TZ) | - | [102] |
2020 | Prediction of PI-RADS v2 Score | Resnet34 CNN | - | WG | Pv, PROX | T2W, DWI, ADC, DCE | 482 | 137 | 68 | No | PI-RADS score | - | - | - | 40, - | [17] |
2019 | Pca detection, prediction of GGG score | Unet, batch normalization, ordinal regression | - | WG | PROX-2 | T2w, ADC | 99 | - | 63 | 5-fold | Gleason score | - | - | - | 32.1 | [103] |
2019 | Pca segmentation, prediction of GS Score | multi-class CNN (Deeplab) | - | WG | Pv | T2w, DWI | 417 | - | - | 5-fold | Gleason score | - | -, 80.9 | 88.8, - | - | [104] |
2021 | Prediction of GGG score | Unet, ordinal regression | - | WG, TZ, PZ | PROX-2 | T2W, DWI, ADC | 112 | - | 70 | 5-fold | GGG | Zonal information | - | - | 13, 37 | [105] |
2019 | Prediction of GGG score | KNN | - | TZ, PZ | Pv | T2w, DCE, DWI, ADC | 112 | - | 70 | 3-fold | GGG | Texture features, zonal information | -, 92 (PZ); 87 (TZ) | - | - | [106] |
2018 | Prediction of GGG score | Stacked sparse autoencoders | - | WG | PROX-2 | T2w, DWI, ADC, | 112 | - | 70 | 3-fold | GGG | Hand-crafted texture features | 47.3, - | - | 27.72, - | [107] |
2021 | Lesion detection and classification | Cascaded DL | 4.7–9.9 | WG | Pv, PROX | T2w, ADC | 1290 | - | 150 | 5-fold | PI-RADS score | - | 30.8, - | 56.1, - | -, 35.9 | [100] |
Publication Year | Application | Method | Input Feature | Sample Size | Ground-Truth | MRI Sequence (s) | CV | Results | Refs. | ||
---|---|---|---|---|---|---|---|---|---|---|---|
Acc (%) | AUC (%) | C-Index | |||||||||
2019 | EPE Detection | Bayesian Network, Texture analysis | Index lesions from biparametric MRI | 39 | Prostectomy | T2w, ADC | No | 82 | 88 | - | [113] |
2020 | ECE Prediction | LASSO regression | ROIs of T2W images | 119 | Prostectomy | T2w, DWI, DCE | 10-fold | - | 82.1 | - | [114] |
2020 | EPE Detection | LASSO regression | Radiomic features, patients’ clinical and pathological variables | 115 | Prostectomy | T2w, ADC, DWI, DCE | No | 81.8 | 86.5 | - | [115] |
2020 | EPE Prediction | Combination of RF model, radiology interpretation and clinical nomogram | MR radiomic features | 228 | Prostectomy | T1w, T2w, DWI, DCE | 10-fold | - | 79 | - | [116] |
2021 | EPE Detection | SVM | Radiomic feature from MRI index lesions | 193 | Prostectomy | T2w, ADC | 10-fold | 79 | - | - | [117] |
2009 | BCR Prediction | Cox regression | GS and clinical variales | 610 | BCR defined by NCCN guideline | T2w, DWI, ADC, DCE | No | - | - | 0.776 (5-year), 0.788 (10-year) | [118] |
2015 | BCR Prediction | Univariate and multivariate analyses using Cox’s proportional hazards model | PI-RADSv2 score, surgical parameters | 158 | Two consecutive PSA ≥ 0.2 ng/mL | T2w, DWI, DCE | No | - | - | - | [119] |
2019 | pre-biopsy mpMRI to improve preoperative risk model | Cox regression | pre-biopsy mpMRI score | 372 | Two consecutive PSA ≥ 0.1 ng/mL | T1w, T2w | No | - | - | - | [120] |
2010 | BCR Prediction | Univariate and multivariate analyses | Clinical variables and tumour ADC data | 158 | PSA ≥ 0.2 ng/mL | ADC, DWI | No | - | 75.5 | - | [121] |
2019 | BCR and bRFS Prediction | Univariate and multivariate Cox regression | IBSI-compliant radiomic features | 107 | Two consecutive PSA ≥ 0.2 ng/mL | T2w, ADC | No | - | 76 | - | [122] |
2016 | BCR Prediction | SVM | Clinicopathologic and bpMRI variables | 205 | PSA ≥ 0.2 ng/mL | T2w, DWI, DCE | 5-fold | 92.2 | - | - | [123] |
2018 | Identify predictive radiomic features for BCR | SVM, Linear discriminant analysis and RF | Radiomic features from pretreatment bpMRI | 120 | PSA > 0.2 ng/mL (post-RP) and PSA >2 ng/mL (post-RT) | T2w, ADC | 3-fold | - | 73 | - | [124] |
2021 | BCR Prediction | Radiomic-based DL | Quantitative features of MRI | 485 | PSA ≥ 0.2 ng/mL | T1w, T2w, DWI, ADC | No | - | - | 0.802 | [125] |
2018 | Post-Prostatectomy Pathology prediction | RF | Demographics, PSA trends, and location-specific biopsy findings | 1560 | Prostatectomy | - | - | - | 75 (OCD), 73 (ECE), 64 (pN+) | - | [126] |
2019 | IMRT response prediction | Univariate radiomic analysis, ML classification models | pre-/post-IMRT mpMRI radiomic features | 33 | Change of ADC values before and after IMRT. | T2w, ADC | 10-fold | - | 63.2 | - | [127] |
2004 | BCR Prediction | ANN | MRI findings, PSA, biopsy Gleason score | 210 | PSA level ≥ 0.1 ng/mL | T2w, DWI, ADC, DCE | 5-fold | - | 89.7 | - | [128] |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Li, H.; Lee, C.H.; Chia, D.; Lin, Z.; Huang, W.; Tan, C.H. Machine Learning in Prostate MRI for Prostate Cancer: Current Status and Future Opportunities. Diagnostics 2022, 12, 289. https://doi.org/10.3390/diagnostics12020289
Li H, Lee CH, Chia D, Lin Z, Huang W, Tan CH. Machine Learning in Prostate MRI for Prostate Cancer: Current Status and Future Opportunities. Diagnostics. 2022; 12(2):289. https://doi.org/10.3390/diagnostics12020289
Chicago/Turabian StyleLi, Huanye, Chau Hung Lee, David Chia, Zhiping Lin, Weimin Huang, and Cher Heng Tan. 2022. "Machine Learning in Prostate MRI for Prostate Cancer: Current Status and Future Opportunities" Diagnostics 12, no. 2: 289. https://doi.org/10.3390/diagnostics12020289
APA StyleLi, H., Lee, C. H., Chia, D., Lin, Z., Huang, W., & Tan, C. H. (2022). Machine Learning in Prostate MRI for Prostate Cancer: Current Status and Future Opportunities. Diagnostics, 12(2), 289. https://doi.org/10.3390/diagnostics12020289