Deep Learning and Gastric Cancer: Systematic Review of AI-Assisted Endoscopy
Abstract
:1. Introduction
2. Methods
2.1. Search Strategy
2.2. Selection Criteria
2.3. Data Extraction
2.4. Data Synthesis and Analysis
3. Results
3.1. Gastric Neoplasm Classification
3.2. Gastric Neoplasm Detection
3.3. Assessment of Tumor Invasion Depth
3.4. Gastric Neoplasm Segmentation
3.5. Pre-Malignant Lesion Detection, Classification, and Segmentation
3.6. Early Gastric Cancer Detection
4. Discussion
Author Contributions
Funding
Conflicts of Interest
Abbreviations
GC | Gastric Cancer |
AI | Artificial Intelligence |
CNN | Convolutional Neural Network |
CADe | Computer-Aided Detection |
CADx | Computer-Aided Diagnosis |
CDSS | Clinical Decision Support Systems |
GIM | Gastric Intestinal Metaplasia |
DL | Deep Learning |
PRISMA | Preferred Reporting Items for Systematic Reviews and Meta-Analyses |
MeSH | Medical Subject Headings |
AUC | Area Under the Receiver Operating Characteristic Curve |
SSD | Single Shot MultiBox Detector |
YOLOv4 | You Only Look Once, Version 4 |
mAP | Mean Average Precision |
CE | Capsule Endoscopy |
WLE | White Light Endoscopy |
CSA-CA-TB-ResUnet | Co-spatial attention and channel attention-based triple-branch resunet |
ME-NBI | Magnifying Endoscopy with Narrow-Band Imaging |
DCNN | Deep Convolutional Neural Network |
TResNet | Tensor Residual Network |
BiSeNet | Bilateral Segmentation Network |
ResNet | Residual Network |
VGGNet | Visual Geometry Group Network |
ResNet152 | Residual Network 152 Layers |
ResNet50 | Residual Network 50 Layers |
CAD | Computer-aided Diagnosis |
GNLs | Gastric Neoplastic Lesions |
EUS | Endoscopic Ultrasonography |
EGCM | Early Gastric Cancer Model |
Grad-CAM | Gradient-weighted Class Activation Mapping |
References
- Sung, H.; Ferlay, J.; Siegel, R.L.; Laversanne, M.; Soerjomataram, I.; Jemal, A.; Bray, F. Global Cancer Statistics 2020: GLOBOCAN Estimates of Incidence and Mortality Worldwide for 36 Cancers in 185 Countries. CA Cancer J. Clin. 2021, 71, 209–249. [Google Scholar] [CrossRef]
- Thrift, A.P.; Wenker, T.N.; El-Serag, H.B. Global burden of gastric cancer: Epidemiological trends, risk factors, screening and prevention. Nat. Rev. Clin. Oncol. 2023, 20, 338–349. [Google Scholar] [CrossRef]
- Morgan, E.; Arnold, M.; Camargo, M.C.; Gini, A.; Kunzmann, A.T.; Matsuda, T.; Meheus, F.; Verhoeven, R.H.A.; Vignat, J.; Laversanne, M.; et al. The current and future incidence and mortality of gastric cancer in 185 countries, 2020–2040: A population-based modelling study. eClinicalMedicine 2022, 47, 101404. [Google Scholar] [CrossRef]
- Katai, H.; Ishikawa, T.; Akazawa, K.; Isobe, Y.; Miyashiro, I.; Oda, I.; Tsujitani, S.; Ono, H.; Tanabe, S.; Fukagawa, T.; et al. Five-year survival analysis of surgically resected gastric cancer cases in Japan: A retrospective analysis of more than 100,000 patients from the nationwide registry of the Japanese Gastric Cancer Association (2001–2007). Gastric Cancer 2018, 21, 144–154. [Google Scholar] [CrossRef]
- Januszewicz, W.; Witczak, K.; Wieszczy, P.; Socha, M.; Turkot, M.H.; Wojciechowska, U.; Didkowska, J.; Kaminski, M.F.; Regula, J. Prevalence and risk factors of upper gastrointestinal cancers missed during endoscopy: A nationwide registry-based study. Endoscopy 2022, 54, 653–660. [Google Scholar] [CrossRef]
- Hosokawa, O.; Hattori, M.; Douden, K.; Hayashi, H.; Ohta, K.; Kaizaki, Y. Difference in accuracy between gastroscopy and colonoscopy for detection of cancer. Hepatogastroenterology 2007, 54, 442–444. [Google Scholar]
- Menon, S.; Trudgill, N. How commonly is upper gastrointestinal cancer missed at endoscopy? A meta-analysis. Endosc. Int. Open 2014, 2, E46–E50. [Google Scholar] [CrossRef]
- Ochiai, K.; Ozawa, T.; Shibata, J.; Ishihara, S.; Tada, T. Current Status of Artificial Intelligence-Based Computer-Assisted Diagnosis Systems for Gastric Cancer in Endoscopy. Diagnostics 2022, 12, 3153. [Google Scholar] [CrossRef]
- Sharma, P.; Hassan, C. Artificial Intelligence and Deep Learning for Upper Gastrointestinal Neoplasia. Gastroenterology 2022, 162, 1056–1066. [Google Scholar] [CrossRef]
- Sarker, I.H. Deep Learning: A Comprehensive Overview on Techniques, Taxonomy, Applications and Research Directions. SN Comput. Sci. 2021, 2, 420. [Google Scholar] [CrossRef]
- Liu, L.; Dong, Z.; Cheng, J.; Bu, X.; Qiu, K.; Yang, C.; Wang, J.; Niu, W.; Wu, X.; Xu, J.; et al. Diagnosis and segmentation effect of the ME-NBI-based deep learning model on gastric neoplasms in patients with suspected superficial lesions—A multicenter study. Front. Oncol. 2023, 12, 1075578. [Google Scholar] [CrossRef]
- Hu, H.; Gong, L.; Dong, D.; Zhu, L.; Wang, M.; He, J.; Shu, L.; Cai, Y.; Cai, S.; Su, W.; et al. Identifying early gastric cancer under magnifying narrow-band images with deep learning: A multicenter study. Gastrointest. Endosc. 2021, 93, 1333–1341.e3. [Google Scholar] [CrossRef]
- Zhu, Y.; Wang, Q.C.; Xu, M.D.; Zhang, Z.; Cheng, J.; Zhong, Y.S.; Zhang, Y.Q.; Chen, W.F.; Yao, L.Q.; Zhou, P.H.; et al. Application of convolutional neural network in the diagnosis of the invasion depth of gastric cancer based on conventional endoscopy. Gastrointest. Endosc. 2019, 89, 806–815.e1. [Google Scholar] [CrossRef]
- Goto, A.; Kubota, N.; Nishikawa, J.; Ogawa, R.; Hamabe, K.; Hashimoto, S.; Ogihara, H.; Hamamoto, Y.; Yanai, H.; Miura, O.; et al. Cooperation between artificial intelligence and endoscopists for diagnosing invasion depth of early gastric cancer. Gastric Cancer 2023, 26, 116–122. [Google Scholar] [CrossRef]
- Islam, M.M.; Poly, T.N.; Walther, B.A.; Lin, M.C.; Li, Y.J. Artificial Intelligence in Gastric Cancer: Identifying Gastric Cancer Using Endoscopic Images with Convolutional Neural Network. Cancers 2021, 13, 5253. [Google Scholar] [CrossRef]
- Du, H.; Dong, Z.; Wu, L.; Li, Y.; Liu, J.; Luo, C.; Zeng, X.; Deng, Y.; Cheng, D.; Diao, W.; et al. A deep-learning based system using multi-modal data for diagnosing gastric neoplasms in real-time (with video). Gastric Cancer 2023, 26, 275–285. [Google Scholar] [CrossRef]
- Pornvoraphat, P.; Tiankanon, K.; Pittayanon, R.; Sunthornwetchapong, P.; Vateekul, P.; Rerknimitr, R. Real-time gastric intestinal metaplasia diagnosis tailored for bias and noisy-labeled data with multiple endoscopic imaging. Comput. Biol. Med. 2023, 154, 106582. [Google Scholar] [CrossRef]
- Gong, E.J.; Bang, C.S.; Lee, J.J.; Baik, G.H.; Lim, H.; Jeong, J.H.; Choi, S.W.; Cho, J.; Kim, D.Y.; Lee, K.B.; et al. Deep learning-based clinical decision support system for gastric neoplasms in real-time endoscopy: Development and validation study. Endoscopy 2023, 55, 701–708. [Google Scholar] [CrossRef]
- Lee, J.H.; Kim, Y.J.; Kim, Y.W.; Park, S.; Choi, Y.I.; Kim, Y.J.; Park, D.K.; Kim, K.G.; Chung, J.W. Spotting malignancies from gastric endoscopic images using deep learning. Surg. Endosc. 2019, 33, 3790–3797. [Google Scholar] [CrossRef]
- Cho, B.J.; Bang, C.S.; Park, S.W.; Yang, Y.J.; Seo, S.I.; Lim, H.; Shin, W.G.; Hong, J.T.; Yoo, Y.T.; Hong, S.H. Automated classification of gastric neoplasms in endoscopic images using a convolutional neural network. Endoscopy 2019, 51, 1121–1129. [Google Scholar] [CrossRef]
- Ueyama, H.; Kato, Y.; Akazawa, Y.; Yatagai, N.; Komori, H.; Takeda, T.; Matsumoto, K.; Ueda, K.; Matsumoto, K.; Hojo, M.; et al. Application of artificial intelligence using a convolutional neural network for diagnosis of early gastric cancer based on magnifying endoscopy with narrow-band imaging. J. Gastroenterol. Hepatol. 2021, 36, 482–489. [Google Scholar] [CrossRef]
- Horiuchi, Y.; Aoyama, K.; Tokai, Y.; Hirasawa, T.; Yoshimizu, S.; Ishiyama, A.; Yoshio, T.; Tsuchida, T.; Fujisaki, J.; Tada, T. Convolutional Neural Network for Differentiating Gastric Cancer from Gastritis Using Magnified Endoscopy with Narrow Band Imaging. Dig. Dis. Sci. 2020, 65, 1355–1363. [Google Scholar] [CrossRef]
- Klang, E.; Barash, Y.; Levartovsky, A.; Barkin Lederer, N.; Lahat, A. Differentiation Between Malignant and Benign Endoscopic Images of Gastric Ulcers Using Deep Learning. Clin. Exp. Gastroenterol. 2021, 14, 155–162. [Google Scholar] [CrossRef]
- Liu, Y.; Zhang, L.; Hao, Z. An xception model based on residual attention mechanism for the classification of benign and malignant gastric ulcers. Sci. Rep. 2022, 12, 15365. [Google Scholar] [CrossRef]
- Zhang, X.; Chen, F.; Yu, T.; An, J.; Huang, Z.; Liu, J.; Hu, W.; Wang, L.; Duan, H.; Si, J. Real-time gastric polyp detection using convolutional neural networks. PLoS ONE 2019, 14, e0214133. [Google Scholar] [CrossRef]
- Durak, S.; Bayram, B.; Bakırman, T.; Erkut, M.; Doğan, M.; Gürtürk, M.; Akpınar, B. Deep neural network approaches for detecting gastric polyps in endoscopic images. Med. Biol. Eng. Comput. 2021, 59, 1563–1574. [Google Scholar] [CrossRef]
- Wu, L.; Shang, R.; Sharma, P.; Zhou, W.; Liu, J.; Yao, L.; Dong, Z.; Yuan, J.; Zeng, Z.; Yu, Y.; et al. Effect of a deep learning-based system on the miss rate of gastric neoplasms during upper gastrointestinal endoscopy: A single-centre, tandem, randomised controlled trial. Lancet Gastroenterol. Hepatol. 2021, 6, 700–708. [Google Scholar] [CrossRef]
- Xu, M.; Zhou, W.; Wu, L.; Zhang, J.; Wang, J.; Mu, G.; Huang, X.; Li, Y.; Yuan, J.; Zeng, Z.; et al. Artificial intelligence in the diagnosis of gastric precancerous conditions by image-enhanced endoscopy: A multicenter, diagnostic study (with video). Gastrointest. Endosc. 2021, 94, 540–548.e4. [Google Scholar] [CrossRef]
- Hirasawa, T.; Aoyama, K.; Tanimoto, T.; Ishihara, S.; Shichijo, S.; Ozawa, T.; Ohnishi, T.; Fujishiro, M.; Matsuo, K.; Fujisaki, J.; et al. Application of artificial intelligence using a convolutional neural network for detecting gastric cancer in endoscopic images. Gastric Cancer 2018, 21, 653–660. [Google Scholar] [CrossRef]
- Ikenoyama, Y.; Hirasawa, T.; Ishioka, M.; Namikawa, K.; Yoshimizu, S.; Horiuchi, Y.; Ishiyama, A.; Yoshio, T.; Tsuchida, T.; Takeuchi, Y.; et al. Detecting early gastric cancer: Comparison between the diagnostic ability of convolutional neural networks and endoscopists. Dig. Endosc. 2021, 33, 141–150. [Google Scholar] [CrossRef]
- Bang, C.S.; Lim, H.; Jeong, H.M.; Hwang, S.H. Use of Endoscopic Images in the Prediction of Submucosal Invasion of Gastric Neoplasms: Automated Deep Learning Model Development and Usability Study. J. Med. Internet Res. 2021, 23, e25167. [Google Scholar] [CrossRef]
- Yoon, H.J.; Kim, S.; Kim, J.H.; Keum, J.S.; Oh, S.I.; Jo, J.; Chun, J.; Youn, Y.H.; Park, H.; Kwon, I.G.; et al. A Lesion-Based Convolutional Neural Network Improves Endoscopic Detection and Depth Prediction of Early Gastric Cancer. J. Clin. Med. 2019, 8, 1310. [Google Scholar] [CrossRef]
- Hamada, K.; Kawahara, Y.; Tanimoto, T.; Ohto, A.; Toda, A.; Aida, T.; Yamasaki, Y.; Gotoda, T.; Ogawa, T.; Abe, M.; et al. Application of convolutional neural networks for evaluating the depth of invasion of early gastric cancer based on endoscopic images. J. Gastroenterol. Hepatol. 2022, 37, 352–357. [Google Scholar] [CrossRef]
- Tang, D.; Zhou, J.; Wang, L.; Ni, M.; Chen, M.; Hassan, S.; Luo, R.; Chen, X.; He, X.; Zhang, L.; et al. A Novel Model Based on Deep Convolutional Neural Network Improves Diagnostic Accuracy of Intramucosal Gastric Cancer (With Video). Front. Oncol. 2021, 11, 622827. [Google Scholar] [CrossRef]
- An, P.; Yang, D.; Wang, J.; Wu, L.; Zhou, J.; Zeng, Z.; Huang, X.; Xiao, Y.; Hu, S.; Chen, Y.; et al. A deep learning method for delineating early gastric cancer resection margin under chromoendoscopy and white light endoscopy. Gastric Cancer 2020, 23, 884–892. [Google Scholar] [CrossRef]
- Du, W.; Rao, N.; Yong, J.; Adjei, P.E.; Hu, X.; Wang, X.; Gan, T.; Zhu, L.; Zeng, B.; Liu, M.; et al. Early gastric cancer segmentation in gastroscopic images using a co-spatial attention and channel attention based triple-branch ResUnet. Comput. Methods Programs Biomed. 2023, 231, 107397. [Google Scholar] [CrossRef]
- Ling, T.; Wu, L.; Fu, Y.; Xu, Q.; An, P.; Zhang, J.; Hu, S.; Chen, Y.; He, X.; Wang, J.; et al. A deep learning-based system for identifying differentiation status and delineating the margins of early gastric cancer in magnifying narrow-band imaging endoscopy. Endoscopy 2021, 53, 469–477. [Google Scholar] [CrossRef]
- Teramoto, A.; Shibata, T.; Yamada, H.; Hirooka, Y.; Saito, K.; Fujita, H. Detection and Characterization of Gastric Cancer Using Cascade Deep Learning Model in Endoscopic Images. Diagnostics 2022, 12, 1996. [Google Scholar] [CrossRef]
- Lin, N.; Yu, T.; Zheng, W.; Hu, H.; Xiang, L.; Ye, G.; Zhong, X.; Ye, B.; Wang, R.; Deng, W.; et al. Simultaneous Recognition of Atrophic Gastritis and Intestinal Metaplasia on White Light Endoscopic Images Based on Convolutional Neural Networks: A Multicenter Study. Clin. Transl. Gastroenterol. 2021, 12, e00385. [Google Scholar] [CrossRef]
- Wu, L.; Xu, M.; Jiang, X.; He, X.; Zhang, H.; Ai, Y.; Tong, Q.; Lv, P.; Lu, B.; Guo, M.; et al. Real-time artificial intelligence for detecting focal lesions and diagnosing neoplasms of the stomach by white-light endoscopy (with videos). Gastrointest. Endosc. 2022, 95, 269–280.e6. [Google Scholar] [CrossRef]
- Hirai, K.; Kuwahara, T.; Furukawa, K.; Kakushima, N.; Furune, S.; Yamamoto, H.; Marukawa, T.; Asai, H.; Matsui, K.; Sasaki, Y.; et al. Artificial intelligence-based diagnosis of upper gastrointestinal subepithelial lesions on endoscopic ultrasonography images. Gastric Cancer 2022, 25, 382–391. [Google Scholar] [CrossRef]
- Sakai, Y.; Takemoto, S.; Hori, K.; Nishimura, M.; Ikematsu, H.; Yano, T.; Yokota, H. Automatic detection of early gastric cancer in endoscopic images using a transferring convolutional neural network. Annu. Int. Conf. IEEE Eng. Med. Biol. Soc. 2018, 2018, 4138–4141. [Google Scholar]
- Li, L.; Chen, Y.; Shen, Z.; Zhang, X.; Sang, J.; Ding, Y.; Yang, X.; Li, J.; Chen, M.; Jin, C.; et al. Convolutional neural network for the diagnosis of early gastric cancer based on magnifying narrow band imaging. Gastric Cancer. 2020, 23, 126–132. [Google Scholar] [CrossRef]
- Tang, D.; Wang, L.; Ling, T.; Lv, Y.; Ni, M.; Zhan, Q.; Fu, Y.; Zhuang, D.; Guo, H.; Dou, X.; et al. Development and validation of a real-time artificial intelligence-assisted system for detecting early gastric cancer: A multicentre retrospective diagnostic study. eBioMedicine 2020, 62, 103146. [Google Scholar] [CrossRef]
- Wu, L.; Wang, J.; He, X.; Zhu, Y.; Jiang, X.; Chen, Y.; Wang, Y.; Huang, L.; Shang, R.; Dong, Z.; et al. Deep learning system compared with expert endoscopists in predicting early gastric cancer and its invasion depth and differentiation status (with videos). Gastrointest. Endosc. 2022, 95, 92–104.e3. [Google Scholar] [CrossRef]
- He, X.; Wu, L.; Dong, Z.; Gong, D.; Jiang, X.; Zhang, H.; Ai, Y.; Tong, Q.; Lv, P.; Lu, B.; et al. Real-time use of artificial intelligence for diagnosing early gastric cancer by magnifying image-enhanced endoscopy: A multicenter diagnostic study (with videos). Gastrointest. Endosc. 2022, 95, 671–678.e4. [Google Scholar] [CrossRef]
- Yao, Z.; Jin, T.; Mao, B.; Lu, B.; Zhang, Y.; Li, S.; Chen, W. Construction and Multicenter Diagnostic Verification of Intelligent Recognition System for Endoscopic Images from Early Gastric Cancer Based on YOLO-V3 Algorithm. Front. Oncol. 2022, 12, 815951. [Google Scholar] [CrossRef]
- Li, J.; Zhu, Y.; Dong, Z.; He, X.; Xu, M.; Liu, J.; Zhang, M.; Tao, X.; Du, H.; Chen, D.; et al. Development and validation of a feature extraction-based logical anthropomorphic diagnostic system for early gastric cancer: A case-control study. eClinicalMedicine 2022, 46, 101366. [Google Scholar] [CrossRef]
- Jin, J.; Zhang, Q.; Dong, B.; Ma, T.; Mei, X.; Wang, X.; Song, S.; Peng, J.; Wu, A.; Dong, L.; et al. Automatic detection of early gastric cancer in endoscopy based on Mask region-based convolutional neural networks (Mask R-CNN) (with video). Front. Oncol. 2022, 12, 927868. [Google Scholar] [CrossRef]
- Zhou, B.; Rao, X.; Xing, H.; Ma, Y.; Wang, F.; Rong, L. A convolutional neural network-based system for detecting early gastric cancer in white-light endoscopy. Scand. J. Gastroenterol. 2023, 58, 157–162. [Google Scholar] [CrossRef]
- Su, X.; Liu, Q.; Gao, X.; Ma, L. Evaluation of deep learning methods for early gastric cancer detection using gastroscopic images. Technol. Health Care 2023, 31 (Suppl. S1), 313–322. [Google Scholar] [CrossRef]
- Nakahira, H.; Ishihara, R.; Aoyama, K.; Kono, M.; Fukuda, H.; Shimamoto, Y.; Nakagawa, K.; Ohmori, M.; Iwatsubo, T.; Iwagami, H.; et al. Stratification of gastric cancer risk using a deep neural network. JGH Open 2019, 4, 466–471. [Google Scholar] [CrossRef]
- Igarashi, S.; Sasaki, Y.; Mikami, T.; Sakuraba, H.; Fukuda, S. Anatomical classification of upper gastrointestinal organs under various image capture conditions using AlexNet. Comput. Biol. Med. 2020, 124, 103950. [Google Scholar] [CrossRef]
- Xiao, Z.; Ji, D.; Li, F.; Li, Z.; Bao, Z. Application of Artificial Intelligence in Early Gastric Cancer Diagnosis. Digestion 2022, 103, 69–75. [Google Scholar] [CrossRef]
- Kim, J.H.; Nam, S.J.; Park, S.C. Usefulness of artificial intelligence in gastric neoplasms. World J. Gastroenterol. 2021, 27, 3543–3555. [Google Scholar] [CrossRef]
- Hsiao, Y.J.; Wen, Y.C.; Lai, W.Y.; Lin, Y.Y.; Yang, Y.P.; Chien, Y.; Yarmishyn, A.A.; Hwang, D.K.; Lin, T.C.; Chang, Y.C.; et al. Application of artificial intelligence-driven endoscopic screening and diagnosis of gastric cancer. World J. Gastroenterol. 2021, 27, 2979. [Google Scholar] [CrossRef]
- Suzuki, H.; Yoshitaka, T.; Yoshio, T.; Tada, T. Artificial intelligence for cancer detection of the upper gastrointestinal tract. Dig. Endosc. 2021, 33, 254–262. [Google Scholar] [CrossRef]
- Zhang, J.; Cui, Y.; Wei, K.; Li, Z.; Li, D.; Song, R.; Ren, J.; Gao, X.; Yang, X. Deep learning predicts resistance to neoadjuvant chemotherapy for locally advanced gastric cancer: A multicenter study. Gastric Cancer 2022, 25, 1050–1059. [Google Scholar] [CrossRef]
- Liu, X.; Cruz Rivera, S.; Moher, D.; Calvert, M.J.; Denniston, A.K.; SPIRIT-AI and CONSORT-AI Working Group. Reporting guidelines for clinical trial reports for interventions involving artificial intelligence: The CONSORT-AI extension. Nat Med. 2020, 26, 1364–1374. [Google Scholar] [CrossRef]
- Cruz Rivera, S.; Liu, X.; Chan, A.W.; Denniston, A.K.; Calvert, M.J.; SPIRIT-AI and CONSORT-AI Working Group; SPIRIT-AI and CONSORT-AI Steering Group; SPIRIT-AI and CONSORT-AI Consensus Group. Guidelines for clinical trial protocols for interventions involving artificial intelligence: The SPIRIT-AI extension. Nat Med. 2020, 26, 1351–1363. [Google Scholar] [CrossRef]
- Parasa, S.; Repici, A.; Berzin, T.; Leggett, C.; Gross, S.A.; Sharma, P. Framework and metrics for the clinical use and implementation of artificial intelligence algorithms into endoscopy practice: Recommendations from the American Society for Gastrointestinal Endoscopy Artificial Intelligence Task Force. Gastrointest. Endosc. 2023, 97, 815–824.e1. [Google Scholar] [CrossRef] [PubMed]
REF | Subject | Clinical Task | Study Design | Sample Size | Detection/Classification Objective | AI Model/Algorithm | AI Model Performance | Clinical Implications |
---|---|---|---|---|---|---|---|---|
Liu L et al. [11] | Pre-malignant lesions | Diagnosis and segmentation of GNLs | Retrospective, two centers | 3757 images from 392 patients with GNLs and 2420 images from 568 patients with non-GNLs | Diagnosis and segmentation of GNLs under magnifying endoscopy with narrow-band imaging (ME-NBI) | Two convolutional neural network (CNN) modules | Accuracy: 90.8%, Sensitivity: 92.5%, Specificity: 89.0% | CAD system can assist endoscopists in more accurately diagnosing GNLs and delineating their extent |
Hu H et al. [12] | Early gastric cancer | Diagnosis of EGC under ME-NBI | Multicenter, randomized | 295 cases | Diagnosis of EGC | VGG-19 | AUC: 0.808 (ITC), 0.813 (ETC), Accuracy: 0.770, Sensitivity: 0.792, Specificity: 0.745 | The model exhibited comparable performance with senior endoscopists in the diagnosis of EGC and showed potential value in aiding and improving the diagnosis of EGC by endoscopists. |
Zhu Y et al. [13] | Gastric cancer | Prediction of invasion depth | Retrospective and prospective | 790 images in the development dataset, 203 images in the test dataset | Determination of invasion depth of gastric cancer | ResNet50 | Area under the ROC curve: 0.94 | Assists in screening patients for endoscopic resection by accurately predicting invasion depth. |
Goto A et al. [14] | Early gastric cancer | Invasion depth determination | Retrospective single | 250 intramucosal cancers and 250 submucosal cancers | Differentiating intramucosal and submucosal gastric cancers | Not specified | Accuracy: 77%, Sensitivity: 76%, Specificity: 78%, F1 measure: 0.768 | Improvement in diagnostic ability to determine invasion depth of early gastric cancer |
DU H et al. [16] | Detection of gastric pathologies | Real-time diagnosis | Retrospective and prospective | 4201 images, 7436 image-pairs, 162 videos | Real-time diagnosing of gastric neoplasms | ENDOANGEL-MM | Accuracy: 86.54% for images, 90.00% for videos, 93.55% for prospective patients | ENDOANGEL-MM identifies gastric neoplasms with good accuracy and has potential role in real-clinic |
Pornvoraphat P et al. [17] | Pre-malignant lesions | Real-time segmentation of GIM | Retrospective, single-center, case–control | 940 GIM images and 1239 non-GIM images | Segmentation of GIM from a healthy stomach | BiSeNet-based model | Sensitivity: 91%, Specificity: 96%, Accuracy: 96%, Mean IoU: 55% | Real-time detection of GIM for improved diagnostic precision |
Gong EG et al. [18] | Detection of gastric pathologies | Automated detection and classification of gastric neoplasms | Prospective, multicenter | 5017 images for training, 2524 procedures for internal testing, 3976 images from five institutions for external testing | Automated detection and classification of gastric neoplasms in real-time endoscopy | Clinical decision support system (CDSS) based on deep learning | Detection rate: 95.6% (internal test), Accuracy: 81.5% in four-class classification and 86.4% in binary classification (external test) | CDSS has potential for real-life clinical application and high performance in terms of lesion detection and classification |
Lee JH et al. [19] | Classification of gastric pathologies | Classification of benign ulcer and cancer | Retrospective, single center, case control | 200 normal, 367 cancer, and 220 ulcer cases | Classification of normal, benign ulcer, and cancer images | Inception, ResNet, and VGGNet | AUC for the three classifiers: 0.95, 0.97, and 0.85, respectively | Automatic classification can complement manual inspection efforts to minimize risks of missing positives due to repetitive sequence of endoscopic frames. |
Cho BJ et al. [20] | Classification of gastric pathologies | Classification of gastric neoplasms | Retrospective and prospective case and control, two centers | 5017 images from 1269 patients for training, 812 images from 212 patients for testing, and 200 images from 200 patients for validation | Automatic classification of gastric neoplasms | Inception-Resnet-v2 | Weighted average accuracy: 84.6%; AUC for differentiating gastric cancer and neoplasm: 0.877 and 0.927, respectively | Potentially useful for clinical application in classifying gastric cancer or neoplasm using endoscopic white-light images. |
Ueyama H et al. [21] | Gastric cancer | Diagnosis | Retrospective and prospective | 5574 ME-NBI images (3797 EGCs, 1777 non-cancerous) | Diagnosis of early gastric cancer (EGC) | ResNet50 | Accuracy: 98.7%, Sensitivity: 98%, Specificity: 100% | The AI-assisted CNN-CAD system for ME-NBI diagnosis of EGC could process many stored ME-NBI images in a short period of time and had a high diagnostic ability. This system may have great potential for future application to real clinical settings. |
Horiuchi Y et al. [22] | Early gastric cancer | Differentiating EGC from gastritis | Retrospective and prospective single | pre-trained using 1492 EGC and 1078 gastritis images from ME-NBI. A separate test dataset (151 EGC and 107 gastritis images based on ME-NBI) was used to evaluate the diagnostic ability | Differentiating EGC from gastritis | 22-layer CNN | Accuracy: 85.3%, Sensitivity: 95.4%, Specificity: 71.0% | May provide rapid and sensitive differentiation between EGC and gastritis |
Klang E et al. [23] | Gastric ulcers | Malignancy detection | Retrospective | 1978 GU images | Discrimination between benign and malignant gastric ulcers | CNN | AUC: 0.91, Sensitivity: 92%, Specificity: 75% | The algorithm may improve the accuracy of differentiating benign from malignant ulcers during endoscopies and assist in patients’ stratification, allowing accelerated patient management and an individualized approach towards surveillance endoscopy. |
Liu Y et al. [24] | Gastric ulcers | Classification of benign and malignant gastric ulcer lesions | Retrospective, single-center, case–control | 109 cases in the benign group, 69 in malignant | Automatic classification and diagnosis of benign and malignant gastric ulcer lesions | Convolutional neural network, Xception model with residual attention module | Accuracy: 81.411%, F1 score: 81.815%, Sensitivity: 83.751%, Specificity: 76.827%, Precision: 80.111% | Residual attention mechanism can improve the classification effect of Xception CNN on benign and malignant lesions of gastric ulcers |
Zhang X et al. [25] | Detection of gastric pathologies | Polyp detection | Single center, retrospective, cases | 404 cases | Automatic detection of gastric polyps | Single Shot MultiBox Detector (SSD-GPNet) | mAP: 90.4%; Improved polyp detection recall by over 10% | Assists in reducing gastric polyp miss rate and potentially decreases the burden on physicians. |
Durak S et al. [26] | Detection of gastric pathologies | Detection of gastric polyps | Retrospective study | 2195 endoscopic images and 3031 polyp labels | Automatic gastric polyp detection | YOLOv4, CenterNet, EfficientNet, Cross Stage ResNext50-SPP, YOLOv3, YOLOv3-SPP, Single Shot Detection, Faster Regional CNN | YOLOv4 had 87.95% mean average precision | YOLOv4 can be used effectively in gastrointestinal CAD systems for polyp detection |
Wu L et al. [27] | Detection of gastric pathologies | Detection of gastric neoplasms | Single-center, randomized controlled, tandem trial | 1812 patients | AI-assisted detection of gastric neoplasms | Not specified | Lower gastric neoplasm miss rate in the AI-first group | AI can reduce the miss rate of gastric neoplasms in clinical practice |
Xu M et al. [28] | Detection of gastric pathologies | Detecting gastric precancerous conditions | Retrospective, multicenter | 760 patients | Detecting gastric atrophy (GA) and intestinal metaplasia (IM) | ENDOANGEL (a deep CNN) | Diagnostic accuracy of GA was 0.901 | The model shows potential for real-time detection of gastric precancerous conditions in clinical practice. |
Hirasawa T et al. [29] | Gastric cancer | Gastric cancer detection | Retrospective and prospective | 69 patients with 77 gastric cancer lesions. Trained using 13,584 endoscopic images of gastric cancer. To evaluate the diagnostic accuracy, an independent test set of 2296 stomach images collected from 69 consecutive patients with 77 gastric cancer lesions was applied to the constructed CNN. | Automatic detection of gastric cancer | Single Shot MultiBox Detector (SSD) | Sensitivity: 92.2%; Positive Predictive Value: 30.6% | Reduces the burden on endoscopists by processing numerous stored endoscopic images in a very short time. |
Ikenoyama Y et al. [30] | Gastric cancer | Early detection | Retrospective and prospective single | 13,584 images from 2639 lesions. The CNN was constructed using 13,584 endoscopic images from 2639 lesions of gastric cancer. Subsequently, its diagnostic ability was compared to that of 67 endoscopists using an independent test dataset (2940 images from 140 cases). | Detecting early gastric cancer | CNN | Sensitivity: 58.4%, Specificity: 87.3% | The CNN detected more early gastric cancer cases in a shorter time than the endoscopists. A diagnostic support tool for gastric cancer using a CNN will be realized in the near future. |
Bang CS et al. [31] | Classification of gastric pathologies | Classify the invasion depth of gastric neoplasms | Prospective, multicenter | a total of 5017 images from 1269 individuals; of these, 812 images from 212 subjects were used | Classifying the invasion depth of gastric neoplasms | AutoDL models (Neuro-T, Create ML Image Classifier, AutoML Vision) | Accuracy: 89.3% | AutoDL models showed high accuracy in classifying invasion depth of gastric neoplasms, suggesting their potential for improving diagnostic accuracy and efficiency in clinical practice. |
Yoon HJ et al. [32] | Early gastric cancer | Tumor invasion depth prediction | Retrospective single | 11,539 images | Classification of endoscopic images as EGC or non-EGC | VGG-16 | AUC for EGC detection: 0.981, AUC for depth prediction: 0.851 | May improve depth prediction in EGC, particularly in undifferentiated-type histology, requiring further validation |
Hamada K et al. [33] | Early gastric cancer | Evaluating the depth of invasion of early gastric cancer | Retrospective | 200 cases | Evaluating the depth of invasion of early gastric cancer | ResNet152 | Sensitivity, specificity, and accuracy for diagnosing M cancer were 84.9% | Assist in endoscopic diagnosis of early gastric cancer |
Tang D et al. [34] | Gastric cancer | Diagnosis of intramucosal GC | Retrospective single-center | 666 gastric cancer patients 3407 endoscopic images | Discrimination of intramucosal GC from advanced GC | DCNN | AUC: 0.942 | The model achieved high accuracy in discriminating intramucosal GC from advanced GC, indicating its potential to assist endoscopists in diagnosing intramucosal GC. |
An P et al. [35] | Early gastric cancer | Delineation of cancer margins | Retrospective and prospective | A total of 546 CE images from 67 patients were included, and 34 CE images from 14 patients were included in the test dataset. In the WLE dataset, the training dataset consisted of 343 images from 260 patients, and the test dataset consisted of 321 images from 218 patients | Delineating the resection margin of early gastric cancer | Fully convolutional networks (ENDOANGEL) | Accuracy: 85.7% in CE images, 88.9% in WLE images | Assisting endoscopists in delineating the resection extent of EGC during ESD |
Du W et al. [36] | Early gastric cancer | Automatic segmentation of EGC lesions | Retrospective and prospective | 7169 images from 2480 patients | Segmentation of EGC lesions in gastroscopic images | Co-spatial attention and channel attention-based triple-branch ResUnet (CSA-CA-TB-ResUnet) | Jaccard similarity index (JSI) of 84.54% | Accurate segmentation of EGC lesions for aiding clinical diagnosis and treatment |
Ling T et al. [37] | Gastric cancer | Margin delineation | Retrospective | 2217 images from 145 EGC patients, 1870 images from 139 EGC patients | Identification of differentiation status and delineation of margins of EGC | CNN | Accuracy: 83.3% | The AI system accurately identifies the differentiation status of EGCs and may assist in determining the surgical strategy and achieving curative resection in EGC patients. |
Teramoto A et al. [38] | Gastric cancer | Detection and segmentation of invasive gastric cancer | Retrospective single-center | 2378 images from different patients | Classification of endoscopic images and identification of the extent of cancer invasion | Cascaded deep learning model, U-Net | Sensitivity: 97.0%, Specificity: 99.4%, Case-based evaluation: 100% | The method could be useful for the classification of endoscopic images and identification of the extent of cancer invasion |
Lin N et al. [39] | Pre-malignant lesions | Recognition of AG and GIM | Retrospective multicenter | A total of 7037 endoscopic images from 2741 participants were used to develop the CNN | Simultaneous recognition of AG and GIM | Deep Convolutional Neural Network (CNN) using TResNet | Not specified | CNN model could be used for diagnosing AG and GIM |
Wu L et al. [40] | Early gastric cancer | Detecting gastric neoplasm, identifying EGC, and predicting EGC invasion depth and differentiation status | Multicenter, prospective, real-time, competitive comparative, diagnostic study | 100 videos | Detecting neoplasms and diagnosing EGCs | Not specified | Sensitivity rates of the system for detecting neoplasms and diagnosing EGCs were 87.81% and 100%, respectively | AI system can enhance the performance of endoscopists in diagnosing EGC |
Hirai K et al. [41] | Pre-malignant lesions | Differentiating gastrointestinal stromal tumors from benign subepithelial lesions | Retrospective | 631 cases | Classifying SELs on EUS images | CNN | Accuracy of 86.1% for five-category classification, sensitivity and accuracy of 98.8% and 89.3%, respectively, for differentiating GISTs from non-GISTs | Assist in improving the diagnosis of SELs in clinical practice |
Sakai Y et al. [42] | Early gastric cancer | Early gastric cancer detection | Retrospective single center | 1000 images | Automatic detection of early gastric cancer | Not provided | Accuracy: 87.6%; Balanced sensitivity and specificity | Assists endoscopists in decision-making by providing a heat map of candidate regions of early gastric cancer. |
Li L et al. [43] | Early gastric cancer | Diagnosis of early gastric cancer | Retrospective and prospective | A total of 386 images of non-cancerous lesions and 1702 images of early gastric cancer were collected to train and establish a CNN model (Inception-v3). Then, a total of 341 endoscopic images (171 non-cancerous lesions and 170 early gastric cancer) were selected to evaluate the diagnostic capabilities of CNN and endoscopists. | Diagnosis of early gastric cancer | Inception-v3 | Sensitivity: 91.18%, Specificity: 90.64%, Accuracy: 90.91% | May enhance the diagnostic efficacy of non-experts in differentiating early gastric cancer from non-cancerous lesions |
Tang D et al. [44] | Early gastric cancer | Detection of early gastric cancer | Retrospective and prospective, multicenter | All 45,240 endoscopic images from 1364 patients were divided into a training dataset (35,823 images from 1085 patients) and a validation dataset (9417 images from 279 patients). Another 1514 images from three other hospitals were used as external validation. | Detection of early gastric cancer | Deep Convolutional Neural Networks (DCNNs) | Accuracy: 85.1%–91.2%, Sensitivity: 85.9%–95.5%, Specificity: 81.7%–90.3% | Multicenter prospective validation needed for clinical application |
Wu L et al. [45] | Pre-malignant lesions | Detecting gastric lesions and predicting neoplasms | Retrospective and prospective | Over 10,000 | Assisting in detecting gastric lesions and predicting neoplasms by WLE | ENDOANGEL-LD | Sensitivity of 96.9% for detecting gastric lesions and 92.9% for diagnosing neoplasms in internal patients | Assisting endoscopists in screening gastric lesions and suspicious neoplasms in clinical work |
He X et al. [46] | Early gastric cancer | Diagnosing EGC in magnifying image-enhanced endoscopy | Retrospective and prospective | 3099 cases | Diagnosing EGC in M-IEE | ENDOANGEL-ME | Diagnostic accuracy of 88.44% and 90.49% in internal and external images, respectively | Assist in diagnosing early gastric cancer |
Yao Z et al. [47] | Early gastric cancer | Diagnosing early gastric cancer | Prospective | 1653 cases | Rapid and accurate diagnosis of endoscopic images from early gastric cancer | YOLO | Accuracy, sensitivity, specificity, and positive predictive value of 85.15%, 85.36%, 84.41%, and 95.22%, respectively, for Test Set 1 | Assist in the efficient, accurate, and rapid detection of early gastric cancer lesions |
Li J et al. [48] | Early gastric cancer | Diagnosis of EGC under M-IEE | Retrospective | 692 patients | Develop a logical anthropomorphic AI diagnostic system for EGCs under M-IEE | ENDOANGEL-LA, based on feature extraction, deep learning (DL), and machine learning (ML) | Accuracy of ENDOANGEL-LA in images was 88.76% and in videos was 87.00% | ENDOANGEL-LA has the potential to increase interactivity between endoscopists and CADs, and improve trust and acceptability of CADs for endoscopists |
Jin Z et al. [49] | Early gastric cancer | Automatic detection of early gastric cancer | Controlled trials | 7133 images from different patients | Automatic detection of early gastric cancer | Mask R-CNN | WLI test—Accuracy: 90.25%, Sensitivity: 91.06%, Specificity: 89.01%, NBI test—Accuracy: 95.12%, Sensitivity: 97.59% | Can be effectively applied to clinical settings for the detection of EGC, especially for the real-time analysis of WLIs |
Zhou B et al. [50] | Early gastric cancer | Detection of early gastric cancer | Single-center retrospective study | 5770 images from 194 patients | Automatic detection of early gastric cancer | EfficientDet | Case-based Sensitivity: 98.4%, Image-based Accuracy: 88.3%, Sensitivity: 84.5%, Specificity: 90.5% | Shows great potential in assisting endoscopists with the detection of EGC |
Su X et al. [51] | Early gastric cancer | EGC detection | Retrospective single-center | 3659 cases | Detection of early gastric cancer | Faster RCNN, Cascade RCNN, Mask RCNN | Accuracy: 0.935 (Faster RCNN), 0.938 (Cascade RCNN), 0.935 (Mask RCNN), Specificity: 0.908 (Faster RCNN, Mask RCNN), 0.946 (Cascade RCNN) | These deep learning methods can assist in early gastric cancer diagnosis using endoscopic images. |
Nakahira H et al. [52] | Gastric cancer | Risk stratification of GC | Retrospective single-center | 107,284 images | Stratification of GC risk | Not explicitly stated | Not explicitly stated | Could provide effective surveillance for GC, stratifying GC risk based on endoscopic examinations |
Igarashi S et al. [53] | Gastric cancer | Automated localization of digestive lesions and prediction of cancer invasion depth | Retrospective | 441 patients | Classification of upper GI organ images into anatomical categories | AlexNet | Accuracy: Training dataset 0.993, Validation dataset 0.965 | Facilitating data collection and assessment of EGD images, potentially useful for both expert and non-expert endoscopists |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Klang, E.; Soroush, A.; Nadkarni, G.N.; Sharif, K.; Lahat, A. Deep Learning and Gastric Cancer: Systematic Review of AI-Assisted Endoscopy. Diagnostics 2023, 13, 3613. https://doi.org/10.3390/diagnostics13243613
Klang E, Soroush A, Nadkarni GN, Sharif K, Lahat A. Deep Learning and Gastric Cancer: Systematic Review of AI-Assisted Endoscopy. Diagnostics. 2023; 13(24):3613. https://doi.org/10.3390/diagnostics13243613
Chicago/Turabian StyleKlang, Eyal, Ali Soroush, Girish N. Nadkarni, Kassem Sharif, and Adi Lahat. 2023. "Deep Learning and Gastric Cancer: Systematic Review of AI-Assisted Endoscopy" Diagnostics 13, no. 24: 3613. https://doi.org/10.3390/diagnostics13243613
APA StyleKlang, E., Soroush, A., Nadkarni, G. N., Sharif, K., & Lahat, A. (2023). Deep Learning and Gastric Cancer: Systematic Review of AI-Assisted Endoscopy. Diagnostics, 13(24), 3613. https://doi.org/10.3390/diagnostics13243613