Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 + 1962 word(s) 1962 2021-09-13 05:56:21 |
2 h + 273 word(s) 2235 2021-09-24 12:02:23 | |
3 h + 273 word(s) 2235 2021-09-24 12:02:57 |

Video Upload Options

We provide professional Video Production Services to translate complex research into visually appealing presentations. Would you like to try it?

Confirm

Are you sure to Delete?
Cite
If you have any further questions, please contact Encyclopedia Editorial Office.
Pecere, S. Artificial Intelligence for Gastrointestinal Diseases. Encyclopedia. Available online: https://encyclopedia.pub/entry/14529 (accessed on 16 November 2024).
Pecere S. Artificial Intelligence for Gastrointestinal Diseases. Encyclopedia. Available at: https://encyclopedia.pub/entry/14529. Accessed November 16, 2024.
Pecere, Silvia. "Artificial Intelligence for Gastrointestinal Diseases" Encyclopedia, https://encyclopedia.pub/entry/14529 (accessed November 16, 2024).
Pecere, S. (2021, September 24). Artificial Intelligence for Gastrointestinal Diseases. In Encyclopedia. https://encyclopedia.pub/entry/14529
Pecere, Silvia. "Artificial Intelligence for Gastrointestinal Diseases." Encyclopedia. Web. 24 September, 2021.
Artificial Intelligence for Gastrointestinal Diseases
Edit

The development of convolutional neural networks has achieved impressive advances of machine learning in recent years, leading to an increasing use of artificial intelligence (AI) in the field of gastrointestinal (GI) diseases. AI networks have been trained to differentiate benign from malignant lesions, analyze endoscopic and radiological GI images, and assess histological diagnoses, obtaining excellent results and high overall diagnostic accuracy. Nevertheless, there data are lacking on side effects of AI in the gastroenterology field, and high-quality studies comparing the performance of AI networks to health care professionals are still limited.

artificial intelligence radiomics deep learning gastrointestinal lesions gastrointestinal cancers

1. Introduction

Continuous innovations have allowed to improve many aspects of gastroenterologists daily clinical practice, from increasing early-stage diagnoses to expanding therapeutic boundaries. In the last decades, a great deal of attention has been focused on the development computer assisted systems that could be applied in endoscopy, radiology, and pathology to improve the diagnosis, treatment, and prognosis of many gastrointestinal diseases. Indeed, machine learning has evolved in recent years due to the usage of convolutional neural networks (CNN), the improvement of the training of such networks that build the basis of artificial intelligence (AI), the development of powerful computers with advanced graphics processing, and their increasing use in many diagnostic fields. However, although AI has been applied in a wide range of gastrointestinal diseases, high-quality studies that compare the performance of AI networks to human health care professionals are lacking, especially studies with prospective design and that are conducted in real-time clinical settings.

This narrative entry will give an overview of some of the most relevant potential applications of AI for both upper and lower gastrointestinal diseases ( Table 1 ), highlighting advantages and main limitations and providing considerations for future development.

Table 1. Key points of AI application in GI disease.

Field Key Points
ESOPHAGUS
1.
AI seems a useful tool for detection of BE, EAC, and ESCC, although current evidence is limited by study type.
2.
Further studies could better address the role of AI for prediction of prognosis and treatment response in esophageal cancers.
STOMACH
1.
CNN-based algorithms showed good diagnostic performances for HP detection.
2.
AI could improve not only lesion detection of GC but also patients’ selection for chemotherapy and definition of prognosis.
LOWER GI TRACT
1.
In the field of UC and CD, AI can be used for automatic detection and grading of disease activity.
2.
CADx and CADe are currently the most promising and effective clinical application of AI.
Legend: BE, Barrett’s esophagus; EA, esophageal adenocarcinoma; ESCC, esophageal squamous cell carcinoma; HP, Helicobacter pylori; GC, gastric cancer; UC, ulcerative colitis; CD, Crohn’s disease; CADx, automatic polyp characterization; CADe, automatic polyp detection; GI, gastro-intestinal.

2. Upper Gastro-Intestinal Tract

Accumulating evidence shows the potential benefits of computer assistance (CA) in the management of esophageal conditions, such as Barrett’s esophagus (BE) and esophageal adenocarcinoma (EAC) [1]. In recent years, the ARGOS project has developed, validated, and benchmarked a computer-aided detection (CAD) system that could assist endoscopists in the detection and delineation of Barrett’s neoplasia. In their study, De Groof et al. showed that their system achieved a higher diagnostic accuracy compared to non-expert endoscopists, and it was potentially fast enough to be used in real time, taking 0.24 s to classify and delineate a Barrett’s lesion within an endoscopic image [2]. This study was conducted using a total of five independent image datasets used to train, validate, and benchmark the system. The first two datasets were used for pre-training and training respectively; the first dataset contained 494,364 endoscopic images from all intestinal segments, and the second contained 1247 high-definition (HD) white-light imaging (WLI) of confirmed early BE neoplasia and non-dysplastic BE. A third dataset containing 297 images was used for refining the training and for internal validation. Fourth and fifth datasets containing 80 images each of early BE neoplasia and non-dysplastic BE delineated by three and six experts, respectively, were used for external validation. The fifth dataset was also used to benchmark the performance of the algorithm versus 53 general endoscopists, showing an accuracy of 88% vs. 73%, a sensitivity of 93% vs. 72%, and a specificity of 83% vs. 74%, respectively [2]. Similarly, in 2020, Hashimoto and colleagues published a single-center retrospective study on a system developed for the detection of early esophageal neoplasia in BE. The algorithm was programmed to distinguish images of lesions with or without dysplasia. A total of 916 images of early esophageal neoplasia in BE and 919 images of BE without high-grade dysplasia were used for training the system. It was validated using 458 images, 225 with dysplasia and 233 without dysplasia, reporting a sensitivity, specificity, and accuracy per image of 96.4%, 94.2%, and 95.4%, respectively. The authors also found that the specificity for images taken with advanced imaging techniques, such as narrow-band imaging (NBI) and near focus, was significantly higher than white-light imaging (WLI) and standard focus [3].

Dedicated models analyzing WLI and NBI images have shown high disease-specific diagnostic accuracy not only for BE and EAC [4] but also for esophageal squamous cell carcinoma (ESCC) [5]. Dedicated algorithms have also been implemented to analyze enhanced endoscopy imaging, allowing to evaluate disease-specific mucosal and vascular patterns [6], the presence of submucosal invasion [7], the depth of invasion [8], and microendoscopy use for both ESCC [9] and BE [10]. Moreover, a recent meta-analysis has shown an overall high accuracy in the detection of early EAC, with a significantly better performance compared to endoscopists in terms of the pooled sensitivity (0.94 vs. 0.82, p = 0.01). However, these results were based mainly on studies where endoscopic images were reviewed retrospectively, whereas data from prospective trials are more limited [11].

A recent study was conducted on protruding lesions of the esophagus, integrating standard WL endoscopic images with endoscopic ultrasound (EUS) images [12]. The diagnostic accuracy in differentiating sub-types of protruding lesions of the AI system outperformed most of the endoscopists enrolled to interpret the images. In addition, when CA models and endoscopists predictions were combined, a higher diagnostic accuracy was achieved compared with the endoscopists alone [12]. CA has been used for image recognition of histology and pathology specimens to categorize dysplastic and non-dysplastic BE and EAC [13] and also for cytology samples obtained by wide-area transepithelial sampling (WATS3D) [14] or by Cytosponge [15], achieving promising results and matching the diagnostic performance of experienced pathologists.

Another worthwhile issue regarding the best management to offer to the patient with a diagnosis of gastric cancer would be to define the real necessity of eventual neoadjuvant chemotherapy (NAC). A study by Wang Y et al. [16], published in 2020, aimed to investigate the role of CT radiomics for differentiation between T2- and T3/4-stage cases in gastric cancer to avoid the adverse events of NAC in those patients who should directly undergo surgery. A total of 244 consecutive patients with pathologically proven gastric cancer were retrospectively included, and a training cohort of 171 patients and a validation cohort of 73 patients were provided. Preoperative arterial and portal phase contrast-enhanced CT images were retrieved. Arterial and portal phase-based radiomics model showed areas under the curve of 0.899 (95% CI 0.812–0.955) and 0.843 (95% CI 0.746–0.914) in the training cohort and 0.825 (95% CI 0.718–0.904) and 0.818 (95% CI 0.711–0.899) in the validation cohort, respectively. The results exhibited that the radiomics models based on the CT images may provide potential value for differentiation of the depth of tumor invasion in gastric cancer. Concerning the use of radiomics, the study of Shin J et al. [17] aimed to develop a radiomics-based prognostic model for recurrence-free survival (RFS) using preoperative contrast-enhanced CT in local advanced gastric cancer. This retrospective study included a training and an external validation cohort of 349 and 61 patients who underwent curative resection for gastric cancer without neoadjuvant therapies. The integrated area under the curve (iAUC) values for RFS prediction were 0.616 (95% CI 0.570–0.663), 0.714 (95% CI 0.667–0.759), and 0.719 (95% CI 0.674–0.764) in clinical, radiomic, and merged models, respectively. In external validation, the iAUC were 0.584 (95% CI 0.554–0.636), 0.652 (95% CI 0.628–0.674), and 0.651 (95% CI 0.630–0.673) in clinical, radiomic, and merged models, respectively. The radiomic model showed significantly higher iAUC values than the clinical model.

3. Lower Gastro-Intestinal Tract

In addition to lesions detection, AI has been also investigated for automatic polyp characterization (CADx) and whether it can potentially distinguish precancerous from benign lesions, avoiding useless polyps’ removal for histological evaluation. In this setting, a pioneering study was performed by Tischendorf et al. [18] with a CADx system able to discriminate non-adenomatous from adenomatous polyps based on vascularization features with NBI magnification vision. Although good performances were obtained, human observers performed better than AI both in terms of sensitivity (93.8% vs. 90%) and specificity (85.7% vs. 70%).

Similar to CADe, CADx achieved better results with the introduction of deep-learning systems. A benchmark study in this setting was performed by Birne et al. [19], who tested an AI system on 125 polyps that were histologically defined as adenomas or hyperplastic. The AI performed a real-time evaluation of the polyps on NBI non-magnified vision according to the Narrow-band Imaging International Colorectal Endoscopic (NICE) classification [20]. The AI model did not reach enough confidence to predict the histology of 19 polyps, whereas for the remaining 106 polyps, it showed an accuracy of 94% (95% CI 86–97%), sensitivity for identification of adenomas of 98% (95% CI 92–100), specificity of 83% (95% CI 67–93), NPV of 97%, and PPV of 90%.

CADx was also evaluated using endocytoscopy (EC-CAD). This technique permits cellular nuclei visualization in vivo with ultra-magnification (×450). Mori et al. [21] reported the results of EC-CAD in four patients using EndoBRAIN (Cybernet Systems Corp., Tokyo, Japan), an AI-based system that analyzes cell nuclei, crypt structure, and micro-vessels in endoscopic images to identify colon cancers. This AI system was further investigated including a comparison between AI and humans (20 trainees and 10 expert endoscopists) [22]. Using methylene blue staining or NBI, EndoBRAIN identified colonic lesions significantly better than non-expert endoscopists, while only sensitivity and NPV were significantly higher compared to experts. Two main studies analyzed the potential application of AI in CADx for diminutive polyps [23][24][25], with promising results that were also confirmed by a recent meta-analysis [26], showing a sensitivity and specificity of 93.5% (95% CI, 90.7–95.6) and 90.8% (95% CI, 86.3–95.9), respectively.

These good performances could justify a “resect and discard” or “diagnose and leave” strategy. In the first case, polyps are still removed but not sent for histological analysis. According to Hassan and co-workers [27], this strategy could result in an annual saving of $25/person and a total of $33 million in the United States of America, with no relevant impact on the efficacy of CRC screening. On the other hand, a “diagnose and leave” strategy could avoid the risk of unnecessary of polypectomy and spare the cost of endoscopic polypectomy, which have been approximately estimated as $179 per person, giving a total saving of $1 billion per year to the United States of America health care system [28]. However, this strategy could expose patients to the risk interval of CRC due to the misdiagnosis of precancerous colonic lesions that would be left in place. Few data are available on CAD system applied to computed tomography colonography (CTC) for detection of colorectal polyps, mainly due to the high number of false positives (FPs). To overcome the issue, Ren et al. [29] proposed a CAD-CTC scheme using shape index, multiscale enhancement filters, and a total of 440 radiomic features. This scheme was evaluated on 152 oral contrast-enhanced CT datasets from 76 patients with 103 polyps ≥ 5 mm. The detection results were encouraging, achieving a high sensitivity and maintaining a low FP rate for polyps ≥ 5 mm. In addition, a recent proof-of-concept study [30] evaluated a non-invasive, radiomics-based, machine-learning differentiation of benign and premalignant colorectal polyps in a CT colonography datasets in an asymptomatic, average-risk colorectal cancer screening cohort including 59 patients. Results showed a sensitivity of 82% (95% CI: 74–91), a specificity of 85% (95% CI: 72–95), and AUC of 0.91 (95% CI: 0.85–0.96), providing a potential basis for future prospective studies in the setting of non-invasive analysis of CT colonography-detected polyps.

AI is rapidly integrating into clinical practice [31], becoming, in few years, a reliable tool for supporting physicians in the study of GI tract. This review focused on AI and diagnostic aspects (endoscopy, radiology, and pathology) of GI diseases and showed that AI seems to have a great potential in the field of detection of inflammatory, pre-cancerous, and cancerous lesions of GI tract ( Table 2 ).

Table 2. Summary of topics investigated.

Field Disease Topic Investigated
ESOPHAGUS BE
  • Detection of dysplasia
EAC
  • Detection of lesions
  • Depth of invasion
ESCC
  • Detection of lesions
  • Depth of invasion
  • Prediction of lymph nodal invasion
  • Guide for radiotherapy treatment
  • Prediction of treatment response
  • Prediction of risk of recurrence
STOMACH HP
  • Prediction of infection
CAG
  • Detection
GIM
  • Detection
GC
  • Detection
  • Dept of invasion
  • Prediction of recurrence
LOWER GI TRACT UC
  • Diagnosis
  • Disease activity evaluation
CD
  • Ulcer detection
  • Ulcer severity grading
PCL
  • Detection
  • Characterization
Legend: BE, Barrett’s esophagus; EAC, esophageal adenocarcinoma; ESCC, esophageal squamous cell carcinoma; HP, Helicobacter pylori; CAG, chronic atrophic gastritis; GIM, gastric intestinal metaplasia; GC, gastric cancer; UC, ulcerative colitis; CD, Crohn’s disease; PCL, precancerous colonic lesions; GI, gastro-intestinal.

 

From available data, AI seems to have high overall accuracy for the diagnosis of any neoplastic lesion, while for inflammatory disease, fewer studies have been performed but with encouraging results. Nevertheless, major limits should be carefully taken into account. First, AI performance results were sometime heterogeneous from one study to another, making it difficult to compare them [32]. Second, the size of training and test datasets varied widely across studies. Third, most CAD or CNN systems were developed in single centers, and many data come from pre-clinical studies, raising the concern of selection and spectrum bias. Finally, most of AI systems for endoscopy derived from retrospective, non-randomized setting, and standardization still remains an issue. In conclusion, AI is definitely changing our work with possible enormous potential benefits, but thresholds for guidelines for standard patient care are needed also to overcome major limitations that, to date, represent important ethical issues and obstacles for its widespread use and implementation.

References

  1. Kou, W.; Carlson, D.A.; Baumann, A.J.; Donnan, E.; Luo, Y.; Pandolfino, J.E.; Etemadi, M. A deep-learning-based unsupervised model on esophageal manometry using variational autoencoder. Artif. Intell. Med. 2021, 112, 102006.
  2. De Groof, A.J.; Struyvenberg, M.R.; van der Putten, J.; van der Sommen, F.; Fockens, K.N.; Curvers, W.L.; Zinger, S.; Pouw, R.E.; Coron, E.; Baldaque-Silva, F.; et al. Deep-learning system detects neoplasia in patients with Barrett’s esophagus with higher accuracy than endoscopists in a multistep training and validation study with benchmarking. Gastroenterology 2020, 158, 915–929.e4.
  3. Hashimoto, R.; Requa, J.; Dao, T.; Ninh, A.; Tran, E.; Mai, D.; Lugo, M.; El-Hage Chehade, N.; Chang, K.J.; Karnes, W.E.; et al. Artificial intelligence using convolutional neural networks for real-time detection of early esophageal neoplasia in Barrett’s esophagus (with video). Gastrointest. Endosc. 2020, 91, 1264–1271.e1.
  4. Struyvenberg, M.R.; de Groof, A.J.; van der Putten, J.; van der Sommen, F.; Baldaque-Silva, F.; Omae, M.; Pouw, R.; Bisschops, R.; Vieth, M.; Schoon, E.J.; et al. A computer-assisted algorithm for narrow-band imaging-based tissue characterization in Barrett’s esophagus. Gastrointest. Endosc. 2021, 93, 89–98.
  5. Li, B.; Cai, S.-L.; Tan, W.-M.; Li, J.-C.; Yalikong, A.; Feng, X.-S.; Yu, H.-H.; Lu, P.-X.; Feng, Z.; Yao, L.-Q.; et al. Comparative study on artificial intelligence systems for detecting early esophageal squamous cell carcinoma between narrow-band and white-light imaging. World J. Gastroenterol. 2021, 27, 281–293.
  6. Uema, R.; Hayashi, Y.; Tashiro, T.; Saiki, H.; Kato, M.; Amano, T.; Tani, M.; Yoshihara, T.; Inoue, T.; Kimura, K.; et al. Use of a convolutional neural network for classifying microvessels of superficial esophageal squamous cell carcinomas. J. Gastroenterol. Hepatol. 2021, 36, 2239–2246.
  7. Ebigbo, A.; Mendel, R.; Rückert, T.; Schuster, L.; Probst, A.; Manzeneder, J.; Prinz, F.; Mende, M.; Steinbrück, I.; Faiss, S.; et al. Endoscopic prediction of submucosal invasion in Barrett’s cancer with the use of artificial intelligence: A pilot study. Endoscopy 2020.
  8. Tokai, Y.; Yoshio, T.; Aoyama, K.; Horie, Y.; Yoshimizu, S.; Horiuchi, Y.; Ishiyama, A.; Tsuchida, T.; Hirasawa, T.; Sakakibara, Y.; et al. Application of artificial intelligence using convolutional neural networks in determining the invasion depth of esophageal squamous cell carcinoma. Esophagus 2020, 17, 250–256.
  9. Tan, M.C.; Bhushan, S.; Quang, T.; Schwarz, R.; Patel, K.H.; Yu, X.; Li, Z.; Wang, G.; Zhang, F.; Wang, X.; et al. Automated software-assisted diagnosis of esophageal squamous cell neoplasia using high-resolution microendoscopy. Gastrointest. Endosc. 2021, 93, 831–838.e2.
  10. Trindade, A.J.; McKinley, M.J.; Fan, C.; Leggett, C.L.; Kahn, A.; Pleskow, D.K. Endoscopic Surveillance of Barrett’s Esophagus Using Volumetric Laser Endomicroscopy With Artificial Intelligence Image Enhancement. Gastroenterology 2019, 157, 303–305.
  11. Zhang, S.M.; Wang, Y.J.; Zhang, S.T. Accuracy of artificial intelligence-assisted detection of esophageal cancer and neoplasms on endoscopic images: A systematic review and meta-analysis. J. Dig. Dis. 2021.
  12. Zhang, M.; Zhu, C.; Wang, Y.; Kong, Z.; Hua, Y.; Zhang, W.; Si, X.; Ye, B.; Xu, X.; Li, L.; et al. Differential diagnosis for esophageal protruded lesions using a deep convolution neural network in endoscopic images. Gastrointest. Endosc. 2021, 93, 1261–1272.e2.
  13. Sali, R.; Moradinasab, N.; Guleria, S.; Ehsan, L.; Fernandes, P.; Shah, T.U.; Syed, S.; Brown, D.E. Deep Learning for Whole-Slide Tissue Histopathology Classification: A Comparative Study in the Identification of Dysplastic and Non-Dysplastic Barrett’s Esophagus. J. Pers. Med. 2020, 10, 141.
  14. Kaul, V.; Gross, S.; Corbett, F.S.; Malik, Z.; Smith, M.S.; Tofani, C.; Infantolino, A. Clinical utility of wide-area transepithelial sampling with three-dimensional computer-assisted analysis (WATS3D) in identifying Barrett’s esophagus and associated neoplasia. Dis. Esophagus 2020, 33.
  15. Gehrung, M.; Crispin-Ortuzar, M.; Berman, A.G.; O’Donovan, M.; Fitzgerald, R.C.; Markowetz, F. Triage-driven diagnosis of Barrett’s esophagus for early detection of esophageal adenocarcinoma using deep learning. Nat. Med. 2021, 833–841.
  16. Wang, Y.; Liu, W.; Yu, Y.; Liu, J.-J.; Jiang, L.; Xue, H.-D.; Lei, J.; Jin, Z.; Yu, J.-C. Prediction of the Depth of Tumor Invasion in Gastric Cancer: Potential Role of CT Radiomics. Acad. Radiol. 2020, 27, 1077–1084.
  17. Shin, J.; Lim, J.S.; Huh, Y.-M.; Kim, J.-H.; Hyung, W.J.; Chung, J.-J.; Han, K.; Kim, S. A radiomics-based model for predicting prognosis of locally advanced gastric cancer in the preoperative setting. Sci. Rep. 2021, 11, 1879.
  18. Hassan, C.; Spadaccini, M.; Iannone, A.; Maselli, R.; Jovani, M.; Chandrasekar, V.T.; Antonelli, G.; Yu, H.; Areia, M.; Dinis-Ribeiro, M.; et al. Performance of artificial intelligence in colonoscopy for adenoma and polyp detection: A systematic review and meta-analysis. Gastrointest. Endosc. 2021, 93, 77–85.e6.
  19. Tischendorf, J.; Gross, S.; Winograd, R.; Hecker, H.; Auer, R.; Behrens, A.; Trautwein, C.; Aach, T.; Stehle, T. Computer-aided classification of colorectal polyps based on vascular patterns: A pilot study. Endoscopy 2010, 42, 203–207.
  20. Byrne, M.F.; Chapados, N.; Soudan, F.; Oertel, C.; Linares Pérez, M.; Kelly, R.; Iqbal, N.; Chandelier, F.; Rex, D.K. Real-time differentiation of adenomatous and hyperplastic diminutive colorectal polyps during analysis of unaltered videos of standard colonoscopy using a deep learning model. Gut 2019, 68, 94–100.
  21. Bisschops, R.; Hassan, C.; Bhandari, P.; Coron, E.; Neumann, H.; Pech, O.; Correale, L.; Repici, A. BASIC (BLI Adenoma Serrated International Classification) classification for colorectal polyp characterization with blue light imaging. Endoscopy 2018, 50, 211–220.
  22. Mori, Y.; Kudo, S.-E.; Misawa, M.; Mori, K. Simultaneous detection and characterization of diminutive polyps with the use of artificial intelligence during colonoscopy. VideoGIE 2019, 4, 7–10.
  23. Kudo, S.-E.; Misawa, M.; Mori, Y.; Hotta, K.; Ohtsuka, K.; Ikematsu, H.; Saito, Y.; Takeda, K.; Nakamura, H.; Ichimasa, K.; et al. Artificial Intelligence-assisted System Improves Endoscopic Identification of Colorectal Neoplasms. Clin. Gastroenterol. Hepatol. 2019, 18, 1874–1881.
  24. Chen, P.-J.; Lin, M.-C.; Lai, M.-J.; Lin, J.-C.; Lu, H.H.-S.; Tseng, V.S. Accurate Classification of Diminutive Colorectal Polyps Using Computer-Aided Analysis. Gastroenterology 2018, 154, 568–575.
  25. Zachariah, R.; Samarasena, J.; Luba, D.; Duh, E.; Dao, T.; Requa, J.; Ninh, A.; Karnes, W. Prediction of Polyp Pathology Using Convolutional Neural Networks Achieves ‘Resect and Discard’ Thresholds. Am. J. Gastroenterol. 2020, 115, 138–144.
  26. Lui, T.K.L.; Guo, C.-G.; Leung, W.K. Accuracy of artificial intelligence on histology prediction and detection of colorectal polyps: A systematic review and meta-analysis. Gastrointest. Endosc. 2020, 92, 11–22.e6.
  27. Hassan, C.; Pickhardt, P.J.; Rex, D.K. A resect and discard strategy would improve cost-effectiveness of colorectal cancer screening. Clin. Gastroenterol. Hepatol. 2010, 8, 865–869.e3.
  28. Abu Dayyeh, B.K.; Thosani, N.; Konda, V.; Wallace, M.B.; Rex, D.K.; Chauhan, S.S.; Hwang, J.H.; Komanduri, S.; Manfredi, M.; Maple, J.T.; et al. ASGE Technology Committee systematic review and meta-analysis assessing the ASGE PIVI thresholds for adopting real-time endoscopic assessment of the histology of diminutive colorectal polyps. Gastrointest. Endosc. 2015, 81, 502.e1–502.e16.
  29. Ren, Y.; Ma, J.; Xiong, J.; Lu, L.; Zhao, J. High-Performance CAD-CTC Scheme Using Shape Index, Multiscale Enhancement Filters, and Radiomic Features. IEEE Trans. Biomed. Eng. 2017, 64, 1924–1934.
  30. Grosu, S.; Wesp, P.; Graser, A.; Maurus, S.; Schulz, C.; Knösel, T.; Cyran, C.C.; Ricke, J.; Ingrisch, M.; Kazmierczak, P.M. Machine Learning-based Differentiation of Benign and Premalignant Colorectal Polyps Detected with CT Colonography in an Asymptomatic Screening Population: A Proof-of-Concept Study. Radiology 2021, 299, 326–335.
  31. Le Berre, C.; Sandborn, W.J.; Aridhi, S.; Devignes, M.D.; Fournier, L.; Smaïl-Tabbone, M.; Danese, S.; Peyrin-Biroulet, L. Application of Artificial Intelligence to Gastroenterology and Hepatology. Gastroenterology 2020, 158, 76–94.e2.
  32. Abadir, A.P.; Ali, M.F.; Karnes, W.; Samarasena, J.B. Artificial Intelligence in Gastrointestinal Endoscopy. Clin. Endosc. 2020, 53, 132–141.
More
Information
Contributor MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register :
View Times: 665
Revisions: 3 times (View History)
Update Date: 24 Sep 2021
1000/1000
ScholarVision Creations