Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

BACKGROUND & AIMS: Barrett's epithelium measurement using widely accepted Prague C&M classification is highly operator dependent. We propose a novel methodology for measuring this risk score automatically. The method also enables quantification of the area of Barrett's epithelium (BEA) and islands, which was not possible before. Furthermore, it allows 3-dimensional (3D) reconstruction of the esophageal surface, enabling interactive 3D visualization. We aimed to assess the accuracy of the proposed artificial intelligence system on both phantom and endoscopic patient data. METHODS: Using advanced deep learning, a depth estimator network is used to predict endoscope camera distance from the gastric folds. By segmenting BEA and gastroesophageal junction and projecting them to the estimated mm distances, we measure C&M scores including the BEA. The derived endoscopy artificial intelligence system was tested on a purpose-built 3D printed esophagus phantom with varying BEAs and on 194 high-definition videos from 131 patients with C&M values scored by expert endoscopists. RESULTS: Endoscopic phantom video data demonstrated a 97.2% accuracy with a marginal ± 0.9 mm average deviation for C&M and island measurements, while for BEA we achieved 98.4% accuracy with only ±0.4 cm2 average deviation compared with ground-truth. On patient data, the C&M measurements provided by our system concurred with expert scores with marginal overall relative error (mean difference) of 8% (3.6 mm) and 7% (2.8 mm) for C and M scores, respectively. CONCLUSIONS: The proposed methodology automatically extracts Prague C&M scores with high accuracy. Quantification and 3D reconstruction of the entire Barrett's area provides new opportunities for risk stratification and assessment of therapy response.

Original publication




Journal article



Publication Date





865 - 878.e8


Deep learning, Esophageal cancer, Imaging, Risk assessment, Three-dimensional, Aged, Automation, Barrett Esophagus, Deep Learning, Disease Progression, Esophageal Mucosa, Esophagogastric Junction, Esophagoscopy, Female, Humans, Image Interpretation, Computer-Assisted, Imaging, Three-Dimensional, Male, Pilot Projects, Predictive Value of Tests, Reproducibility of Results, Risk Assessment, Risk Factors, Severity of Illness Index, Treatment Outcome