๐Ÿง‘๐Ÿผโ€๐Ÿ’ป Research - January 15, 2026

AI-powered hierarchical classification of ampullary neoplasms: a deep learning approach using white-light and narrow-band imaging.

๐ŸŒŸ Stay Updated!
Join AI Health Hub to receive the latest insights in health and AI.

โšก Quick Summary

This study presents a groundbreaking AI-powered hierarchical classification system for diagnosing ampullary neoplasms, utilizing both white-light and narrow-band imaging. The model achieved an impressive overall diagnostic accuracy of 92.2%, significantly enhancing the detection of high-risk lesions.

๐Ÿ” Key Details

  • ๐Ÿ“Š Dataset: 4244 endoscopic images from 464 patients
  • ๐Ÿงฉ Features used: White-light (WL) and narrow-band imaging (NBI)
  • โš™๏ธ Technology: EfficientNet-B4 classifiers and StyleGAN2-ADA for data augmentation
  • ๐Ÿ† Performance: Overall accuracy of 92.2%, with specific accuracies of 95.6% for normal vs. abnormal lesions

๐Ÿ”‘ Key Takeaways

  • ๐Ÿค– AI integration in medical imaging can significantly improve diagnostic accuracy.
  • ๐Ÿ“ˆ Hierarchical classification allows for stepwise diagnosis of ampullary lesions.
  • ๐Ÿ’ก Dual-modality imaging outperformed single-modality approaches, enhancing diagnostic capabilities.
  • ๐ŸŽจ Synthetic image generation effectively addressed data scarcity and class imbalance.
  • ๐Ÿฅ High sensitivity for detecting high-grade dysplasia (83.3%) and cancer (87.5%) was achieved.
  • ๐ŸŒ Study conducted at Seoul National University Hospital.
  • ๐Ÿ†” PMID: 41532990.

๐Ÿ“š Background

The diagnosis of lesions in the Ampulla of Vater (AoV) is notoriously challenging due to their complex morphology and the limited availability of representative images. This difficulty is particularly pronounced for high-risk dysplastic lesions, which require precise identification to guide treatment decisions. The advent of advanced imaging techniques and artificial intelligence offers new avenues for improving diagnostic accuracy in this critical area of gastroenterology.

๐Ÿ—’๏ธ Study

The study aimed to develop a hierarchical deep learning framework for the classification of ampullary lesions using both white-light and narrow-band endoscopic images. The framework consists of three sequential binary classifications: normal vs. abnormal, adenoma vs. cancer, and high-grade dysplasia (HGD) vs. low-grade dysplasia (LGD) within adenomas. The model was trained on a substantial dataset collected from patients at Seoul National University Hospital.

๐Ÿ“ˆ Results

The hierarchical model demonstrated remarkable performance, achieving stage-specific accuracies of 95.6% for normal vs. abnormal, 94.4% for adenoma vs. cancer, and 92.7% for LGD vs. HGD. The overall diagnostic accuracy reached 92.2%, with excellent sensitivity for HGD (83.3%) and cancer (87.5%). Notably, the confidence-based dual-modality approach (AUROC: 0.921) significantly outperformed single-modality approaches.

๐ŸŒ Impact and Implications

The findings from this study have the potential to revolutionize the diagnostic landscape for ampullary lesions. By integrating AI with advanced imaging techniques, healthcare professionals can achieve more accurate and timely diagnoses, ultimately leading to improved patient outcomes. This approach not only enhances the detection of high-risk lesions but also paves the way for broader applications of AI in gastroenterology and beyond.

๐Ÿ”ฎ Conclusion

This study highlights the transformative potential of AI-powered diagnostic tools in the field of gastroenterology. The integration of dual-modality imaging and synthetic data augmentation has significantly improved the diagnostic performance for ampullary lesions. As we continue to explore the capabilities of AI in healthcare, the future looks promising for enhanced diagnostic accuracy and patient care.

๐Ÿ’ฌ Your comments

What are your thoughts on the integration of AI in medical diagnostics? We would love to hear your insights! ๐Ÿ’ฌ Leave your comments below or connect with us on social media:

AI-powered hierarchical classification of ampullary neoplasms: a deep learning approach using white-light and narrow-band imaging.

Abstract

BACKGROUND: Endoscopic diagnosis of Ampulla of Vater (AoV) lesions remains challenging owing to complex morphology and limited representative images, particularly for high-risk dysplastic lesions. This study aimed to develop a hierarchical deep learning framework for the stepwise classification of ampullary lesions using white-light (WL) and narrow-band endoscopic images (NBI).
METHODS: The framework employs three sequential binary classifications: (1) normal vs. abnormal, (2) adenoma vs. cancer, and (3) high-grade dysplasia (HGD) vs. low-grade dysplasia (LGD) within adenomas. Each stage uses EfficientNet-B4 classifiers trained independently on WL and NBI. Predictions are integrated using confidence-based voting. To overcome data scarcity and class imbalance, for HGD and cancer, we used StyleGAN2-ADA to generate synthetic images. The hierarchical model was developed using 4244 endoscopic images from 464 patients collected at Seoul National University Hospital (2693/833/718 for train/validation/test).
RESULTS: The hierarchical model achieved stage-specific accuracies of 95.6% (normal vs. abnormal), 94.4% (adenoma vs. cancer), and 92.7% (LGD vs. HGD), resulting in overall diagnostic accuracy of 92.2%. The model demonstrated excellent sensitivity of 83.3% for HGD and 87.5% for cancer, with specificities exceeding 98%. The confidence-based dual-modality approach (AUROC: 0.921) significantly outperformed single-modality approaches using WL alone (AUROC: 0.866) or NBI alone (AUROC: 0.895), by integrating their complementary diagnostic strengths. Generative adversarial network-based augmentation substantially improved sensitivity for cancer (from 87.5% to 91.7%) and HGD (from 83.3% to 86.5%), while overall accuracy increased from 94.5% to 95.1%.
CONCLUSIONS: A hierarchical deep learning approach integrating dual-modality imaging and synthetic data augmentation significantly improves diagnostic performance for ampullary lesions.

Author: [‘Yoon D’, ‘Chang SH’, ‘Paik WH’, ‘Kim CH’, ‘Kim BS’, ‘Kim YG’, ‘Chung H’, ‘Ryu JK’, ‘Lee SH’, ‘Cho IR’, ‘Choi SJ’, ‘Kim JS’, ‘Kim S’, ‘Choi JH’]

Journal: Surg Endosc

Citation: Yoon D, et al. AI-powered hierarchical classification of ampullary neoplasms: a deep learning approach using white-light and narrow-band imaging. AI-powered hierarchical classification of ampullary neoplasms: a deep learning approach using white-light and narrow-band imaging. 2026; (unknown volume):(unknown pages). doi: 10.1007/s00464-025-12534-2

Share on facebook
Facebook
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on whatsapp
WhatsApp

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.