International Journal of Innovative Research in Computer and Communication Engineering

ISSN Approved Journal | Impact factor: 8.771 | ESTD: 2013 | Follows UGC CARE Journal Norms and Guidelines

| Monthly, Peer-Reviewed, Refereed, Scholarly, Multidisciplinary and Open Access Journal | High Impact Factor 8.771 (Calculated by Google Scholar and Semantic Scholar | AI-Powered Research Tool | Indexing in all Major Database & Metadata, Citation Generator | Digital Object Identifier (DOI) |


TITLE Explainable AI System for Brain Tumor Detection & Severity Prediction
ABSTRACT Brain tumor diagnosis is a critical requirement in medical imaging and oncology. Explainable artificial intelligence can significantly enhance early diagnosis and severity prediction of brain tumors when applied to MRI scans through advanced deep learning models. By processing brain MRI images and extracting tumor-specific features, AI models can classify tumors and estimate their severity levels, such as low-grade and high-grade tumors. To ensure transparency and clinical reliability, explainability methods like Grad-CAM and SHAP are used to highlight the most influential MRI regions and feature contributions responsible for the prediction. These interpretations allow medical professionals to verify the AI’s reasoning, improving trust and enabling informed decision-making. This approach supports precise tumor assessment, reduces diagnostic ambiguity, and strengthens the role of interpretable AI in real-world healthcare applications involving brain tumor analysis using MRI technology.
AUTHOR AKASH K L, DHANUSHA K, PRAFFUL KUMAR S A, REHAN MALLICK, PROF. SHANKAR SARJI P U.G. Student, Department of CS&E, Bapuji Institute of Engineering and Technology, Davanagere, Karnataka, India Assistant Professor, Department of CS&E, Bapuji Institute of Engineering and Technology, Davanagere, Karnataka, India
VOLUME 177
DOI DOI: 10.15680/IJIRCCE.2025.1312063
PDF pdf/63_Explainable AI System for Brain Tumor Detection & Severity Prediction.pdf
KEYWORDS
References 1. M. Iftikhar, S. Khalid, and A. Javed, “Explainable CNN for Brain Tumor Detection and Classification,” Brain Informatics, Springer, 2025.
2. M. M. Rahman, T. Hossain, and A. Chakraborty, “Enhanced MRI Brain Tumor Detection Using Deep Learning with Explainable AI (SHAP),” Scientific Reports, Nature Publishing Group, 2025.
3. Sharma, N. Verma, and L. Khan, “Explainable AI-Driven MRI-Based Brain Tumor Classification,” Frontiers in Artificial Intelligence, vol. 4, 2025.
4. J. Patel, R. Singh, and K. Gupta, “Exploring the Potential of Explainable AI in Brain Tumor Detection: Review and Challenges,” Artificial Intelligence Review, Springer, 2025.
5. H. Gundogan, “A Novel Hybrid Deep Learning Model Enhanced with Explainable AI for Brain Tumor Multi-Classification,” Applied Sciences, vol. 15, no. 10, MDPI, 2025.
6. Hosny, M. Kadry, and S. Zhang, “Explainable Ensemble Deep Learning–Based Model for Brain Tumor Classification using Transfer Learning and XAI,” Neural Computing and Applications, Springer, 2025.
7. S. Sarker, “Transfer Learning and Explainable AI for Brain Tumor Classification: MRI Data from Bangladesh,” arXiv preprint, arXiv:2506.07228, 2025.
8. Guluwadi, R. Mittal, and P. Kumar, “Enhancing Brain Tumor Detection in MRI Images through Explainable AI using Grad-CAM and ResNet50,” Neural Computing and Applications, Springer, 2024.
9. X. Yan, L. Zhang, and M. Hu, “Explainable Brain Tumor Detection Framework for MRI: Segmentation, Classification, and Interpretability,” Applied Sciences, vol. 13, no. 6, pp. 3438–3450, MDPI, 2023.
10. Khan, R. Shafiq, and K. Al-Ghamdi, “Accurate Brain Tumor Detection Using Deep Convolutional Neural Networks,” Journal of King Saud University – Computer and Information Sciences, Elsevier, 2022.
image
Copyright © IJIRCCE 2020.All right reserved