Neurosymbolic Approaches for Knowledge-Infused Clinical Reasoning in Explainable Artificial Intelligence for Differential Diagnosis
Abstract
Differential diagnosis in clinical decision-making demands robust reasoning and explainability to ensure trustworthy and efficient patient care. While deep learning models have demonstrated promise in extracting patterns from medical data, they often lack interpretability and clinical trustworthiness. Neurosymbolic AI—an approach that integrates neural networks with symbolic reasoning—offers a promising paradigm for embedding clinical knowledge into machine learning pipelines. This paper explores the integration of medical ontologies, expert systems, and logic-based reasoning with deep learning techniques for improved diagnostic accuracy and interpretability. We propose a hybrid framework leveraging medical knowledge graphs and probabilistic logic programming to enhance AI-assisted clinical reasoning. Case-based evaluations demonstrate increased performance and transparency compared to purely neural approaches.