Abstract
Artificial Intelligence (AI) is increasingly being leveraged to transform healthcare delivery, particularly in low-resource settings where clinical staff and infrastructure are severely limited. This paper proposes the design and conceptual implementation of an AI-based symptom checker tailored for such environments, incorporating explainability features to enhance trust and usability among healthcare workers and patients. By integrating low-cost mobile platforms with lightweight AI models, and adopting explainable AI (XAI) techniques, we aim to foster responsible diagnosis assistance. This research synthesizes recent advancements and identifies the challenges and pathways in deploying explainable AI tools in low-resource settings. The findings from prior implementations in sub-Saharan Africa, India, and Latin America inform the architectural framework presented in this study.
View more >>