Interpretable signal processing for early seizure prediction among epileptic patients
Interpretable signal processing for early seizure prediction among epileptic patients
Date
2025
Authors
Ahimbisibwe, Acleophas
Journal Title
Journal ISSN
Volume Title
Publisher
Makerere University
Abstract
Neurological disorders affect approximately 3.4 billion people worldwide, representing a significant global health challenge. Among these, epilepsy is one of the most preva- lent, impacting over 70 million individuals globally. The burden of epilepsy is particularly pronounced in low- and middle-income countries (LMICs), where access to specialized neurological care is often limited. Although antiepileptic drugs (AEDs) are the standard treatment, nearly one-third of patients suffer from drug-resistant epilepsy (DRE), where seizures persist despite trials of two or more appropriate medications. For these individuals, the unpredictable nature of seizures poses severe risks, including physical injury, cognitive decline, psychological distress, and decreased quality of life. Thus, the ability to predict seizures before onset remains a crucial, yet unsolved, chal- lenge in clinical neurology. Electroencephalography (EEG) plays a central role in epilepsy diagnosis and monitor- ing, as it non-invasively records brain activity and captures neural dynamics associated with seizures. However, EEG signals are highly complex, often exhibiting noise, inter- patient differences, and intra-patient variability, which complicate seizure detection and prediction. Recent advances in artificial intelligence (AI), particularly in deep learn- ing and machine learning, have opened new frontiers in EEG-based seizure prediction. AI models can uncover subtle pre-ictal patterns invisible to human observers, offering the potential for early warnings and proactive clinical interventions. Despite this progress, a major limitation remains: many deep learning models operate as black boxes, lacking transparency and interpretability. In medical contexts, where clinical accountability and trust are paramount, explainable AI (XAI) is critical to bridge the gap between algorithmic intelligence and medical decision-making. To address these limitations, this study adopts a multi-strategic learning approach that integrates advanced signal processing, deep learning, and explainable AI to develop an interpretable and clinically reliable seizure prediction system. The goal is not only to enhance predictive performance but also to provide transparent, human-understandable explanations that can assist clinicians in real-time epilepsy management. Ultimately, the research aims to improve patient outcomes by enabling personalized, data-driven interventions for individuals with drug-resistant epilepsy.
Description
A dissertation submitted to the Directorate Graduate Training in partial fulfillment of the requirements for the award of the Degree of Master of Science in Computer Science of Makerere University
Keywords
Citation
Ahimbisibwe, A. (2025). Interpretable signal processing for early seizure prediction among epileptic patients; Unpublished Masters dissertation, Makerere University, Kampala