Swaleh Omar

and 3 more

Accurate Electroencephalography (EEG) signals classification is essential for diagnosing brain disorders such as Epilepsy. Whereas Deep Learning models such as Convolution Neural Networks (CNNs) and Long Short-Term Memory (LSTM) improved EEG classification performance over traditional methods, existing attention mechanisms such as Additive, Luong and Multihead struggle to capture EEG’s complex temporal dependencies. This study proposes Scaled Custom Attention (SCA); a mechanism for temporal dependency modeling during EEG classification. Unlike traditional Query-Key-Value (QKV) approaches which rely on semantic weighting schemes, SCA employs direct feature weighting strategy that adapts to the unique temporal dependencies of EEG signals, and introduces a scaling strategy that enhances stability. To validate our approach, experiments were conducted using TUH EEG Epilepsy Corpus (TUEP) where SCA achieved an improved classification performance (Accuracy: 98.07%, F1-Score: 98.06%), marginally higher than Additive (97.60%, 97.61%), Multihead (97.66%, 97.66%), and Luong (97.68%, 97.66%) attention mechanisms when integrated to the LConvNet EEG classification model. Additionally, SCA achieves a balanced performance profile, with competitive inference time of 2.83 vs. 1.32–3.89 for baselines, parameter efficiency (58.5 params/sample vs 58.5–63.7), and a comparable generalization, with an average training-validation difference (Avg) of 0.0191, making it a promising enhancement for EEG-based deep learning models.