CO425 — Final Year Project · Group 23

Lightweight Models for
Neurological Disorders
Detection

EEG-based AI detection of Epilepsy, Alzheimer's & Parkinson's Disease using TEECNet-inspired Error Correction Architecture

Dataset · CHB-MIT · Siena · OpenNeuro
Framework · PyTorch · EEGNet · CNN-TCN
University of Peradeniya · 2026
01 — Overview

Abstract

Neurological disorders such as Epilepsy, Alzheimer's Disease (AD), and Parkinson's Disease (PD) affect hundreds of millions globally, yet their diagnosis relies heavily on specialist-dependent, resource-intensive clinical assessment. Electroencephalography (EEG) offers a non-invasive, affordable window into brain activity—but automated analysis remains computationally expensive and clinically inaccessible.

This project proposes a unified family of lightweight EEG-based machine learning models for the detection of all three neurological disorders. Drawing inspiration from the TEECNet architecture originally designed for physics simulations, we introduce a two-stage training paradigm: a lightweight Base Network learns dominant EEG patterns, followed by a small Error-Correction Network (ECN) that refines predictions by targeting the systematic errors the base model makes.

Across all three disorders, the ECN consistently improves accuracy and reduces misclassifications—demonstrating that error-aware residual correction is a powerful, generalizable strategy for EEG classification with constrained model budgets.

0%
Alzheimer's
Accuracy
0%
Parkinson's
Max Accuracy
0%
F1 Improvement
(Epilepsy ECN)
0
Epilepsy
AUC Score
Epilepsy Detection
EEGNet + ECN · CHB-MIT & Siena datasets
🧠
Alzheimer's Disease
EEGNet-FCN · OpenNeuro ds004504
🔬
Parkinson's Disease
CNN-TCN + ECN · ds004584 & ds002778
03 — Architecture

Proposed Methodology

Two-Stage Training Pipeline
Stage 1
EEG
Preprocessing
Stage 1
Train Base
Network
Stage 1
Freeze Base
Parameters
Stage 2
Train ECN on
Residuals
Output
Base +
ECN Correction

Final output: Y = Base + ECN Correction

Base Network Architecture

A lightweight EEGNet-inspired encoder extracts primary temporal–spatial features from the raw multi-channel EEG input. Key components:

Depthwise–separable convolutions (temporal then spatial) to minimise parameter count
Batch normalisation and SiLU activations for stable training
Optional Squeeze-and-Excitation (SE1D) channel attention block
Spectral Band Power branch fusing frequency-domain embeddings (delta, theta, alpha, beta, gamma)
Outputs class logits + 128-d embedding for ECN consumption

Error-Correction Network (ECN)

A small gated corrector network, TEECNet-inspired, that refines the base prediction:

Inputs: base logits, prediction confidence/uncertainty, optional raw signal features
Correction Heads: Δemb (embedding correction), Δlogits (logit correction), gate (correction strength)
Gated Combination: base_emb + gate·Δemb → head + gate·Δlogits → final logits
Trained exclusively on residual errors from the frozen base model

Custom Loss Functions

📉

Focal Loss

Down-weights easy, correctly classified examples so the model focuses training capacity on hard borderline cases—critical for highly imbalanced EEG datasets.

⚖️

Class-Weighted Cross-Entropy

Adjusts loss contribution by inverse class frequency, compensating for the natural scarcity of seizure windows relative to interictal EEG.

🔗

Residual MSE (ECN Objective)

ECN minimises mean-squared error on the prediction residuals from the frozen base—directly optimising the correction term rather than the full classification loss.

Model Complexity Summary

Component Parameters Role
Base EEGNet 94,333 Primary classifier
FCN / ECN 132,804 Error corrector
Total 227,137 Full pipeline
04 — Setup & Data

Experiment Setup & Implementation

Datasets

CHB-MIT (Primary)

Patients24 pediatric
Channels23 EEG
Sample Rate256 Hz
Duration~916 hours
Seizure Events198

Siena Scalp EEG

PopulationAdult epilepsy
PurposeCross-dataset eval
GeneralisationOut-of-distribution

Preprocessing Pipeline

Channel pad/trim → 23 channels, 256 Hz target
Resample → per-channel Z-score normalisation
4-second sliding windows with 50% overlap
Labels via majority rule within each window
1% seizure / 99% non-seizure class imbalance handled via focal loss

Model Architecture

Base: EEGNet — learns dominant temporal–spatial EEG patterns, outputs class logits (seizure / non-seizure).

ECN: Lightweight MLP / small CNN operating on logits + uncertainty estimates. Outputs correction or refined seizure probability.

Training Details

OptimiserAdam (lr=1e-3)
Stage 1 LossFocal + BCE
Stage 2 LossResidual MSE
Val StrategyPatient-wise split
HardwareNVIDIA RTX 6000 Ada

Dataset — OpenNeuro ds004504

SubjectsAD patients + CN controls
Channels19 channels @ 500 Hz
DerivativePreprocessed version used

Preprocessing (Derivative)

Band-pass filter 0.5–45 Hz
Re-referencing to A1–A2 mastoid
Artifact Subspace Reconstruction (ASR)
ICA (RunICA) + ICLabel classifier for component removal
Custom pipeline: subject filtering, resampling (500→250 Hz), 20s windows with 50% overlap, 80/20 train/eval split

Model — EEGNet-FCN

A Modified EEGNet-inspired base with Spectral Band Power branch fused at the embedding level. On top, a Feature Correction Network (FCN) refines predictions through:

Signal Branch: convolutional stack → 64-d representation
Meta Extractor: 6 logit-derived confidence features
Shared MLP: combines all streams → shared representation
Gated combination: base_emb + gate·Δemb + gate·Δlogits → CN/AD prediction
Phase 1 (Base)88.78% Bal. Acc.
Phase 2 (+FCN)92.89% Bal. Acc.
Phase 3 (Fine-tune)93.12% Bal. Acc.

Datasets

ds004584 (Primary)

Subjects149
Channels60 EEG
TypeResting-state EEG

ds002778 (Secondary)

Subjects31
Channels40 EEG
PurposeRobustness test

Preprocessing

Bandpass 0.5–45 Hz + 50 Hz notch filter
Average re-referencing + ICA artifact removal (20 components)
Downsample to 250 Hz, 2-second windows with 50% overlap
Stratified subject-wise 70/15/15 train/val/test split
Channel-wise Z-score normalisation

Three Base Models

1

Modified EEGNet

Depthwise-separable CNN, lowest parameter count, focal loss with PD class boost.

2

LW Temporal-Spatial CNN

Separates temporal rhythm and spatial channel learning. Noise + masking + channel dropout augmentation.

3

CNN + TCN

CNN spatial front-end with dilated causal TCN backbone for long-range EEG dependencies. Mixup augmentation.

05 — Evaluation

Results & Analysis

This section consolidates the strongest findings across Epilepsy, Alzheimer's disease, and Parkinson's disease, then drills into task-specific metrics, cross-dataset behaviour, and model-to-model comparisons.

Evaluation Snapshot

Cross-disorder evidence for lightweight error correction

Each task is evaluated with the metric profile most meaningful to its setting, while the shared question remains the same: can a compact ECN improve lightweight EEG classifiers consistently across distinct neurological conditions?

Epilepsy
0.9945
AUC with ECN

Strong seizure discrimination with fewer false alarms under cross-dataset evaluation.

Alzheimer's
93.26%
Accuracy

EEGNet-FCN achieves the strongest overall classification performance among the lightweight baselines shown.

Parkinson's
97.19%
Best F1 with CNN-TCN + ECN

ECN refinement improves all three lightweight Parkinson's models, with the best final performance from CNN-TCN.

0.9945
AUC (ECN)
+21.2%
F1 Improvement
Siena cross-dataset
−3
False Alarms
ECN vs Base
0.9944
Base AUC

Cross-Dataset Performance

Base Model Base + ECN
EEGNet (CHB-MIT) Base
0.84
EEGNet (CHB-MIT) +ECN
0.87
TwoStream (CHB-MIT) Base
0.92
TwoStream (CHB-MIT) +ECN
0.95
Siena Rescue Base
0.36
Siena Rescue +ECN
0.57

Distillation Comparison

Model Accuracy F1 AUC
HeavyTeacher-ResNet 0.97 0.90 0.96
LightBase-EEGNet 0.79 0.34 0.62
EEGNet + ECN (Distilled) 0.86 0.56 0.72
Key Findings
ECN never degrades correct predictions
Improvements come from reducing false alarms
Particularly valuable in low-seizure-prevalence settings
Demonstrates the ECN "Rescue Effect" on cross-dataset generalisation
93.26%
Accuracy
93.12%
Balanced Accuracy
93.21%
Macro F1
227K
Total Parameters

Phase-wise Balanced Accuracy

Phase 1 — Base Only
88.78%
Phase 2 — + FCN
92.89%
Phase 3 — Fine-Tuned
93.12%

Confusion Matrix Summary

CN Class

Precision94.3%
Recall91.0%
F1-score92.6%

AD Class

Precision92.4%
Recall95.2%
F1-score93.8%

Comparison with Lightweight Architectures

Model Accuracy Bal. Acc. Macro F1
EEGNet-FCN (Ours) 0.9326 0.9312 0.9321
DSCNN 0.8448 0.8491 0.8448
TinyResNet 0.8302 0.8491 0.8296
TCNLite 0.8506 0.8447 0.8474
MobileNet1D 0.8245 0.8182 0.8212
ShuffleNet1D 0.8009 0.8063 0.8189
SqueezeNet1D 0.7830 0.7926 0.8025

Training measured on NVIDIA RTX 6000 Ada Generation GPU with CUDA AMP. EEGNet-FCN trains in 581s.

96.42%
CNN-TCN +ECN
Best Accuracy
97.19%
CNN-TCN +ECN
Best F1
265
Fewer Errors
EEGNet ECN gain
+7.03%
Max Acc Gain
EEGNet Base→ECN

Base vs. Base+ECN — All Models

Base Base + ECN
1. Modified EEGNet
Accuracy Base
88.3%
Accuracy +ECN
95.4%
F1 Base
90.7%
F1 +ECN
96.4%
2. LW Temporal-Spatial CNN
Accuracy Base
89.9%
Accuracy +ECN
95.9%
F1 Base
92.0%
F1 +ECN
96.8%
3. CNN + TCN
Accuracy Base
90.7%
Accuracy +ECN
96.4%
F1 Base
92.6%
F1 +ECN
97.2%

Error Reduction Summary

Model Base Acc Base F1 +ECN Acc +ECN F1 Acc Gain Error Reduction
Modified EEGNet 88.32% 90.65% 95.35% 96.35% +7.03% 265 fewer
LW Temporal-Spatial CNN 89.91% 91.97% 95.89% 96.77% +5.97% 225 fewer
CNN-TCN 90.71% 92.61% 96.42% 97.19% +5.71% 215 fewer
06 — Takeaways

Conclusion

This project demonstrates that a TEECNet-inspired Error Correction Network (ECN) is a powerful, general-purpose enhancement for lightweight EEG classifiers. By training a small corrector network exclusively on the systematic prediction residuals of a frozen base model, we achieve consistent, meaningful accuracy gains across three neurologically distinct conditions—without increasing the base model's inference-time parameter count.

The results validate a key hypothesis: lightweight models contain systematic, learnable blind spots, and a targeted corrector is a more efficient route to accuracy recovery than simply making the base model larger.

For Alzheimer's detection, our EEGNet-FCN achieves 93.26% accuracy and 93.21% Macro F1—outperforming all evaluated lightweight baselines (DSCNN, TinyResNet, TCNLite, MobileNet1D, ShuffleNet1D, SqueezeNet1D) by a significant margin, while remaining within practical parameter budgets.

ECN Universality

Consistent improvement across Epilepsy, Alzheimer's, and Parkinson's with the same base+corrector design philosophy.

📱

Edge Deployability

Total parameter count under 230K supports real-time inference on wearable and edge devices.

🔁

Residual Learning Works

The base never degrades from ECN application; corrections are bounded and stabilised by clipping.

Future Work

Priority Improvements

Expand dataset size and diversity to improve generalisation
Strengthen methods for handling class imbalance
Extend evaluation to real-time and online settings
Measure hardware latency and power efficiency directly

Near-Term Work

Test on larger, more diverse EEG datasets
Enable deployment on edge / portable devices
Improve overall accuracy and ECN stability
Validate in real clinical environments

Long-Term Vision

Unified multi-disorder detection pipeline
Personalised adaptive ECN fine-tuning per patient
Integration with wearable EEG headsets
Prospective clinical trial validation
07 — People

Team

Sameera Kumarasinghe

Sameera Kumarasinghe

E/20/212
Epilepsy Detection
Nuwan Dilshan

Nuwan Dilshan

E/20/455
Alzheimer's Disease Detection
Sachin Dulaj

Sachin Dulaj

E/20/456
Parkinson's Disease Detection

Supervisors

Prof. Roshan G. Ragel

Prof. Roshan Ragel

Supervisor · Department of Computer Engineering

roshanr@eng.pdn.ac.lk
Mr. Sivaraj Nimishan

Mr. Sivaraj Nimishan

Supervisor · Department of Computer Engineering

nimishan@ccs.sab.ac.lk