Dr. L. Merin Singh

Assistant Professor



Electronics and Communication Engineering

Manipur Technical University

Manipur Technical University, Imphal West, Manipur, India 795004



EEGPARNet


Effective Electroencephalogram (EEG) signal processing necessitates the mitigation of physiological artifacts. While deep learning frameworks have demonstrated superior performance over traditional methods for this task, their high complexity and computational demands hinder deployment on resource-constrained platforms. In this work, a lightweight denoising network called EEGPARnet is proposed to address this limitation. The proposed architecture integrates a single-stage transformer encoder with temporal and spectral attention modules and a Gated Recurrent Unit (GRU)-based decoder.  This fusion enables the model to learn time-frequency long-range similarities, facilitating efficient feature extraction and a reduced number of trainable parameters. Experimental validation of the proposed model on the EEGDenoiseNet dataset revealed an average temporal relative root mean square error (RRMSEt) of 0.230, spectral relative root mean square error (RRMSEs) of 0.228, and a correlation coefficient (CC) of 0.967 for ocular artifact removal. For muscular artifact removal, the proposed method achieved competitive results against state-of-the-art techniques, with mean RRMSEt, RRMSEs, and CC values of 0.498, 0.439, and 0.842, respectively. Compared to state-of-the-art model, the proposed EEGPARnet demonstrated a significant reductions in computational complexity with 277x fewer trainable parameters, 358x less FLOPS, and 554x smaller storage, making it highly suitable for real-time EEG denoising on resource-constrained devices without compromising performance.

Publications


EEGPARnet: time-frequency attention transformer encoder and GRU decoder for removal of ocular and muscular artifacts from EEG signals


Kiyam Babloo Singh, Aheibam Dinamani Singh, Merin Loukrakpam

Medical \& Biological Engineering \& Computing, Springer Berlin Heidelberg, 2025, pp. 1--25



Tools
Translate to