Essential Site Maintenance: Authorea-powered sites will be updated circa 15:00-17:00 Eastern on Tuesday 5 November.
There should be no interruption to normal services, but please contact us at help@authorea.com in case you face any issues.

loading page

Power quality disturbance signal segmentation and classification based on modified BI-LSTM with double attention mechanism
  • +1
  • Poras Khetarpal,
  • Dr Neelu Nagpal,
  • Pierluigi Siano,
  • Mohammed Al-Numay
Poras Khetarpal
Delhi Technological University Department of Electrical Engineering
Author Profile
Dr Neelu Nagpal
Maharaja Agrasen Institute of Technology

Corresponding Author:nagpalneelu1971@ieee.org

Author Profile
Pierluigi Siano
University of Salerno
Author Profile
Mohammed Al-Numay
King Saud University
Author Profile

Abstract

This paper proposes a recurrent neural network (RNN) based model to segment and classify multiple combined multiple power quality disturbances (PQDs) from the PQD voltage signal. A modified bi-directional long short-term memory (BI-LSTM) model with two different types of attention mechanism is developed. Firstly, an attention gate is added to the basic LSTM cell to reduce the training time and focus the memory on important PQD signal part. Secondly, attention layer is added to the BI-LSTM to obtain the more important part of the voltage signal by assigning weightage to the output of the BI-LSTM model. Finally a SoftMax classifier is applied to classify the combined PQD signal in 96 different combinations. The performance of proposed BI-LSTM model with attention gate and attention layer mechanism is compared with the performance of baseline models based on recurrent neural network (RNN) and convolution neural network (CNN). With this model, the PQD signal is easily segmented from the voltage signal which makes the process of PQD classification more accurate with less computation complexity and in less time.
07 Mar 2023Submitted to IET Generation, Transmission & Distribution
10 Mar 2023Submission Checks Completed
10 Mar 2023Assigned to Editor
23 May 2023Reviewer(s) Assigned
07 Jun 2023Review(s) Completed, Editorial Evaluation Pending
18 Jun 2023Editorial Decision: Revise Major
13 Jul 20231st Revision Received
14 Jul 2023Submission Checks Completed
14 Jul 2023Assigned to Editor
23 Jul 2023Reviewer(s) Assigned
10 Aug 2023Review(s) Completed, Editorial Evaluation Pending
25 Sep 2023Editorial Decision: Revise Minor
04 Oct 20232nd Revision Received
25 Oct 2023Review(s) Completed, Editorial Evaluation Pending
14 Nov 2023Editorial Decision: Accept