Transformers are critical and costly components of power
systems whose health deteriorates over time due to factors such as poor cooling
and heavy loading. Consequently, predictive maintenance is emerging as an
effective alternative to conventional corrective maintenance, enabling
continuous monitoring and early fault detection.
To enhance the effectiveness of predictive maintenance for
power transformers under limited Dissolved Gas Analysis (DGA) data conditions,
this study proposes a customised Long Short-Term Memory (C-LSTM) deep learning
model. The developed C-LSTM architecture is specifically designed to address
the limitations of conventional LSTM networks, which often exhibit higher
classification error rates when trained on small datasets and may underperform
compared to traditional machine learning approaches.
A comprehensive performance evaluation was conducted by
comparing the proposed C-LSTM model with several well-established traditional
machine learning algorithms using multiple metrics, including validation
accuracy, test accuracy, precision, recall, and F1-score. Additionally, the
diagnostic capability of the model was rigorously assessed across seven
transformer fault categories, including low- and high-energy discharges,
partial discharge, electrical and thermal faults, and low-, medium-, and
high-temperature thermal faults.
The experimental results demonstrate the superior
classification and diagnostic performance of the proposed C-LSTM model,
achieving a validation accuracy of 100% and a test accuracy of 98.57%,
significantly outperforming conventional machine learning techniques. These
findings confirm that the proposed C-LSTM framework offers a robust and
reliable solution for transformer fault diagnosis and predictive maintenance,
particularly in scenarios characterised by scarce DGA datasets.
Author(s) Details
G.V.S.S.N. Srirama
Sarma
Department of Electrical and Electronics Engineering, Matrusri Engineering
College, Saidabad, Hyderabad, India.
Please see the book here :- https://doi.org/10.9734/bpi/nhstc/v9/6804
No comments:
Post a Comment