Electronics, Vol. 15, Pages 325: Multi-Scale Temporal Learning with EEMD Reconstruction for Non-Stationary Error Forecasting in Current Transformers
Electronics doi: 10.3390/electronics15020325
Authors:
Jian Liu
Chen Hu
Zhenhua Li
Jiuxi Cui
Current transformer measurement errors exhibit strong non-stationarity and multi-scale temporal dynamics, which make accurate prediction challenging for conventional deep learning models. This paper presents a hybrid signal processing and temporal learning framework that integrates ensemble empirical mode decomposition (EEMD) with a dual-scale temporal convolutional architecture. EEMD adaptively decomposes the error sequence into intrinsic mode functions, while a Pearson correlation-based selection step removes redundant and noise-dominated components. The refined signal is then processed by a dual-scale temporal convolutional network (TCN) designed with parallel dilated kernels to capture both high-frequency transients and long-range drift patterns. Experimental evaluations on 110 kV substation data confirm that the proposed decomposition-enhanced dual-scale temporal convolutional framework significantly improves generalization and robustness, reducing the root mean square error by 40.9% and the mean absolute error by 37.0% compared with benchmark models. The results demonstrate that combining decomposition-based preprocessing with multi-scale temporal learning effectively enhances the accuracy and stability of non-stationary current transformer error forecasting.
Source link
Jian Liu www.mdpi.com
