Electronics, Vol. 14, Pages 2378: iTransformer-FFC: A Frequency-Aware Transformer Framework for Multi-Scale Time Series Forecasting
Electronics doi: 10.3390/electronics14122378
Authors:
Yongli Tang
Zhongqi Cai
Capturing complex temporal dependencies across multiple scales remains a fundamental challenge in time series forecasting. Transformer-based models have achieved impressive performance on sequence tasks, but vanilla designs often struggle to integrate information from both local fluctuations and global trends, especially in non-stationary sequences. We propose iTransformer-FFC, a novel forecasting framework that addresses these issues through frequency-domain analysis and multi-scale feature fusion. In particular, iTransformer-FFC introduces a Fast Fourier Convolution (FFC) module to transform time series data into the frequency domain, isolating dominant periodic components and attenuating noise before attention is applied. A hierarchical feature fusion mechanism that integrates features extracted at multiple temporal resolutions then jointly models global and local temporal patterns, while a factorized self-attention architecture reduces the quadratic complexity of standard Transformers, improving efficiency while maintaining accuracy. Together, these innovations enable more effective long-range dependency modeling and adaptability to regime shifts in the data. Extensive experiments on five public benchmark datasets demonstrate that iTransformer-FFC consistently outperforms state-of-the-art models, including the vanilla Transformer, an earlier iTransformer variant, and PatchTST. Notably, our model achieves on average an 8.73% lower MSE and 6.95% lower MAE than the best performing baseline, confirming its superior predictive accuracy and generalization in multi-scale time series forecasting through its innovative integration of frequency-domain analysis, hierarchical feature fusion, and factorized attention mechanisms.
Source link
Yongli Tang www.mdpi.com