Next Article in Journal
Efficient Solution of Fokker–Planck Equations in Two Dimensions
Previous Article in Journal
An Interior Regularity Property for the Solution to a Linear Elliptic System with Singular Coefficients in the Lower-Order Term
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

An Adaptive Learning Time Series Forecasting Model Based on Decoder Framework

School of Information, Shanxi University of Finance and Economics, Taiyuan 030006, China
*
Author to whom correspondence should be addressed.
Mathematics 2025, 13(3), 490; https://doi.org/10.3390/math13030490
Submission received: 11 December 2024 / Revised: 24 January 2025 / Accepted: 29 January 2025 / Published: 31 January 2025

Abstract

Time series forecasting constitutes a fundamental technique for analyzing dynamic alterations within temporal datasets and predicting future trends in various domains. Nevertheless, achieving effective modeling faces challenges arising from complex factors such as accurately capturing the relationships among temporally distant data points and accommodating rapid shifts in data distributions over time. While Transformer-based models have demonstrated remarkable capabilities in handling long-range dependencies recently, directly applying them to address the evolving distributions within temporal datasets remains a challenging task. To tackle these issues, this paper presents an innovative sequence-to-sequence adaptive learning approach centered on decoder framework for addressing temporal modeling tasks. An end-to-end deep learning architecture-based Transformer decoding framework is introduced, which is capable of adaptively discerning the interdependencies within temporal datasets. Experiments carried out on multiple datasets indicate that the time series adaptive learning model based on the decoder achieved an overall reduction of 2.6% in MSE (Mean Squared Error) loss and 1.8% in MAE (Mean Absolute Error) loss when compared with the most advanced Transformer-based time series forecasting model.
Keywords: time series forecasting; Transformer; decoder-only; concept drift; low-rank decomposition time series forecasting; Transformer; decoder-only; concept drift; low-rank decomposition

Share and Cite

MDPI and ACS Style

Hao, J.; Sun, Q. An Adaptive Learning Time Series Forecasting Model Based on Decoder Framework. Mathematics 2025, 13, 490. https://doi.org/10.3390/math13030490

AMA Style

Hao J, Sun Q. An Adaptive Learning Time Series Forecasting Model Based on Decoder Framework. Mathematics. 2025; 13(3):490. https://doi.org/10.3390/math13030490

Chicago/Turabian Style

Hao, Jianlong, and Qiwei Sun. 2025. "An Adaptive Learning Time Series Forecasting Model Based on Decoder Framework" Mathematics 13, no. 3: 490. https://doi.org/10.3390/math13030490

APA Style

Hao, J., & Sun, Q. (2025). An Adaptive Learning Time Series Forecasting Model Based on Decoder Framework. Mathematics, 13(3), 490. https://doi.org/10.3390/math13030490

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop