Schedule-free optimization of the transformers-based time series forecasting model

Kyrylo Yemets, Michal Greguš

Abstract


The task of time series forecasting is important for many scientific, technical, and applied fields, such as finance, economics, meteorology, medicine, transportation, and telecommunications. Existing methods, such as autoregressive models and moving average models, have their limitations, especially when working with non-stationary and seasonal data. In this work, the basic architecture of transformers was modified to solve time series forecasting problems. Additionally, state-of-the-art optimizers were investigated and experimentally compared, including AdamW, stochastic gradient descent (SGD), and new methods such as schedule-free SGD and schedule-free AdamW, to improve forecasting accuracy and the efficiency of the training procedure for the transformer architecture. Modeling was conducted on meteorological data that included seasonal time series. The accuracy evaluation of the optimization methods studied in this work was performed using a range of different performance indicators. The results showed that the new optimization methods significantly improve forecasting accuracy compared to the use of traditional optimizers.

Keywords


AdaGrad; Forecasting; RMSprop; Schedule-Free AdamW; Schedule-Free SGD; Time series; Transformers;

Full Text:

PDF


DOI: http://doi.org/10.11591/ijai.v14.i2.pp1067-1076

Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

IAES International Journal of Artificial Intelligence (IJ-AI)
ISSN/e-ISSN 2089-4872/2252-8938 
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).

View IJAI Stats