학술논문

Correlation Recurrent Units: A Novel Neural Architecture for Improving the Predictive Performance of Time-Series Data
Document Type
Periodical
Author
Source
IEEE Transactions on Pattern Analysis and Machine Intelligence IEEE Trans. Pattern Anal. Mach. Intell. Pattern Analysis and Machine Intelligence, IEEE Transactions on. 45(12):14266-14283 Dec, 2023
Subject
Computing and Processing
Bioengineering
Predictive models
Computer architecture
Transformers
Time series analysis
Forecasting
Data models
Logic gates
Time-series forecasting
STL state
autocorrelation gate
correlation gate
correlation recurrent unit
Language
ISSN
0162-8828
2160-9292
1939-3539
Abstract
Time-series forecasting (TSF) is a traditional problem in the field of artificial intelligence, and models such as recurrent neural network, long short-term memory, and gate recurrent units have contributed to improving its predictive accuracy. Furthermore, model structures have been proposed to combine time-series decomposition methods such as seasonal-trend decomposition using LOESS. However, this approach is learned in an independent model for each component, and therefore, it cannot learn the relationships between the time-series components. In this study, we propose a new neural architecture called a correlation recurrent unit (CRU) that can perform time-series decomposition within a neural cell and learn correlations (autocorrelation and correlation) between each decomposition component. The proposed neural architecture was evaluated through comparative experiments with previous studies using four univariate and four multivariate time-series datasets. The results showed that long- and short-term predictive performance was improved by more than 10%. The experimental results indicate that the proposed CRU is an excellent method for TSF problems compared to other neural architectures.