학술논문

Transformers for Modeling Long-Term Dependencies in Time Series Data: A Review
Document Type
Conference
Source
2023 IEEE Signal Processing in Medicine and Biology Symposium (SPMB) Signal Processing in Medicine and Biology Symposium (SPMB), 2023 IEEE. :1-5 Dec, 2023
Subject
Bioengineering
Signal Processing and Analysis
Deep learning
Recurrent neural networks
Time series analysis
Finance
Medical services
Big Data
Signal processing
Climate change
Language
ISSN
2473-716X
Abstract
Analysis of time series data for classification or prediction tasks is very useful in various applications such as healthcare, climate studies and finance. As big data resources have recently become available in a number of fields such as healthcare [1], finance [3]–[5] and climate change [6], it is now possible to apply state of the art deep learning models. Traditional methods such as autoregressive integrated moving average (ARIMA) [7], long short-term memory networks (LSTM) [8], gated recurrent units (GRUs) [9] and recurrent neural networks (RNN) [10] have provided robust frameworks in the analysis of time series data. However, these methods have limitations when applied to big data sets and when used to model long-term dependencies. The emergence of transformer-based architectures [11], as show in Figure 1, and technologies such as ChatGPT [12], has demonstrated the potential for analyzing time series data with long-term dependencies and advancing the basic science by discovering new underlying structure. In this review, we provide a detailed analysis of state of the art in deep learning systems that model long-term context.