학술논문

Transfer learning model for cash-instrument prediction adopting a Transformer derivative
Document Type
article
Source
Journal of King Saud University: Computer and Information Sciences, Vol 36, Iss 3, Pp 102000- (2024)
Subject
Price Prediction
Transfer Learning
Deep Learning
Transformer Model
Autocorrelation
Electronic computers. Computer science
QA75.5-76.95
Language
English
ISSN
1319-1578
Abstract
Investors aiming for high market returns must accurately predict the prices of various cash instruments. However, making accurate predictions is challenging due to the complex cyclic and trending characteristic of markets, characterized by high volatility and unpredictable fluctuations. Furthermore, many studies overlook how interactions between different markets affect price movements. To address these problems, this research introduces a deep transfer-learning approach derived from the Transformer model, named the rotary-positional encoding autocorrelation Transformer (RAT). Unlike traditional methods, the RAT employs autocorrelation instead of self-attention to more effectively capture periodic features, while rotary-positional encoding preserves both the absolute and relative positioning within sequences to enhance trend understanding. Through transfer learning, the RAT model extracts deep features from a source domain and applies them to a target domain, demonstrating superior performance over LSTM, CNN-LSTM, gated recurrent units (GRUs), and Transformer models in multi-day predictions across 12 cash-instrument datasets. It achieved a substantial increase in accuracy, with a 35.83% reduction in mean squared error (MSE), a 23.95% reduction in mean absolute error (MAE), and a 32.63% increase in the coefficient of determination (R2). This study validates the RAT model’s effectiveness in predicting financial instrument prices.