학술논문

Multi-attention based Feature Embedding for Irregular Asynchronous Time Series Modelling
Document Type
Conference
Author
Source
IECON 2023- 49th Annual Conference of the IEEE Industrial Electronics Society Industrial Electronics Society, IECON 2023- 49th Annual Conference of the IEEE. :1-6 Oct, 2023
Subject
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Engineering Profession
Power, Energy and Industry Applications
Robotics and Control Systems
Signal Processing and Analysis
Transportation
Training
Industrial electronics
Machine learning algorithms
Time series analysis
Transfer learning
Statistical learning
Training data
Predictive models
Minimization
Forecasting
Language
ISSN
2577-1647
Abstract
Forecasting time series values based on historic covariates has been an active area of research in statistics and machine learning. With the availability of computation resources and big data infrastructure supporting massive volume, velocity and variety, the algorithms have evolved from classic statistical learning to neural-network driven loss minimisation techniques. While state of the art attention and self-attention-transformers have shown promise of improved performance with sufficient training data, most of them fail to generalise to different problems of time-series modelling (such as classification and extremum forecasting) with asynchronously sampled covariates. This paper introduces the concept of a generalised time series embedding and transfer learning for time series (analogous to token-to-vector or image-to-vector embeddings in language and vision models respectively) that allow joint training with a unified interface. The major benefit of this work is a unified embedding model employing multi-attention for feature representation which enables benchmark performance against state of the art models from recent literature.