학술논문

Compatible Transformer for Irregularly Sampled Multivariate Time Series
Document Type
Conference
Source
2023 IEEE International Conference on Data Mining (ICDM) ICDM Data Mining (ICDM), 2023 IEEE International Conference on. :1409-1414 Dec, 2023
Subject
Communication, Networking and Broadcast Technologies
Computing and Processing
Time series analysis
Transformer cores
Transformers
Time measurement
Robustness
Data mining
Task analysis
multivariate time series
irregularly sampling
Language
ISSN
2374-8486
Abstract
To analyze multivariate time series, most previous methods assume regular subsampling of time series, where the interval between adjacent measurements and the number of samples remain unchanged. Practically, data collection systems could produce irregularly sampled time series due to sensor failures and interventions. However, existing methods designed for regularly sampled multivariate time series cannot directly handle irregularity owing to misalignment along both temporal and variate dimensions. To fill this gap, we propose Compatible Transformer (CoFormer), a transformer-based encoder to achieve comprehensive temporal-interaction feature learning for each individual sample in irregular multivariate time series. In CoFormer, we view each sample as a unique variate-time point and leverage intra-variate/inter-variate attentions to learn sample-wise temporal/interaction features based on intra-variate/inter-variate neighbors. With CoFormer as the core, we can analyze irregularly sampled multivariate time series for many downstream tasks, including classification and prediction. We conduct extensive experiments on 3 real-world datasets and validate that the proposed CoFormer significantly and consistently outperforms existing methods. Code will be avilable at https://github.com/MediaBrain-SJTU/CoFormer.