학술논문

-Complexity Low-Rank Approximation SVD for Massive Matrix in Tensor Train Format
Document Type
Conference
Source
ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) Acoustics, Speech and Signal Processing (ICASSP), ICASSP 2023 - 2023 IEEE International Conference on. :1-5 Jun, 2023
Subject
Bioengineering
Communication, Networking and Broadcast Technologies
Computing and Processing
Signal Processing and Analysis
Tensors
Simulation
Software algorithms
Signal processing algorithms
Signal processing
Approximation algorithms
Software
Language
ISSN
2379-190X
Abstract
Tensor train decomposition (TTD) has recently been proposed for high-dimensional signals because it can save significant storage in various signal processing applications. This paper presents a low-rank approximation algorithm of singular value decomposition (SVD) for large-scale matrices in tensor train format (TT-format). The proposed alternating least square block power SVD (ALS-BPSVD) algorithm can reduce the computational complexity by decomposing the large-scale SVD into a low-rank approximation scheme with a fixed-iteration block power method for searching singular values and vectors. Moreover, a low-complexity two-step truncation scheme was proposed to reduce more complexity and facilitate the parallel processing. The proposed ALS-BPSVD algorithm can support the low-rank approximation SVD for matrices with dimension higher than 2 11 × 2 11 . The simulation results show that the ALS-BPSVD achieved up to 21.3 times speed-up compared to the benchmark ALS-SVD algorithm for the random matrices with prescribed singular values.