학술논문

Channel Estimation for Quantized Systems Based on Conditionally Gaussian Latent Models
Document Type
Periodical
Source
IEEE Transactions on Signal Processing IEEE Trans. Signal Process. Signal Processing, IEEE Transactions on. 72:1475-1490 2024
Subject
Signal Processing and Analysis
Communication, Networking and Broadcast Technologies
Computing and Processing
Channel estimation
Quantization (signal)
Adaptation models
Training
Channel models
Vectors
Signal to noise ratio
generative latent model
coarse quantization
Bussgang theorem
covariance recovery
Language
ISSN
1053-587X
1941-0476
Abstract
This work introduces a novel class of channel estimators tailored for coarse quantization systems. The proposed estimators are founded on conditionally Gaussian latent generative models, specifically Gaussian mixture models (GMMs), mixture of factor analyzers (MFAs), and variational autoencoders (VAEs). These models effectively learn the unknown channel distribution inherent in radio propagation scenarios, providing valuable prior information. Conditioning on the latent variable of these generative models yields a locally Gaussian channel distribution, thus enabling the application of the well-known Bussgang decomposition. By exploiting the resulting conditional Bussgang decomposition, we derive parameterized linear minimum mean square error (MMSE) estimators for the considered generative latent variable models. In this context, we explore leveraging model-based structural features to reduce memory and complexity overhead associated with the proposed estimators. Furthermore, we devise necessary training adaptations, enabling direct learning of the generative models from quantized pilot observations without requiring ground-truth channel samples during the training phase. Through extensive simulations, we demonstrate the superiority of our introduced estimators over existing state-of-the-art methods for coarsely quantized systems, as evidenced by significant improvements in mean square error (MSE) and achievable rate metrics.