학술논문

Noise Variance Estimation in DS-CDMA and its Effects on the Individually Optimum Receiver
Document Type
Conference
Source
2006 IEEE 7th Workshop on Signal Processing Advances in Wireless Communications Signal Processing Advances in Wireless Communications, 2006. SPAWC '06. IEEE 7th Workshop on. :1-5 Jul, 2006
Subject
Signal Processing and Analysis
Computing and Processing
Communication, Networking and Broadcast Technologies
Multiaccess communication
Bit error rate
Detectors
Signal to noise ratio
Turbo codes
Decoding
Analysis of variance
Transmitters
Matched filters
Context
Language
ISSN
1948-3244
1948-3252
Abstract
In the context of synchronous random DS-CDMA (Direct Sequence Code Division Multiple Access) communications over a mobile network, the receiver that minimizes the peruser bit error rate (BER) is the symbol Maximum a posteriori (MAP) detector. This receiver is derived under the hypothesis of perfect channel state information at the receiver. In this paper we consider the case where the channel noise variance is estimated and analyze the effect of this mismatch. We show that the Bit Error Rate (BER) is piece-wise monotonic wrt. the estimated noise variance, reaching its minimum for the true channel variance. We also provide an upper bound of the individually optimum receiver performance under noise variance mismatch. Thus we give a theoretical justification for the usual bias towards noise variance underestimation adopted by the community.