학술논문

Effects of Mismatched Training on Adaptive Detection
Document Type
Conference
Source
2018 52nd Asilomar Conference on Signals, Systems, and Computers Signals, Systems, and Computers, 2018 52nd Asilomar Conference on. :2081-2085 Oct, 2018
Subject
Bioengineering
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Signal Processing and Analysis
Covariance matrices
Training
Detectors
Interference suppression
Mathematical model
Random variables
Loss factor for mismatched training
Detection performance of CFAR algorithms for mismatched training
Language
ISSN
2576-2303
Abstract
Interference cancelation in the adaptive radar detection context typically relies on training samples to estimate the covariance matrix of interference and noise in the test vector. Adaptive detection algorithms are generally developed under the assumption that the interference-plus-noise covariance matrix of the test vector (say C) is the same as the interference-plus-noise covariance matrix of the training vectors (say Σ). When the two covariance matrices are not perfectly matched the constant false alarm rate (CFAR) feature of adaptive detectors is no longer valid. For mismatched conditions, standard scalar CFAR techniques can be applied on adaptive detector outputs to regain the CFAR feature. In this paper we consider the Adaptive Matched Filter (AMF) statistic based CFAR detector and shown that the effects of covariance matrix mismatch can be condensed into a single scalar quantity referred to as the loss factor ρ. The loss factor is a random variable if the estimate of Σ is a random matrix. Sample results are provided for the deterministic case.