Performance Indicator for MIMO MMSE Receivers in the Presence of Channel Estimation Error

Performance Indicator for MIMO MMSE Receivers in the Presence of Channel   Estimation Error

We present the derivation of post-processing SNR for Minimum-Mean-Squared-Error (MMSE) receivers with imperfect channel estimates, and show that it is an accurate indicator of the error rate performance of MIMO systems in the presence of channel estimation error. Simulation results show the tightness of the analysis.


šŸ’” Research Summary

The paper addresses a critical gap in the performance analysis of multiple‑input multiple‑output (MIMO) systems that employ minimum‑mean‑squared‑error (MMSE) receivers when the channel state information (CSI) is imperfect. While the MMSE filter is optimal under the assumption of perfect CSI, real‑world wireless links inevitably suffer from channel estimation error (CEE) due to finite‑length training, noise, and hardware impairments. The authors model the estimated channel as (\hat{H}=H+\Delta H), where (\Delta H) is a zero‑mean complex Gaussian matrix with variance (\sigma_e^2). This statistical model captures the behavior of common estimators such as least‑squares (LS) and linear‑MMSE (LMMSE).

Using the estimated channel, the MMSE filter is constructed as (W=(\hat{H}\hat{H}^H+N_0I)^{-1}\hat{H}). When this filter is applied to the actual received signal (y=Hx+n), the output consists of the desired signal component, residual inter‑stream interference, thermal noise, and an additional noise term caused by the mismatch between (\hat{H}) and (H). By carefully expanding the expression and retaining second‑order terms in (\Delta H), the authors derive a closed‑form post‑processing signal‑to‑noise ratio (PPSNR) for each data stream:

\