Well-Conditioned Linear Minimum Mean Square Error Estimation

6 Jan 2022  ·  Edwin K. P. Chong ·

Linear minimum mean square error (LMMSE) estimation is often ill-conditioned, suggesting that unconstrained minimization of the mean square error is an inadequate approach to filter design. To address this, we first develop a unifying framework for studying constrained LMMSE estimation problems. Using this framework, we explore an important structural property of constrained LMMSE filters involving a certain prefilter. Optimality is invariant under invertible linear transformations of the prefilter. This parameterizes all optimal filters by equivalence classes of prefilters. We then clarify that merely constraining the rank of the filter does not suitably address the problem of ill-conditioning. Instead, we adopt a constraint that explicitly requires solutions to be well-conditioned in a certain specific sense. We introduce two well-conditioned filters and show that they converge to the unconstrained LMMSE filter as their truncation-power loss goes to zero, at the same rate as the low-rank Wiener filter. We also show extensions to the case of weighted trace and determinant of the error covariance as objective functions. Finally, our quantitative results with historical VIX data demonstrate that our two well-conditioned filters have stable performance while the standard LMMSE filter deteriorates with increasing condition number.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here