Sorry, you need to enable JavaScript to visit this website.

Cramèr-Rao Bound for Estimation After Model Selection and its Application to Sparse Vector Estimation

Citation Author(s):
Elad Meir, and Tirza Routtenberg
Submitted by:
Elad Meir
Last updated:
9 May 2022 - 3:39am
Document Type:
Presentation Slides
Event:
Presenters:
Elad Meir
Paper Code:
SPTM-23.4

Abstract

In many practical parameter estimation problems,
such as coefficient estimation of polynomial regression, the true
model is unknown and thus, a model selection step is performed
prior to estimation. The data-based model selection step affects
the subsequent estimation. In particular, the oracle Cramér-Rao
bound (CRB), which is based on knowledge of the true model, is
inappropriate for post-model-selection performance analysis and
system design outside the asymptotic region. In this paper, we
investigate post-model-selection parameter estimation of a vector
with an unknown support set, where this support set represents
the model. We analyze the estimation performance of coherent
estimators that force unselected parameters to zero. We use the
mean-squared-selected-error (MSSE) criterion and introduce the
concept of selective unbiasedness in the sense of Lehmann unbiasedness.
We derive a non-Bayesian Cramér-Rao-type bound on
the MSSE and on the mean-squared-error (MSE) of any coherent
estimator with a specific selective-bias function in the Lehmann
sense. We implement the selective CRB for the special case of
sparse vector estimation with an unknown support set. Finally,
we demonstrate in simulations that the proposed selective CRB is
an informative lower bound on the performance of the maximum
selected likelihood estimator for a general linear model with the
generalized information criterion and for sparse vector estimation
with one step thresholding. It is shown that for these cases the selective
CRB outperforms the oracle CRB and Sando-Mitra-Stoica CRB

up
1 user has voted: Elad Meir

Files

ICASSP 2022 presentation slides SPTM-23.4

(40)