- Read more about GENERAL TOTAL VARIATION REGULARIZED SPARSE BAYESIAN LEARNING FOR ROBUST BLOCK-SPARSE SIGNAL RECOVERY
- Log in to post comments
Block-sparse signal recovery without knowledge of block sizes and boundaries, such as those encountered in multi-antenna mmWave channel models, is a hard problem for compressed sensing (CS) algorithms. We propose a novel Sparse Bayesian Learning (SBL) method for block-sparse recovery based on popular CS based regularizers with the function input variable related to total variation (TV). Contrary to conventional approaches that impose the regularization on the signal components, we regularize the SBL hyperparameters.
- Categories:
- Read more about A PARTIALLY COLLAPSED GIBBS SAMPLER FOR UNSUPERVISED NONNEGATIVE SPARSE SIGNAL RESTORATION
- Log in to post comments
In this paper the problem of restoration of unsupervised nonnegative sparse signals is addressed in the Bayesian framework. We introduce a new probabilistic hierarchical prior, based on the Generalized Hyperbolic (GH) distribution, which explicitly accounts for sparsity. On the one hand, this new prior allows us to take into account the non-negativity.
- Categories:
- Read more about FASTER AND STILL SAFE: COMBINING SCREENING TECHNIQUES AND STRUCTURED DICTIONARIES TO ACCELERATE THE LASSO
- Log in to post comments
Accelerating the solution of the Lasso problem becomes crucial when scaling to very high dimensional data.
In this paper, we propose a way to combine two existing acceleration techniques: safe screening tests, which simplify the problem by eliminating useless dictionary atoms; and the use of structured dictionaries which are faster to operate with. A structured approximation of the true dictionary is used at the initial stage of the optimization, and we show how to define screening tests which are still safe despite the approximation error.
- Categories: