Sorry, you need to enable JavaScript to visit this website.

Sparsity and Optimization

Multi-scale algorithms for optimal transport


Optimal transport is a geometrically intuitive and robust way to quantify differences between probability measures.
It is becoming increasingly popular as numerical tool in image processing, computer vision and machine learning.
A key challenge is its efficient computation, in particular on large problems. Various algorithms exist, tailored to different special cases.
Multi-scale methods can be applied to classical discrete algorithms, as well as entropy regularization techniques. They provide a good compromise between efficiency and flexibility.

Paper Details

Authors:
Submitted On:
2 June 2018 - 2:58am
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

schmitzer_2018-06_Lausanne.pdf

(92 downloads)

Keywords

Additional Categories

Subscribe

[1] , "Multi-scale algorithms for optimal transport", IEEE SigPort, 2018. [Online]. Available: http://sigport.org/3230. Accessed: Oct. 15, 2018.
@article{3230-18,
url = {http://sigport.org/3230},
author = { },
publisher = {IEEE SigPort},
title = {Multi-scale algorithms for optimal transport},
year = {2018} }
TY - EJOUR
T1 - Multi-scale algorithms for optimal transport
AU -
PY - 2018
PB - IEEE SigPort
UR - http://sigport.org/3230
ER -
. (2018). Multi-scale algorithms for optimal transport. IEEE SigPort. http://sigport.org/3230
, 2018. Multi-scale algorithms for optimal transport. Available at: http://sigport.org/3230.
. (2018). "Multi-scale algorithms for optimal transport." Web.
1. . Multi-scale algorithms for optimal transport [Internet]. IEEE SigPort; 2018. Available from : http://sigport.org/3230

Convolutional group-sparse coding and source localization


In this paper, we present a new interpretation of non-negatively constrained convolutional coding problems as blind deconvolution problems with spatially variant point spread function. In this light, we propose an optimization framework that generalizes our previous work on non-negative group sparsity for convolutional models. We then link these concepts to source localization problems that arise in scientific imaging, and provide a visual example on an image derived from data captured by the Hubble telescope.

Poster.pdf

PDF icon Poster.pdf (642 downloads)

Paper Details

Authors:
Pol del Aguila Pla, Joakim Jaldén
Submitted On:
12 April 2018 - 12:59pm
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

Poster.pdf

(642 downloads)

Subscribe

[1] Pol del Aguila Pla, Joakim Jaldén, "Convolutional group-sparse coding and source localization", IEEE SigPort, 2018. [Online]. Available: http://sigport.org/2444. Accessed: Oct. 15, 2018.
@article{2444-18,
url = {http://sigport.org/2444},
author = {Pol del Aguila Pla; Joakim Jaldén },
publisher = {IEEE SigPort},
title = {Convolutional group-sparse coding and source localization},
year = {2018} }
TY - EJOUR
T1 - Convolutional group-sparse coding and source localization
AU - Pol del Aguila Pla; Joakim Jaldén
PY - 2018
PB - IEEE SigPort
UR - http://sigport.org/2444
ER -
Pol del Aguila Pla, Joakim Jaldén. (2018). Convolutional group-sparse coding and source localization. IEEE SigPort. http://sigport.org/2444
Pol del Aguila Pla, Joakim Jaldén, 2018. Convolutional group-sparse coding and source localization. Available at: http://sigport.org/2444.
Pol del Aguila Pla, Joakim Jaldén. (2018). "Convolutional group-sparse coding and source localization." Web.
1. Pol del Aguila Pla, Joakim Jaldén. Convolutional group-sparse coding and source localization [Internet]. IEEE SigPort; 2018. Available from : http://sigport.org/2444

ADMM Penalty Parameter Selection with Krylov Subspace Recycling Technique for Sparse Coding


The alternating direction method of multipliers (ADMM) has been widely used for a very wide variety of imaging inverse problems. One of the disadvantages of this method, however, is the need to select an algorithm parameter, the penalty parameter, that has a significant effect on the rate of convergence of the algorithm. Although a number of heuristic methods have been proposed, as yet there is no general theory providing a good choice of this parameter for all problems.

Paper Details

Authors:
Youzuo Lin, Brendt Wohlberg, Velimir Vesselinov
Submitted On:
3 October 2017 - 6:45pm
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

Presentation slides

(190 downloads)

Subscribe

[1] Youzuo Lin, Brendt Wohlberg, Velimir Vesselinov, "ADMM Penalty Parameter Selection with Krylov Subspace Recycling Technique for Sparse Coding", IEEE SigPort, 2017. [Online]. Available: http://sigport.org/2254. Accessed: Oct. 15, 2018.
@article{2254-17,
url = {http://sigport.org/2254},
author = {Youzuo Lin; Brendt Wohlberg; Velimir Vesselinov },
publisher = {IEEE SigPort},
title = {ADMM Penalty Parameter Selection with Krylov Subspace Recycling Technique for Sparse Coding},
year = {2017} }
TY - EJOUR
T1 - ADMM Penalty Parameter Selection with Krylov Subspace Recycling Technique for Sparse Coding
AU - Youzuo Lin; Brendt Wohlberg; Velimir Vesselinov
PY - 2017
PB - IEEE SigPort
UR - http://sigport.org/2254
ER -
Youzuo Lin, Brendt Wohlberg, Velimir Vesselinov. (2017). ADMM Penalty Parameter Selection with Krylov Subspace Recycling Technique for Sparse Coding. IEEE SigPort. http://sigport.org/2254
Youzuo Lin, Brendt Wohlberg, Velimir Vesselinov, 2017. ADMM Penalty Parameter Selection with Krylov Subspace Recycling Technique for Sparse Coding. Available at: http://sigport.org/2254.
Youzuo Lin, Brendt Wohlberg, Velimir Vesselinov. (2017). "ADMM Penalty Parameter Selection with Krylov Subspace Recycling Technique for Sparse Coding." Web.
1. Youzuo Lin, Brendt Wohlberg, Velimir Vesselinov. ADMM Penalty Parameter Selection with Krylov Subspace Recycling Technique for Sparse Coding [Internet]. IEEE SigPort; 2017. Available from : http://sigport.org/2254

Adaptive Sparsity Tradeoff for L1-Constraint NLMS Algorithm


Embedding the l1 norm in gradient-based adaptive filtering is a popular solution for sparse plant estimation. Supported on the modal analysis of the adaptive algorithm near steady state, this work shows that the optimal sparsity tradeoff depends on filter length, plant sparsity and signal-to-noise ratio. In a practical implementation, these terms are obtained with an unsupervised mechanism tracking the filter weights. Simulation results prove the robustness and superiority of the novel adaptive-tradeoff sparsity-aware method.

Paper Details

Authors:
Abdullah Alshabilli, Shihab Jimaa
Submitted On:
19 March 2016 - 12:32pm
Short Link:
Type:
Event:
Presenter's Name:
Document Year:
Cite

Document Files

icassp2016-poster.pdf

(355 downloads)

Keywords

Additional Categories

Subscribe

[1] Abdullah Alshabilli, Shihab Jimaa, "Adaptive Sparsity Tradeoff for L1-Constraint NLMS Algorithm", IEEE SigPort, 2016. [Online]. Available: http://sigport.org/825. Accessed: Oct. 15, 2018.
@article{825-16,
url = {http://sigport.org/825},
author = {Abdullah Alshabilli; Shihab Jimaa },
publisher = {IEEE SigPort},
title = {Adaptive Sparsity Tradeoff for L1-Constraint NLMS Algorithm},
year = {2016} }
TY - EJOUR
T1 - Adaptive Sparsity Tradeoff for L1-Constraint NLMS Algorithm
AU - Abdullah Alshabilli; Shihab Jimaa
PY - 2016
PB - IEEE SigPort
UR - http://sigport.org/825
ER -
Abdullah Alshabilli, Shihab Jimaa. (2016). Adaptive Sparsity Tradeoff for L1-Constraint NLMS Algorithm. IEEE SigPort. http://sigport.org/825
Abdullah Alshabilli, Shihab Jimaa, 2016. Adaptive Sparsity Tradeoff for L1-Constraint NLMS Algorithm. Available at: http://sigport.org/825.
Abdullah Alshabilli, Shihab Jimaa. (2016). "Adaptive Sparsity Tradeoff for L1-Constraint NLMS Algorithm." Web.
1. Abdullah Alshabilli, Shihab Jimaa. Adaptive Sparsity Tradeoff for L1-Constraint NLMS Algorithm [Internet]. IEEE SigPort; 2016. Available from : http://sigport.org/825