Sorry, you need to enable JavaScript to visit this website.

Adaptive Signal Processing

D2L: DECENTRALIZED DICTIONARY LEARNING OVER DYNAMIC NETWORKS

Paper Details

Authors:
Amir Daneshmand, Ying Sun, Gesualdo Scutari, Francisco Facchinei
Submitted On:
9 March 2017 - 1:10am
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

D4L_slides.pdf

(55 downloads)

Keywords

Subscribe

[1] Amir Daneshmand, Ying Sun, Gesualdo Scutari, Francisco Facchinei, "D2L: DECENTRALIZED DICTIONARY LEARNING OVER DYNAMIC NETWORKS", IEEE SigPort, 2017. [Online]. Available: http://sigport.org/1716. Accessed: Aug. 18, 2017.
@article{1716-17,
url = {http://sigport.org/1716},
author = {Amir Daneshmand; Ying Sun; Gesualdo Scutari; Francisco Facchinei },
publisher = {IEEE SigPort},
title = {D2L: DECENTRALIZED DICTIONARY LEARNING OVER DYNAMIC NETWORKS},
year = {2017} }
TY - EJOUR
T1 - D2L: DECENTRALIZED DICTIONARY LEARNING OVER DYNAMIC NETWORKS
AU - Amir Daneshmand; Ying Sun; Gesualdo Scutari; Francisco Facchinei
PY - 2017
PB - IEEE SigPort
UR - http://sigport.org/1716
ER -
Amir Daneshmand, Ying Sun, Gesualdo Scutari, Francisco Facchinei. (2017). D2L: DECENTRALIZED DICTIONARY LEARNING OVER DYNAMIC NETWORKS. IEEE SigPort. http://sigport.org/1716
Amir Daneshmand, Ying Sun, Gesualdo Scutari, Francisco Facchinei, 2017. D2L: DECENTRALIZED DICTIONARY LEARNING OVER DYNAMIC NETWORKS. Available at: http://sigport.org/1716.
Amir Daneshmand, Ying Sun, Gesualdo Scutari, Francisco Facchinei. (2017). "D2L: DECENTRALIZED DICTIONARY LEARNING OVER DYNAMIC NETWORKS." Web.
1. Amir Daneshmand, Ying Sun, Gesualdo Scutari, Francisco Facchinei. D2L: DECENTRALIZED DICTIONARY LEARNING OVER DYNAMIC NETWORKS [Internet]. IEEE SigPort; 2017. Available from : http://sigport.org/1716

RECURSIVE LEAST-SQUARES ALGORITHMS FOR SPARSE SYSTEM MODELING


In this paper, we propose some sparsity aware algorithms, namely the Recursive least-Squares for sparse systems (S-RLS) and l0-norm Recursive least-Squares (l0-RLS), in order to exploit the sparsity of an unknown system. The first algorithm, applies a discard function on the weight vector to disregard the coefficients close to zero during the update process. The second algorithm, employs the sparsity-promoting scheme via some non-convex approximations to the l0-norm.

Paper Details

Authors:
Hamed Yazdanpanah, Paulo Sergio Ramirez Diniz
Submitted On:
3 March 2017 - 9:25pm
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

RECURSIVE LEAST-SQUARES ALGORITHMS FOR SPARSE SYSTEM MODELING

(59 downloads)

Keywords

Subscribe

[1] Hamed Yazdanpanah, Paulo Sergio Ramirez Diniz, "RECURSIVE LEAST-SQUARES ALGORITHMS FOR SPARSE SYSTEM MODELING", IEEE SigPort, 2017. [Online]. Available: http://sigport.org/1621. Accessed: Aug. 18, 2017.
@article{1621-17,
url = {http://sigport.org/1621},
author = {Hamed Yazdanpanah; Paulo Sergio Ramirez Diniz },
publisher = {IEEE SigPort},
title = {RECURSIVE LEAST-SQUARES ALGORITHMS FOR SPARSE SYSTEM MODELING},
year = {2017} }
TY - EJOUR
T1 - RECURSIVE LEAST-SQUARES ALGORITHMS FOR SPARSE SYSTEM MODELING
AU - Hamed Yazdanpanah; Paulo Sergio Ramirez Diniz
PY - 2017
PB - IEEE SigPort
UR - http://sigport.org/1621
ER -
Hamed Yazdanpanah, Paulo Sergio Ramirez Diniz. (2017). RECURSIVE LEAST-SQUARES ALGORITHMS FOR SPARSE SYSTEM MODELING. IEEE SigPort. http://sigport.org/1621
Hamed Yazdanpanah, Paulo Sergio Ramirez Diniz, 2017. RECURSIVE LEAST-SQUARES ALGORITHMS FOR SPARSE SYSTEM MODELING. Available at: http://sigport.org/1621.
Hamed Yazdanpanah, Paulo Sergio Ramirez Diniz. (2017). "RECURSIVE LEAST-SQUARES ALGORITHMS FOR SPARSE SYSTEM MODELING." Web.
1. Hamed Yazdanpanah, Paulo Sergio Ramirez Diniz. RECURSIVE LEAST-SQUARES ALGORITHMS FOR SPARSE SYSTEM MODELING [Internet]. IEEE SigPort; 2017. Available from : http://sigport.org/1621

ADAPTIVE MATCHING PURSUIT FOR SPARSE SIGNAL RECOVERY


Spike and Slab priors have been of much recent interest in signal processing as a means of inducing sparsity in Bayesian inference. Applications domains that benefit from the use of these priors include sparse recovery, regression and classification. It is well-known that solving for the sparse coefficient vector to maximize these priors results in a hard non-convex and mixed integer programming problem. Most existing solutions to this optimization problem either involve simplifying assumptions/relaxations or are computationally expensive.

Paper Details

Authors:
Tiep H. Vu, Hojjat S. Mousavi, Vishal Monga
Submitted On:
27 February 2017 - 9:59pm
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

Poster_ICASSP_2017_AMP.pdf

(66 downloads)

Keywords

Subscribe

[1] Tiep H. Vu, Hojjat S. Mousavi, Vishal Monga, "ADAPTIVE MATCHING PURSUIT FOR SPARSE SIGNAL RECOVERY", IEEE SigPort, 2017. [Online]. Available: http://sigport.org/1467. Accessed: Aug. 18, 2017.
@article{1467-17,
url = {http://sigport.org/1467},
author = {Tiep H. Vu; Hojjat S. Mousavi; Vishal Monga },
publisher = {IEEE SigPort},
title = {ADAPTIVE MATCHING PURSUIT FOR SPARSE SIGNAL RECOVERY},
year = {2017} }
TY - EJOUR
T1 - ADAPTIVE MATCHING PURSUIT FOR SPARSE SIGNAL RECOVERY
AU - Tiep H. Vu; Hojjat S. Mousavi; Vishal Monga
PY - 2017
PB - IEEE SigPort
UR - http://sigport.org/1467
ER -
Tiep H. Vu, Hojjat S. Mousavi, Vishal Monga. (2017). ADAPTIVE MATCHING PURSUIT FOR SPARSE SIGNAL RECOVERY. IEEE SigPort. http://sigport.org/1467
Tiep H. Vu, Hojjat S. Mousavi, Vishal Monga, 2017. ADAPTIVE MATCHING PURSUIT FOR SPARSE SIGNAL RECOVERY. Available at: http://sigport.org/1467.
Tiep H. Vu, Hojjat S. Mousavi, Vishal Monga. (2017). "ADAPTIVE MATCHING PURSUIT FOR SPARSE SIGNAL RECOVERY." Web.
1. Tiep H. Vu, Hojjat S. Mousavi, Vishal Monga. ADAPTIVE MATCHING PURSUIT FOR SPARSE SIGNAL RECOVERY [Internet]. IEEE SigPort; 2017. Available from : http://sigport.org/1467

Robust Estimation of Self-Exciting Point Process Models with Application to Neuronal Modeling


We consider the problem of estimating discrete self- exciting point process models from limited binary observations, where the history of the process serves as the covariate. We analyze the performance of two classes of estimators: l1-regularized maximum likelihood and greedy estimation for a discrete version of the Hawkes process and characterize the sampling tradeoffs required for stable recovery in the non-asymptotic regime. Our results extend those of compressed sensing for linear and generalized linear models with i.i.d.

Paper Details

Authors:
Abbas Kazemipour, Min Wu and Behtash Babadi
Submitted On:
12 December 2016 - 9:35am
Short Link:
Type:
Document Year:
Cite

Document Files

Robust_SEPP_TSP.pdf

(91 downloads)

Keywords

Subscribe

[1] Abbas Kazemipour, Min Wu and Behtash Babadi, "Robust Estimation of Self-Exciting Point Process Models with Application to Neuronal Modeling", IEEE SigPort, 2016. [Online]. Available: http://sigport.org/1261. Accessed: Aug. 18, 2017.
@article{1261-16,
url = {http://sigport.org/1261},
author = {Abbas Kazemipour; Min Wu and Behtash Babadi },
publisher = {IEEE SigPort},
title = {Robust Estimation of Self-Exciting Point Process Models with Application to Neuronal Modeling},
year = {2016} }
TY - EJOUR
T1 - Robust Estimation of Self-Exciting Point Process Models with Application to Neuronal Modeling
AU - Abbas Kazemipour; Min Wu and Behtash Babadi
PY - 2016
PB - IEEE SigPort
UR - http://sigport.org/1261
ER -
Abbas Kazemipour, Min Wu and Behtash Babadi. (2016). Robust Estimation of Self-Exciting Point Process Models with Application to Neuronal Modeling. IEEE SigPort. http://sigport.org/1261
Abbas Kazemipour, Min Wu and Behtash Babadi, 2016. Robust Estimation of Self-Exciting Point Process Models with Application to Neuronal Modeling. Available at: http://sigport.org/1261.
Abbas Kazemipour, Min Wu and Behtash Babadi. (2016). "Robust Estimation of Self-Exciting Point Process Models with Application to Neuronal Modeling." Web.
1. Abbas Kazemipour, Min Wu and Behtash Babadi. Robust Estimation of Self-Exciting Point Process Models with Application to Neuronal Modeling [Internet]. IEEE SigPort; 2016. Available from : http://sigport.org/1261

The recursive Hessian sketch for adaptive filtering


The recursive Hessian sketch for adaptive filtering as a block diagram

We introduce in this paper the recursive Hessian sketch, a new adaptive filtering algorithm based on sketching the same exponentially weighted least squares problem solved by the recursive least squares algorithm. The algorithm maintains a number of sketches of the inverse autocorrelation matrix and recursively updates them at random intervals. These are in turn used to update the unknown filter estimate. The complexity of the proposed algorithm compares favorably to that of recursive least squares.

Paper Details

Authors:
Robin Scheibler, Martin Vetterli
Submitted On:
29 March 2016 - 4:50am
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

icassp2016_slides.pdf

(183 downloads)

Keywords

Subscribe

[1] Robin Scheibler, Martin Vetterli, "The recursive Hessian sketch for adaptive filtering", IEEE SigPort, 2016. [Online]. Available: http://sigport.org/1064. Accessed: Aug. 18, 2017.
@article{1064-16,
url = {http://sigport.org/1064},
author = {Robin Scheibler; Martin Vetterli },
publisher = {IEEE SigPort},
title = {The recursive Hessian sketch for adaptive filtering},
year = {2016} }
TY - EJOUR
T1 - The recursive Hessian sketch for adaptive filtering
AU - Robin Scheibler; Martin Vetterli
PY - 2016
PB - IEEE SigPort
UR - http://sigport.org/1064
ER -
Robin Scheibler, Martin Vetterli. (2016). The recursive Hessian sketch for adaptive filtering. IEEE SigPort. http://sigport.org/1064
Robin Scheibler, Martin Vetterli, 2016. The recursive Hessian sketch for adaptive filtering. Available at: http://sigport.org/1064.
Robin Scheibler, Martin Vetterli. (2016). "The recursive Hessian sketch for adaptive filtering." Web.
1. Robin Scheibler, Martin Vetterli. The recursive Hessian sketch for adaptive filtering [Internet]. IEEE SigPort; 2016. Available from : http://sigport.org/1064

Compressed Training Adaptive Equalization


Compressed Training Adaptive Equalization

We introduce it compressed training adaptive equalization as a novel approach for reducing number of training symbols in a communication packet. The proposed semi-blind approach is based on the exploitation of the special magnitude boundedness of communication symbols. The algorithms are derived from a special convex optimization setting based on l_\infty norm. The corresponding framework has a direct link with the compressive sensing literature established by invoking the duality between l_1 and l_\infty norms.

Paper Details

Authors:
Baki Berkay Yilmaz, Alper T. Erdogan
Submitted On:
21 March 2016 - 12:13pm
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

icassp2016bbyhazirae.pdf

(231 downloads)

Keywords

Subscribe

[1] Baki Berkay Yilmaz, Alper T. Erdogan, "Compressed Training Adaptive Equalization", IEEE SigPort, 2016. [Online]. Available: http://sigport.org/932. Accessed: Aug. 18, 2017.
@article{932-16,
url = {http://sigport.org/932},
author = {Baki Berkay Yilmaz; Alper T. Erdogan },
publisher = {IEEE SigPort},
title = {Compressed Training Adaptive Equalization},
year = {2016} }
TY - EJOUR
T1 - Compressed Training Adaptive Equalization
AU - Baki Berkay Yilmaz; Alper T. Erdogan
PY - 2016
PB - IEEE SigPort
UR - http://sigport.org/932
ER -
Baki Berkay Yilmaz, Alper T. Erdogan. (2016). Compressed Training Adaptive Equalization. IEEE SigPort. http://sigport.org/932
Baki Berkay Yilmaz, Alper T. Erdogan, 2016. Compressed Training Adaptive Equalization. Available at: http://sigport.org/932.
Baki Berkay Yilmaz, Alper T. Erdogan. (2016). "Compressed Training Adaptive Equalization." Web.
1. Baki Berkay Yilmaz, Alper T. Erdogan. Compressed Training Adaptive Equalization [Internet]. IEEE SigPort; 2016. Available from : http://sigport.org/932

Adaptive Sparsity Tradeoff for L1-Constraint NLMS Algorithm


Embedding the l1 norm in gradient-based adaptive filtering is a popular solution for sparse plant estimation. Supported on the modal analysis of the adaptive algorithm near steady state, this work shows that the optimal sparsity tradeoff depends on filter length, plant sparsity and signal-to-noise ratio. In a practical implementation, these terms are obtained with an unsupervised mechanism tracking the filter weights. Simulation results prove the robustness and superiority of the novel adaptive-tradeoff sparsity-aware method.

Paper Details

Authors:
Abdullah Alshabilli, Shihab Jimaa
Submitted On:
19 March 2016 - 12:32pm
Short Link:
Type:
Event:
Presenter's Name:
Document Year:
Cite

Document Files

icassp2016-poster.pdf

(167 downloads)

Keywords

Additional Categories

Subscribe

[1] Abdullah Alshabilli, Shihab Jimaa, "Adaptive Sparsity Tradeoff for L1-Constraint NLMS Algorithm", IEEE SigPort, 2016. [Online]. Available: http://sigport.org/825. Accessed: Aug. 18, 2017.
@article{825-16,
url = {http://sigport.org/825},
author = {Abdullah Alshabilli; Shihab Jimaa },
publisher = {IEEE SigPort},
title = {Adaptive Sparsity Tradeoff for L1-Constraint NLMS Algorithm},
year = {2016} }
TY - EJOUR
T1 - Adaptive Sparsity Tradeoff for L1-Constraint NLMS Algorithm
AU - Abdullah Alshabilli; Shihab Jimaa
PY - 2016
PB - IEEE SigPort
UR - http://sigport.org/825
ER -
Abdullah Alshabilli, Shihab Jimaa. (2016). Adaptive Sparsity Tradeoff for L1-Constraint NLMS Algorithm. IEEE SigPort. http://sigport.org/825
Abdullah Alshabilli, Shihab Jimaa, 2016. Adaptive Sparsity Tradeoff for L1-Constraint NLMS Algorithm. Available at: http://sigport.org/825.
Abdullah Alshabilli, Shihab Jimaa. (2016). "Adaptive Sparsity Tradeoff for L1-Constraint NLMS Algorithm." Web.
1. Abdullah Alshabilli, Shihab Jimaa. Adaptive Sparsity Tradeoff for L1-Constraint NLMS Algorithm [Internet]. IEEE SigPort; 2016. Available from : http://sigport.org/825

PERFORMANCE LIMITS OF SINGLE-AGENT AND MULTI-AGENT SUB-GRADIENT STOCHASTIC LEARNING

Paper Details

Authors:
Ali H. Sayed
Submitted On:
19 March 2016 - 11:41am
Short Link:
Type:

Document Files

Poster_Bicheng_2016_ICASSP.pdf

(148 downloads)

Keywords

Subscribe

[1] Ali H. Sayed, "PERFORMANCE LIMITS OF SINGLE-AGENT AND MULTI-AGENT SUB-GRADIENT STOCHASTIC LEARNING", IEEE SigPort, 2016. [Online]. Available: http://sigport.org/820. Accessed: Aug. 18, 2017.
@article{820-16,
url = {http://sigport.org/820},
author = {Ali H. Sayed },
publisher = {IEEE SigPort},
title = {PERFORMANCE LIMITS OF SINGLE-AGENT AND MULTI-AGENT SUB-GRADIENT STOCHASTIC LEARNING},
year = {2016} }
TY - EJOUR
T1 - PERFORMANCE LIMITS OF SINGLE-AGENT AND MULTI-AGENT SUB-GRADIENT STOCHASTIC LEARNING
AU - Ali H. Sayed
PY - 2016
PB - IEEE SigPort
UR - http://sigport.org/820
ER -
Ali H. Sayed. (2016). PERFORMANCE LIMITS OF SINGLE-AGENT AND MULTI-AGENT SUB-GRADIENT STOCHASTIC LEARNING. IEEE SigPort. http://sigport.org/820
Ali H. Sayed, 2016. PERFORMANCE LIMITS OF SINGLE-AGENT AND MULTI-AGENT SUB-GRADIENT STOCHASTIC LEARNING. Available at: http://sigport.org/820.
Ali H. Sayed. (2016). "PERFORMANCE LIMITS OF SINGLE-AGENT AND MULTI-AGENT SUB-GRADIENT STOCHASTIC LEARNING." Web.
1. Ali H. Sayed. PERFORMANCE LIMITS OF SINGLE-AGENT AND MULTI-AGENT SUB-GRADIENT STOCHASTIC LEARNING [Internet]. IEEE SigPort; 2016. Available from : http://sigport.org/820

Slides for I-SM-PUAP algorithm presentation


In this presentation, we present an improved set-membership partial-update
affine projection (I-SM-PUAP) algorithm, aiming at
accelerating the convergence, and decreasing the update rates
and the computational complexity of the set-membership
partial-update affine projection (SM-PUAP) algorithm. To
meet these targets, we constrain the weight vector perturbation
to be bounded by a hypersphere instead of the threshold
hyperplanes as in the standard algorithm. We use the distance
between the present weight vector and the expected update

Paper Details

Authors:
Paulo S. R. Diniz, Hamed Yazdanpanah
Submitted On:
15 March 2016 - 8:58pm
Short Link:
Type:
Event:
Presenter's Name:
Document Year:
Cite

Document Files

ICASSP2016_Presentation_diniz.pdf

(159 downloads)

Keywords

Subscribe

[1] Paulo S. R. Diniz, Hamed Yazdanpanah, "Slides for I-SM-PUAP algorithm presentation", IEEE SigPort, 2016. [Online]. Available: http://sigport.org/699. Accessed: Aug. 18, 2017.
@article{699-16,
url = {http://sigport.org/699},
author = {Paulo S. R. Diniz; Hamed Yazdanpanah },
publisher = {IEEE SigPort},
title = {Slides for I-SM-PUAP algorithm presentation},
year = {2016} }
TY - EJOUR
T1 - Slides for I-SM-PUAP algorithm presentation
AU - Paulo S. R. Diniz; Hamed Yazdanpanah
PY - 2016
PB - IEEE SigPort
UR - http://sigport.org/699
ER -
Paulo S. R. Diniz, Hamed Yazdanpanah. (2016). Slides for I-SM-PUAP algorithm presentation. IEEE SigPort. http://sigport.org/699
Paulo S. R. Diniz, Hamed Yazdanpanah, 2016. Slides for I-SM-PUAP algorithm presentation. Available at: http://sigport.org/699.
Paulo S. R. Diniz, Hamed Yazdanpanah. (2016). "Slides for I-SM-PUAP algorithm presentation." Web.
1. Paulo S. R. Diniz, Hamed Yazdanpanah. Slides for I-SM-PUAP algorithm presentation [Internet]. IEEE SigPort; 2016. Available from : http://sigport.org/699

Locating Salient Group-Structured Image Features via Adaptive Compressive Sensing


In this paper we consider the task of locating salient group-structured features in potentially high-dimensional images; the salient feature detection here is modeled as a Robust Principal Component Analysis problem, in which the aim is to locate groups of outlier columns embedded in an otherwise low rank matrix.

Paper Details

Authors:
Jarvis Haupt
Submitted On:
23 February 2016 - 1:44pm
Short Link:
Type:
Event:
Presenter's Name:
Document Year:
Cite

Document Files

globalsip_XingguoLi.pdf

(222 downloads)

Keywords

Subscribe

[1] Jarvis Haupt, "Locating Salient Group-Structured Image Features via Adaptive Compressive Sensing", IEEE SigPort, 2015. [Online]. Available: http://sigport.org/495. Accessed: Aug. 18, 2017.
@article{495-15,
url = {http://sigport.org/495},
author = {Jarvis Haupt },
publisher = {IEEE SigPort},
title = {Locating Salient Group-Structured Image Features via Adaptive Compressive Sensing},
year = {2015} }
TY - EJOUR
T1 - Locating Salient Group-Structured Image Features via Adaptive Compressive Sensing
AU - Jarvis Haupt
PY - 2015
PB - IEEE SigPort
UR - http://sigport.org/495
ER -
Jarvis Haupt. (2015). Locating Salient Group-Structured Image Features via Adaptive Compressive Sensing. IEEE SigPort. http://sigport.org/495
Jarvis Haupt, 2015. Locating Salient Group-Structured Image Features via Adaptive Compressive Sensing. Available at: http://sigport.org/495.
Jarvis Haupt. (2015). "Locating Salient Group-Structured Image Features via Adaptive Compressive Sensing." Web.
1. Jarvis Haupt. Locating Salient Group-Structured Image Features via Adaptive Compressive Sensing [Internet]. IEEE SigPort; 2015. Available from : http://sigport.org/495

Pages