Sorry, you need to enable JavaScript to visit this website.

Adaptive Signal Processing

DATA CENSORING WITH SET-MEMBERSHIP ALGORITHMS


In this paper, we use the set-membership normalized least-mean-square (SM-NLMS) algorithm to censor the data set in big data applications. First, we use the distribution of the noise signal and the excess of the steady-state mean-square error (EMSE) to estimate the threshold for the desired update rate in the single threshold SM-NLMS (ST-SM-NLMS) algorithm. Then, we introduce the double threshold SM-NLMS (DT-SM-NLMS) algorithm which defines an acceptable
range of the error signal. This algorithm censors the data with very low and very high output estimation error.

Paper Details

Authors:
Paulo Sergio Ramirez Diniz, Hamed Yazdanpanah
Submitted On:
10 November 2017 - 12:09pm
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

DATA CENSORING WITH SET-MEMBERSHIP ALGORITHMS

(11 downloads)

Keywords

Additional Categories

Subscribe

[1] Paulo Sergio Ramirez Diniz, Hamed Yazdanpanah, "DATA CENSORING WITH SET-MEMBERSHIP ALGORITHMS", IEEE SigPort, 2017. [Online]. Available: http://sigport.org/2295. Accessed: Dec. 17, 2017.
@article{2295-17,
url = {http://sigport.org/2295},
author = {Paulo Sergio Ramirez Diniz; Hamed Yazdanpanah },
publisher = {IEEE SigPort},
title = {DATA CENSORING WITH SET-MEMBERSHIP ALGORITHMS},
year = {2017} }
TY - EJOUR
T1 - DATA CENSORING WITH SET-MEMBERSHIP ALGORITHMS
AU - Paulo Sergio Ramirez Diniz; Hamed Yazdanpanah
PY - 2017
PB - IEEE SigPort
UR - http://sigport.org/2295
ER -
Paulo Sergio Ramirez Diniz, Hamed Yazdanpanah. (2017). DATA CENSORING WITH SET-MEMBERSHIP ALGORITHMS. IEEE SigPort. http://sigport.org/2295
Paulo Sergio Ramirez Diniz, Hamed Yazdanpanah, 2017. DATA CENSORING WITH SET-MEMBERSHIP ALGORITHMS. Available at: http://sigport.org/2295.
Paulo Sergio Ramirez Diniz, Hamed Yazdanpanah. (2017). "DATA CENSORING WITH SET-MEMBERSHIP ALGORITHMS." Web.
1. Paulo Sergio Ramirez Diniz, Hamed Yazdanpanah. DATA CENSORING WITH SET-MEMBERSHIP ALGORITHMS [Internet]. IEEE SigPort; 2017. Available from : http://sigport.org/2295

The Adaptive Complex Shock Diffusion for Seismic Random Noise Attenuation

Paper Details

Authors:
Hongbo Lin
Submitted On:
9 November 2017 - 9:17pm
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

The Adaptive Complex Shock Diffusion for Seismic Random Noise Attenuation

(9 downloads)

Keywords

Subscribe

[1] Hongbo Lin, "The Adaptive Complex Shock Diffusion for Seismic Random Noise Attenuation", IEEE SigPort, 2017. [Online]. Available: http://sigport.org/2277. Accessed: Dec. 17, 2017.
@article{2277-17,
url = {http://sigport.org/2277},
author = {Hongbo Lin },
publisher = {IEEE SigPort},
title = {The Adaptive Complex Shock Diffusion for Seismic Random Noise Attenuation},
year = {2017} }
TY - EJOUR
T1 - The Adaptive Complex Shock Diffusion for Seismic Random Noise Attenuation
AU - Hongbo Lin
PY - 2017
PB - IEEE SigPort
UR - http://sigport.org/2277
ER -
Hongbo Lin. (2017). The Adaptive Complex Shock Diffusion for Seismic Random Noise Attenuation. IEEE SigPort. http://sigport.org/2277
Hongbo Lin, 2017. The Adaptive Complex Shock Diffusion for Seismic Random Noise Attenuation. Available at: http://sigport.org/2277.
Hongbo Lin. (2017). "The Adaptive Complex Shock Diffusion for Seismic Random Noise Attenuation." Web.
1. Hongbo Lin. The Adaptive Complex Shock Diffusion for Seismic Random Noise Attenuation [Internet]. IEEE SigPort; 2017. Available from : http://sigport.org/2277

D2L: DECENTRALIZED DICTIONARY LEARNING OVER DYNAMIC NETWORKS

Paper Details

Authors:
Amir Daneshmand, Ying Sun, Gesualdo Scutari, Francisco Facchinei
Submitted On:
9 March 2017 - 1:10am
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

D4L_slides.pdf

(97 downloads)

Keywords

Subscribe

[1] Amir Daneshmand, Ying Sun, Gesualdo Scutari, Francisco Facchinei, "D2L: DECENTRALIZED DICTIONARY LEARNING OVER DYNAMIC NETWORKS", IEEE SigPort, 2017. [Online]. Available: http://sigport.org/1716. Accessed: Dec. 17, 2017.
@article{1716-17,
url = {http://sigport.org/1716},
author = {Amir Daneshmand; Ying Sun; Gesualdo Scutari; Francisco Facchinei },
publisher = {IEEE SigPort},
title = {D2L: DECENTRALIZED DICTIONARY LEARNING OVER DYNAMIC NETWORKS},
year = {2017} }
TY - EJOUR
T1 - D2L: DECENTRALIZED DICTIONARY LEARNING OVER DYNAMIC NETWORKS
AU - Amir Daneshmand; Ying Sun; Gesualdo Scutari; Francisco Facchinei
PY - 2017
PB - IEEE SigPort
UR - http://sigport.org/1716
ER -
Amir Daneshmand, Ying Sun, Gesualdo Scutari, Francisco Facchinei. (2017). D2L: DECENTRALIZED DICTIONARY LEARNING OVER DYNAMIC NETWORKS. IEEE SigPort. http://sigport.org/1716
Amir Daneshmand, Ying Sun, Gesualdo Scutari, Francisco Facchinei, 2017. D2L: DECENTRALIZED DICTIONARY LEARNING OVER DYNAMIC NETWORKS. Available at: http://sigport.org/1716.
Amir Daneshmand, Ying Sun, Gesualdo Scutari, Francisco Facchinei. (2017). "D2L: DECENTRALIZED DICTIONARY LEARNING OVER DYNAMIC NETWORKS." Web.
1. Amir Daneshmand, Ying Sun, Gesualdo Scutari, Francisco Facchinei. D2L: DECENTRALIZED DICTIONARY LEARNING OVER DYNAMIC NETWORKS [Internet]. IEEE SigPort; 2017. Available from : http://sigport.org/1716

RECURSIVE LEAST-SQUARES ALGORITHMS FOR SPARSE SYSTEM MODELING


In this paper, we propose some sparsity aware algorithms, namely the Recursive least-Squares for sparse systems (S-RLS) and l0-norm Recursive least-Squares (l0-RLS), in order to exploit the sparsity of an unknown system. The first algorithm, applies a discard function on the weight vector to disregard the coefficients close to zero during the update process. The second algorithm, employs the sparsity-promoting scheme via some non-convex approximations to the l0-norm.

Paper Details

Authors:
Hamed Yazdanpanah, Paulo Sergio Ramirez Diniz
Submitted On:
3 March 2017 - 9:25pm
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

RECURSIVE LEAST-SQUARES ALGORITHMS FOR SPARSE SYSTEM MODELING

(107 downloads)

Keywords

Subscribe

[1] Hamed Yazdanpanah, Paulo Sergio Ramirez Diniz, "RECURSIVE LEAST-SQUARES ALGORITHMS FOR SPARSE SYSTEM MODELING", IEEE SigPort, 2017. [Online]. Available: http://sigport.org/1621. Accessed: Dec. 17, 2017.
@article{1621-17,
url = {http://sigport.org/1621},
author = {Hamed Yazdanpanah; Paulo Sergio Ramirez Diniz },
publisher = {IEEE SigPort},
title = {RECURSIVE LEAST-SQUARES ALGORITHMS FOR SPARSE SYSTEM MODELING},
year = {2017} }
TY - EJOUR
T1 - RECURSIVE LEAST-SQUARES ALGORITHMS FOR SPARSE SYSTEM MODELING
AU - Hamed Yazdanpanah; Paulo Sergio Ramirez Diniz
PY - 2017
PB - IEEE SigPort
UR - http://sigport.org/1621
ER -
Hamed Yazdanpanah, Paulo Sergio Ramirez Diniz. (2017). RECURSIVE LEAST-SQUARES ALGORITHMS FOR SPARSE SYSTEM MODELING. IEEE SigPort. http://sigport.org/1621
Hamed Yazdanpanah, Paulo Sergio Ramirez Diniz, 2017. RECURSIVE LEAST-SQUARES ALGORITHMS FOR SPARSE SYSTEM MODELING. Available at: http://sigport.org/1621.
Hamed Yazdanpanah, Paulo Sergio Ramirez Diniz. (2017). "RECURSIVE LEAST-SQUARES ALGORITHMS FOR SPARSE SYSTEM MODELING." Web.
1. Hamed Yazdanpanah, Paulo Sergio Ramirez Diniz. RECURSIVE LEAST-SQUARES ALGORITHMS FOR SPARSE SYSTEM MODELING [Internet]. IEEE SigPort; 2017. Available from : http://sigport.org/1621

ADAPTIVE MATCHING PURSUIT FOR SPARSE SIGNAL RECOVERY


Spike and Slab priors have been of much recent interest in signal processing as a means of inducing sparsity in Bayesian inference. Applications domains that benefit from the use of these priors include sparse recovery, regression and classification. It is well-known that solving for the sparse coefficient vector to maximize these priors results in a hard non-convex and mixed integer programming problem. Most existing solutions to this optimization problem either involve simplifying assumptions/relaxations or are computationally expensive.

Paper Details

Authors:
Tiep H. Vu, Hojjat S. Mousavi, Vishal Monga
Submitted On:
27 February 2017 - 9:59pm
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

Poster_ICASSP_2017_AMP.pdf

(116 downloads)

Keywords

Subscribe

[1] Tiep H. Vu, Hojjat S. Mousavi, Vishal Monga, "ADAPTIVE MATCHING PURSUIT FOR SPARSE SIGNAL RECOVERY", IEEE SigPort, 2017. [Online]. Available: http://sigport.org/1467. Accessed: Dec. 17, 2017.
@article{1467-17,
url = {http://sigport.org/1467},
author = {Tiep H. Vu; Hojjat S. Mousavi; Vishal Monga },
publisher = {IEEE SigPort},
title = {ADAPTIVE MATCHING PURSUIT FOR SPARSE SIGNAL RECOVERY},
year = {2017} }
TY - EJOUR
T1 - ADAPTIVE MATCHING PURSUIT FOR SPARSE SIGNAL RECOVERY
AU - Tiep H. Vu; Hojjat S. Mousavi; Vishal Monga
PY - 2017
PB - IEEE SigPort
UR - http://sigport.org/1467
ER -
Tiep H. Vu, Hojjat S. Mousavi, Vishal Monga. (2017). ADAPTIVE MATCHING PURSUIT FOR SPARSE SIGNAL RECOVERY. IEEE SigPort. http://sigport.org/1467
Tiep H. Vu, Hojjat S. Mousavi, Vishal Monga, 2017. ADAPTIVE MATCHING PURSUIT FOR SPARSE SIGNAL RECOVERY. Available at: http://sigport.org/1467.
Tiep H. Vu, Hojjat S. Mousavi, Vishal Monga. (2017). "ADAPTIVE MATCHING PURSUIT FOR SPARSE SIGNAL RECOVERY." Web.
1. Tiep H. Vu, Hojjat S. Mousavi, Vishal Monga. ADAPTIVE MATCHING PURSUIT FOR SPARSE SIGNAL RECOVERY [Internet]. IEEE SigPort; 2017. Available from : http://sigport.org/1467

Robust Estimation of Self-Exciting Point Process Models with Application to Neuronal Modeling


We consider the problem of estimating discrete self- exciting point process models from limited binary observations, where the history of the process serves as the covariate. We analyze the performance of two classes of estimators: l1-regularized maximum likelihood and greedy estimation for a discrete version of the Hawkes process and characterize the sampling tradeoffs required for stable recovery in the non-asymptotic regime. Our results extend those of compressed sensing for linear and generalized linear models with i.i.d.

Paper Details

Authors:
Abbas Kazemipour, Min Wu and Behtash Babadi
Submitted On:
12 December 2016 - 9:35am
Short Link:
Type:
Document Year:
Cite

Document Files

Robust_SEPP_TSP.pdf

(145 downloads)

Keywords

Subscribe

[1] Abbas Kazemipour, Min Wu and Behtash Babadi, "Robust Estimation of Self-Exciting Point Process Models with Application to Neuronal Modeling", IEEE SigPort, 2016. [Online]. Available: http://sigport.org/1261. Accessed: Dec. 17, 2017.
@article{1261-16,
url = {http://sigport.org/1261},
author = {Abbas Kazemipour; Min Wu and Behtash Babadi },
publisher = {IEEE SigPort},
title = {Robust Estimation of Self-Exciting Point Process Models with Application to Neuronal Modeling},
year = {2016} }
TY - EJOUR
T1 - Robust Estimation of Self-Exciting Point Process Models with Application to Neuronal Modeling
AU - Abbas Kazemipour; Min Wu and Behtash Babadi
PY - 2016
PB - IEEE SigPort
UR - http://sigport.org/1261
ER -
Abbas Kazemipour, Min Wu and Behtash Babadi. (2016). Robust Estimation of Self-Exciting Point Process Models with Application to Neuronal Modeling. IEEE SigPort. http://sigport.org/1261
Abbas Kazemipour, Min Wu and Behtash Babadi, 2016. Robust Estimation of Self-Exciting Point Process Models with Application to Neuronal Modeling. Available at: http://sigport.org/1261.
Abbas Kazemipour, Min Wu and Behtash Babadi. (2016). "Robust Estimation of Self-Exciting Point Process Models with Application to Neuronal Modeling." Web.
1. Abbas Kazemipour, Min Wu and Behtash Babadi. Robust Estimation of Self-Exciting Point Process Models with Application to Neuronal Modeling [Internet]. IEEE SigPort; 2016. Available from : http://sigport.org/1261

The recursive Hessian sketch for adaptive filtering


The recursive Hessian sketch for adaptive filtering as a block diagram

We introduce in this paper the recursive Hessian sketch, a new adaptive filtering algorithm based on sketching the same exponentially weighted least squares problem solved by the recursive least squares algorithm. The algorithm maintains a number of sketches of the inverse autocorrelation matrix and recursively updates them at random intervals. These are in turn used to update the unknown filter estimate. The complexity of the proposed algorithm compares favorably to that of recursive least squares.

Paper Details

Authors:
Robin Scheibler, Martin Vetterli
Submitted On:
29 March 2016 - 4:50am
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

icassp2016_slides.pdf

(229 downloads)

Keywords

Subscribe

[1] Robin Scheibler, Martin Vetterli, "The recursive Hessian sketch for adaptive filtering", IEEE SigPort, 2016. [Online]. Available: http://sigport.org/1064. Accessed: Dec. 17, 2017.
@article{1064-16,
url = {http://sigport.org/1064},
author = {Robin Scheibler; Martin Vetterli },
publisher = {IEEE SigPort},
title = {The recursive Hessian sketch for adaptive filtering},
year = {2016} }
TY - EJOUR
T1 - The recursive Hessian sketch for adaptive filtering
AU - Robin Scheibler; Martin Vetterli
PY - 2016
PB - IEEE SigPort
UR - http://sigport.org/1064
ER -
Robin Scheibler, Martin Vetterli. (2016). The recursive Hessian sketch for adaptive filtering. IEEE SigPort. http://sigport.org/1064
Robin Scheibler, Martin Vetterli, 2016. The recursive Hessian sketch for adaptive filtering. Available at: http://sigport.org/1064.
Robin Scheibler, Martin Vetterli. (2016). "The recursive Hessian sketch for adaptive filtering." Web.
1. Robin Scheibler, Martin Vetterli. The recursive Hessian sketch for adaptive filtering [Internet]. IEEE SigPort; 2016. Available from : http://sigport.org/1064

Compressed Training Adaptive Equalization


Compressed Training Adaptive Equalization

We introduce it compressed training adaptive equalization as a novel approach for reducing number of training symbols in a communication packet. The proposed semi-blind approach is based on the exploitation of the special magnitude boundedness of communication symbols. The algorithms are derived from a special convex optimization setting based on l_\infty norm. The corresponding framework has a direct link with the compressive sensing literature established by invoking the duality between l_1 and l_\infty norms.

Paper Details

Authors:
Baki Berkay Yilmaz, Alper T. Erdogan
Submitted On:
21 March 2016 - 12:13pm
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

icassp2016bbyhazirae.pdf

(281 downloads)

Keywords

Subscribe

[1] Baki Berkay Yilmaz, Alper T. Erdogan, "Compressed Training Adaptive Equalization", IEEE SigPort, 2016. [Online]. Available: http://sigport.org/932. Accessed: Dec. 17, 2017.
@article{932-16,
url = {http://sigport.org/932},
author = {Baki Berkay Yilmaz; Alper T. Erdogan },
publisher = {IEEE SigPort},
title = {Compressed Training Adaptive Equalization},
year = {2016} }
TY - EJOUR
T1 - Compressed Training Adaptive Equalization
AU - Baki Berkay Yilmaz; Alper T. Erdogan
PY - 2016
PB - IEEE SigPort
UR - http://sigport.org/932
ER -
Baki Berkay Yilmaz, Alper T. Erdogan. (2016). Compressed Training Adaptive Equalization. IEEE SigPort. http://sigport.org/932
Baki Berkay Yilmaz, Alper T. Erdogan, 2016. Compressed Training Adaptive Equalization. Available at: http://sigport.org/932.
Baki Berkay Yilmaz, Alper T. Erdogan. (2016). "Compressed Training Adaptive Equalization." Web.
1. Baki Berkay Yilmaz, Alper T. Erdogan. Compressed Training Adaptive Equalization [Internet]. IEEE SigPort; 2016. Available from : http://sigport.org/932

Adaptive Sparsity Tradeoff for L1-Constraint NLMS Algorithm


Embedding the l1 norm in gradient-based adaptive filtering is a popular solution for sparse plant estimation. Supported on the modal analysis of the adaptive algorithm near steady state, this work shows that the optimal sparsity tradeoff depends on filter length, plant sparsity and signal-to-noise ratio. In a practical implementation, these terms are obtained with an unsupervised mechanism tracking the filter weights. Simulation results prove the robustness and superiority of the novel adaptive-tradeoff sparsity-aware method.

Paper Details

Authors:
Abdullah Alshabilli, Shihab Jimaa
Submitted On:
19 March 2016 - 12:32pm
Short Link:
Type:
Event:
Presenter's Name:
Document Year:
Cite

Document Files

icassp2016-poster.pdf

(218 downloads)

Keywords

Additional Categories

Subscribe

[1] Abdullah Alshabilli, Shihab Jimaa, "Adaptive Sparsity Tradeoff for L1-Constraint NLMS Algorithm", IEEE SigPort, 2016. [Online]. Available: http://sigport.org/825. Accessed: Dec. 17, 2017.
@article{825-16,
url = {http://sigport.org/825},
author = {Abdullah Alshabilli; Shihab Jimaa },
publisher = {IEEE SigPort},
title = {Adaptive Sparsity Tradeoff for L1-Constraint NLMS Algorithm},
year = {2016} }
TY - EJOUR
T1 - Adaptive Sparsity Tradeoff for L1-Constraint NLMS Algorithm
AU - Abdullah Alshabilli; Shihab Jimaa
PY - 2016
PB - IEEE SigPort
UR - http://sigport.org/825
ER -
Abdullah Alshabilli, Shihab Jimaa. (2016). Adaptive Sparsity Tradeoff for L1-Constraint NLMS Algorithm. IEEE SigPort. http://sigport.org/825
Abdullah Alshabilli, Shihab Jimaa, 2016. Adaptive Sparsity Tradeoff for L1-Constraint NLMS Algorithm. Available at: http://sigport.org/825.
Abdullah Alshabilli, Shihab Jimaa. (2016). "Adaptive Sparsity Tradeoff for L1-Constraint NLMS Algorithm." Web.
1. Abdullah Alshabilli, Shihab Jimaa. Adaptive Sparsity Tradeoff for L1-Constraint NLMS Algorithm [Internet]. IEEE SigPort; 2016. Available from : http://sigport.org/825

PERFORMANCE LIMITS OF SINGLE-AGENT AND MULTI-AGENT SUB-GRADIENT STOCHASTIC LEARNING

Paper Details

Authors:
Ali H. Sayed
Submitted On:
19 March 2016 - 11:41am
Short Link:
Type:

Document Files

Poster_Bicheng_2016_ICASSP.pdf

(199 downloads)

Keywords

Subscribe

[1] Ali H. Sayed, "PERFORMANCE LIMITS OF SINGLE-AGENT AND MULTI-AGENT SUB-GRADIENT STOCHASTIC LEARNING", IEEE SigPort, 2016. [Online]. Available: http://sigport.org/820. Accessed: Dec. 17, 2017.
@article{820-16,
url = {http://sigport.org/820},
author = {Ali H. Sayed },
publisher = {IEEE SigPort},
title = {PERFORMANCE LIMITS OF SINGLE-AGENT AND MULTI-AGENT SUB-GRADIENT STOCHASTIC LEARNING},
year = {2016} }
TY - EJOUR
T1 - PERFORMANCE LIMITS OF SINGLE-AGENT AND MULTI-AGENT SUB-GRADIENT STOCHASTIC LEARNING
AU - Ali H. Sayed
PY - 2016
PB - IEEE SigPort
UR - http://sigport.org/820
ER -
Ali H. Sayed. (2016). PERFORMANCE LIMITS OF SINGLE-AGENT AND MULTI-AGENT SUB-GRADIENT STOCHASTIC LEARNING. IEEE SigPort. http://sigport.org/820
Ali H. Sayed, 2016. PERFORMANCE LIMITS OF SINGLE-AGENT AND MULTI-AGENT SUB-GRADIENT STOCHASTIC LEARNING. Available at: http://sigport.org/820.
Ali H. Sayed. (2016). "PERFORMANCE LIMITS OF SINGLE-AGENT AND MULTI-AGENT SUB-GRADIENT STOCHASTIC LEARNING." Web.
1. Ali H. Sayed. PERFORMANCE LIMITS OF SINGLE-AGENT AND MULTI-AGENT SUB-GRADIENT STOCHASTIC LEARNING [Internet]. IEEE SigPort; 2016. Available from : http://sigport.org/820

Pages