Sorry, you need to enable JavaScript to visit this website.

Machine Learning for Signal Processing

Sparse Modeling


Sparse Modeling in Image Processing and Deep LearningSparse approximation is a well-established theory, with a profound impact on the fields of signal and image processing. In this talk we start by presenting this model and its features, and then turn to describe two special cases of it – the convolutional sparse coding (CSC) and its multi-layered version (ML-CSC).  Amazingly, as we will carefully show, ML-CSC provides a solid theoretical foundation to … deep-learning.

Paper Details

Authors:
Michael Elad
Submitted On:
22 December 2017 - 1:26pm
Short Link:
Type:
Event:
Presenter's Name:
Document Year:
Cite

Document Files

ICIP_KeyNote_Talk_small size.pdf

(197)

Subscribe

[1] Michael Elad, "Sparse Modeling ", IEEE SigPort, 2017. [Online]. Available: http://sigport.org/2260. Accessed: Oct. 29, 2020.
@article{2260-17,
url = {http://sigport.org/2260},
author = {Michael Elad },
publisher = {IEEE SigPort},
title = {Sparse Modeling },
year = {2017} }
TY - EJOUR
T1 - Sparse Modeling
AU - Michael Elad
PY - 2017
PB - IEEE SigPort
UR - http://sigport.org/2260
ER -
Michael Elad. (2017). Sparse Modeling . IEEE SigPort. http://sigport.org/2260
Michael Elad, 2017. Sparse Modeling . Available at: http://sigport.org/2260.
Michael Elad. (2017). "Sparse Modeling ." Web.
1. Michael Elad. Sparse Modeling [Internet]. IEEE SigPort; 2017. Available from : http://sigport.org/2260

ESRGAN+ : Further Improving Enhanced Super-Resolution Generative Adversarial Network


Enhanced Super-Resolution Generative Adversarial Network (ESRGAN) is a perceptual-driven approach for single image super-resolution that is able to produce photorealistic images. Despite the visual quality of these generated images, there is still room for improvement. In this fashion, the model is extended to further improve the perceptual quality of the images. We have designed a network architecture with a novel basic block to replace the one used by the original ESRGAN. Moreover, we introduce noise inputs to the generator network in order to exploit stochastic variation.

Paper Details

Authors:
Submitted On:
3 June 2020 - 8:27am
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

Presentation ICASSP 2020.pdf

(58)

Subscribe

[1] , "ESRGAN+ : Further Improving Enhanced Super-Resolution Generative Adversarial Network", IEEE SigPort, 2020. [Online]. Available: http://sigport.org/5450. Accessed: Oct. 29, 2020.
@article{5450-20,
url = {http://sigport.org/5450},
author = { },
publisher = {IEEE SigPort},
title = {ESRGAN+ : Further Improving Enhanced Super-Resolution Generative Adversarial Network},
year = {2020} }
TY - EJOUR
T1 - ESRGAN+ : Further Improving Enhanced Super-Resolution Generative Adversarial Network
AU -
PY - 2020
PB - IEEE SigPort
UR - http://sigport.org/5450
ER -
. (2020). ESRGAN+ : Further Improving Enhanced Super-Resolution Generative Adversarial Network. IEEE SigPort. http://sigport.org/5450
, 2020. ESRGAN+ : Further Improving Enhanced Super-Resolution Generative Adversarial Network. Available at: http://sigport.org/5450.
. (2020). "ESRGAN+ : Further Improving Enhanced Super-Resolution Generative Adversarial Network." Web.
1. . ESRGAN+ : Further Improving Enhanced Super-Resolution Generative Adversarial Network [Internet]. IEEE SigPort; 2020. Available from : http://sigport.org/5450

Attention based Curiosity-driven Exploration in Deep Reinforcement Learning


Reinforcement Learning enables to train an agent via interaction with the environment. However, in the majority of real-world scenarios, the extrinsic feedback is sparse or not sufficient, thus intrinsic reward formulations are needed to successfully train the agent. This work investigates and extends the paradigm of curiosity-driven exploration. First, a probabilistic approach is taken to exploit the advantages of the attention mechanism, which is successfully applied in other domains of Deep Learning.

Paper Details

Authors:
Patrik Reizinger, Márton Szemenyei
Submitted On:
17 May 2020 - 3:59pm
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

ICASSP_presentation.pdf

(58)

Keywords

Additional Categories

Subscribe

[1] Patrik Reizinger, Márton Szemenyei, "Attention based Curiosity-driven Exploration in Deep Reinforcement Learning", IEEE SigPort, 2020. [Online]. Available: http://sigport.org/5386. Accessed: Oct. 29, 2020.
@article{5386-20,
url = {http://sigport.org/5386},
author = {Patrik Reizinger; Márton Szemenyei },
publisher = {IEEE SigPort},
title = {Attention based Curiosity-driven Exploration in Deep Reinforcement Learning},
year = {2020} }
TY - EJOUR
T1 - Attention based Curiosity-driven Exploration in Deep Reinforcement Learning
AU - Patrik Reizinger; Márton Szemenyei
PY - 2020
PB - IEEE SigPort
UR - http://sigport.org/5386
ER -
Patrik Reizinger, Márton Szemenyei. (2020). Attention based Curiosity-driven Exploration in Deep Reinforcement Learning. IEEE SigPort. http://sigport.org/5386
Patrik Reizinger, Márton Szemenyei, 2020. Attention based Curiosity-driven Exploration in Deep Reinforcement Learning. Available at: http://sigport.org/5386.
Patrik Reizinger, Márton Szemenyei. (2020). "Attention based Curiosity-driven Exploration in Deep Reinforcement Learning." Web.
1. Patrik Reizinger, Márton Szemenyei. Attention based Curiosity-driven Exploration in Deep Reinforcement Learning [Internet]. IEEE SigPort; 2020. Available from : http://sigport.org/5386

MULTI-LABEL CONSISTENT CONVOLUTIONAL TRANSFORM LEARNING: APPLICATION TO NON-INTRUSIVE LOAD MONITORING

Paper Details

Authors:
Shikha Singh, Jyoti Maggu, Angshul Majumdar, Emilie Chouzenoux, Giovanni Chierchia
Submitted On:
16 May 2020 - 11:32am
Short Link:
Type:
Event:
Document Year:
Cite

Document Files

ICASSP_PPT20.pdf

(44)

Subscribe

[1] Shikha Singh, Jyoti Maggu, Angshul Majumdar, Emilie Chouzenoux, Giovanni Chierchia, "MULTI-LABEL CONSISTENT CONVOLUTIONAL TRANSFORM LEARNING: APPLICATION TO NON-INTRUSIVE LOAD MONITORING", IEEE SigPort, 2020. [Online]. Available: http://sigport.org/5376. Accessed: Oct. 29, 2020.
@article{5376-20,
url = {http://sigport.org/5376},
author = {Shikha Singh; Jyoti Maggu; Angshul Majumdar; Emilie Chouzenoux; Giovanni Chierchia },
publisher = {IEEE SigPort},
title = {MULTI-LABEL CONSISTENT CONVOLUTIONAL TRANSFORM LEARNING: APPLICATION TO NON-INTRUSIVE LOAD MONITORING},
year = {2020} }
TY - EJOUR
T1 - MULTI-LABEL CONSISTENT CONVOLUTIONAL TRANSFORM LEARNING: APPLICATION TO NON-INTRUSIVE LOAD MONITORING
AU - Shikha Singh; Jyoti Maggu; Angshul Majumdar; Emilie Chouzenoux; Giovanni Chierchia
PY - 2020
PB - IEEE SigPort
UR - http://sigport.org/5376
ER -
Shikha Singh, Jyoti Maggu, Angshul Majumdar, Emilie Chouzenoux, Giovanni Chierchia. (2020). MULTI-LABEL CONSISTENT CONVOLUTIONAL TRANSFORM LEARNING: APPLICATION TO NON-INTRUSIVE LOAD MONITORING. IEEE SigPort. http://sigport.org/5376
Shikha Singh, Jyoti Maggu, Angshul Majumdar, Emilie Chouzenoux, Giovanni Chierchia, 2020. MULTI-LABEL CONSISTENT CONVOLUTIONAL TRANSFORM LEARNING: APPLICATION TO NON-INTRUSIVE LOAD MONITORING. Available at: http://sigport.org/5376.
Shikha Singh, Jyoti Maggu, Angshul Majumdar, Emilie Chouzenoux, Giovanni Chierchia. (2020). "MULTI-LABEL CONSISTENT CONVOLUTIONAL TRANSFORM LEARNING: APPLICATION TO NON-INTRUSIVE LOAD MONITORING." Web.
1. Shikha Singh, Jyoti Maggu, Angshul Majumdar, Emilie Chouzenoux, Giovanni Chierchia. MULTI-LABEL CONSISTENT CONVOLUTIONAL TRANSFORM LEARNING: APPLICATION TO NON-INTRUSIVE LOAD MONITORING [Internet]. IEEE SigPort; 2020. Available from : http://sigport.org/5376

SEQUENCE-TO-SUBSEQUENCE LEARNING WITH CONDITIONAL GAN FOR POWER DISAGGREGATION


Non-intrusive load monitoring (a.k.a. power disaggregation) refers to identifying and extracting the consumption patterns of individual appliances from the mains which records the whole-house energy consumption. Recently, deep learning has been shown to be a promising method to solve this problem and many approaches based on it have been proposed.

Paper Details

Authors:
Yungang Pan, Ke Liu, Zhaoyan Shen, Xiaojun Cai, Zhiping Jia
Submitted On:
15 May 2020 - 5:10am
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

SEQUENCE-TO-SUBSEQUENCE LEARNING WITH CONDITIONAL GAN FOR POWER DISAGGREGATION.pdf

(67)

Keywords

Additional Categories

Subscribe

[1] Yungang Pan, Ke Liu, Zhaoyan Shen, Xiaojun Cai, Zhiping Jia, "SEQUENCE-TO-SUBSEQUENCE LEARNING WITH CONDITIONAL GAN FOR POWER DISAGGREGATION", IEEE SigPort, 2020. [Online]. Available: http://sigport.org/5344. Accessed: Oct. 29, 2020.
@article{5344-20,
url = {http://sigport.org/5344},
author = {Yungang Pan; Ke Liu; Zhaoyan Shen; Xiaojun Cai; Zhiping Jia },
publisher = {IEEE SigPort},
title = {SEQUENCE-TO-SUBSEQUENCE LEARNING WITH CONDITIONAL GAN FOR POWER DISAGGREGATION},
year = {2020} }
TY - EJOUR
T1 - SEQUENCE-TO-SUBSEQUENCE LEARNING WITH CONDITIONAL GAN FOR POWER DISAGGREGATION
AU - Yungang Pan; Ke Liu; Zhaoyan Shen; Xiaojun Cai; Zhiping Jia
PY - 2020
PB - IEEE SigPort
UR - http://sigport.org/5344
ER -
Yungang Pan, Ke Liu, Zhaoyan Shen, Xiaojun Cai, Zhiping Jia. (2020). SEQUENCE-TO-SUBSEQUENCE LEARNING WITH CONDITIONAL GAN FOR POWER DISAGGREGATION. IEEE SigPort. http://sigport.org/5344
Yungang Pan, Ke Liu, Zhaoyan Shen, Xiaojun Cai, Zhiping Jia, 2020. SEQUENCE-TO-SUBSEQUENCE LEARNING WITH CONDITIONAL GAN FOR POWER DISAGGREGATION. Available at: http://sigport.org/5344.
Yungang Pan, Ke Liu, Zhaoyan Shen, Xiaojun Cai, Zhiping Jia. (2020). "SEQUENCE-TO-SUBSEQUENCE LEARNING WITH CONDITIONAL GAN FOR POWER DISAGGREGATION." Web.
1. Yungang Pan, Ke Liu, Zhaoyan Shen, Xiaojun Cai, Zhiping Jia. SEQUENCE-TO-SUBSEQUENCE LEARNING WITH CONDITIONAL GAN FOR POWER DISAGGREGATION [Internet]. IEEE SigPort; 2020. Available from : http://sigport.org/5344

FAST BLOCK-SPARSE ESTIMATION FOR VECTOR NETWORKS


While there is now a significant literature on sparse inverse covariance estimation, all that literature, with only a couple of exceptions, has dealt only with univariate (or scalar) net- works where each node carries a univariate signal. However in many, perhaps most, applications, each node may carry multivariate signals representing multi-attribute data, possibly of different dimensions. Modelling such multivariate (or vector) networks requires fitting block-sparse inverse covariance matrices. Here we achieve maximal block sparsity by maximizing a block-l0-sparse penalized likelihood.

Paper Details

Authors:
Submitted On:
15 May 2020 - 3:00am
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

blockSpGGM.pdf

(50)

Subscribe

[1] , "FAST BLOCK-SPARSE ESTIMATION FOR VECTOR NETWORKS", IEEE SigPort, 2020. [Online]. Available: http://sigport.org/5340. Accessed: Oct. 29, 2020.
@article{5340-20,
url = {http://sigport.org/5340},
author = { },
publisher = {IEEE SigPort},
title = {FAST BLOCK-SPARSE ESTIMATION FOR VECTOR NETWORKS},
year = {2020} }
TY - EJOUR
T1 - FAST BLOCK-SPARSE ESTIMATION FOR VECTOR NETWORKS
AU -
PY - 2020
PB - IEEE SigPort
UR - http://sigport.org/5340
ER -
. (2020). FAST BLOCK-SPARSE ESTIMATION FOR VECTOR NETWORKS. IEEE SigPort. http://sigport.org/5340
, 2020. FAST BLOCK-SPARSE ESTIMATION FOR VECTOR NETWORKS. Available at: http://sigport.org/5340.
. (2020). "FAST BLOCK-SPARSE ESTIMATION FOR VECTOR NETWORKS." Web.
1. . FAST BLOCK-SPARSE ESTIMATION FOR VECTOR NETWORKS [Internet]. IEEE SigPort; 2020. Available from : http://sigport.org/5340

LARGE-SCALE TIME SERIES CLUSTERING WITH k-ARs


Time-series clustering involves grouping homogeneous time series together based on certain similarity measures. The mixture AR model (MxAR) has already been developed for time series clustering, as has an associated EM algorithm. How- ever, this EM clustering algorithm fails to perform satisfactorily in large-scale applications due to its high computational complexity. This paper proposes a new algorithm, k-ARs, which is a limiting version of the existing EM algorithm.

Paper Details

Authors:
Submitted On:
15 May 2020 - 2:55am
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

kARs.pdf

(56)

Subscribe

[1] , "LARGE-SCALE TIME SERIES CLUSTERING WITH k-ARs", IEEE SigPort, 2020. [Online]. Available: http://sigport.org/5339. Accessed: Oct. 29, 2020.
@article{5339-20,
url = {http://sigport.org/5339},
author = { },
publisher = {IEEE SigPort},
title = {LARGE-SCALE TIME SERIES CLUSTERING WITH k-ARs},
year = {2020} }
TY - EJOUR
T1 - LARGE-SCALE TIME SERIES CLUSTERING WITH k-ARs
AU -
PY - 2020
PB - IEEE SigPort
UR - http://sigport.org/5339
ER -
. (2020). LARGE-SCALE TIME SERIES CLUSTERING WITH k-ARs. IEEE SigPort. http://sigport.org/5339
, 2020. LARGE-SCALE TIME SERIES CLUSTERING WITH k-ARs. Available at: http://sigport.org/5339.
. (2020). "LARGE-SCALE TIME SERIES CLUSTERING WITH k-ARs." Web.
1. . LARGE-SCALE TIME SERIES CLUSTERING WITH k-ARs [Internet]. IEEE SigPort; 2020. Available from : http://sigport.org/5339

Generative pre-training for speech with autoregressive predictive coding

Paper Details

Authors:
Yu-An Chung, James Glass
Submitted On:
14 May 2020 - 5:56pm
Short Link:
Type:
Event:
Document Year:
Cite

Document Files

icassp-20.generative.slides.pdf

(47)

Subscribe

[1] Yu-An Chung, James Glass, "Generative pre-training for speech with autoregressive predictive coding", IEEE SigPort, 2020. [Online]. Available: http://sigport.org/5323. Accessed: Oct. 29, 2020.
@article{5323-20,
url = {http://sigport.org/5323},
author = {Yu-An Chung; James Glass },
publisher = {IEEE SigPort},
title = {Generative pre-training for speech with autoregressive predictive coding},
year = {2020} }
TY - EJOUR
T1 - Generative pre-training for speech with autoregressive predictive coding
AU - Yu-An Chung; James Glass
PY - 2020
PB - IEEE SigPort
UR - http://sigport.org/5323
ER -
Yu-An Chung, James Glass. (2020). Generative pre-training for speech with autoregressive predictive coding. IEEE SigPort. http://sigport.org/5323
Yu-An Chung, James Glass, 2020. Generative pre-training for speech with autoregressive predictive coding. Available at: http://sigport.org/5323.
Yu-An Chung, James Glass. (2020). "Generative pre-training for speech with autoregressive predictive coding." Web.
1. Yu-An Chung, James Glass. Generative pre-training for speech with autoregressive predictive coding [Internet]. IEEE SigPort; 2020. Available from : http://sigport.org/5323

A Generalization of Principal Component Analysis [POSTER]

Paper Details

Authors:
Samuele Battaglino, Erdem Koyuncu
Submitted On:
14 May 2020 - 12:48pm
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

Poster.pdf

(51)

Subscribe

[1] Samuele Battaglino, Erdem Koyuncu, "A Generalization of Principal Component Analysis [POSTER]", IEEE SigPort, 2020. [Online]. Available: http://sigport.org/5317. Accessed: Oct. 29, 2020.
@article{5317-20,
url = {http://sigport.org/5317},
author = {Samuele Battaglino; Erdem Koyuncu },
publisher = {IEEE SigPort},
title = {A Generalization of Principal Component Analysis [POSTER]},
year = {2020} }
TY - EJOUR
T1 - A Generalization of Principal Component Analysis [POSTER]
AU - Samuele Battaglino; Erdem Koyuncu
PY - 2020
PB - IEEE SigPort
UR - http://sigport.org/5317
ER -
Samuele Battaglino, Erdem Koyuncu. (2020). A Generalization of Principal Component Analysis [POSTER]. IEEE SigPort. http://sigport.org/5317
Samuele Battaglino, Erdem Koyuncu, 2020. A Generalization of Principal Component Analysis [POSTER]. Available at: http://sigport.org/5317.
Samuele Battaglino, Erdem Koyuncu. (2020). "A Generalization of Principal Component Analysis [POSTER]." Web.
1. Samuele Battaglino, Erdem Koyuncu. A Generalization of Principal Component Analysis [POSTER] [Internet]. IEEE SigPort; 2020. Available from : http://sigport.org/5317

FAST CLUSTERING WITH CO-CLUSTERING VIA DISCRETE NON-NEGATIVE MATRIX FACTORIZATION FOR IMAGE IDENTIFICATION

Paper Details

Authors:
Submitted On:
14 May 2020 - 12:02pm
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:

Document Files

math-beamer.pdf

(63)

Subscribe

[1] , "FAST CLUSTERING WITH CO-CLUSTERING VIA DISCRETE NON-NEGATIVE MATRIX FACTORIZATION FOR IMAGE IDENTIFICATION", IEEE SigPort, 2020. [Online]. Available: http://sigport.org/5313. Accessed: Oct. 29, 2020.
@article{5313-20,
url = {http://sigport.org/5313},
author = { },
publisher = {IEEE SigPort},
title = {FAST CLUSTERING WITH CO-CLUSTERING VIA DISCRETE NON-NEGATIVE MATRIX FACTORIZATION FOR IMAGE IDENTIFICATION},
year = {2020} }
TY - EJOUR
T1 - FAST CLUSTERING WITH CO-CLUSTERING VIA DISCRETE NON-NEGATIVE MATRIX FACTORIZATION FOR IMAGE IDENTIFICATION
AU -
PY - 2020
PB - IEEE SigPort
UR - http://sigport.org/5313
ER -
. (2020). FAST CLUSTERING WITH CO-CLUSTERING VIA DISCRETE NON-NEGATIVE MATRIX FACTORIZATION FOR IMAGE IDENTIFICATION. IEEE SigPort. http://sigport.org/5313
, 2020. FAST CLUSTERING WITH CO-CLUSTERING VIA DISCRETE NON-NEGATIVE MATRIX FACTORIZATION FOR IMAGE IDENTIFICATION. Available at: http://sigport.org/5313.
. (2020). "FAST CLUSTERING WITH CO-CLUSTERING VIA DISCRETE NON-NEGATIVE MATRIX FACTORIZATION FOR IMAGE IDENTIFICATION." Web.
1. . FAST CLUSTERING WITH CO-CLUSTERING VIA DISCRETE NON-NEGATIVE MATRIX FACTORIZATION FOR IMAGE IDENTIFICATION [Internet]. IEEE SigPort; 2020. Available from : http://sigport.org/5313

Pages