- Bayesian learning; Bayesian signal processing (MLR-BAYL)
- Bounds on performance (MLR-PERF)
- Applications in Systems Biology (MLR-SYSB)
- Applications in Music and Audio Processing (MLR-MUSI)
- Applications in Data Fusion (MLR-FUSI)
- Cognitive information processing (MLR-COGP)
- Distributed and Cooperative Learning (MLR-DIST)
- Learning theory and algorithms (MLR-LEAR)
- Neural network learning (MLR-NNLR)
- Information-theoretic learning (MLR-INFO)
- Independent component analysis (MLR-ICAN)
- Graphical and kernel methods (MLR-GRKN)
- Other applications of machine learning (MLR-APPL)
- Pattern recognition and classification (MLR-PATT)
- Source separation (MLR-SSEP)
- Sequential learning; sequential decision methods (MLR-SLER)

- Read more about Fast Exemplar Selection Algorithm for Matrix Approximation and Representation: A Variant oASIS Algorithm
- Log in to post comments

Extracting inherent patterns from large data using decompositions of

data matrix by a sampled subset of exemplars has found many applications

in machine learning. We propose a computationally efficient

algorithm for adaptive exemplar sampling, called fast exemplar selection

(FES). The proposed algorithm can be seen as an efficient

variant of the oASIS algorithm (Patel et al). FES iteratively selects incoherent

exemplars based on the exemplars that are already sampled.

This is done by ensuring that the selected exemplars forms a positive

- Categories:

- Read more about Approximate Support Recovery of Atomic Line Spectral Estimation: A Tale of Resolution and Precision
- Log in to post comments

This work investigates the parameter estimation performance of super-resolution line spectral estimation using atomic norm minimization. The focus is on analyzing the algorithm's accuracy of inferring the frequencies and complex magnitudes from noisy observations. When the Signal-to-Noise Ratio is reasonably high and the true frequencies are separated by $O(\frac{1}{n})$, the atomic norm estimator is shown to localize the correct number of frequencies, each within a neighborhood of size $O(\sqrt{\frac{\log n}{n^3}} \sigma)$ of one of the true frequencies.

- Categories:

- Read more about Minimum-Volume Weighted Symmetric Nonnegative Matrix Factorization for Clustering
- Log in to post comments

In recent years, nonnegative matrix factorization (NMF) attracts much attention in machine learning and signal processing fields due to its interpretability of data in a low dimensional subspace. For clustering problems, symmetric nonnegative matrix factorization (SNMF) as an extension of NMF factorizes the similarity matrix of data points directly and outperforms NMF when dealing with nonlinear data structure. However, the clustering results of SNMF is very sensitive to noisy data.

- Categories:

- Read more about Recurrent neural networks for polyphonic sound event detection in real life recordings
- Log in to post comments

Slides from the presentation held at ICASSP 2016 for the paper: Recurrent neural networks for polyphonic sound event detection in real life recordings

- Categories:

- Read more about FILTERBANK LEARNING USING CONVOLUTIONAL RESTRICTED BOLTZMANN MACHINE FOR SPEECH RECOGNITION
- Log in to post comments

Convolutional Restricted Boltzmann Machine (ConvRBM) as a model for speech signal is presented in this paper. We have

developed ConvRBM with sampling from noisy rectified linear units (NReLUs). ConvRBM is trained in an unsupervised way to model speech signal of arbitrary lengths. Weights of the model can represent an auditory-like filterbank. Our

## poster.pdf

- Categories:

- Read more about FILTERBANK LEARNING USING CONVOLUTIONAL RESTRICTED BOLTZMANN MACHINE FOR SPEECH RECOGNITION
- Log in to post comments

Convolutional Restricted Boltzmann Machine (ConvRBM) as a model for speech signal is presented in this paper. We have

developed ConvRBM with sampling from noisy rectified linear units (NReLUs). ConvRBM is trained in an unsupervised way to model speech signal of arbitrary lengths. Weights of the model can represent an auditory-like filterbank. Our

## poster.pdf

- Categories:

The problem of estimating sparse eigenvectors of a symmetric matrix attracts a lot of attention in many applications, especially those with high dimensional data set. While classical eigenvectors can be obtained as the solution of a maximization

## SPCAposter.pdf

- Categories:

- Read more about Symmetric Matrix Perturbation For Differentially-Private Principal Component Analysis
- Log in to post comments

Differential privacy is a strong, cryptographically-motivated definition of privacy that has recently received a significant amount of research attention for its robustness to known attacks. The principal component analysis (PCA) algorithm is frequently used in signal processing, machine learning and statistics pipelines. In this paper, we propose a new algorithm for differentially-private computation of PCA and compare the performance empirically with some recent state-of-the-art algorithms on different data sets.

- Categories:

- Read more about Audio Word Similarity for Clustering with Zero Resources based on iterative HMM Classification
- Log in to post comments

Recent work on zero resource word discovery makes intensive use of audio fragment clustering to find repeating speech patterns. In the absence of acoustic models, the clustering step traditionally relies on dynamic time warping (DTW) to compare two samples and thus suffers from the known limitations of this technique. We propose a new sample comparison method, called 'similarity by terative classification', that exploits the modeling capacities of hidden Markov models (HMM) with no supervision.

- Categories:

- Read more about Discriminant Correlation Analysis for Feature Level Fusion with Application to Multimodal Biometrics
- Log in to post comments

In this paper, we present Discriminant Correlation Analysis (DCA), a feature level fusion technique that incorporates the class associations in correlation analysis of the feature sets. DCA performs an effective feature fusion by maximizing the pair-wise correlations across the two feature sets, and at the same time, eliminating the between-class correlations and restricting the correlations to be within classes.

- Categories: