Sorry, you need to enable JavaScript to visit this website.

We consider the problem of learning smooth multivariate probability density functions. We invoke the canonical decomposition of multivariate functions and we show that if a joint probability density function admits a truncated Fourier series representation, then the classical univariate Fejér-Riesz Representation Theorem can be used for learning bona fide joint probability density functions. We propose a scalable, flexible, and direct framework for learning smooth multivariate probability density functions even from potentially incomplete datasets.

Categories:
45 Views

In this paper, we propose a restoration method of time-varying graph signals, i.e., signals on a graph whose signal values change over time, using deep algorithm unrolling. Deep algorithm unrolling is a method that learns parameters in an iterative optimization algorithm with deep learning techniques. It is expected to improve convergence speed and accuracy while the iterative steps are still interpretable. In the proposed method, the minimization problem is formulated so that the time-varying graph signal is smooth both in time and spatial domains.

Categories:
69 Views

In the past two decades convex optimization gained increasing popularity in signal processing and communications, as many fundamental problems in this area can be modelled, analyzed and solved using convex optimization theory and algorithms. In emerging large scale applications such as compressed sensing, massive MIMO and machine learning, the underlying optimization problems often exhibit convexity, however, the classic interior point methods do not scale well with the problem dimensions.

Categories:
61 Views

The unlabeled sensing problem is to solve a noisy linear system of equations under unknown permutation of the measurements. We study a particular case of the problem where the

Categories:
17 Views

Simulated annealing (SA) is a widely used approach to solve global optimization problems in signal processing. The initial non-convex problem is recast as the exploration of a sequence of Boltzmann probability distributions, which are increasingly harder to sample from. They are parametrized by a temperature that is iteratively decreased, following the so-called cooling schedule. Convergence results of SA methods usually require the cooling schedule to be set a priori with slow decay. In this work, we introduce a new SA approach that selects the cooling schedule on the fly.

Categories:
10 Views

Simulated annealing (SA) is a widely used approach to solve global optimization problems in signal processing. The initial non-convex problem is recast as the exploration of a sequence of Boltzmann probability distributions, which are increasingly harder to sample from. They are parametrized by a temperature that is iteratively decreased, following the so-called cooling schedule. Convergence results of SA methods usually require the cooling schedule to be set a priori with slow decay. In this work, we introduce a new SA approach that selects the cooling schedule on the fly.

Categories:
13 Views

The smoothing task is the core of many signal processing applications. It deals with the recovery of a sequence of hidden state variables from a sequence of noisy observations in a one-shot manner. In this work we propose RTSNet, a highly efficient model-based and data-driven smoothing algorithm. RTSNet integrates dedicated trainable models into the flow of the classical Rauch-Tung-Striebel (RTS) smoother, and is able to outperform it when operating under model mismatch and non-linearities while retaining its efficiency and interpretability.

Categories:
17 Views

Pages