Sorry, you need to enable JavaScript to visit this website.

We introduce the Generative Barankin Bound (GBB), a learned Barankin Bound, for evaluating the achievable performance in estimating the direction of arrival (DOA) of a source in non-asymptotic conditions, when the statistics of the measurement are unknown. We first learn the measurement distribution using a conditional normalizing flow (CNF) and then use it to derive the GBB.

Categories:
23 Views

Multiscale convolutional neural network (CNN) has demonstrated remarkable capabilities in solving various vision problems. However, fusing features of different scales always results in large model sizes, impeding the application of mul-
tiscale CNNs in RGB-D saliency detection. In this paper, we propose a customized feature fusion module, called Saliency Enhanced Feature Fusion (SEFF), for RGB-D saliency detection. SEFF utilizes saliency maps of the neighboring scales

Categories:
22 Views

We consider the problem of learning smooth multivariate probability density functions. We invoke the canonical decomposition of multivariate functions and we show that if a joint probability density function admits a truncated Fourier series representation, then the classical univariate Fejér-Riesz Representation Theorem can be used for learning bona fide joint probability density functions. We propose a scalable, flexible, and direct framework for learning smooth multivariate probability density functions even from potentially incomplete datasets.

Categories:
50 Views

In this paper, we propose a restoration method of time-varying graph signals, i.e., signals on a graph whose signal values change over time, using deep algorithm unrolling. Deep algorithm unrolling is a method that learns parameters in an iterative optimization algorithm with deep learning techniques. It is expected to improve convergence speed and accuracy while the iterative steps are still interpretable. In the proposed method, the minimization problem is formulated so that the time-varying graph signal is smooth both in time and spatial domains.

Categories:
72 Views

In the past two decades convex optimization gained increasing popularity in signal processing and communications, as many fundamental problems in this area can be modelled, analyzed and solved using convex optimization theory and algorithms. In emerging large scale applications such as compressed sensing, massive MIMO and machine learning, the underlying optimization problems often exhibit convexity, however, the classic interior point methods do not scale well with the problem dimensions.

Categories:
85 Views

The unlabeled sensing problem is to solve a noisy linear system of equations under unknown permutation of the measurements. We study a particular case of the problem where the

Categories:
18 Views

Simulated annealing (SA) is a widely used approach to solve global optimization problems in signal processing. The initial non-convex problem is recast as the exploration of a sequence of Boltzmann probability distributions, which are increasingly harder to sample from. They are parametrized by a temperature that is iteratively decreased, following the so-called cooling schedule. Convergence results of SA methods usually require the cooling schedule to be set a priori with slow decay. In this work, we introduce a new SA approach that selects the cooling schedule on the fly.

Categories:
12 Views

Simulated annealing (SA) is a widely used approach to solve global optimization problems in signal processing. The initial non-convex problem is recast as the exploration of a sequence of Boltzmann probability distributions, which are increasingly harder to sample from. They are parametrized by a temperature that is iteratively decreased, following the so-called cooling schedule. Convergence results of SA methods usually require the cooling schedule to be set a priori with slow decay. In this work, we introduce a new SA approach that selects the cooling schedule on the fly.

Categories:
13 Views

Pages