Sorry, you need to enable JavaScript to visit this website.

We investigate the use of entropy-regularized optimal transport (EOT) cost in developing generative models to learn implicit distributions. Two generative models are proposed. One uses EOT cost directly in an one-shot optimization problem and the other uses EOT cost iteratively in an adversarial game. The proposed generative models show improved performance over contemporary models on scores of sample based test.

Categories:
17 Views

We reveal an interesting link between tensors and multivariate statistics. The rank of a multivariate probability tensor can be interpreted as a nonlinear measure of statistical dependence of the associated random variables. Rank equals one when the random variables are independent, and complete statistical dependence corresponds to full rank; but we show that rank as low as two can already model strong statistical dependence.

Categories:
105 Views

Deep neural networks (DNNs) have found applications in diverse signal processing (SP) problems. Most efforts either directly adopt the DNN as a black-box approach to perform certain SP tasks without taking into account of any known properties of the signal models, or insert a pre-defined SP operator into a DNN as an add-on data processing stage. This paper presents a novel hybrid-NN framework in which one or more SP layers are inserted into the DNN architecture in a coherent manner to enhance the network capability and efficiency in feature extraction.

Categories:
10 Views

Many state-of-the-art machine learning models such as deep neural networks have recently shown to be vulnerable to adversarial perturbations, especially in classification tasks. Motivated by adversarial machine learning, in this paper we investigate the robustness of sparse regression models with strongly correlated covariates to adversarially designed measurement noises. Specifically, we consider the family of ordered weighted L1 (OWL) regularized regression methods and study the case of OSCAR (octagonal shrinkage clustering algorithm for regression) in the adversarial setting.

Categories:
102 Views

This paper presents a simple yet effective method to improve the visual quality of Generative Adversarial Network (GAN) generated images. In typical GAN architectures, the discriminator block is designed mainly to capture the class-specific content from images without explicitly imposing constraints on the visual quality of the generated images. A key insight from the image quality assessment literature is that natural scenes possess a very unique local structural and (hence) statistical signature, and that distortions affect this signature.

Categories:
49 Views

Graph convolutional networks adapt the architecture of convolutional neural networks to learn rich representations of data supported on arbitrary graphs by replacing the convolution operations of convolutional neural networks with graph-dependent linear operations. However, these graph-dependent linear operations are developed for scalar functions supported on undirected graphs. We propose both a generalization of the underlying graph and a class of linear operations for stochastic (time-varying) processes on directed (or undirected) graphs to be used in graph convolutional networks.

dsw.pdf

PDF icon poster (366)
Categories:
4 Views

Network science provides valuable insights across
numerous disciplines including sociology, biology, neuroscience
and engineering. A task of major practical importance in these
application domains is inferring the network structure from
noisy observations at a subset of nodes. Available methods for
topology inference typically assume that the process over the
network is observed at all nodes. However, application-specific
constraints may prevent acquiring network-wide observations.

Categories:
14 Views

False discovery rate (FDR) control is highly desirable in several high-dimensional estimation problems. While solving such problems, it is observed that traditional approaches such as the Lasso select a high number of false positives, which increase with higher noise and correlation levels in the dataset. Stability selection is a procedure which uses randomization with the Lasso to reduce the number of false positives.

Categories:
11 Views

Pages