Sorry, you need to enable JavaScript to visit this website.

Network data can be conveniently modeled as a graph signal, where data values are assigned to the nodes of a graph describing the underlying network topology. Successful learning from network data requires methods that effectively exploit this graph structure. Graph neural networks (GNNs) provide one such method and have exhibited promising performance on a wide range of problems. Understanding why GNNs work is of paramount importance, particularly in applications involving physical networks.

Categories:
1 Views

An increasing number of distributed machine learning applications require efficient communication of neural network parameterizations. DeepCABAC, an algorithm in the current working draft of the emerging MPEG-7 part 17 standard for compression of neural networks for multimedia content description and analysis, has demonstrated high compression gains for a variety of neural network models. In this paper we propose a method for employing DeepCABAC in a Federated Learning scenario for the exchange of intermediate differential parameterizations.

Categories:
31 Views

Phoneme boundary detection plays an essential first step for a variety of speech processing applications such as speaker diarization, speech science, keyword spotting, etc. In this work, we propose a neural architecture coupled with a parameterized structured loss function to learn segmental representations for the task of phoneme boundary detection. First, we evaluated our model when the spoken phonemes were not given as input.

Categories:
24 Views

Building upon advances on optimal transport and anomaly detection, we propose a generalization of an unsupervised and automatic method for detection of significant deviation from reference signals. Unlike most existing approaches for anomaly detection, our method is built on a non-parametric framework exploiting the optimal transportation to estimate deviation from an observed distribution.

Categories:
41 Views

Pages