Sorry, you need to enable JavaScript to visit this website.

Learning Flexible Representations of Stochastic Processes on Graphs

Primary tabs

Citation Author(s):
Brian M. Sadler, Radu V. Balan
Submitted by:
Addison Bohannon
Last updated:
30 May 2018 - 1:35pm
Document Type:
Poster
Document Year:
2018
Event:
Presenters Name:
Addison Bohannon
Paper Code:
DSW18001

Abstract 

Abstract: 

Graph convolutional networks adapt the architecture of convolutional neural networks to learn rich representations of data supported on arbitrary graphs by replacing the convolution operations of convolutional neural networks with graph-dependent linear operations. However, these graph-dependent linear operations are developed for scalar functions supported on undirected graphs. We propose both a generalization of the underlying graph and a class of linear operations for stochastic (time-varying) processes on directed (or undirected) graphs to be used in graph convolutional networks. By parameterizing the proposed linear operations using functional calculus, we can achieve arbitrarily low learning complexity. The combined graphical model and filtering approach is shown to model richer behaviors and display greater flexibility in learning representations than product graph methods.

dsw.pdf

PDF icon poster (237)
up
0 users have voted: