Documents
Poster
Improving Dynamic Graph Convolutional Network with Fine-grained Attention Mechanism
- Citation Author(s):
- Submitted by:
- Bo Wu
- Last updated:
- 4 May 2022 - 11:43pm
- Document Type:
- Poster
- Document Year:
- 2022
- Event:
- Presenters:
- Bo Wu
- Paper Code:
- MLSP-29.5
- Categories:
- Log in to post comments
Graph convolutional network (GCN) is a novel framework that utilizes a pre-defined Laplacian matrix to learn graph data effectively. With its powerful nonlinear fitting ability, GCN can produce high-quality node embedding. However, generalized GCN can only handle static graphs, whereas a large number of graphs are dynamic and evolve over time, which limits the application field of GCN. Facing the challenge, GCN with recurrent neural network (e.g., RNN) is naturally combined to acquire dynamic graph changes through joint training. However, these methods must use the node information during the entire timeline and ignore two subtle factors: the influence of nodes change with time and are related to the frequency of events. Therefore, we propose a stable and scalable dynamic GCN method using a fine-grained attention mechanism named FADGC. We use GCN to obtain static node vectors at each timestep and integrate node influence factors with multi-head attention for graph time-series learning. Experiments on multiple datasets show that our approach can better capture the inherent special characteristics of different dynamic graphs and achieve higher performance compared with related approaches.