Sorry, you need to enable JavaScript to visit this website.

Personalized PageRank Graph Attention Networks

Citation Author(s):
Submitted by:
Julie Choi
Last updated:
5 May 2022 - 3:09am
Document Type:
Poster
Document Year:
2022
Event:
Presenters:
Julie Choi
Paper Code:
9085
 

There has been a rising interest in graph neural networks (GNNs) for representation learning over the past few years. GNNs provide a general and efficient framework to learn from graph-structured data. However, GNNs typically only use the information of a very limited neighborhood for each node to avoid over-smoothing. A larger neighborhood would be desirable to provide the model with more information. In this work, we incorporate the limit distribution of Personalized PageRank (PPR) into graph attention networks (GATs) to reflect the larger neighbor information without introducing over-smoothing. Intuitively, message aggregation based on Personalized PageRank corresponds to infinitely many neighborhood aggregation layers. We show that our models outperform a variety of baseline models for four widely used benchmark datasets. Our implementation is publicly available online at https://github.com/juliechoi12/pprgat.

up
1 user has voted: Julie Choi