Sorry, you need to enable JavaScript to visit this website.

FastGAT: Simple and Efficient Graph Attention Neural Network with Global-aware Adaptive Computational Node Attention

Citation Author(s):
Li Zhang, Xiaofang Zhang
Submitted by:
Shenzhi Yang
Last updated:
12 April 2024 - 5:49am
Document Type:
Poster
Document Year:
2024
Event:
Presenters:
Shenzhi Yang
Categories:
Keywords:
 

Graph attention neural network (GAT) stands as a fundamental model within graph neural networks, extensively employed across various applications. It assigns different weights to different nodes for feature aggregation by comparing the similarity of features between nodes. However, as the amount and density of graph data increases, GAT's computational demands rise steeply. In response, we present FastGAT, a simpler and more efficient graph attention neural network with global-aware adaptive computational node attention. FastGAT assigns a trainable attention weight to each node and updates it adaptively. Experiments show that FastGAT reduces training time on eight public datasets by 6.22\% to 19.50\% while maintaining the same performance. The code is available via https://github.com/szYang2000/FastGAT.

up
0 users have voted: