Documents
Poster
Prompting Label Efficiency in Federated Graph Learning via Personalized Semi-supervision
- Citation Author(s):
- Submitted by:
- Qinghua Mao
- Last updated:
- 1 April 2024 - 10:19pm
- Document Type:
- Poster
- Categories:
- Log in to post comments
Federated graph learning (FGL) enables the collaborative training of graph neural networks (GNNs) in a distributed manner. A critical challenge in FGL is label deficiency, which becomes more intricate due to non-IID decentralized data. Existing methods have focused on extracting knowledge from abundant unlabeled data, leaving few-shot labeled data unexplored. To this end, we propose ConFGL, a novel FGL framework to enhance label efficiency in federated learning with non-IID subgraphs. We formulate a semi-supervised objective to harness both unlabeled and labeled data, where self-supervised learning is achieved via a graph contrastive module. Additionally, a personalized federated learning (FL) strategy is adopted to concurrently train a global model and an individual model, which helps alleviate the representation disparities encoded by local models. Extensive experiments on four node-level datasets under non-IID settings have shown that ConFGL can consistently provide an average of 4.10% accuracy gains over personalized FL methods while maintaining a higher GPU throughput.