Ada-GNN: Adapting to Local Patterns for Improving Graph Neural Networks

Published by ACM

Graph Neural Networks (GNNs) have demonstrated strong power in mining various graph-structure data. Since real-world graphs are usually on a large scale, training scalable GNNs has become one of the research trends in recent years. Existing methods only produce one single model to serve all nodes. However, different nodes may exhibit various properties thus require diverse models, especially when the graph is large. Forcing all nodes to share a unified model will decrease the model’s expressiveness. What is worse, some small groups’ patterns are prone to be ignored by the model due to their minority, making these nodes unpredictable and even some raising potential unfairness problems.
In this paper, we propose a model-agnostic framework Ada-GNN that provides personalized GNN models for specific sets of nodes. Intuitively, it is desirable that every node has its own model. But considering the efficiency and scalability of the framework, we generate specific GNN models at the subgroup-level rather than individual node-level. To be specific,Ada-GNN first splits the original graph into several non-overlapped subgroups and tags each node with its subgroup label. After that, a meta adapter is proposed to adapt a base GNN model to each subgroup rapidly. To better facilitate the global-to-local knowledge adaption,we design a feature enhancement module that captures the distinctions among different subgroups to improve the Ada-GNN’s performance. Ada-GNN is model-agnostic and can be equipped to almost all existing scalable GNN based methods such as GraphSAGE, ClusterGCN, SIGN, and SAGN.We conduct extensive experiments with six popular scalable GNN as base methods on two large-scale datasets, and the results consistently demonstrate the generality and superiority ofAda-GNN.