No Change, No Gain: Empowering Graph Neural Networks with Expected Model Change Maximization for Active Learning

Zixing Song, Yifei Zhang, Irwin King

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

Abstract

Graph Neural Networks (GNNs) are crucial for machine learning applications with graph-structured data, but their success depends on sufficient labeled data. We present a novel active learning (AL) method for GNNs, extending the Expected Model Change Maximization (EMCM) principle to improve prediction performance on unlabeled data. By presenting a Bayesian interpretation for the node embeddings generated by GNNs under the semi-supervised setting, we efficiently compute the closed-form EMCM acquisition function as the selection criterion for AL without re-training. Our method establishes a direct connection with expected prediction error minimization, offering theoretical guarantees for AL performance. Experiments demonstrate our method's effectiveness compared to existing approaches, in terms of both accuracy and efficiency.
Original languageEnglish
Title of host publicationThe Thirty-seventh Annual Conference on Neural Information Processing Systems
Editors A. Oh , T. Naumann, A. Globerson, K. Saenko, M. Hardt, S. Levine
PublisherCurran Associates, Inc
Pages47511-47526
ISBN (Electronic)9781713899921
Publication statusPublished - 10 Dec 2023
Event37th Conference on Neural Information Processing Systems, NeurIPS 2023 - New Orleans, United States
Duration: 10 Dec 202316 Dec 2023

Conference

Conference37th Conference on Neural Information Processing Systems, NeurIPS 2023
Country/TerritoryUnited States
CityNew Orleans
Period10/12/2316/12/23

Fingerprint

Dive into the research topics of 'No Change, No Gain: Empowering Graph Neural Networks with Expected Model Change Maximization for Active Learning'. Together they form a unique fingerprint.

Cite this