Its All Graph To Me: Single-Model Graph Representation Learning on Multiple Domains

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

Abstract

Graph neural networks (GNNs) have revolutionised the field of graph representation learning and plays a critical role in graph-based research. Recent work explores applying GNNs to pre-training and fine-tuning, where a model is trained on a large dataset and its learnt representations are then transferred to a smaller dataset. However, current work only explore pre-training on a single domain; for example, a model pre-trained on molecular graphs is fine-tuned on other molecular graphs. This leads to poor generalisability of pre-trained models to novel domains and tasks.

In this work, we curate a multi-graph-domain dataset and apply state-of-the-art Graph Adversarial Contrastive Learning (GACL) methods. We present a pre-trained graph model that may have the capability of acting as a foundational graph model. We will evaluate the efficacy of its learnt representations on various downstream tasks against baseline models pre-trained on single domains. In addition, we aim to compare our model to un-trained and non-transferred models, and show that performance using our foundational model is capable of achieving equal or better than task-specific methodology.
Original languageEnglish
Title of host publicationNeurIPS 2023 Workshop
Subtitle of host publicationNew Frontiers in Graph Learning
Place of PublicationNew Orleans
PublisherOpenReview
Pages1-28
Number of pages28
Publication statusPublished - 28 Oct 2023
Event37th Conference on Neural Information Processing Systems, NeurIPS 2023 - New Orleans, United States
Duration: 10 Dec 202316 Dec 2023

Conference

Conference37th Conference on Neural Information Processing Systems, NeurIPS 2023
Country/TerritoryUnited States
CityNew Orleans
Period10/12/2316/12/23

Fingerprint

Dive into the research topics of 'Its All Graph To Me: Single-Model Graph Representation Learning on Multiple Domains'. Together they form a unique fingerprint.

Cite this