COSTA:Covariance-Preserving Feature Augmentation for Graph Contrastive Learning

Yifei Zhang, Hao Zhu, Zixing Song, Piotr Koniusz, Irwin King

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

140 Citations (Scopus)

Abstract

Graph contrastive learning (GCL) improves graph representation learning, leading to SOTA on various downstream tasks. The graph augmentation step is a vital but scarcely studied step of GCL. In this paper, we show that the node embedding obtained via the graph augmentations is highly biased, somewhat limiting contrastive models from learning discriminative features for downstream tasks. Thus, instead of investigating graph augmentation in the input space, we alternatively propose to perform augmentations on the hidden features (feature augmentation). Inspired by so-called matrix sketching, we propose COSTA, a novel Covariance-preServing feaTure space Augmentation framework for GCL, which generates augmented features by maintaining a "good sketch" of original features. To highlight the superiority of feature augmentation with COSTA, we investigate a single-view setting (in addition to multi-view one) which conserves memory and computations. We show that the feature augmentation with COSTA achieves comparable/better results than graph augmentation based models.
Original languageEnglish
Title of host publicationKDD '22: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining
Pages2524-2534
Number of pages11
ISBN (Electronic)9781450393850
DOIs
Publication statusPublished - 14 Aug 2022
Event28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, - Washington, United States
Duration: 14 Aug 202218 Aug 2022

Conference

Conference28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining,
Abbreviated titleKDD 2022
Country/TerritoryUnited States
CityWashington
Period14/08/2218/08/22

Fingerprint

Dive into the research topics of 'COSTA:Covariance-Preserving Feature Augmentation for Graph Contrastive Learning'. Together they form a unique fingerprint.

Cite this