Abstract
Data augmentation has been widely used in machine learning for natural language processing and computer vision tasks to improve model performance.
However, little research has studied data augmentation on graph neural networks, particularly using augmentation at both train- and test-time.
Inspired by the success of augmentation in other domains, we have designed a method for social influence prediction using graph neural networks with
train- and test-time augmentation, which can effectively generate multiple augmented graphs for social networks by utilising a variational graph autoencoder in both scenarios. We have evaluated the performance of our method on predicting user influence on multiple social network datasets. Our experimental results show that our end-to-end approach, which jointly trains a graph autoencoder and social influence behaviour classification network, can outperform state-of-the-art approaches, demonstrating the effectiveness of train- and test-time augmentation on graph neural networks for social influence prediction. We observe that this is particularly effective on smaller graphs.
However, little research has studied data augmentation on graph neural networks, particularly using augmentation at both train- and test-time.
Inspired by the success of augmentation in other domains, we have designed a method for social influence prediction using graph neural networks with
train- and test-time augmentation, which can effectively generate multiple augmented graphs for social networks by utilising a variational graph autoencoder in both scenarios. We have evaluated the performance of our method on predicting user influence on multiple social network datasets. Our experimental results show that our end-to-end approach, which jointly trains a graph autoencoder and social influence behaviour classification network, can outperform state-of-the-art approaches, demonstrating the effectiveness of train- and test-time augmentation on graph neural networks for social influence prediction. We observe that this is particularly effective on smaller graphs.
Original language | English |
---|---|
Title of host publication | International Joint Conference on Neural Networks 2021 (IJCNN 2021) |
Publisher | Institute of Electrical and Electronics Engineers (IEEE) |
Number of pages | 8 |
ISBN (Electronic) | 978-1-6654-3900-8 |
ISBN (Print) | 978-1-6654-4597-9 |
DOIs | |
Publication status | Published - 20 Sept 2021 |
Event | The International Joint Conference on Neural Networks 2021 (IJCNN 2021) - Duration: 18 Jun 2021 → 22 Jun 2021 https://www.ijcnn.org/ |
Publication series
Name | |
---|---|
Publisher | IEEE |
ISSN (Print) | 2161-4393 |
ISSN (Electronic) | 2161-4407 |
Conference
Conference | The International Joint Conference on Neural Networks 2021 (IJCNN 2021) |
---|---|
Period | 18/06/21 → 22/06/21 |
Internet address |
Keywords
- training
- computer vision
- social networking (online)
- computational modeling
- prediction methods
- graph neural networks
- natural language processing