Skip to main content

Graph Contrastive Multi-view Learning: A Pre-training Framework for Graph Classification.

Adjeisah, M., Zhu, X., Xu, H. and Ayall, T. A., 2024. Graph Contrastive Multi-view Learning: A Pre-training Framework for Graph Classification. Knowledge-Based Systems, 299, 112112.

Full text available as:

[img]
Preview
PDF (OPEN ACCESS ARTICLE)
1-s2.0-S0950705124007469-main (1).pdf - Published Version
Available under License Creative Commons Attribution.

1MB

DOI: 10.1016/j.knosys.2024.112112

Abstract

Recent advancements in node and graph classification tasks can be attributed to the implementation of contrastive learning and similarity search. Despite considerable progress, these approaches present challenges. The integration of similarity search introduces an additional layer of complexity to the model. At the same time, applying contrastive learning to non-transferable domains or out-of-domain datasets results in less competitive outcomes. In this work, we propose maintaining domain specificity for these tasks, which has demonstrated the potential to improve performance by eliminating the need for additional similarity searches. We adopt a fraction of domain-specific datasets for pre-training purposes, generating augmented pairs that retain structural similarity to the original graph, thereby broadening the number of views. This strategy involves a comprehensive exploration of optimal augmentations to devise multi-view embeddings. An evaluation protocol, which focuses on error minimization, accuracy enhancement, and overfitting prevention, guides this process to learn inherent, transferable structural representations that span diverse datasets. We combine pre-trained embeddings and the source graph as a beneficial input, leveraging local and global graph information to enrich downstream tasks. Furthermore, to maximize the utility of negative samples in contrastive learning, we extend the training mechanism during the pre-training stage. Our method consistently outperforms comparative baseline approaches in comprehensive experiments conducted on benchmark graph datasets of varying sizes and characteristics, establishing new state-of-the-art results.

Item Type:Article
ISSN:0950-7051
Uncontrolled Keywords:Contrastive learning; Graph classification; Graph neural network; Multi-view representation learning; Pre-trained embeddings
Group:Faculty of Media & Communication
ID Code:40526
Deposited By: Symplectic RT2
Deposited On:21 Nov 2024 10:43
Last Modified:21 Nov 2024 10:43

Downloads

Downloads per month over past year

More statistics for this item...
Repository Staff Only -