To accommodate this issue, we propose Probability Graph Complementation Contrastive Learning (PGCCL) which adaptively constructs the complementation graph. We employ Beta Mixture Model (BMM) to distinguish intra-class similarity and inter-class similarity. Based on the posterior probability, we construct ...
对比学习的核心思想是通过样本间的相似性与差异性来学习数据表示。传统的监督学习依赖标注的数据,而对比...
Feng S, Jing B, Zhu Y, Tong H (2022) Adversarial graph contrastive learning with information regularization. In: Proceedings of the ACM Web Conference 2022, pp. 1362–1371 Zügner D, Akbarnejad A, Günnemann S (2018) Adversarial attacks on neural networks for graph data. In: Proceedings ...
However, previous methods do not propose a solution to these differences that would alter the properties of the original graph. They often assume similarity between the graphs before and after data enhancement. Consequently, this assumption in existing GCL methods may lead to the collapse of ...
1.模态交互(Interaction)定义 指模态之间信息的相互作用与共享,探索模态间的语义关系。重点在于如何通过...
Specifically, we construct an augmentation graph by calculating the feature similarity of nodes to capture latent structural information. For the original graph and the augmentation graph, we employ a shared Graph Neural Network (GNN) encoder to extract the semantic features of nodes with different ...
B. MolCLR: molecular contrastive learning of representations via graph neural networks. CodeOcean https://doi.org/10.24433/CO.8582800.v1 (2021). Chen, T., Kornblith, S., Swersky, K., Norouzi, M. & Hinton, G. Big self-supervised models are strong semi-supervised learners. Preprint at ...
Self-supervised graph representation learning has recently shown considerable promise in a range of fields, including bioinformatics and social networks. A large number of graph contrastive learning approaches have shown promising performance for representation learning on graphs, which train models by maximi...
As ‖zi−zj‖22=‖zi‖22+‖zj‖22−2zi⋅zj=2−2zi⋅zj, the following similarity function can be used as an efficient substitute: (7)distance(zi,zj)=ezi⋅zj/τ. 3.2. Self supervised clustering by confidence boosting Graph clustering is essentially unsupervised. To this end, we...
Graph neural networks (GNNs) possess the advantage of leveraging both node attributes and graph topology, rendering them potent in modeling graph-structured data [14]. LPIs can be naturally modeled as a graph, with lncRNAs and proteins serving as nodes, and known interactions forming edges. Theref...