论文:Self-Supervised Graph Neural Networks for Improved Electroencephalographic Seizure Analysis 代码:github.com/tsy935/eeg-g 一句话总结 作者提出两种将EEG表示为图结构的方法,并采用一篇ICLR2018的DCGRU模型对该图结构数据进行时空相关性建模。并采用自监督预训练(当前的EEG信号预测下一个的EEG信号)进一步提...
其次,现实世界的数据通常包含噪声,特别是在用户的短期行为中,这可能源于临时意图或误点击,这种噪声对图模型和序列模型的准确性产生负面影响,进一步复杂化了建模过程。 为了克服上述难题,北京大学、香港大学的研究人员提出了一种名为 Self-Supervised Graph Neural Network(SelfGNN)的全新框架,用于序列推荐。 论文链接:htt...
自监督-SelfGNN: Self-supervised Graph Neural Networks without explicit negative sampling 标签:自监督、图神经网络、对比学习 动机 在真实世界中许多数据大部分是有没有标签的,而打上标签的是需要很大花费的 现存的对比学习
论文标题:Self-supervised Graph Neural Networks without explicit negative sampling 论文作者:Zekarias T. Kefato, Sarunas Girdzijauskas 论文来源:2021, WWW 论文地址:download 论文代码:download 1 Introduction 本文核心贡献: 使用孪生网络隐式实现对比学习; ...
We present InfoMotif, a new semi-supervised, motif-regularized, learning framework over graphs. We overcome two key limitations of message passing in popular graph neural networks (GNNs): localization (a k-layer GNN cannot utilize features outside the k-hop neighborhood of the labeled training ...
这篇paper是21年发表在ICML上的工作,主要提出了一个新颖的全图自监督表示学习框架(GraphLoG),除了保留传统图学习所关注的局部相似性之外,还引入了层次原型来捕获全局语义集群(全局-局部视角)。然后文章采用EM算法来学习这个模型,在化学生物等几个领域基准集上获得了不错的效果。 文章链接: https://proceedings.mlr....
Self-Supervised Graph Neural Networks for Sequential Recommendation Yuxi Liu, Lianghao Xia, Chao Huang*SIGIR2024 * denotes corresponding author In this paper, we propose a novel framework called Self-Supervised Graph Neural Network (SelfGNN) for sequential recommendation. TheSelfGNNframework encodes short...
Official repository for ICLR'23 paper: Multi-task Self-supervised Graph Neural Network Enable Stronger Task Generalization - jumxglhf/ParetoGNN
Moreover, current graphbased recommendation models do not fully take into account the direction of information transfer among shared items when calculating their similarities. This paper proposes, TCAUIS, a topic classification augmented user intend self -supervised hypergraph neural network for session -...
most self-supervised graph neural networks only use adjacency matrix as the input topology information of graph and cannot obtain too high-order information since the number of layers of graph neural network is fairly limited. If there are too many layers, the phenomenon of over smoothing will ap...