而为了继续提高Graph pooling,我们通过SAGPool使用拓扑结构和特征来更简单的学习层级表示。 二.模型 SAGPool的核心在于使用GNN计算self-attention,SAGPool层如下图: An illustration of the SAGPool layer. Self-attention mask 具体使用图卷积计算self-attention值,以Semi-Supervised Classification with Graph Convolutional ...
由此我们提出了一种 SAGPool模型,是一种 Self-Attention Graph Pooling method,我们的方法可以用一种End2End的方式学习结构层次信息,Self-attention结构可以区分哪些节点应该丢弃,哪些应该保留。因为Self-attention结构使用了Graph convolution来计算attention分数,Node features以及Graph topology都被考虑进去,简而言之,SAGPool...
论文标题:Self-Attention Graph Pooling 论文作者:Junhyun Lee, Inyeop Lee, Jaewoo Kang 论文来源:2019, ICML 论文地址:download 论文代码:download 1 Preamble 对图使用下采样 downsampling (pooling)。 2 Introduction 图池化三种类型: Topology based pooling; ...
(1)[DIFFPOOL] Hierarchical Graph Representation Learning with Differentiable Pooling,NeurIPS 2018 DIFFPOOL是一种可微的图池化方法,能够以端到端的方式学习assignment矩阵:S(l)∈Rnl×nl+1。 (2)Graph u-net,ICML 2019 gPool实现了与DiffPool相当的性能。 为了进一步改进图池化方法,文中提出了SAGPool,它可以使用...
ICML 2019 Junhyun Lee, Inyeop Lee, Jaewoo Kang Machine Learning (cs.LG); Machine Learning (stat.ML) 论文地址 本文介绍的论文是《Self-Attention Graph Pooling》。 该篇文章提出了一种新的图池化方式,该池化方式是基于自注意力机制来计算得分,同时考虑了节点特征和图的拓扑结构,能够实现端...
ICML2019 Self-Attention Graph Pooling & Ji的方法[link]解决了复杂度的问题,但是没有考虑图的拓扑结构。 由此我们提出了一种SAGPool模型,是一种Self-AttentionGraphPoolingmethod,我们的方法可以用一种End2End的方式学习结构层次信息,Self-attention结构可以区分哪些节点应该丢弃,哪些应该保留。因为Self-attention结构使用...
SAGPool - Self-Attention Graph Pooling 图分类 图池化方法 ICML 2019,程序员大本营,技术文章内容聚合第一站。
[TNNLS 2023] Self-supervised Learning IoT Device Features with Graph Contrastive Neural Network for Device Classification in Social Internet of Things [paper] [TKDE 2023] Feature-Level Deeper Self-Attention Network With Contrastive Learning for Sequential Recommendation [paper] [AAAI 2023] Recommend What...
Exploring Self-attention for Image Recognition Hengshuang Zhao CUHK Jiaya Jia CUHK Vladlen Koltun Intel Labs Abstract Recent work has shown that self-attention can serve as a basic building block for image recognition models. We explore variations of self-attention and assess their effec- tiveness ...
Name:Self-Attention Graph Pooling Links:https://arxiv.org/abs/1904.08082 Github:https://github.com/inyeoplee77/SAGPool Conference:ICML2019 0 Abstrcat 将深度学习的框架迁移到结构化的数据近期也是个热点,近期的一些学习都在将卷积+pool迁移到非结构化的数据中【模仿CNN,目前LSTM,GRU还没有人模仿】,将卷积...