最近研读了和GNTK相关的paper。今天想先讲第一篇GNTK的鼻祖: Graph Neural Tangent Kernel [2]. 这篇发在NeurIPS 2019上,虽然感觉引用的人不多,但是想法真的不错,因为紧跟了当时18年NeurIPS的震惊中外的NTK [1]…
我们知道用小学习率的梯度下降算法训练很宽的网络意味着由Neural Tangent Kernel (NTK)来决定网络的动力学。这一个结论首先在全连接网络中被观察到,之后被应用到卷积网络中。最近,图神经网络(graph neural network)异常火热,那么无限宽的Graph neural network (GNN)是否也由NTK来决定呢?如果是,那么它在实际测试表现...
最近深入研究了GNTK系列论文,这里将重点解析GNTK系列论文的鼻祖:《Graph Neural Tangent Kernel》[2]。论文发表在2019年的NeurIPS上,尽管引用量可能不多,但其创新思路值得赞扬,紧跟了18年NeurIPS上引发轰动的NTK [1],成功将NTK应用到图分类领域。本文将带你理解论文中的理论推导过程。首先,我们需要...
While graph kernels (GKs) are easy to train and enjoy provable theoretical guarantees, their practical performances are limited by their expressive power, as the kernel function often depends on hand-crafted combinatorial features of graphs. Compared to graph kernels, graph neural networks (GNNs) ...
This repository implements Graph Neural Tangent Kernel (infinitely wide multi-layer GNNs trained by gradient descent), described in the following paper: Simon S. Du, Kangcheng Hou, Barnabás Póczos, Ruslan Salakhutdinov, Ruosong Wang, Keyulu Xu. Graph Neural Tangent Kernel: Fusing Graph Neural Net...
[NeurIPS 2019] Graph Neural Tangent Kernel: Fusing Graph Neural Networks with Graph Kernels [paper] [KDD 2019] Stability and Generalization of Graph Convolutional Neural Networks [paper] [Neural Networks] The Vapnik–Chervonenkis dimension of graph and recursive neural networks [paper] Other Related ...
Graph Neural Tangent Kernel: Fusing Graph Neural Networks with Graph Kernels. NeurIPS 2019. paper Simon Du, Kangcheng Hou, Russ Salakhutdinov, Barnabas Poczos, Ruosong Wang, Keyulu Xu. SNEQ: Semi-supervised Attributed Network Embedding with Attention-based Quantisation. AAAI 2020. paper Tao He, ...
By establishing a novel connection between such kernels and the graph neural tangent kernel (GNTK), we introduce the first GNN confidence bound and use it to design a phased-elimination algorithm with sublinear regret. Our regret bound depends on the GNTK's maximum information gain, which we ...
图神经正切核 (Graph Neural Tangent Kernel,GNTK): 为了分析图神经网络的可训练性和表达能力,深度学习理论研究工作者将神经正切核(NTK)理论推广到图数据处理问题上,并提出了图神经正切核模型[1],它将这种过度参数化的图神经网络表示为提取非线性特征的线性化模型。形式上,图神经正切核属于图核方法,拥有优化简单、...
Quantum Graph Neural Network (QGNN):通过量子电路捕捉图结构信息,利用量子态表达图节点的特征。此方法能够处理小规模图数据,但在处理大规模真实世界数据时,由于量子比特数量的限制,其应用受到很大局限。 Graph Quantum Neural Tangent Kernel (GraphQNTK):这一模型基于图神经切核,利用无限宽度的量子图神经网络进行图分...