Kernel Graph Convolutional Neural Networks: 27th International Conference on Artificial Neural Networks, Rhodes, Greece, October 4-7, 2018, Proceedings, Part IGraph kernels have been successfully applied to many graph classification problems. Typically, a kernel is first designed, and then an SVM ...
Kernel Attention Based Multi-scale Adaptive Graph Convolutional Neural Network for Skeleton-Based Graph convolution network is widely used in skeleton-based action recognition tasks. To improve the adaptability of the graph, the model should learn the s... Y Liu,H Zhang,D Xu - IEEE International...
在这篇论文中,作者提出了一种更加通用的池化框架,以核函数的形式捕捉特征之间的高阶信息。同时也证明了使用无参数化的紧致清晰特征映射,以指定阶形式逼近核函数,例如高斯核函数。本文提出的核函数池化可以和CNN网络联合优化。 Network Structure Overview Kernel Poolin
achievesstate-of-the-artresultsinthetaskofgraph- basedsemi-supervisedclassificationacrossthree benchmarkdatasets:Cora,CiteseerandPubmed. 1Introduction Convolutionalneuralnetworks(CNNs)[LeCunetal.,1998] havebeensuccessfullyusedinvariousmachinelearning
Graph-BERT [9] utilizes a language model-based embedding SeqVec to represent protein sequences and a graph convolutional neural network with the training strategy of subgraph batches using a top-k intimacy sampling approach. The Ensemble Residual Convolutional Neural Network (EResCNN) [10] model ...
The recognition performance of convolutional neural networks has surpassed that of humans in many computer vision areas. However, there is a large number of parameter redundancy in deep neural networks, especially the weights of the convolutional kernels. In this work, we propose a simple Diagonal-...
The initialization of Convolutional Neural Networks (CNNs) is about providing reasonable initial values for the convolution kernels and the fully connected layers. In this paper, we proposed a convolution kernel initialization method based on the two-dimensional principal component analysis (2DPCA), in...
This paper introduces dynamic kernel convolutional neural networks (DK-CNNs), an enhanced type of CNN, by performing line-by-line scanning regular convolution to generate a latent dimension of kernel weights. The proposed DK-CNN applies regular convolution to the DK weights, which rely on a ...
Neural networks based on convolutional operations have achieved remarkable results in the field of deep learning, but there are two inherent flaws in standard convolutional operations. On the one hand, the convolution operation be confined to a local window and cannot capture information from other lo...
While kernel functions are implicitly seen as an infinite-width neural network in the aforementioned methods, some methods use kernel approximation techniques [26, 37] to explicitly construct one layer of a neural network by a kernel function. For example, convolutional kernel networks that link conv...