[17] R. Kim, C. H. So, M. Jeong, S. Lee, J. Kim, and J. Kang. Hats: A hierarchical graph attention network for stock movement prediction, 2019. [18] T. N. Kipf and M. Welling. Variational graph auto-encoders. NIPS Workshop on Bayesian Deep Learning, 2016. [19] T. N. K...
Junction Tree Variational Autoencoder for Molecular Graph Generation Wengong Jin, Regina Barzilay, Tommi Jaakkola ICML 2018 MolGAN: An implicit generative model for small molecular graphs Nicola De Cao, Thomas Kipf arXiv 1805 Generative Modeling for Protein Structures Namrata Anand, Po-Ssu Huang NeurIPS...
where taking the neighborhood of each pixel into account is critical for the performance of downstream tasks, we introduced a graph convolutional autoencoder that integrates both the gene expression of a cell and that of its neighbors. Our graph-based autoencoder structure decodes both a cell’s...
MGAE (Wang et al., 2017) ultilized marginalized denoising graph auto-encoders.GALA (Park et al., 2019) proposed Laplacian sharpening (inverse of Laplacian smoothing) to alleviate the oversmoothing issue in GNN training.AGE (Cui et al., 2020) employed adaptive learning for the measurement of ...
Variational Graph Auto-Encoders Thomas N. Kipf, Max Welling arXiv 1611 Scalable Graph Embedding for Asymmetric Proximity Chang Zhou, Yuqiong Liu, Xiaofei Liu, Zhongyi Liu, Jun Gao AAAI 2017 Fast Network Embedding Enhancement via High Order Proximity Approximation ...
Collaborative denoising auto-encoders for top-n recommender systems. In Proceedings of the Ninth ACM International Conference on Web Search and Data Mining, San Francisco, CA, USA, 22–25 February 2016; pp. 153–162. [Google Scholar] Wang, X.; He, X.; Wang, M.; Feng, F.; Chua, T...
To address the drawbacks, we propose a Multi-Prior Graph Autoencoder (MPGAE) with ranking-based band selection for HAD. There are three main components: the ranking-based band selection component, the adaptive salient weight component, and the graph autoencoder. First, the ranking-based band ...
Graph Convolutional Networks for Text Classification Liang Yao, Chengsheng Mao, Yuan Luo AAAI 2019 Differentiable Perturb-and-Parse: Semi-Supervised Parsing with a Structured Variational Autoencoder Caio Corro, Ivan Titov ICLR 2019 Structured Neural Summarization Patrick Fernandes, Miltiadis Allamanis, ...
For example, SpaGCN [42] uses graph convolutional network to integrate multi-modal data including gene expression, spatial location of spots/cells, and histology data. STAGATE [43] obtains spatial representation of spots by a graph attention auto-encoder guided by a spatial neighbor network and ...
In21, the maximum correlation entropy was used as the loss function of the deep autoencoder and the critical parameters of the deep autoencoder were optimized to fit the signal characteristics using an artificial fish swarm algorithm; Wang et al.22 used a Gaussian radial basis kernel function ...