Auto-encoder based data clustering. In Iberoamerican Congress on Pattern Recognition, pages 117-124. Springer, 2013.CHUNFENG SONG, FENG LIU, HUANG YONGZHEN, et al. Auto-encoder based data clustering[ C ]//CIARP2013, PartI, 2013 : 117-124....
In this paper, we create a new autoencoder variant to efficiently capture the features of high-dimensional data, and propose an unsupervised deep hashing method for large-scale data retrieval, named as Autoencoder-based Unsupervised Clustering and Hashing (AUCH). By constructing a hashing layer as...
processes, displayed in the schematic inFig. 2. By encoding an input ‘x’, the autoencoder can obtain a new characteristic ‘y’, and then reconstruct the initial input ‘x’, based on the new characteristic ‘y’. The following formula represents the encoding procedure, as shown in Eq....
Autoencoder based methods generalize better and are less prone to overfitting for a data restricted problem like ours, as the number of parameters that are to be learned/estimated is much smaller than the number of learnable parameters in matrix factorization or nuclear norm minimization (more on ...
In this paper, we aim to present a new graph convolutional solution to the nodes clustering task, which help label graph structure data, thereby facilitating the development of deep neural network. To date, several graph convolutional auto-encoder based clustering models have been proposed (Kipf ...
Add a description, image, and links to the autoencoder topic page so that developers can more easily learn about it. Curate this topic Add this topic to your repo To associate your repository with the autoencoder topic, visit your repo's landing page and select "manage topics." Learn...
Transform the data into more clustering-friendly representations: A deep version of k-means is based on learning a data representation and applying k-means in the embedded space. How to represent a cluster: a vector VS an autoencoder network. ...
3d). SCA analysis on setA was also done using a latent space based on kinase targets, but we cannot find any robust association with reference clusters (see https://figshare.com/articles/dataset/Figure_3_from_manuscript_Sparsely-Connected_Autoencoder_SCA_for_single_cell_RNAseq_data_mining/...
So, we’ve integrated both convolutional neural networks and autoencoder ideas for information reduction from image based data. That would be pre-processing step for clustering. In this way, we can apply k-means clustering with 98 features instead of 784 features. This could fasten labeling proce...
Previously, we’ve appliedconventional autoencoderto handwritten digit database (MNIST). That approach was pretty. We can apply same model to non-image problems such as fraud or anomaly detection. If the problem were pixel based one, you might remember thatconvolutional neural networksare more suc...