While the most recent work, for the first time, achieved\n100% validity, its architecture is rather complex due to auxiliary neural\nnetworks other than VAE, making it difficult to train. This paper presents a\nmolecular hypergraph grammar variational autoencoder (MHG-VAE), which uses a\n...
Liu and co-workers used persistent spectral hypergraph based molecular descriptors for their machine learning approaches to predict protein-ligand binding affinities. Vaisman and Masson have developed a computational geometry approach using Delaunay tessellations to capture the influence of the four nearest ...
Deep neural networks consist of several layers of encoders mapping the input data into a low-dimensional manifold, from which the following decoder layers can reconstruct denoised, full-rank data. The applications in scRNA-seq include denoising single-cell transcriptome [95,122], batch effect remov...
Solutions to the handling of multi-valent bonds have been introduced via the use ofhypergraphs; in a hypergraph, edges are sets of at least two atoms (hyperedges) instead of tuples of atoms [37]. However, the use of hypergraphs is not further discussed here as they are not currently wide...