We regularize Gaussian mixture Bayesian (GMB) classifier in terms of the following two points: 1) class-conditional probability density functions, and 2) complexity as a classifier. For the former, we employ the
Semi-discrete optimal transport problems, which evaluate the Wasserstein distance between a discrete and a generic (possibly non-discrete) probability meas
The widespread applications of high-throughput sequencing technology have produced a large number of publicly available gene expression datasets. However, due to the gene expression datasets have the characteristics of small sample size, high dimensional
Fast exact leave-one-out cross-validation of sparse least-squares support vector machines Neural Networks (2004) J. Yuan et al. Adaptive spherical gaussian kernel in sparse bayesian learning framework for nonlinear regression Expert Systems with Applications (2009) J. Yuan et al. Reliable multi-obj...
recently, deep convolutional neural networks [18] have also shown powerful capability to remove AWGN. The mixture of IN and AWGN, however, makes the denois- ing problem much more difficult because of the very different properties of the two types of noises. A few methods have ...
An alternative approach to supervised learning is the semi-supervised learning (SSL) approach, where the learning algorithm can exploit assumptions of the data to learn from a few pieces of labeled data [4]. The most common assumptions imposed on the data in SSL are the cluster, smoothness, ...
Mixture-of-experts (MoE) is an ensemble-based classification technique proposed by [6]. MoE is a probabilistic model composed of a set of networks that stratifies the input space and assigns a local classifier to each partition, leading to a “divide-and-conquer” strategy. MoE has been use...
Mixture-of-experts (MoE) is an ensemble-based classification technique proposed by [6]. MoE is a probabilistic model composed of a set of networks that stratifies the input space and assigns a local classifier to each partition, leading to a “divide-and-conquer” strategy. MoE has been use...
Various concrete regression techniques can be adopted for this purpose, such as linear regression, support vector machine (SVM) [40,41], and neural networks (NNs) [42]. Then, MVs are replaced with the conditional expectation of the regression results. Local least squares (LLS) regression is ...