This paper presents a new classifier, kernel sparse representation-based classifier (KSRC), based on SRC and the kernel trick which is a usual technique in machine learning. KSRC is a nonlinear extension of SRC
These kernel algorithms were actually proposed to improve the performance of sparse representation based classifier (SRC) [27], [33] which uses the linear modeling framework, by exploiting the advantages of projecting data in some high-dimensional space [15]. However, linear representations are ...
In this paper, we propose a kernel representation-based nearest neighbor classifier (KRNNC), which can be regarded as a nonlinear extension of improved nearest neighbor classifier (INNC). What's more, we provide a detailed analysis of the classification mechanism of KRNNC, which is considered ...
4. Kernel Sparse Representation Classifier 4.1. Sparse Representation Classification Let a matrix Ai represent features of the ith class for auxiliary training samples, namely, , where m is the feature dimension, and ni is the number of auxiliary training samples of the ith class. The auxiliary ...
modality (feature type), which indicates the level of significance and contribution of specific feature types in building the kernel-based max-margin classifier. The SPECT features are the most useful features as they are quite discriminative for the task of PD diagnosis. This is also easily ...
accuracy of 86.98% using a KNN classifier. Liu et al.6extracted multiple features from the frequency and time domains, and used sparse representation methods for feature fusion, achieving a classification accuracy of 90.875% with the Sparse Representation based Discriminant Analysis classifier ( SRDA ...
modality (feature type), which indicates the level of significance and contribution of specific feature types in building the kernel-based max-margin classifier. The SPECT features are the most useful features as they are quite discriminative for the task of PD diagnosis. This is also easily ...
The classical approach begins with the optimal Bayes classifier by assuming the normal distribution for the classes which using the linear discriminant analy- sis leads to the Fisher algorithm. The Fisher approach is based on projecting d-dimensional data onto a line with the hope that the ...
Combined techniques can also be applied to the identification, treatment and processing of images, in generating systems based on: extreme Learning Machine and Sparse Representation based classification method, have attracted significant attention due to their respective performance characteristics in computer...
where x12 is the pairwise representation of the sequence pair (X1, X2), and K′(Δ, Δ) is kernel representation that operates on vector data. This pairwise kernel is based on three sequence kernels, a spectrum kernel, a motif, and a pfam kernel. The ROC scores reported for these ker...