Linear dimensionality reduction (LDR) techniques are quite important in pattern recognition due to their linear time complexity and simplicity. In this paper, we present a novel LDR technique which, though linear, aims to maximize the Chernoff distance in the transformed space; thus, augmenting the...
ANon-LinearDimensionality-ReductionTechnique forFastSimilaritySearchinLargeDatabases KhanhVuKienA.HuaHaoChengSheau-DongLang SchoolofElectricalEngineeringandComputerScience UniversityofCentralFlorida Orlando,FL32816-2362 ABSTRACT Toenableefficientsimilaritysearchinlargedatabases,many indexingtechniquesusealineartransformation...
On the Performance of Chernoff-Distance-Based Linear Dimensionality Reduction Techniques Duin (LD) method, which aims to maximize a criterion derived from the Chernoff distance in the original space, and the one introduced by Rueda and Herrera (RH), which aims to maximize the Chernoff distance in...
Modern techniques for optimization over matrix manifolds enable a generic linear dimensionality reduction solver, which accepts as input data and an objective to be optimized, and returns, as output, an optimal low-dimensional projection of the data. This simple optimization framework further allows ...
(2014). A Review on Linear and Non-Linear Dimensionality Reduction Techniques. Machine Learning and Applications: An International Journal, 1(1), 65-76. Retrieved from http://airccse.org/journal/mlaij/papers/1114mlaij06.pdfArunasakthi. K and Kamatchipriya. L(2014), `A Review On Linear ...
Linear discriminant analysis (LDA) and principal component analysis (PCA) are both dimensionality reduction techniques, but they serve different purposes and are used in different contexts. Let’s discuss the differences between LDA and PCA in various aspects:...
Linear Discriminant Analysis (LDA) is a dimensionality reduction technique. As the name implies dimensionality reduction techniques reduce the number of dimensions (i.e. variables) in a dataset while retaining as much information as possible. For instance, suppose that we plotted the relationship betwe...
Dimensionality reduction is concerned with the problem of mapping data points that lie on or near a low-dimensional manifold in a high-dimensional data space to a low-dimensional embedding space. Traditional techniques such as principal component analysis (PCA) and multidimensional scaling (MDS) have...
In particular, we exploit the properties of random projections in generic sensor devices and we take some first steps in introducing linear dimensionality reduction techniques in the compressed domain. We design a classification framework that consists in embedding the low dimensional classification space ...
Manifold learning can be considered as the family of approaches to non-lineardimensionality reduction. Where most of the approaches in manifold learning are dependent on the idea that the data we are having is only artificially high. And we know that these kinds of datasets are very difficult to...