Document Type
Article
Publication Date
2008
Publication Title
Pattern Recognition
Volume
41
Issue
10
First Page
3138
Keywords
Linear dimensionality reduction, Pattern classification Discriminant analysis
Last Page
3152
Abstract
Linear dimensionality reduction (LDR) techniques are quite important in pattern recognition due to their linear time complexity and simplicity. In this paper, we present a novel LDR technique which, though linear, aims to maximize the Chernoff distance in the transformed space; thus, augmenting the class separability in such a space. We present the corresponding criterion, which is maximized via a gradient-based algorithm, and provide convergence and initialization proofs. We have performed a comprehensive performance analysis of our method combined with two well-known classifiers, linear and quadratic, on synthetic and real-life data, and compared it with other LDR techniques. The results on synthetic and standard real-life data sets show that the proposed criterion outperforms the latter when combined with both linear and quadratic classifiers.
DOI
10.1016/j.patcog.2008.01.016
Recommended Citation
Rueda, Luis and Herrera, Myriam. (2008). Linear Dimensionality Reduction by Maximizing the Chernoff Distance in the Transformed Space. Pattern Recognition, 41 (10), 3138-3152.
https://scholar.uwindsor.ca/computersciencepub/9
Comments
NOTICE: this is the author’s version of a work that was accepted for publication in Pattern Recognition. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Pattern Recognition, 41 (10), 2008 and is available here.