A support vector machine formulation to PCA analysis and its kernel version

In this paper, we present a simple and straightforward primal-dual support vector machine formulation to the problem of principal component analysis (PCA) in dual variables. By considering a mapping to a high-dimensional feature space and application of the kernel trick (Mercer theorem), kernel PCA...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on neural networks Vol. 14; no. 2; pp. 447 - 450
Main Authors: Suykens, J.A.K., Van Gestel, T., Vandewalle, J., De Moor, B.
Format: Journal Article
Language:English
Published: United States IEEE 01-03-2003
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this paper, we present a simple and straightforward primal-dual support vector machine formulation to the problem of principal component analysis (PCA) in dual variables. By considering a mapping to a high-dimensional feature space and application of the kernel trick (Mercer theorem), kernel PCA is obtained as introduced by Scholkopf et al. (2002). While least squares support vector machine classifiers have a natural link with the kernel Fisher discriminant analysis (minimizing the within class scatter around targets +1 and -1), for PCA analysis one can take the interpretation of a one-class modeling problem with zero target value around which one maximizes the variance. The score variables are interpreted as error variables within the problem formulation. In this way primal-dual constrained optimization problem interpretations to the linear and kernel PCA analysis are obtained in a similar style as for least square-support vector machine classifiers.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ObjectType-Article-1
ObjectType-Feature-2
ISSN:1045-9227
1941-0093
DOI:10.1109/TNN.2003.809414