Variable selection with neural networks

In this paper, we present 3 different neural network-based methods to perform variable selection. OCD — Optimal Cell Damage — is a pruning method, which evaluates the usefulness of a variable and prunes the least useful ones (it is related to the Optimal Brain Damage method of Le Cun et al.). Regula...

Full description

Saved in:
Bibliographic Details
Published in:Neurocomputing (Amsterdam) Vol. 12; no. 2-3; pp. 223 - 248
Main Authors: Cibas, Tautvydas, Soulié, Françoise Fogelman, Gallinari, Patrick, Raudys, Sarunas
Format: Journal Article
Language:English
Published: Elsevier B.V 31-07-1996
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this paper, we present 3 different neural network-based methods to perform variable selection. OCD — Optimal Cell Damage — is a pruning method, which evaluates the usefulness of a variable and prunes the least useful ones (it is related to the Optimal Brain Damage method of Le Cun et al.). Regularization theory proposes to constrain estimators by adding a term to the cost function used to train a neural network. In the Bayesian framework, this additional term can be interpreted as the log prior to the weights distribution. We propose to use two priors (a Gaussian and a Gaussian mixture) and show that this regularization approach allows to select efficient subsets of variables. Our methods are compared to conventional statistical selection procedures and are shown to significantly improve on that.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:0925-2312
1872-8286
DOI:10.1016/0925-2312(95)00121-2