Regularization Algorithms for Learning that are Equivalent to Multilayer Networks

Learning an input-output mapping from a set of examples, of the type that many neural networks have been constructed to perform, can be regarded as synthesizing an approximation of a multidimensional function (that is, solving the problem of hypersurface reconstruction). From this point of view, thi...

Full description

Saved in:
Bibliographic Details
Published in:Science (American Association for the Advancement of Science) Vol. 247; no. 4945; pp. 978 - 982
Main Authors: Poggio, T., Girosi, F.
Format: Journal Article
Language:English
Published: Washington, DC American Society for the Advancement of Science 23-02-1990
American Association for the Advancement of Science
The American Association for the Advancement of Science
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Learning an input-output mapping from a set of examples, of the type that many neural networks have been constructed to perform, can be regarded as synthesizing an approximation of a multidimensional function (that is, solving the problem of hypersurface reconstruction). From this point of view, this form of learning is closely related to classical approximation techniques, such as generalized splines and regularization theory. A theory is reported that shows the equivalence between regularization and a class of three-layer networks called regularization networks or hyper basis functions. These networks are not only equivalent to generalized splines but are also closely related to the classical radial basis functions used for interpolation tasks and to several pattern recognition and neural network algorithms. They also have an interesting interpretation in terms of prototypes that are synthesized and optimally combined during the learning stage.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ObjectType-Article-1
ObjectType-Feature-2
ISSN:0036-8075
1095-9203
DOI:10.1126/science.247.4945.978