Eigenvalues of covariance matrices : application to neural-network learning
The learning time of a simple neural-network model is obtained through an analytic computation of the eigenvalue spectrum for the Hessian matrix, which describes the second-order properties of the objective function in the space of coupling coefficients. The results are generic for symmetric matrice...
Saved in:
Published in: | Physical review letters Vol. 66; no. 18; pp. 2396 - 2399 |
---|---|
Main Authors: | , , |
Format: | Journal Article |
Language: | English |
Published: |
Ridge, NY
American Physical Society
06-05-1991
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The learning time of a simple neural-network model is obtained through an analytic computation of the eigenvalue spectrum for the Hessian matrix, which describes the second-order properties of the objective function in the space of coupling coefficients. The results are generic for symmetric matrices obtained by summing outer products of random vectors. The form of the eigenvalue distribution suggests new techniques for accelerating the learning process, and provides a theoretical justification for the choice of centered versus biased state variables. (Author) |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 ObjectType-Article-2 ObjectType-Feature-1 |
ISSN: | 0031-9007 1079-7114 |
DOI: | 10.1103/physrevlett.66.2396 |