A nonmonotone supermemory gradient algorithm for unconstrained optimization

This paper presents a nonmonotone supermemory gradient algorithm for unconstrained optimization problems. At each iteration, this proposed method sufficiently uses the previous multi-step iterative information and avoids the storage and computation of matrices associated with the Hessian of objectiv...

Full description

Saved in:
Bibliographic Details
Published in:Journal of applied mathematics & computing Vol. 46; no. 1-2; pp. 215 - 235
Main Authors: Ou, Yigui, Liu, Yuanwen
Format: Journal Article
Language:English
Published: Berlin/Heidelberg Springer Berlin Heidelberg 01-10-2014
Springer Nature B.V
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper presents a nonmonotone supermemory gradient algorithm for unconstrained optimization problems. At each iteration, this proposed method sufficiently uses the previous multi-step iterative information and avoids the storage and computation of matrices associated with the Hessian of objective functions, thus it is suitable to solve large-scale optimization problems and can converge stably. Under some assumptions, the convergence properties of the proposed algorithm are analyzed. Numerical results are also reported to show the efficiency of this proposed method.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1598-5865
1865-2085
DOI:10.1007/s12190-013-0747-0