A fast non-monotone line search for stochastic gradient descent

We give an improved non-monotone line search algorithm for stochastic gradient descent (SGD) for functions that satisfy interpolation conditions. We establish theoretical convergence guarantees for the algorithm for non-convex functions. We conduct a detailed empirical evaluation to validate the the...

Full description

Saved in:
Bibliographic Details
Published in:Optimization and engineering Vol. 25; no. 2; pp. 1105 - 1124
Main Authors: Fathi Hafshejani, Sajad, Gaur, Daya, Hossain, Shahadat, Benkoczi, Robert
Format: Journal Article
Language:English
Published: New York Springer US 01-06-2024
Springer Nature B.V
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We give an improved non-monotone line search algorithm for stochastic gradient descent (SGD) for functions that satisfy interpolation conditions. We establish theoretical convergence guarantees for the algorithm for non-convex functions. We conduct a detailed empirical evaluation to validate the theoretical results.
ISSN:1389-4420
1573-2924
DOI:10.1007/s11081-023-09836-6