Bart Vandereycken

A note on the optimal convergence rate of descent methods with fixed step sizes for smooth strongly convex functions

by ,

Abstract:

Based on a recent result by de Klerk, Glineur, and Taylor (SIAM J. Optim., 30(3):2053--2082, 2020) on the attainable convergence rate of gradient descent for smooth and strongly convex functions in terms of function values, a convergence analysis for general descent methods with fixed step sizes is presented. It covers variable metric methods as well as gradient related search directions under angle and scaling conditions. An application to inexact gradient methods is also presented.

Reference:

A. Uschmajew, B. Vandereycken, "A note on the optimal convergence rate of descent methods with fixed step sizes for smooth strongly convex functions", 2021.

Bibtex Entry:

@misc{Uschmajew_V:2021,
  title = {A note on the optimal convergence rate of descent methods with fixed step sizes for smooth strongly convex functions},
  author = {Uschmajew, A. and Vandereycken, B.},
  year = {2021},  
  abstract = {Based on a recent result by de Klerk, Glineur, and Taylor (SIAM J. Optim., 30(3):2053--2082, 2020) on the attainable convergence rate of gradient descent for smooth and strongly convex functions in terms of function values, a convergence analysis for general descent methods with fixed step sizes is presented. It covers variable metric methods as well as gradient related search directions under angle and scaling conditions. An application to inexact gradient methods is also presented.},  
  Howpublished = {Tech. report (submitted)},
  Pdf = {http://www.unige.ch/math/vandereycken/papers/preprint_Uschmajew_V_2021.pdf}
}