pdf icon
Volume 15 (2019) Article 4 pp. 1-32 [Research Survey]
Potential-Function Proofs for Gradient Methods
Received: March 3, 2019
Revised: August 5, 2019
Published: September 12, 2019
Download article from ToC site:
[PDF (475K)] [PS (2128K)] [Source ZIP]
Keywords: convex optimization, potential function
ACM Classification: G.1.6
AMS Classification: 68Q25

Abstract: [Plain Text Version]

This note discusses proofs of convergence for gradient methods (also called “first-order methods”) based on simple potential-function arguments. We cover methods like gradient descent (for both smooth and non-smooth settings), mirror descent, and some accelerated variants. We hope the structure and presentation of these amortized-analysis proofs will be useful as a guiding principle in learning and using these proofs.