Electrical Engineering Systems Seminar

Thursday February 25, 2021 12:00 PM

Accelerated gradient methods on Riemannian manifolds

Speaker: Suvrit Sra, EECS, LIDS, IDSS, Massachusetts Institue of Technology
Location: Online Event

This talks lies at the interface of geometry and optimization. I'll talk about geodesically convex optimization problems, a rich class of non-convex optimization problems that admit tractable global optimization. I'll provide some background on this class and some motivating examples. Beyond a general introduction to the topic area, I will dive deeper into a recent discovery of a long-sought result: an accelerated gradient method for Riemannian manifolds. Towards developing this method, we will revisit Nesterov's (Euclidean) estimate sequence technique and present a conceptually simple alternative. We will then generalize this simpler alternative to the Riemannian setting. Combined with a new geometric inequality, we will then obtain the first (global) accelerated Riemannian-gradient method. I'll also comment on some very recent updates on this topic.

Series Electrical Engineering Systems Seminar Series

Contact: Caroline Murphy caroline@caltech.edu