CSE DSI Machine Learning Seminar with Ahmet Alacaoglu (UW-Madison)

Stochastic Variance Reduction Beyond Minimization Problems

Variance reduction methods have been extensively studied in the context of first-order algorithms for minimization problems in the last decade, with significantly less focus on more general problems such as min-max optimization. An even more general problem called the monotone inclusion provides a broad framework to model such classical problems as matrix games and linearly constrained optimization and also more contemporary problems such as adversarially robust training or multi agent reinforcement learning.

In this talk, I will present variance reduced algorithms for solving monotone inclusions, monotone variational inequalities and convex-concave min-max optimization problems.  I will present algorithms that obtain optimal complexity guarantees with respect to the final accuracy, Lipschitz constant and the number of operators involved in the problem. The improvement compared to deterministic first-order algorithms can be as much as $\sqrt{n}$ where $n$ is the number of operators, mirroring the similar improvements in minimization problems. I will consider both duality gap and operator norm (residual or gradient mapping in constrained setups) as optimality measures with the latter providing a computable progress metric. I will also provide preliminary numerical evidence on practical merit of these algorithms.

Ahmet Alacaoglu is a postdoctoral research associate at the University of Wisconsin-Madison. He completed his PhD in Computer and Communication Sciences at EPFL in Switzerland. He is interested in randomized first-order algorithms for min-max problems, adaptive gradient algorithms and their applications in machine learning and reinforcement learning.

Start date
Tuesday, Oct. 10, 2023, 11 a.m.
End date
Tuesday, Oct. 10, 2023, Noon
Location

Keller 3-180 or via Zoom.

Share