ISyE Seminar Series: Necdet Serhat Aybat

"Primal-Dual Methods with Stepsize Search for Nonconvex Minimax Problems"

Necdet Serhat Aybat

Professor in the Department of Industrial and Manufacturing Engineering
Pennsylvania State University

About the Seminar:

In this talk we discuss Gradient Descent Ascent (GDA) methods with adaptive stepsizes for solving nonconvex-strongly concave (NCSC) minimax problems. In the first part of the talk, assuming we only have access to noisy gradients through an unbiased stochastic oracle with a finite variance, we propose a stochastic GDA method with backtracking (SGDA-B). SGDA-B is agnostic to the Lipschitz constant L, concavity modulus μ>0, and the variance bound of the unbiased stochastic gradient estimator. Within O(1/ε4 log(1/p)) stochastic gradient calls, SGDA-B can compute an ε-stationary point in terms of the gradient-map norm evaluated at the random output point with probability at least 1-p.

In the second part of the talk, we focus on the deterministic setting, and propose an alternating gradient descent ascent method AGDA+ that can adaptively choose nonmonotone primal-dual stepsizes to compute an approximate stationary point without requiring the knowledge of the global Lipschitz constant L and the concavity modulus μ. Using a nonmonotone step-size search (backtracking) scheme, AGDA+ stands out by its ability to exploit the local Lipschitz structure and eliminates the need for precise tuning of hyper-parameters. AGDA+ achieves the optimal iteration complexity of O(1/ε2) and it is the first step-size search method for the class of NCSC minimax problems, and it requires only 3 gradient calls per backtracking iteration. The numerical experiments demonstrate its robustness and efficiency. Read the work on, “AGDA+: Proximal Alternating Gradient Descent Ascent Method with a Nonmonotone Adaptive Step-Size Search for Nonconvex Minimax Problems”.

About the Speaker:

Necdet Serhat Aybat is a Professor in the Department of Industrial and Manufacturing Engineering at the Pennsylvania State University. He received his Ph.D. degree in Operations Research from Columbia University, New York, USA in 2011, and his M.S. and B.S. degrees in industrial engineering from Bogazici University, Istanbul, Turkey, in 2005 and 2003, respectively. Aybat’s research mainly focuses on first- order methods for large-scale, constrained optimization problems arising in machine learning, and distributed optimization. His research has been supported by NSF, ARO and ONR research grants. He is currently an associate editor for Mathematics of Operations ResearchJournal of Scientific Computing, and Journal of Global Optimization.


If you wish to be added to the ISyE Graduate Seminar Series emailing list, please email Event Coordinator Emily Rice at [email protected]

Start date
Wednesday, Dec. 3, 2025, 9 a.m.
End date
Wednesday, Dec. 3, 2025, 10 a.m.
Location

Lind Hall 325

Share