ML Seminar: Beyond Adam: What Optimization Can Help Large Foundation Models

The UMN Machine Learning Seminar Series brings together faculty, students, and local industrial partners who are interested in the theoretical, computational, and applied aspects of machine learning, to pose problems, exchange ideas, and foster collaborations. The talks are every Tuesday from 11 a.m. - 12 p.m. during the Spring 2024 semester.

This week's speaker, Tianbao Yang (Texas A&M University), will be giving a talk titled "Beyond Adam: What Optimization Can Help Large Foundation Models".

Abstract

Large foundation models have revolutionized AI. Optimization plays a significant role in the development of large foundation models. Many optimization algorithms have been proposed such as Adam for solving the traditional empirical risk minimization problem. However, these traditional algorithms have faced new challenges for learning large foundation models, e.g., slow convergence, GPU hungry. In this talk I will present our recent work on innovations about optimization for large foundation models. 

Biography

Tianbao Yang is an Associate Professor and Herbert H. Richardson Faculty Fellow in the CSE department at Texas A&M University, where he directs the Optimization for Machine learning and AI (OptMAI Lab). His research interests center around optimization, big data, machine learning, and responsible AI. Before joining TAMU, he was an assistant professor and then tenured Dean's Excellence associate professor at the Computer Science Department of the University of Iowa from 2014 to 2022. Before that, he worked in Silicon Valley as Machine Learning Researcher for two years at GE Resarch and NEC Labs. He received the Best Student Paper Award of COLT in 2012, and the NSF Career Award in 2019. He is the founder of the widely adopted LibAUC library

Start date
Tuesday, April 23, 2024, 11 a.m.
End date
Tuesday, April 23, 2024, Noon
Location

Keller Hall 3-180 and via Zoom.

Share