ISyE Seminar Series: Minshuo Chen
"Diffusion Transformer Captures Spatial-Temporal Dependencies: A Theory for Gaussian Process Data"
Minshuo Chen
Assistant Professor,Department of Industrial Engineering and Management Sciences
Northwestern University
About the Seminar:
Diffusion models have emerged as a powerful generative AI technology for high-dimensional complex data modeling in various applications. Diffusion Transformer, the backbone of Sora for video generation, successfully further scales the capacity of diffusion models, pioneering new avenues for high-fidelity sequential data modeling. Unlike static data such as images, sequential data consists of consecutive data frames indexed by time, exhibiting rich spatial and temporal dependencies. These dependencies represent the underlying dynamic model and are critical to validate the generated data. In this talk, we aim to develop theoretical underpinnings of diffusion transformers for capturing spatial-temporal dependencies. Specifically, we establish score approximation and distribution estimation guarantees of diffusion transformers for learning Gaussian process data with covariance functions of various decay patterns. We highlight how the spatial-temporal dependencies are captured and affect learning efficiency. Our study proposes a novel transformer approximation theory, where the transformer acts to unroll a gradient descent algorithm. We support our theoretical results by numerical experiments, providing strong evidence that spatial-temporal dependencies are captured within attention layers, aligning with our approximation theory.
About the Speaker:
Minshuo Chen is an assistant professor with the Department of Industrial Engineering and Management Sciences at Northwestern University. Prior to joining Northwestern, he was an associate research scholar with the Department of Electrical and Computer Engineering at Princeton University. He completed his Ph.D. from the School of Industrial and Systems Engineering at Georgia Tech, majoring in Machine Learning. His research focuses on developing principled methodologies and theoretical foundations of deep learning, with a particular interest in 1) generative models including diffusion models, 2) foundations of machine learning, such as optimization and sample efficiency, and 3) reinforcement learning.
If you wish to be added to the ISyE Graduate Seminar Series emailing list, please email Event Coordinator Emily Rice at [email protected].