Generalization theory for diffusion models

Data Science Seminar

Frank Cole (University of Minnesota)

Abstract

Generative modeling is a powerful unsupervised learning framework in which the goal is to sample from an unknown probability distribution. Diffusion models, which use stochastic differential equations to sample the target distribution, have achieved state-of-the-art success in the generation of audio and image data. Despite their empirical triumphs, the mathematical theory of diffusion models is still limited, and a salient open question is whether the sample complexity of diffusion models suffers from a curse of dimensionality. In this talk, we introduce a measure of complexity for probability distributions and show that diffusion models can break the curse of dimensionality in learning distributions with low complexity. To illustrate the theory, we provide examples of distributions with low complexity, including certain mixtures of Gaussians.

Start date
Tuesday, Feb. 13, 2024, 1:25 p.m.
End date
Tuesday, Feb. 13, 2024, 2:25 p.m.
Location

Lind Hall 325 or via Zoom

Zoom registration

Share