Presented By: Department of Statistics
Statistics Department Seminar Series: Yao Xie, Coca-Cola Foundation Chair & Professor, H. Milton Stewart School of Industrial and Systems Engineering, Georgia Institute of Technology
"Generative Models for Statistical Inference: From Wasserstein Flows to Guided Generation"
Abstract: Generative models such as normalizing flows and diffusion processes have transformed how we represent complex, high-dimensional data, yet their statistical and mathematical foundations remain less understood. In this talk, I will present a unified framework that views generative modeling as flows in probability space, continuous transformations between distributions that reveal the geometric structure underlying learning and inference. I will begin with the JKO-flow generative model, inspired by the Jordan–Kinderlehrer–Otto (JKO) scheme for Wasserstein gradient flows, which interprets density learning as proximal gradient descent in the space of probability measures. This perspective offers provable convergence guarantees and connects generative modeling with classical principles of statistical inference. Building on this foundation, I will discuss recent extensions using guided flow generative models that incorporate data- or risk-driven guidance fields to achieve robustness, domain adaptation, and inference under uncertainty. Together, these results form a framework that bridges statistics, optimization, and machine learning, bringing classical inferential ideas toward a principled foundation for trustworthy generative AI.