Skip to Content

Sponsors

No results

Keywords

No results

Types

No results

Search Results

Events

No results
Search events using: keywords, sponsors, locations or event type
When / Where
All occurrences of this event have passed.
This listing is displayed for historical purposes.

Presented By: MCAIM - Department of Mathematics

Bridging the theory-practice gap in machine learning: new results in sampling and optimization

Neha Wadia, Ph.D.

The flyer includes a picture of Neha Wadia along with her title and topic of her proposed talk on Friday April 18th. The location will be in East Hall as well as a webinar stream with link. The flyer includes a picture of Neha Wadia along with her title and topic of her proposed talk on Friday April 18th. The location will be in East Hall as well as a webinar stream with link.
The flyer includes a picture of Neha Wadia along with her title and topic of her proposed talk on Friday April 18th. The location will be in East Hall as well as a webinar stream with link.
Meteoric progress in machine learning over the last decade has outpaced our foundational understanding, limiting our ability to harness the technology effectively in applications that require performance guarantees, and inviting the development of theory to enable such applications. At the heart of this progress is a highly productive connection to gradient-based optimization, the efficacy of which we are so far unable to fully explain. Motivated by this issue, in the first part of the talk, I will briefly describe an interpretable and computationally efficient adaptive step-size method for gradient-based optimization that relies on ideas from the numerical analysis of ordinary differential equations. I will show how this method connects studies of popular optimizers for machine learning in continuous time—where they are often more amenable to analysis—with their practical discrete-time implementations.

Surprising recent developments in machine learning include the ability to generate---or sample---perceptual data such as natural images and language. Markov Chain Monte Carlo (MCMC) algorithms have long provided a generic recipe for sampling from probability distributions of interest. The Gibbs sampler is a specific limit of an MCMC algorithm and is the natural choice for sampling from a simple model of image patches. In the bulk of this talk, I will focus on a new mixing time bound for Gibbs sampling from well-conditioned log-concave distributions. I will outline the proof of the bound and place it within the context of ongoing efforts in the broader community to understand the efficacy of diffusion-based image generation methods. Time permitting, I will discuss potential applications of these efforts to problems in cosmology and biophysics.
The flyer includes a picture of Neha Wadia along with her title and topic of her proposed talk on Friday April 18th. The location will be in East Hall as well as a webinar stream with link. The flyer includes a picture of Neha Wadia along with her title and topic of her proposed talk on Friday April 18th. The location will be in East Hall as well as a webinar stream with link.
The flyer includes a picture of Neha Wadia along with her title and topic of her proposed talk on Friday April 18th. The location will be in East Hall as well as a webinar stream with link.

Livestream Information

 Livestream
April 18, 2025 (Friday) 10:30am

Explore Similar Events

  •  Loading Similar Events...

Keywords


Back to Main Content