Skip to Content

Sponsors

No results

Keywords

No results

Types

No results

Search Results

Events

No results
Search events using: keywords, sponsors, locations or event type
When / Where
All occurrences of this event have passed.
This listing is displayed for historical purposes.

Presented By: Department of Statistics Dissertation Defenses

Deep Learning-Assisted Approximate Bayesian Inference with Applications to Astronomy

Declan McNamara

Approximate Bayesian methods provide a principled means for inference in settings in which exact posterior inference is intractable. In this work, I present methods for variational inference, an approach to approximate Bayesian inference in which an approximation to the posterior is selected by numerical optimization. The approaches and analysis primarily consider amortized variational inference, a class of techniques that leverages deep learning to obtain a mapping from data instances to variational approximations of the posterior. First, I present SMC-Wake, a likelihood-based approach for minimization of the forward KL divergence. This algorithm uses Sequential Monte Carlo (SMC) samplers to construct inexpensive particle approximations for training an inference network. Next, I present a study of neural posterior estimation (NPE) and its objective function, the expected forward KL divergence. This likelihood-free approach to amortized inference averages over large amounts of simulated data from the model to learn mappings from data instances to variational approximations of the posterior. I present an analysis of this approach from the perspective of neural tangent kernel (NTK) theory. Under certain conditions on the variational family and neural network mapping, I show that NPE optimizes a convex functional and reliably converges to a unique solution in the asymptotic infinite-width limit, despite the highly nonconvex nature of neural network optimization landscapes. Finally, I extend these results to posit a novel class of expressive variational families based on linear combinations of basis functions, and propose a procedure to adaptively fit these basis functions to parameterize complex distributions. When targeting the forward KL divergence within this framework, the objective is convex in the variational parameters, but nevertheless allows for practitioners to fit highly multimodal variational approximations to the posterior. We conclude with applications of these methods to difficult problems in astronomy, such as redshift estimation from astronomical images, and the task of detecting blended astronomical spectra.

Explore Similar Events

  •  Loading Similar Events...

Keywords


Back to Main Content