BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//UM//UM*Events//EN
CALSCALE:GREGORIAN
BEGIN:VTIMEZONE
TZID:America/Detroit
TZURL:http://tzurl.org/zoneinfo/America/Detroit
X-LIC-LOCATION:America/Detroit
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20070311T020000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=2SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20071104T020000
RRULE:FREQ=YEARLY;BYMONTH=11;BYDAY=1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20260109T114337
DTSTART;TZID=America/Detroit:20260116T100000
DTEND;TZID=America/Detroit:20260116T110000
SUMMARY:Workshop / Seminar:Statistics Department Seminar Series: Jian Kang\, Professor & Associate Chair for Research\, Biostatistics\, University of Michigan
DESCRIPTION:Abstract: Deep generative models\, such as Variational Autoencoders (VAE) and diffusion probabilistic models\, have transformed high-dimensional data modeling. However\, these approaches often rely on variational approximations or computationally intensive ordinary differential equation (ODE) solvers\, trading exact Bayesian inference for scalability. In this talk\, I present the Bayesian Deep Noise Neural Network (B-DeepNoise)\, a framework originally developed for density regression that possesses inherent yet under-explored generative capabilities. Unlike standard Bayesian neural networks that place priors only on network weights\, the B-DeepNoise framework injects stochastic noise into every hidden layer of a deep architecture. We show that this construction is mathematically equivalent to a deep hierarchical latent variable model\, yielding rich conditional distributions through layer-wise noise propagation. By exploiting piecewise-linear activation functions\, specifically ReLU function\, we derive a closed-form Gibbs sampling algorithm that enables asymptotically exact posterior inference\, avoiding the approximation errors commonly associated with variational methods. I will demonstrate how this framework unifies three closely related tasks: (1) uncertainty quantification in regression\, (2) density regression for complex conditional distributions\, and (3) extensions to generative modeling\, where layer-wise noise injection enables flexible sample generation and data imputation. These results bridge flexible deep learning architectures with rigorous Bayesian inference and computational statistics\, providing a principled approach to density learning and generative modeling.
UID:143254-21892556@events.umich.edu
URL:https://events.umich.edu/event/143254
CLASS:PUBLIC
STATUS:CONFIRMED
CATEGORIES:seminar
LOCATION:West Hall - 340
CONTACT:
END:VEVENT
BEGIN:VEVENT
DTSTAMP:20260116T060134
DTSTART;TZID=America/Detroit:20260116T100000
DTEND;TZID=America/Detroit:20260116T160000
SUMMARY:Other:Woven Horizons - National Parks Fundraiser
DESCRIPTION:Join Sierra Club + Crafts for Conservation: VIPs Club as we table about the importance of National Parks & Land Back Movements while raising for the Ranger Relief & Land Back Legal funds. \nStop by any time between 10a-4p on Fridays January 16th in Misfits + 1-4p on January 30th outside of Joe's Pizza (if weather permits)!
UID:143053-21891986@events.umich.edu
URL:https://events.umich.edu/event/143053
CLASS:PUBLIC
STATUS:CONFIRMED
CATEGORIES:
LOCATION:Misfits Coffee Club
CONTACT:
END:VEVENT
END:VCALENDAR