BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//UM//UM*Events//EN
CALSCALE:GREGORIAN
BEGIN:VTIMEZONE
TZID:America/Detroit
TZURL:http://tzurl.org/zoneinfo/America/Detroit
X-LIC-LOCATION:America/Detroit
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20070311T020000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=2SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20071104T020000
RRULE:FREQ=YEARLY;BYMONTH=11;BYDAY=1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20240909T093912
DTSTART;TZID=America/Detroit:20240920T100000
DTEND;TZID=America/Detroit:20240920T110000
SUMMARY:Workshop / Seminar:Statistics Department Seminar Series: Snigdha Panigrahi\, Assistant Professor\, Department of Statistics\, University of Michigan
DESCRIPTION:Abstract. Group lasso penalties are commonly used to estimate sparse models by simultaneously setting groups of variables to zero. When paired with a suitable loss function\, these penalties enable model selection with different types of data\, such as categorical\, count\, and continuous data.  However\, carrying out inference in the selected model\, also known as selective inference or post-selection inference\, is quite challenging. There are two main reasons for this. First\, previous selective inference methods for the lasso penalty do not extend to selection with a group lasso penalty\, as there is no straightforward way to describe the selection event. Second\, even for the lasso\, these previous methods are restricted to normal data. As a result\, analysts are limited to less efficient options like data splitting\, where each sample is used for only one of the two tasks\, model selection or inference. \n\nIn this talk\, I will present a selective inference method for models selected with group lasso penalties. The method involves using a new form of randomization with Gaussian noise variables\, resulting in a randomized group lasso estimator. Unlike traditional data splitting\, this method makes use of all available samples for both model selection and inference\, resulting in significantly shorter intervals that adapt to the signal strength in the data.  Moreover\, this method provides a unified way to make distribution-free selective inference with a wide class of loss functions\, as opposed to more recent variants of data splitting such as data fission\, that are tailored to specific parametric families.\n\nhttps://snigdha-panigrahi.netlify.app/
UID:124533-21853150@events.umich.edu
URL:https://events.umich.edu/event/124533
CLASS:PUBLIC
STATUS:CONFIRMED
CATEGORIES:seminar
LOCATION:West Hall - 340
CONTACT:
END:VEVENT
END:VCALENDAR