Skip to Content

Sponsors

No results

Keywords

No results

Types

No results

Search Results

Events

No results
Search events using: keywords, sponsors, locations or event type
When / Where
All occurrences of this event have passed.
This listing is displayed for historical purposes.

Presented By: Department of Statistics

Statistics Department Seminar Series: Snigdha Panigrahi, Assistant Professor, Department of Statistics, University of Michigan

"Selective inference using randomized group lasso estimators"

Panigrahi, Snidgha Panigrahi, Snidgha
Panigrahi, Snidgha
Abstract. Group lasso penalties are commonly used to estimate sparse models by simultaneously setting groups of variables to zero. When paired with a suitable loss function, these penalties enable model selection with different types of data, such as categorical, count, and continuous data. However, carrying out inference in the selected model, also known as selective inference or post-selection inference, is quite challenging. There are two main reasons for this. First, previous selective inference methods for the lasso penalty do not extend to selection with a group lasso penalty, as there is no straightforward way to describe the selection event. Second, even for the lasso, these previous methods are restricted to normal data. As a result, analysts are limited to less efficient options like data splitting, where each sample is used for only one of the two tasks, model selection or inference.

In this talk, I will present a selective inference method for models selected with group lasso penalties. The method involves using a new form of randomization with Gaussian noise variables, resulting in a randomized group lasso estimator. Unlike traditional data splitting, this method makes use of all available samples for both model selection and inference, resulting in significantly shorter intervals that adapt to the signal strength in the data. Moreover, this method provides a unified way to make distribution-free selective inference with a wide class of loss functions, as opposed to more recent variants of data splitting such as data fission, that are tailored to specific parametric families.

https://snigdha-panigrahi.netlify.app/
Panigrahi, Snidgha Panigrahi, Snidgha
Panigrahi, Snidgha

Explore Similar Events

  •  Loading Similar Events...

Keywords


Back to Main Content