Skip to Content

Sponsors

No results

Keywords

No results

Types

No results

Search Results

Events

No results
Search events using: keywords, sponsors, locations or event type
When / Where

Presented By: Department of Statistics

Statistics Department Seminar Series: Tianyu Zhang, Postdoctoral Research Fellow, Department of Statistics & Data Science, Carnegie Mellon University.

"Adaptive and Scalable Nonparametric Estimation via Stochastic Optimization"

Tianyu Zhang Tianyu Zhang
Tianyu Zhang
Abstract: Nonparametric procedures are frequently employed in predictive and inferential modeling to relate random variables without imposing specific parametric forms. In supervised learning, for instance, our focus is often on the conditional mean function that links predictive covariates to a numerical outcome of interest. While many existing statistical learning methods achieve this with optimal statistical performance, their computational expenses often do not scale favorably with increasing sample sizes. This challenge is exacerbated in certain “online settings,” where data is continuously collected and estimates require frequent updates.

In this talk, I will discuss a class of nonparametric stochastic optimization methods. The estimates are constructed using stochastic gradient descent (SGD) over a function space of varying capacity. Combining this computational approach with compact function approximation strategies—such as utilizing eigenfunctions in a reproducing kernel Hilbert space—certain nonparametric estimators can attain both optimal statistical properties and minimal (computational) space expense. Additionally, I will introduce a rolling validation procedure, an online adaptation of cross-validation, designed for hyperparameter tuning. This model selection process naturally integrates with incremental SGD algorithms, imposing a negligible extra computational burden.

https://terrytianyuzhang.github.io/
Tianyu Zhang Tianyu Zhang
Tianyu Zhang

Explore Similar Events

  •  Loading Similar Events...

Tags


Back to Main Content