Skip to Content

Sponsors

No results

Tags

No results

Types

No results

Search Results

Events

No results
Search events using: keywords, sponsors, locations or event type
When / Where
All occurrences of this event have passed.
This listing is displayed for historical purposes.

Presented By: Colloquium Series - Department of Mathematics

Colloquium: Gaussian kernelized graph Laplacian: Bi-stochastic normalization and eigen-convergence

Xiuyuan Cheng (Duke)

Abstract: Eigen-data of graph Laplacian matrices are widely used in data analysis and machine learning, such as dimension reduction by spectral embedding. Many graph Laplacian methods start by building a kernelized affinity matrix from high-dimensional data points, which may lie on some unknown low-dimensional manifolds embedded in the ambient space. When clean manifold data are corrupted by high dimensional noise, it can negatively influence the performance of graph Laplacian methods. In this talk, we first introduce the use of bi-stochastic normalization to improve the robustness of graph Laplacian to high-dimensional outlier noise, possibly heteroskedastic, with a proven convergence guarantee under the manifold data setting. Next, for the important question of eigen-convergence (namely the convergence of eigenvalues and eigenvectors to the spectra of the Laplace-Beltrami operator), we show that choosing a smooth kernel function leads to improved theoretical convergence rates compared to prior results. The proof is by analyzing the Dirichlet form convergence and constructing candidate approximate eigenfunctions via convolution with the manifold heat kernel. When data density is non-uniform on the manifold, we prove the same rates for the density-corrected graph Laplacian. The theory is supported by numerical results. Joint work with Boris Landa and Nan Wu.

Talk will be in-person and on Zoom: https://umich.zoom.us/j/98734707290

Explore Similar Events

  •  Loading Similar Events...

Back to Main Content