Skip to Content

Sponsors

No results

Keywords

No results

Types

No results

Search Results

Events

No results
Search events using: keywords, sponsors, locations or event type
When / Where
All occurrences of this event have passed.
This listing is displayed for historical purposes.

Presented By: Department of Statistics Seminar Series

Statistics Department Seminar Series: Jun Zhang, Professor, Departments of Psychology and Mathematics, University of Michigan

"Information Geometry and Maximal Entropy Inference"

Jun Zhang Jun Zhang
Jun Zhang
Information Geometry is the differential geometric study of the manifold of probability models, where each probability distribution is just a point on the manifold. Instead of using metric for measuring distances on such manifolds, these applications often use “divergence functions” for measuring proximity of two points (that do not impose symmetry and triangular inequality), for instance Kullback-Leibler divergence, Bregman divergence, f-divergence, etc. Divergence functions are tied to generalized entropy (for instance, Tsallis entropy, Renyi entropy, phi-entropy) and cross-entropy functions widely used in machine learning and information sciences. After a brief introduction to IG, I illustrate the geometry of maximum entropy inference and exponential family. I then introduce a general form of entropy/cross-entropy/divergence function, and show how the geometry of the underlying probability manifold (deformed exponential family) reveals an “escort statistics” that is hidden from the standard exponential family.
Jun Zhang Jun Zhang
Jun Zhang

Explore Similar Events

  •  Loading Similar Events...

Tags


Back to Main Content