Presented By: Department of Statistics Seminar Series
Statistics Department Seminar Series: Jun Zhang, Professor, Departments of Psychology and Mathematics, University of Michigan
"Information Geometry and Maximal Entropy Inference"
Information Geometry is the differential geometric study of the manifold of probability models, where each probability distribution is just a point on the manifold. Instead of using metric for measuring distances on such manifolds, these applications often use “divergence functions” for measuring proximity of two points (that do not impose symmetry and triangular inequality), for instance Kullback-Leibler divergence, Bregman divergence, f-divergence, etc. Divergence functions are tied to generalized entropy (for instance, Tsallis entropy, Renyi entropy, phi-entropy) and cross-entropy functions widely used in machine learning and information sciences. After a brief introduction to IG, I illustrate the geometry of maximum entropy inference and exponential family. I then introduce a general form of entropy/cross-entropy/divergence function, and show how the geometry of the underlying probability manifold (deformed exponential family) reveals an “escort statistics” that is hidden from the standard exponential family.
Related Links
Co-Sponsored By
Explore Similar Events
-
Loading Similar Events...