Skip to Content

Sponsors

No results

Keywords

No results

Types

No results

Search Results

Events

No results
Search events using: keywords, sponsors, locations or event type
When / Where
All occurrences of this event have passed.
This listing is displayed for historical purposes.

Presented By: Algebraic Geometry Reading Seminar - Department of Mathematics

The principle of ‘maximum entropy’

Tim Hoheisel (McGill University)

The principle of ‘maximum entropy’ states that the probability distribution which best re- presents the current state of knowledge about a system is the one with largest entropy with respect to a given prior (data) distribution. It was first formulated in the context of statistical physics in two seminal papers by E. T. Jaynes (Physical Review, Series II. 1957), and thus constitutes an informa- tion theoretic manifestation of Occam’s razor. We bring the idea of maximum entropy to bear in the context of linear inverse problems in that we solve for the probability measure which is close to the (learned or chosen) prior and whose expectation has small residual with respect to the observation. Duality leads to tractable, finite-dimensional (dual) problems. A core tool, which we then show to be useful beyond the linear inverse problem setting, is the ‘MEMM functional’: it is an infimal pro- jection of the Kullback- Leibler divergence and a linear equation, which coincides with Cramér’s function (ubiquitous in the theory of large deviations) in most cases, and is paired in duality with the cumulant generating function of the prior measure. Numerical examples underline the efficacy of the presented framework.

The talk encompasses joint work with Rustum Choksi (McGill), Ariel Goodwin (Cornell), Carola- Bibiane Schönlieb (Cambridge), and Yakov Vaisbourd (McGill).

Livestream Information

 Livestream
February 17, 2023 (Friday) 9:00am
Joining Information Not Yet Available

Explore Similar Events

  •  Loading Similar Events...

Keywords


Back to Main Content