BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//UM//UM*Events//EN
CALSCALE:GREGORIAN
BEGIN:VTIMEZONE
TZID:America/Detroit
TZURL:http://tzurl.org/zoneinfo/America/Detroit
X-LIC-LOCATION:America/Detroit
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20070311T020000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=2SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20071104T020000
RRULE:FREQ=YEARLY;BYMONTH=11;BYDAY=1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20210120T142636
DTSTART;TZID=America/Detroit:20210129T100000
DTEND;TZID=America/Detroit:20210129T110000
SUMMARY:Workshop / Seminar:Statistics Department Seminar Series: Weijie Su\, Assistant Professor\, Wharton Statistics Department\, University of Pennsylvania
DESCRIPTION:Abstract: Privacy-preserving data analysis has been put on a firm mathematical foundation since the introduction of differential privacy (DP) in 2006. This privacy definition\, however\, has some well-known weaknesses: notably\, it does not tightly handle composition. In this talk\, we propose a relaxation of DP that we term \"f-DP\"\, which has a number of appealing properties and avoids some of the difficulties associated with prior relaxations. First\, f-DP preserves the hypothesis testing interpretation of differential privacy\, which makes its guarantees easily interpretable. It allows for lossless reasoning about composition and post-processing\, and notably\, a direct way to analyze privacy amplification by subsampling. We define a canonical single-parameter family of definitions within our class that is termed \"Gaussian Differential Privacy\"\, based on hypothesis testing of two shifted normal distributions. We prove that this family is focal to f-DP by introducing a central limit theorem\, which shows that the privacy guarantees of any hypothesis-testing based definition of privacy (including differential privacy) converge to Gaussian differential privacy in the limit under composition. This central limit theorem also gives a tractable analysis tool. We demonstrate the use of the tools we develop by giving an improved analysis of the privacy guarantees of noisy stochastic gradient descent. \n\nThis is joint work with Jinshuo Dong and Aaron Roth.\n\nhttp://www-stat.wharton.upenn.edu/~suw/\n\nThis seminar will be livestreamed via Zoom https://umich.zoom.us/j/94350208889\nThere will be a virtual reception to follow
UID:80541-20738138@events.umich.edu
URL:https://events.umich.edu/event/80541
CLASS:PUBLIC
STATUS:CONFIRMED
CATEGORIES:seminar
LOCATION:Off Campus Location
CONTACT:
END:VEVENT
END:VCALENDAR