Kantorovich Initiative Seminar: Jason Altschuler

  • Date: 10/12/2023
  • Time: 10:00
Lecturer(s):
Jason Altschuler, University of Pennsylvania
Location: 

Online

Topic: 

Shifted divergences for sampling, privacy, and beyond [video]

Description: 

Shifted divergences provide a principled way of making information theoretic divergences (e.g. KL) geometrically aware via optimal transport smoothing. In this talk, I will argue that shifted divergences provide a powerful approach towards unifying optimization, sampling, privacy, and beyond. For concreteness, I will demonstrate these connections via three recent highlights. (1) Characterizing the differential privacy of Noisy-SGD, the standard algorithm for private convex optimization. (2) Characterizing the mixing time of the Langevin Algorithm to its stationary distribution for log-concave sampling. (3) The fastest high-accuracy algorithm for sampling from log-concave distributions. A recurring theme is a certain notion of algorithmic stability, and the central technique for establishing this is shifted divergences. Based on joint work with Kunal Talwar, and with Sinho Chewi.

Other Information: 

Location: Online

 

On Zoom (registration required):

https://ubc.zoom.us/meeting/register/u5wtdOGoqj0rH9CiYNIDC5HqEGckTaJtca89 

 

A recording of this event is available on mathtube.org.