Schedule for: 25w5430 - Wasserstein Gradient Flows in Math and Machine Learning
Beginning on Sunday, June 29 and ending Friday July 4, 2025
All times in Banff, Alberta time, MDT (UTC-6).
Sunday, June 29 | |
---|---|
16:00 - 17:30 | Check-in begins at 16:00 on Sunday and is open 24 hours (Front Desk - Professional Development Centre) |
17:30 - 19:30 |
Dinner ↓ A buffet dinner is served daily between 5:30pm and 7:30pm in Vistas Dining Room, top floor of the Sally Borden Building. (Vistas Dining Room) |
20:00 - 22:00 |
Informal gathering ↓ Meet and Greet at the BIRS Lounge (PDC building , 2nd floor). (PDC BIRS Lounge) |
Monday, June 30 | |
---|---|
07:00 - 08:45 |
Breakfast ↓ Breakfast is served daily between 7 and 9am in the Vistas Dining Room, the top floor of the Sally Borden Building. (Vistas Dining Room) |
08:45 - 09:00 |
Introduction and Welcome by BIRS Staff ↓ A brief introduction to BIRS with important logistical information, technology instruction, and opportunity for participants to ask questions. (TCPL 201) |
09:00 - 10:00 | Matthias Erbar (TCPL 201) |
10:00 - 10:30 | Coffee Break (TCPL Foyer) |
10:30 - 11:15 |
Beatrice Acciaio: Absolutely continuous curves of stochastic processes ↓ We study absolutely continuous curves in the adapted
Wasserstein space of filtered processes. We provide a probabilistic
representation of such curves as flows of adapted processes on a common
filtered probability space, extending classical superposition results to
the adapted setting. We characterize geodesics in this space and derive
an adapted Benamou--Brenier-type formula, obtaining -as an application-
a Skorokhod-type representation for sequences of filtered processes
under the adapted weak topology. Finally, we provide an adapted version
of the continuity equation characterizing absolutely continuous curves
of filtered processes. (TCPL 201) |
11:30 - 13:00 |
Lunch ↓ Lunch is served daily between 11:30am and 1:30pm in the Vistas Dining Room, the top floor of the Sally Borden Building. (Vistas Dining Room) |
13:15 - 14:00 |
Giulia Cavagnari: Stochastic approximation as dissipative Wasserstein flow: a measure-theoretic perspective ↓ We propose a unified probabilistic framework to study the convergence of stochastic Euler schemes—such as stochastic gradient descent—in a separable Hilbert space X. These algorithms approximate deterministic ODEs driven by dissipative vector fields arising from stochastic superposition.
By interpreting their evolution in the Wasserstein space over X through Probability Vector Fields, we establish convergence to an implicit limit dynamics governed by a maximal dissipative extension of the underlying barycentric field. As a direct result, our work recovers the well-known convergence of classical stochastic schemes in X to the unique solution of the underlying deterministic ODE.
This is a joint work with Giuseppe Savaré (Bocconi University - Italy) and Giacomo Enrico Sodini (Universität Wien - Austria). (TCPL 201) |
14:00 - 14:30 | Rentian Yao (TCPL 201) |
14:30 - 15:00 | Andrew Warren (TCPL 201) |
15:00 - 15:30 | Coffee Break (TCPL Foyer) |
15:30 - 16:00 |
Omar Abdul Halim: Multi-to one-dimensional screening and semi-discrete optimal transport ↓ We study the monopolist's screening problem with a multi-dimensional distribution of consumers and a one-dimensional space of goods. We establish general conditions under which solutions satisfy a structural condition known as nestedness, which greatly simplifies their analysis and characterization. Under these assumptions, we go on to develop a general method to solve the problem, either in closed form or with relatively simple numerical computations, and illustrate it with examples. These results are established both when the monopolist has access to only a discrete subset of the one-dimensional space of products, as well as when the entire continuum is available. (TCPL 201) |
16:00 - 16:30 | Forest Kobayashi (TCPL 201) |
17:30 - 19:30 |
Dinner ↓ A buffet dinner is served daily between 5:30pm and 7:30pm in Vistas Dining Room, top floor of the Sally Borden Building. (Vistas Dining Room) |
20:00 - 21:00 |
Social gathering ↓ We will introduce ourselves and get to know each other (PDC BIRS Lounge) |
Tuesday, July 1 | |
---|---|
07:00 - 08:45 |
Breakfast ↓ Breakfast is served daily between 7 and 9am in the Vistas Dining Room, the top floor of the Sally Borden Building. (Vistas Dining Room) |
09:00 - 10:00 | Bharath Sriperumbudur (TCPL 201) |
10:00 - 10:30 | Coffee Break (TCPL Foyer) |
10:30 - 11:15 | Youssef Mroueh (TCPL 201) |
11:15 - 11:30 |
Group Photo ↓ Meet in foyer of TCPL to participate in the BIRS group photo. The photograph will be taken outdoors, so dress appropriately for the weather. Please don't be late, or you might not be in the official group photo! (TCPL Foyer) |
11:30 - 13:00 |
Lunch ↓ Lunch is served daily between 11:30am and 1:30pm in the Vistas Dining Room, the top floor of the Sally Borden Building. (Vistas Dining Room) |
13:15 - 14:00 | Adil Salim (TCPL 201) |
14:00 - 14:30 |
Lauren Conger: Monotonicity of Coupled Multispecies Wasserstein-2 Gradient Flows ↓ We present a notion of λ-monotonicity for an n-species system of PDEs governed by flow dynamics, extending monotonicity in Banach spaces to the Wasserstein-2 metric space. We show that monotonicity implies the existence of and convergence to a unique steady state. In the special setting of Wasserstein-2 gradient descent of different energies for each species, we prove convergence to the unique Nash equilibrium of the associated energies, and discuss the relationship between monotonicity and displacement convexity. This extends known zero-sum (min-max) results in infinite-dimensional game theory to the general-sum setting. We provide examples of monotone coupled gradient flow systems, including cross-diffusion, nonlocal interaction, and linear and nonlinear diffusion. Numerically, we demonstrate convergence of a four-player economic model for market competition, and an optimal transport problem. This is joint work with Ricardo Baptista, Franca Hoffmann, Eric Mazumdar, and Lillian Ratliff. (TCPL 201) |
14:30 - 15:00 |
Clément Bonet: Flowing Datasets with Wasserstein over Wasserstein Gradient Flows ↓ Many applications in machine learning involve data represented as probability distributions. The emergence of such data requires radically novel techniques to design tractable gradient flows on probability distributions over this type of (infinite-dimensional) objects. For instance, being able to flow labeled datasets is a core task for applications ranging from domain adaptation to transfer learning or dataset distillation. In this setting, we propose to represent each class by the associated conditional distribution of features, and to model the dataset as a mixture distribution supported on these classes (which are themselves probability distributions), meaning that labeled datasets can be seen as probability distributions over probability distributions. We endow this space with a metric structure from optimal transport, namely the Wasserstein over Wasserstein (WoW) distance, derive a differential structure on this space, and define WoW gradient flows. The latter enables to design dynamics over this space that decrease a given objective functional. We apply our framework to transfer learning and dataset distillation tasks, leveraging our gradient flow construction as well as novel tractable functionals that take the form of Maximum Mean Discrepancies with Sliced-Wasserstein based kernels between probability distributions. Joint work with Christophe Vauthier and Anna Korba. (TCPL 201) |
15:00 - 15:30 | Coffee Break (TCPL Foyer) |
15:30 - 16:00 | Jakwang Kim (TCPL 201) |
16:00 - 16:30 | Sibylle Marcotte (TCPL 201) |
17:30 - 19:30 |
Dinner ↓ A buffet dinner is served daily between 5:30pm and 7:30pm in Vistas Dining Room, top floor of the Sally Borden Building. (Vistas Dining Room) |
Wednesday, July 2 | |
---|---|
07:00 - 08:45 |
Breakfast ↓ Breakfast is served daily between 7 and 9am in the Vistas Dining Room, the top floor of the Sally Borden Building. (Vistas Dining Room) |
09:15 - 10:00 | Flavien Leger (TCPL 201) |
10:00 - 10:30 | Coffee Break (TCPL Foyer) |
10:30 - 11:15 |
Austin Stromme: Asymptotic log-Sobolev constants and the Polyak-Łojasiewicz gradient domination condition ↓ The Polyak-Łojasiewicz (PL) constant for a given function exactly characterizes the exponential rate of convergence of gradient flow uniformly over initializations, and has been of major recent interest in optimization and machine learning because it is strictly weaker than strong convexity yet implies many of the same results. In the world of sampling, the log-Sobolev inequality plays an analogous role, governing the convergence of Langevin dynamics from arbitrary initialization in Kullback-Leibler divergence. In this talk, we present a new connection between optimization and sampling by showing that the PL constant is exactly the low temperature limit of the re-scaled log-Sobolev constant, under mild assumptions. Based on joint work with Sinho Chewi. (TCPL 201) |
11:15 - 12:00 |
Daniel Lacker: Geodesic convexity and strengthened functional inequalities on submanifolds of Wasserstein space ↓ We study geodesic convexity properties of various functionals on submanifolds of Wasserstein spaces with their induced geometry. We obtain short new proofs of several known results, such as the strong convexity of entropy on sphere-like submanifolds due to Carlen-Gangbo, as well as new ones, such as the $\lambda$-convexity of entropy on the space of couplings of $\lambda$-log-concave marginals. The arguments revolve around a simple but versatile principle, which crucially requires no knowledge of the structure or regularity of geodesics in the submanifold (and which is valid in a general metric spaces): If the EVI($\lambda$) gradient flow of a functional exists and leaves a submanifold invariant, then the restriction of the functional to the submanifold is geodesically $\lambda$-convex. In these settings, we derive strengthened forms of Talagrand and HWI inequalities on submanifolds, which we show to be related to large deviation bounds for conditioned empirical measures. This is joint work with Louis-Pierre Chaintron. (TCPL 201) |
11:30 - 13:00 |
Lunch ↓ Lunch is served daily between 11:30am and 1:30pm in the Vistas Dining Room, the top floor of the Sally Borden Building. (Vistas Dining Room) |
13:30 - 17:30 | Free Afternoon (Banff National Park) |
17:30 - 19:30 |
Dinner ↓ A buffet dinner is served daily between 5:30pm and 7:30pm in Vistas Dining Room, top floor of the Sally Borden Building. (Vistas Dining Room) |
Thursday, July 3 | |
---|---|
07:00 - 08:45 |
Breakfast ↓ Breakfast is served daily between 7 and 9am in the Vistas Dining Room, the top floor of the Sally Borden Building. (Vistas Dining Room) |
09:00 - 10:00 |
Katy Craig: Gradient Flows with Different Gradients: Wasserstein, Hellinger-Kantorovich, and Vector Valued Gradient Flows ↓ Following an overview of Wasserstein and Hellinger-Kantorovich gradient flows, I will introduce the notion of vector valued gradient flows, which arise in applications including multispecies PDEs and classification of vector valued measures. Our main result is a unified framework that connects four existing notions of vector valued optimal transport, along with a sharp inequality relating the four notions. I will close by comparing and contrasting the properties of each metric from the perspective of gradients flows and linearization (TCPL 201) |
10:00 - 10:30 | Coffee Break (TCPL Foyer) |
10:30 - 11:15 | Jose Carrillo (TCPL 201) |
11:30 - 13:00 |
Lunch ↓ Lunch is served daily between 11:30am and 1:30pm in the Vistas Dining Room, the top floor of the Sally Borden Building. (Vistas Dining Room) |
13:30 - 14:15 |
Sinho Chewi: Toward ballistic acceleration for log-concave sampling ↓ The underdamped (or kinetic) Langevin dynamics are conjectured to provide a diffusive-to-ballistic speed-up for log-concave sampling. This was recently established in continuous time via the space-time Poincaré inequality of Cao, Lu, Wang, and placed in the context of non-reversible lifts by Eberle and Lörler. However, these results have so far not led to accelerated iteration complexities for numerical discretizations. In this talk, I will describe a framework for establishing KL divergence bounds for SDE discretizations based on local error computations. We apply this to show the first algorithmic result for log-concave sampling with sublinear dependence on the condition number (i.e., a partial result toward the conjectured acceleration phenomenon). At the heart of this result is a technique to use coupling arguments to control information-theoretic divergences. This technique, which we call “shifted composition”, builds on works developed with my co-authors Jason M. Altschuler and Matthew S. Zhang. (TCPL 201) |
14:15 - 15:00 |
Li Wang: Learning-enhanced particle methods for gradient flow PDEs ↓ In the current stage of numerical methods for PDE, the primary challenge lies in addressing the complexities of high dimensionality while maintaining physical fidelity in our solvers. In this presentation, I will introduce deep learning assisted particle methods aimed at addressing some of these challenges. These methods combine the benefits of traditional structure-preserving techniques with the approximation power of neural networks, aiming to handle high dimensional problems with minimal training. I will begin with a discussion of general Wasserstein-type gradient flows and then extend the concept to the Landau equation in plasma physics. If time allows, I will also mention our recent progress in extending this framework to operator learning. (TCPL 201) |
15:00 - 15:30 | Coffee Break (TCPL Foyer) |
15:30 - 16:00 |
Wenjun Zhao: Data analysis through Wasserstein barycenter with general factors ↓ In this talk, we introduce an extension of Wasserstein barycenter to general types of factors. To showcase its applicability in data analysis, we propose a general framework using the barycenter problem for simulating conditional distributions and beyond. Real-world examples on meteorological time series within purely data-driven settings will be presented to demonstrate our methodology.
This talk is based on joint work with the group of Esteban Tabak (NYU). (TCPL 201) |
16:00 - 16:30 |
Aram-Alexandre Pooladian: Wasserstein Flow Matching: Generative modeling over families of distributions ↓ The task of generative modeling typically concerns the transport of a single source distribution to a single target distribution on the basis of samples. Transport can take the form of an optimal transport map, but recent work has shown that it is more tractable to learn simple probability flows through what is known as "flow matching", a framework which suggests to perform regression onto simple probability flows. In this work, we study the task of learning flows between families of distributions (i.e., many source measures to many target measures). Examples include families of means-covariances (e.g., Gaussians), or families of point-clouds --- in both of these examples, the geometry of the data is relevant to the generative task, and we wish to preserve this along the flows. We introduce Wasserstein flow matching (WFM), which appropriately lifts flow matching onto families of distributions by appealing to the Riemannian nature of the Wasserstein geometry. Our algorithm leverages theoretical and computational advances in entropic optimal transport, as well as the attention mechanism in our neural network architecture. As applications, we demonstrate how to generate representations of granular cell states from single-cell genomics data (via Bures--Wasserstein FM) and synthesize cellular microenvironments from spatial transcriptomics datasets. Code is available at [WassersteinFlowMatching](this https URL). This is joint work with Doron Haviv, Brandon Amos, and Dana Pe'er. (TCPL 201) |
16:30 - 17:00 | Matthew (Shunshi) Zhang (TCPL 201) |
17:30 - 19:30 |
Dinner ↓ A buffet dinner is served daily between 5:30pm and 7:30pm in Vistas Dining Room, top floor of the Sally Borden Building. (Vistas Dining Room) |
Friday, July 4 | |
---|---|
07:00 - 08:45 |
Breakfast ↓ Breakfast is served daily between 7 and 9am in the Vistas Dining Room, the top floor of the Sally Borden Building. (Vistas Dining Room) |
08:45 - 09:30 |
Hugo Lavenant: Gradient flows of potential energies in the geometry of Sinkhorn divergences ↓ What happens to Wasserstein gradient flows if one uses entropic optimal transport in the JKO scheme instead of plain optimal transport? I will explain why it may be relevant to use Sinkhorn divergences, built on entropic optimal transport, as they allow the regularization parameter to remain fixed. This approach leads to a new flow on the space of probability measure: a gradient flow with respect to the Riemannian geometry induced by Sinkhorn divergences. I will discuss the intriguing structure and features of this flow. This is joint work with Mathis Hardion. (TCPL 201) |
09:30 - 10:00 | Garrett Mulcahy (TCPL 201) |
10:00 - 10:30 | Coffee Break (TCPL Foyer) |
10:30 - 11:00 |
Checkout by 11AM ↓ 5-day workshop participants are welcome to use BIRS facilities (TCPL ) until 3 pm on Friday, although participants are still required to checkout of the guest rooms by 11AM. (Front Desk - Professional Development Centre) |
10:30 - 11:15 | Jan Maas (TCPL 201) |
12:00 - 13:30 | Lunch from 11:30 to 13:30 (Vistas Dining Room) |