Theme: Generative Models and Inverse Problems
Date: Friday, November 15, 2024
Time: 1:30 PM to 5:00 PM
Location: SCAI (Sorbonne University, Pierre and Marie Curie Campus)
Program :
13h30 - 14h30 Stanislas Strasman (Sorbonne Université) - An analysis of the noise schedule for score-based generative models
Abstract: Score-based generative models (SGMs) aim at estimating a target data distribution by learning score functions using only noise-perturbed samples from the target. Recent literature has focused extensively on assessing the error between the target and estimated distributions, gauging the generative quality through the Kullback-Leibler (KL) divergence and Wasserstein distances. Under mild assumptions on the data distribution, we establish an upper bound for the KL divergence between the target and the estimated distributions, explicitly depending on any time-dependent noise schedule. Under additional regularity assumptions, taking advantage of favorable underlying contraction mechanisms, we provide a tighter error bound in Wasserstein distance compared to state-of-the-art results. In addition to being tractable, this upper bound jointly incorporates properties of the target distribution and SGM hyperparameters that need to be tuned during training.
14h30 - 15h30 Geneviève Robin (Owkin) - Generative methods for sampling transition paths in molecular dynamics
Abstract: Molecular systems often remain trapped for long times around some local minimum of the potential energy function, before switching to another one -- a behavior known as metastability. Simulating transition paths linking one metastable state to another one is difficult by direct numerical methods. In view of the promises of machine learning techniques, we explore in this work two approaches to more efficiently generate transition paths: sampling methods based on generative models such as variational autoencoders, and importance sampling methods based on reinforcement learning.
15h30 - 16h00 Pause
16h00 - 17h00 Garbiel Victorino Cardoso (École des Mines) - Solving inverse problems with score-based priors
Abstract: Solving ill-posed (Bayesian) inverse problems generally rely on the power of the prior distribution (or data fidelity term). In this talk, we focus on how to use an off the shelf score-based generative model as prior and how to modify the inner sampling procedure of the generative model to sample (approximately) from the posterior distribution. This is done without retraining the off the shelf generative model. We then present how we have use this procedure to solve inverse problems that arise in electrocardiogram analysis.
References:
[1] Gabriel Cardoso, Yazid Janati, Sylvain Le Corff, and Eric Moulines. Monte Carlo guided Denoising Diffusion models for Bayesian linear inverse problems. The Twelfth International Conference on Learning Representations. 2023.
[2]Cardoso, G. V., Bedin, L., Duchateau, J., Dubois, R., & Moulines, E. (2023). Bayesian ecg reconstruction using denoising diffusion generative models. to appear in Neurips 2024.