Groups keyboard shortcuts have been updated
Dismiss
See shortcuts

Reminder: All About That... Seminar Series - Modèles génératifs et problèmes inverses, 15 Novembre 2024

8 views
Skip to first unread message

All About That Bayes

unread,
Nov 7, 2024, 3:07:26 AM11/7/24
to All About That Bayes
English version below

Chères et chers collègues,


un bref message pour vous rappeler la prochaine séance de notre après midi thématique Modèles génératifs et problèmes inverses le vendredi 15 Novembre. La séance se tiendra au SCAI (Plan d'accès) de 13h30 à 17h00 (programme ci-dessous).


L’après-midi est accessible à toutes et tous, mais merci d’indiquer votre participation en vous inscrivant via le lien suivant : https://forms.office.com/e/fuVzYurNRY


À vos agendas, pour les prochaines dates 2025 : 24 Janvier et 18 Avril !


En espérant vous voir nombreuses et nombreux !


Bien cordialement,

Julien Stoehr et Sylvain Le Corff
Pour le Groupe Spécialisé « Statistique Bayésienne » de la SFdS


signature_sfds_scai.png


--------------------------------------------------------------------------------------------------------


Dear Colleagues,

A brief message to remind you about the upcoming session of our thematic afternoon, Generative Models and Inverse Problems, on Friday, November 15. The session will be held at SCAI (access map) from 1:30 PM to 5:00 PM (see program below).

The event is open to all, but please confirm your attendance by registering via the following link: https://forms.office.com/e/fuVzYurNRY.

Mark your calendars for the upcoming 2025 dates: January 24 and April 18!

Looking forward to seeing many of you there!

Best regards,
Julien Stoehr and Sylvain Le Corff
For the Bayesian Statistics Group of the SFdS

signature_sfds_scai.png

Theme: Generative Models and Inverse Problems
Date: Friday, November 15, 2024
Time: 1:30 PM to 5:00 PM
Location: SCAI (Sorbonne University, Pierre and Marie Curie Campus)


Program :


13h30 - 14h30 Stanislas Strasman (Sorbonne Université) - An analysis of the noise schedule for score-based generative models

Abstract: Score-based generative models (SGMs) aim at estimating a target data distribution by learning score functions using only noise-perturbed samples from the target. Recent literature has focused extensively on assessing the error between the target and estimated distributions, gauging the generative quality through the Kullback-Leibler (KL) divergence and Wasserstein distances. Under mild assumptions on the data distribution, we establish an upper bound for the KL divergence between the target and the estimated distributions, explicitly depending on any time-dependent noise schedule. Under additional regularity assumptions, taking advantage of favorable underlying contraction mechanisms, we provide a tighter error bound in Wasserstein distance compared to state-of-the-art results. In addition to being tractable, this upper bound jointly incorporates properties of the target distribution and SGM hyperparameters that need to be tuned during training.

14h30 - 15h30 Geneviève Robin (Owkin) - Generative methods for sampling transition paths in molecular dynamics

Abstract: Molecular systems often remain trapped for long times around some local minimum of the potential energy function, before switching to another one -- a behavior known as metastability. Simulating transition paths linking one metastable state to another one is difficult by direct numerical methods. In view of the promises of machine learning techniques, we explore in this work two approaches to more efficiently generate transition paths: sampling methods based on generative models such as variational autoencoders, and importance sampling methods based on reinforcement learning.

15h30 - 16h00 Pause

16h00 - 17h00 Garbiel Victorino Cardoso (École des Mines) - Solving inverse problems with score-based priors

Abstract: Solving ill-posed (Bayesian) inverse problems generally rely on the power of the prior distribution (or data fidelity term). In this talk, we focus on how to use an off the shelf score-based generative model as prior and how to modify the inner sampling procedure of the generative model to sample (approximately) from the posterior distribution. This is done without retraining the off the shelf generative model. We then present how we have use this procedure to solve inverse problems that arise in electrocardiogram analysis.

References:
[1] Gabriel Cardoso,  Yazid Janati, Sylvain Le Corff, and Eric Moulines. Monte Carlo guided Denoising Diffusion models for Bayesian linear inverse problems. The Twelfth International Conference on Learning Representations. 2023.
[2]Cardoso, G. V., Bedin, L., Duchateau, J., Dubois, R., & Moulines, E. (2023). Bayesian ecg reconstruction using denoising diffusion generative models. to appear in Neurips 2024.

Reply all
Reply to author
Forward
0 new messages