Groups keyboard shortcuts have been updated
Dismiss
See shortcuts

RAPPEL: All About That... Seminar Series - Simulation based inference, 24 Janvier 2025

25 views
Skip to first unread message

All About That Bayes

unread,
Jan 10, 2025, 6:13:28 AMJan 10
to All About That Bayes

English version below


Chères et chers collègues,


nos meilleurs voeux pour cette année 2025 ! Comme vous le savez, la prochaine séance du séminaire « All About that… » aura lieu le 24 Janvier 2025 au SCAI (Plan d'accès) ! Le programme est ci-dessous !


Pour rappel, l’après-midi est accessible à toutes et tous, mais merci d’indiquer votre participation en vous inscrivant via le lien suivant : https://forms.office.com/e/me7PJmkQDm


En espérant vous voir nombreuses et nombreux !


Bien cordialement,

Julien Stoehr et Sylvain Le Corff
Pour le Groupe Spécialisé « Statistique Bayésienne » de la Société Française de Statistique (SFdS)


signature_sfds_scai.png

--------------------------------------------------------------------------------------------------------


Dear colleagues,


We wish you a happy new year ! As we previously informed you, the next session of the "All About that..." seminar series will take place on January 24, 2025 at SCAI (Access map). The program is below !


The event is open to all, but please confirm your attendance by registering via the following link: https://forms.office.com/e/me7PJmkQDm



We look forward to seeing many of you there!


Best regards,

Julien Stoehr and Sylvain Le Corff

For the Bayesian Statistics Group of the Société Française de Statistique (SFdS)


signature_sfds_scai.png



Theme: Simulation based inference

Date: Vendredi 24 Janvier 2025

Time: De 13h30 à 17h00

Location: SCAI (Sorbonne Université, Campus Pierre et Marie Curie)


Program:


13h30 - 14h30 Joshua Bon (Université Paris Dauphine) - Bayesian score calibration for approximate models

 
Abstract: Scientists continue to develop increasingly complex mechanistic models to reflect their knowledge more realistically. Statistical inference using these models can be challenging since the corresponding likelihood function is often intractable and model simulation may be computationally burdensome. Fortunately, in many of these situations, it is possible to adopt a surrogate model or approximate likelihood function. It may be convenient to conduct Bayesian inference directly with the surrogate, but this can result in bias and poor uncertainty quantification. In this paper (https://arxiv.org/abs/2211.05357) we propose a new method for adjusting approximate posterior samples to reduce bias and produce more accurate uncertainty quantification. We do this by optimizing a transform of the approximate posterior that maximizes a scoring rule. Our approach requires only a (fixed) small number of complex model simulations and is numerically stable. We demonstrate beneficial corrections to several approximate posteriors using our method on several examples of increasing complexity.


14h30 - 15h30 Giacomo Zanella (Bocconi University) - Entropy contraction of the Gibbs sampler under log-concavity


Abstract: In this talk I will present recent work (https://arxiv.org/abs/2410.00858) on the non-asymptotic analysis of the Gibbs sampler, a classical and popular MCMC algorithm for sampling. In particular, under the assumption that the probability measure π of interest is strongly log-concave, we show that the random scan Gibbs sampler contracts in relative entropy, and provide a sharp characterization of the associated contraction rate. The result implies that, under appropriate conditions, the number of full evaluations of π required for the Gibbs sampler to converge is independent of the dimension. If time permits, I will also discuss connections and applications of the above results to the problem of zero-order parallel sampling, as well as extensions to Hit-and-Run and Metropolis-within-Gibbs. 


Based on joint work with Filippo Ascolani and Hugo Lavenant.


15h30 - 16h00 Pause


16h00 - 17h00 Paul Bastide (Université Paris Cité) - Goodness of Fit for Bayesian Generative Models with Applications in Population Genetics


In population genetics, inference about intractable likelihood models is common, and simulation methods, including Approximate Bayesian Computation (ABC) and Simulation-Based Inference (SBI), are essential. ABC/SBI methods work by simulating instrumental data sets of the models under study and comparing them with the observed data set $y_{obs}$. Advanced machine learning tools are used for tasks such as model selection and parameter inference. The present work focuses on model criticism. This type of analysis, called goodness of fit (GoF), is important for model validation. It can also be used for model pruning when the number of candidates to be considered is excessive, especially in the context where data simulation is expensive. We introduce two new GoF tests based on the local outlier factor (LOF), an indicator that was initially defined for outlier and novelty detection. We test whether $y_{obs}$ is distributed from the prior predictive distribution (pre-inference GoF) and whether there is a parameter value such that $y_{obs}$ is distributed from the likelihood with that value (post-inference GoF).  We evaluate the performance of our two GoF tests on simulated datasets from three different model settings of varying complexity, and on a dataset of single nucleotide polymorphism (SNP) markers for the evaluation of complex evolutionary scenarios of modern human populations.


Joint work with Guillaume Le Mailloux, Jean-Michel Marin and Arnaud Estoup.

Reply all
Reply to author
Forward
0 new messages