Groups keyboard shortcuts have been updated
Dismiss
See shortcuts

Reminder - October 10, with Kaniav Kamary and some other news

26 views
Skip to first unread message

All About That Bayes

unread,
Oct 9, 2023, 2:54:23 AM10/9/23
to All About That Bayes
Dear all,

a gentle reminder that tomorrow (October 10, 2023), 16:00 we will have a talk by  Kaniav Kamary (Centrale Supélec) on Bayesian principal component analysis.

Abstract. The technique of principal component analysis (PCA) has recently been expressed as the maximum likelihood solution for a generative latent variable model. In this talk, I’ll first present probabilistic reformulation that is the basis for a Bayesian treatment of PCA. Then, my focus will be on showing that the effective dimensionality of the latent space (equivalent to the number of retained principal components) can be determined automatically as part of the Bayesian inference procedure.

The seminar will take place at SCAI (Sorbonne Université, Campus Pierre et Marie Curie) and will be available online via the zoom link: https://cnrs.zoom.us/j/99452802078?pwd=aWV1V3cyYzB3cHl5emY3a0ZKczNuQT09
(ID de réunion : 994 5280 2078, Code secret : ic51nf)

Another information that can be of interest : a monthly seminar on  the theory and practice of Monte Carlo in statistics and data science is starting on Friday October 13th. It will take place at PariSanté Campus (Room 7) from 16:00 to 18:00. You can find the program below.

Best regards,
The All about that Bayes organising team

-------------------------------------------------------------------

4 p.m. Piecewise deterministic sampling with splitting schemes. Andrea Bertazzi, CMAP - École Polytechnique

Piecewise deterministic Markov processes (PDMPs) received substantial interest in recent years as an alternative to classical Markov chain Monte Carlo algorithms. While theoretical properties of PDMPs have been studied extensively, their practical implementation remains limited to specific applications in which bounds on the gradient of the negative log-target can be derived. In order to address this problem, we propose to approximate PDMPs using splitting schemes, that means simulating the deterministic dynamics and the random jumps in two different stages. We show that symmetric splittings of PDMPs are of second order. Then we focus on the Zig-Zag sampler (ZZS) and show how to remove the bias of the splitting scheme with a skew reversible Metropolis filter. Finally, we illustrate with numerical simulations the advantages of our proposed scheme over competitors.

5 p.m. Bayesian score calibration for approximate models. Joshua Bon, Ceremade - Université Paris Dauphine-PSL

Scientists continue to develop increasingly complex mechanistic models to reflect their knowledge more realistically. Statistical inference using these models can be challenging since the corresponding likelihood function is often intractable and model simulation may be computationally burdensome.  Fortunately, in many of these situations, it is possible to adopt a surrogate model or approximate likelihood function.  It may be convenient to base Bayesian inference directly on the surrogate, but this can result in bias and poor uncertainty quantification.  In this paper we propose a new method for adjusting approximate posterior samples to reduce bias and produce more accurate uncertainty quantification.  We do this by optimizing a transform of the approximate posterior that maximizes a scoring rule.  Our approach requires only a (fixed) small number of complex model simulations and is numerically stable.  We demonstrate good performance of the new method on several examples of increasing complexity.
Reply all
Reply to author
Forward
0 new messages