In Bayesian inference, we must specify a model for the data (a likelihood) and a model for parameters (a prior). Consider two questions:
Why is it more complicated to specify the likelihood than the prior?
In order to specify the prior, how could can we switch between the theoretical literature (invariance, normality assumption, ...) and the applied literature (experts elicitation, robustness, ...)?
I will discuss those question in the domain of causal inference: prior distributions for causal effects, coefficients of regression and the other parameters in causal models.
In Bayesian inference, we must specify a model for the data (a likelihood) and a model for parameters (a prior). Consider two questions:
Why is it more complicated to specify the likelihood than the prior?
In order to specify the prior, how could can we switch between the theoretical literature (invariance, normality assumption, ...) and the applied literature (experts elicitation, robustness, ...)?
I will discuss those question in the domain of causal inference: prior distributions for causal effects, coefficients of regression and the other parameters in causal models.
Sujet : all about that bayes seminar - Andrew Gelman - Prior distribution for causal inference
Heure : 11 oct. 2022 02:00 PM Paris
Participer à la réunion Zoom
https://univ-grenoble-alpes-fr.zoom.us/j/93236504120?pwd=WngySU9FSzJGYVE3b253Ympubk9kdz09
ID de réunion : 932 3650 4120
Code secret : 369362