Normalizing constant / marginal likelihood in TMB?

22 views
Skip to first unread message

Katie Paulson

unread,
Dec 23, 2025, 11:51:42 AM12/23/25
to us...@tmb-project.org
Hello,

Is it possible to use TMB to estimate the normalizing constant (marginal likelihood) for a Bayesian model?

To be clear, I have something like:
p(theta | data) = p(data | theta) p(theta) / p(data)

with a likelihood defined for p(data | theta), and latent model defined for theta. I've been using TMB to fit this model and approximate the posterior distribution p(theta | data).

Now, I would like to also get the normalizing constant p(data). With other tools I know how to extract this value (e.g. R-INLA via mlik in the model output) -- is this also possible in TMB?

Thank you!
Katie Paulson


Paul vdb

unread,
Feb 17, 2026, 1:20:12 PM (9 days ago) Feb 17
to TMB Users
`aghq` is an R package that uses quadrature to approximate the posterior using a TMB object. It has some similarities to INLA for using Laplace for inner marginalization but then using adaptive gauss hermite quadrature for the outer marginalization to get p(data).
awstringer1/aghq: Adaptive Gauss Hermite Quadrature for Bayesian Inference

See the vignette for:  `normalized_posterior$lognormconst` from the built object to get the normalizing constant.... I think but double check.

Reply all
Reply to author
Forward
0 new messages