Efficient joint marginal covariance recovery from iSAM2 / Bayes tree for JCBB?

91 views
Skip to first unread message

Ovar

unread,
Mar 22, 2026, 7:34:30 PMMar 22
to gtsam users

Hi,

I’m using GTSAM/iSAM2 for landmark-based SLAM, and I need joint marginal covariances for subsets of variables in order to run JCBB data association.

At the moment, covariance extraction is a major runtime bottleneck. Profiling on the Victoria Park datasets shows that this step takes a substantial fraction of the total runtime (>80%). My current approach rebuilds a Marginals object from the full factor graph at every measurement step, and then uses Marginals::jointMarginalInformation() to obtain the covariance (or information matrix) for the variables of interest.

Since the optimization itself is incremental, this kind of batch-style covariance recovery seems poorly matched to the incremental nature of iSAM2, reducing some of the practical benefit of using an incremental solver because a significant amount of work (linearization, multi-frontal elimination) is repeatedly redone just to recover marginals.

I noticed that:

  • the iSAM2 paper mentions that marginals may be recovered recursively from the Bayes tree, in a way that is formally equivalent to the dynamic programming ideas in the “Covariance Recovery from a Square Root Information Matrix for Data Association” paper
  • BayesTree in GTSAM already appears to support marginal factors for one or two variables, but not for larger subsets

So my question is:

Is there any efficient way in GTSAM/iSAM2 to recover joint marginal covariances for an arbitrary subset of variables directly from the Bayes tree, without reconstructing full batch marginals each time?

More specifically:

  • Has this already been explored in GTSAM or related work?
  • Would extending the existing Bayes tree marginal-factor machinery beyond two variables be a reasonable direction?
  • Or is there a better way to exploit the iSAM2 / Bayes tree structure for this kind of query?

I’m also considering trying to implement something along these lines myself in the GTSAM source, but I’m still getting acquainted with the internal structure of the library, so I’d be very grateful for any pointers or advice.

Dellaert, Frank

unread,
Mar 22, 2026, 8:07:33 PMMar 22
to Ovar, gtsam users
You might be in luck. I’m about to add a PR that dramatically speeds up covariance queries if your queries involve more than 1 or 2 variables.

If you’re only interested in marginals on single variables, then maybe your bottleneck is constructing the Marginals object. But that is not needed, the Bayes tree already has

```
/** Return marginal on any variable. Note that this actually returns a conditional, for which a
* solution may be directly obtained by calling .solve() on the returned object.
* Alternatively, it may be directly used as its factor base class. For example, for Gaussian
* systems, this returns a GaussianConditional, which inherits from JacobianFactor and
* GaussianFactor. */
sharedConditional marginalFactor(Key j, const Eliminate& function = EliminationTraitsType::DefaultEliminate) const;

/**
* return joint on two variables
* Limitation: can only calculate joint if cliques are disjoint or one of them is root
*/
sharedFactorGraph joint(Key j1, Key j2, const Eliminate& function = EliminationTraitsType::DefaultEliminate) const;
```

Both of these methods will become a tiny bit faster, but are already close to as good as they’re going to get. 

Best
Frank


From: gtsam...@googlegroups.com <gtsam...@googlegroups.com> on behalf of Ovar <skj...@gmail.com>
Date: Sunday, March 22, 2026 at 7:34 PM
To: gtsam users <gtsam...@googlegroups.com>
Subject: [GTSAM] Efficient joint marginal covariance recovery from iSAM2 / Bayes tree for JCBB?

You don't often get email from skj...@gmail.com. Learn why this is important
--
You received this message because you are subscribed to the Google Groups "gtsam users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to gtsam-users...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/gtsam-users/9bf2a34d-933f-42b3-b8a1-401ef44ed1e1n%40googlegroups.com.

Dellaert, Frank

unread,
Mar 23, 2026, 12:53:00 AMMar 23
to Ovar, gtsam users

Mostly Codex 5.4, but under my strong supervision 😊

I would love it if you could try it and report on whether you see similar speedups.

Best
Frank

Ovar

unread,
Mar 23, 2026, 7:16:15 AMMar 23
to gtsam users
This is exactly what I was looking for!

I'll give it a try on my setup and report back my findings.

Thanks a lot!

Ovar

unread,
Mar 24, 2026, 11:50:48 AMMar 24
to gtsam users

Hi Frank,

I tried the PR on my Victoria Park landmark-SLAM setup, and I do see a very substantial speedup.

The full run went from about 1441 s (~23:21) to 322 s (~4:47), which is roughly a 4.5× end-to-end speedup. Based on visual inspection, I would roughly estimate the average query size to be around 20–30 variables, with queried subsets that are moderately localized.

Since my setup is in Python, I tested this by adding a Matrix ISAM2::jointMarginalCovariance(const KeyVector&) method and exposing it through the wrapper.

Thanks again — very promising for my use case.

Dellaert, Frank

unread,
Mar 24, 2026, 2:06:03 PMMar 24
to Ovar, gtsam users
Excellent!

Good point about the wrapper, I’ll try to include that in the PR.

Best!
Frank

Sent: Tuesday, March 24, 2026 9:50:47 AM
To: gtsam users <gtsam...@googlegroups.com>

Ovar

unread,
Mar 24, 2026, 6:41:34 PMMar 24
to gtsam users
One last note: while reading CovarianceRecovery.pdf in the PR, I wondered if there might be a couple of notation issues. Around Eq. 21, unless I’m misunderstanding it, it seems that the conditionals of the queried cliques should also be included as factors. And in Eq. 10, shouldn’t the integration be over (? Just wanted to mention it in case it’s helpful.
Reply all
Reply to author
Forward
0 new messages