Hi,
I’m using GTSAM/iSAM2 for landmark-based SLAM, and I need joint marginal covariances for subsets of variables in order to run JCBB data association.
At the moment, covariance extraction is a major runtime bottleneck. Profiling on the Victoria Park datasets shows that this step takes a substantial fraction of the total runtime (>80%). My current approach rebuilds a Marginals object from the full factor graph at every measurement step, and then uses Marginals::jointMarginalInformation() to obtain the covariance (or information matrix) for the variables of interest.
Since the optimization itself is incremental, this kind of batch-style covariance recovery seems poorly matched to the incremental nature of iSAM2, reducing some of the practical benefit of using an incremental solver because a significant amount of work (linearization, multi-frontal elimination) is repeatedly redone just to recover marginals.
I noticed that:
So my question is:
Is there any efficient way in GTSAM/iSAM2 to recover joint marginal covariances for an arbitrary subset of variables directly from the Bayes tree, without reconstructing full batch marginals each time?
More specifically:
I’m also considering trying to implement something along these lines myself in the GTSAM source, but I’m still getting acquainted with the internal structure of the library, so I’d be very grateful for any pointers or advice.
|
You don't often get email from skj...@gmail.com.
Learn why this is important
|
Hi Frank,
I tried the PR on my Victoria Park landmark-SLAM setup, and I do see a very substantial speedup.
The full run went from about 1441 s (~23:21) to 322 s (~4:47), which is roughly a 4.5× end-to-end speedup. Based on visual inspection, I would roughly estimate the average query size to be around 20–30 variables, with queried subsets that are moderately localized.
Since my setup is in Python, I tested this by adding a Matrix ISAM2::jointMarginalCovariance(const KeyVector&) method and exposing it through the wrapper.
Thanks again — very promising for my use case.