FixedLag Visual-Inertial Odometry

994 views
Skip to first unread message

Thomas Mörwald

unread,
Mar 21, 2019, 3:19:39 AM3/21/19
to gtsam users
Hi there,

I'm trying to set up a sliding window Visual-Inertial Odometry pipeline using the IncrementalFixedLagSmoother in gtsam_unstable/nonlinear.
Here's a sketch of the factor graph for a better understanding:

VIO.png


When it comes to the marginalization of the first pose (x0), iSAM2 complains about (x0) not beeing a leave node:
Exception: "Requesting to marginalize variables that are not leaves, the ISAM2 object is now in an inconsistent state so should no longer be used."

Debugging the code and saving the BayesTree before and after reordering for marginalization, I found that all landmarks are leaves of (x0) after reordering the Bayes Tree for marginalization of (x0).

So, why are there still landmarks below (x0) after reordering (in IncrementalFixedLagSmoother::update() line 130)?

Do I have to provide the factor indices of the ProjectionFactors between (l0-l5) to (x0) to iSAM2::update()?

If so, is the reordering taking care of the margins of the removed ProjectionFactors, or is it just cutting them off?


Do you have some papers or other literature dealing with marginalization using iSAM2 ?


Thanks, and best regards,

Thomas







Frank Dellaert

unread,
Mar 21, 2019, 10:02:40 AM3/21/19
to gtsam users
Thomas, *perhaps* the issue is that x0 is still connected (by you) to landmarks *after* it has been pushed outside of the lag? It's hard to see without code. In cases like this I'd create a new test in testIncrementalFixedLagSmoother and make it fail, then try to solve it. If you create a PR with such a new test (as minimal as possible, fake data, just BetweenFactors between Point3s), but with your sequence of updates, I'll take a look at what is going on.

As far as papers go: we don't have a good paper showing the awesome power of marginalization in the Bayes tree :-(. If anyone wants to write it with me, I'm game :-) These come closest:

Thomas Mörwald

unread,
Mar 25, 2019, 2:26:08 PM3/25/19
to gtsam users
Thanks for your hint and the paper references. There was indeed some connection to the state that was supposed to be marginalized.
I fixed it and now it's working like a charm :)

best wishes,
Thomas

Stefan Gächter

unread,
Aug 26, 2020, 8:56:46 AM8/26/20
to gtsam users
I do not fully understand all the above comments. I studied in detail the paper Williams et al., "Concurrent Filtering and Smoothing". IJRR, 2014 and, in my view, it is a key-paper to show the power of Bayes trees. However, to my understanding, there is no marginalization done, marginalization in the sense that removing a variable and leaving a factor behind. There are two conditionally independent branches, one for filtering and one for smoothing, but rebalancing the tree and moving variables from the filtering branch to the smoothing branch is not marginalization. Or do I misunderstand this? So what is understood by the power of  marginalization in the Bayes tree?

Actually, I am looking for the implementation in GTSAM of the concepts in Williams et al.'s paper. Is it implemented with ConcurrentFilteringAndSmoothing.cpp ? If yes, is there some example available?

Kind regards
Stefan 

Frank Dellaert

unread,
Aug 26, 2020, 9:20:24 AM8/26/20
to gtsam users
Stefan

Marginalization in a Bayes tree is super-simple and automatic, in a way, as at a high level the Bayes tree encodes the posterior in chain rule form, e.g. P(distant past|past)P(past|present)P(present). Of course, it's not a chain but a tree, but that's really the same for the purpose of this discussion. If you decide you no longer care about the distant past, you could simply *drop* the P(distant past|past) subtree. But, you don't have to do this explicitly, if you have enough RAM/disk space: as long as you don't connect any factor to the "distant past" it will never be swapped in memory again.

Indeed, ConcurrentIncrementalSmoother (and other variants) is the code we developed back in the day. We never released the front-end software, though. The best alternative is the unit tests for those classes.

Frank

Stefan Gächter

unread,
Aug 26, 2020, 11:27:29 AM8/26/20
to gtsam users
Thanks for the quick reply. I will have a closer look at the implementation and unit tests. Is there a reason why concurrent filtering and smoothing has not become *the* standard in SLAM community? It is the most elegant solution in my opinion.

I understand your example. Yes, subtrees can be easily dropped without repercussion on the rest of tree, because it has the Bayes tree structure and, yes, then this is marginalization.
 
Personally, I am cautious with the term marginalization. For a while, marginalization has been proposed as the solution for sliding-window approaches to keep the problem bounded. But while bounding the problem by marginalization - factoring and culling some variables, excluding them from the optimization - meant often to loose the sparseness. Then, many heuristics haven been introduced for sparsification. Each paper a different scheme contradicting each other.

But to my understanding, this is different for a Bayes tree. There is factorization, but that is not yet marginalization. There is variable reordering necessary to keep sparseness locally, but that is not marginalization neither. There are no variables dropped in the optimization. Variables are not considered, because  they are conditionally independent or they are not affected up to a certain margin. That again is not marginalization in my view. Marginalization has the tendency to destroy the sparseness. Thus, to highlight the beauty of Bayes trees and to distinguish it from a sliding window approach with marginalization and heuristic sparsification, I avoid the term marginalzation when speaking of Bayes trees.

Stefan

Dellaert, Frank

unread,
Aug 26, 2020, 11:31:34 AM8/26/20
to Stefan Gächter, gtsam users
Agreed with your thoughts, but when you drop a leaf sub-tree in the Bayes tree, you are *actually* doing marginalization, albeit at zero cost: there are no new edges added, anywhere. Of course, the sets of variables you can easily marginalize in that way is completely determined by the ordering, and is not arbitrary at all.

Frank

From: gtsam...@googlegroups.com <gtsam...@googlegroups.com> on behalf of Stefan Gächter <ma...@gachter.name>
Sent: Wednesday, August 26, 2020 11:27:29 AM
To: gtsam users <gtsam...@googlegroups.com>
Subject: [GTSAM] Re: FixedLag Visual-Inertial Odometry
 
--
You received this message because you are subscribed to the Google Groups "gtsam users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to gtsam-users...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/gtsam-users/6ecf2f93-741c-4011-9fdc-4dd162fb5341n%40googlegroups.com.

Stefan Gächter

unread,
Aug 26, 2020, 12:29:37 PM8/26/20
to gtsam users
Completely agree with you. The last point is an important one. - The message I would like to convey is that with a Bayes tree one can avoid marginalization or postpone it until it does no harm anymore. - I am somewhat stubborn with this.

Thanks for your comments. Appreciate it.
Stefan

Reply all
Reply to author
Forward
0 new messages