Hello,
Since it's my first post here: thanks for this nice piece of software ! Really helpful.
I ran into the problem of Eigen failing on OBJECT_ALLOCATED_ON_STACK_IS_TOO_BIG at compile time.
Responsible for that is that I'm setting up an AutoDiffCostFunction of P parameters and R residuals with P=131K and R=4K.
I suppose Ceres allocates things on the stack in proportion to P and that Eigen complains about that.
I see three possibilities: increase the stack, use the heap, or do static memory allocation:
1) Increase the hard stack limit Eigen complains about, but I suppose that won't scale very well (I'm planning to increase P).
2) Have Ceres allocate on the heap instead of the stack. The latter can be achieved with DynamicAutoDiffCostFunction if I'm right. But is that the right thing to do here or is there a better way ? This has been built for when the size is not known at compile time (cf. documentation), not for my case. So how does it deal with dynamically allocating big amounts of memory ?
I suppose I'll find out, but if anyone here has a recommendation/suggestion about the right way to go, I'd be happy to benefit from it.
3) Have Ceres use static memory allocation. In theory, this might be possible, as everything the amount of required memory is known at compile-time when using the non-dynamic version. But I don't know if this feature has been built in (and I suppose it may be an implementation choice not to propose this).
Thanks you very much for your comments on this!
Best,
Kenneth
PS: the problem is dense, and not separable into several ResidualBlocks (or actually it would be, but at a dramatic overload in computation).
--
You received this message because you are subscribed to the Google Groups "Ceres Solver" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ceres-solver...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ceres-solver/883fbed7-6f07-424c-b1d1-de155553e590%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Kenneth,My replies are inline.On Tue, Dec 22, 2015 at 9:44 AM Kenneth V <kenneth...@gmail.com> wrote:Hello,
Since it's my first post here: thanks for this nice piece of software ! Really helpful.
I ran into the problem of Eigen failing on OBJECT_ALLOCATED_ON_STACK_IS_TOO_BIG at compile time.
Responsible for that is that I'm setting up an AutoDiffCostFunction of P parameters and R residuals with P=131K and R=4K.
I suppose Ceres allocates things on the stack in proportion to P and that Eigen complains about that.
I see three possibilities: increase the stack, use the heap, or do static memory allocation:
1) Increase the hard stack limit Eigen complains about, but I suppose that won't scale very well (I'm planning to increase P).
Allocating on the heap has a significant cost in runtime for AutoDiffCostFunction.
Though to be honest with P = 131K and R = 4K, I wonder if you are better off computing your derivatives analytically.
2) Have Ceres allocate on the heap instead of the stack. The latter can be achieved with DynamicAutoDiffCostFunction if I'm right. But is that the right thing to do here or is there a better way ? This has been built for when the size is not known at compile time (cf. documentation), not for my case. So how does it deal with dynamically allocating big amounts of memory ?It may work for you, because what it does is, it uses small chunks of the parameter space to do the computation in multiple passes.
I suppose I'll find out, but if anyone here has a recommendation/suggestion about the right way to go, I'd be happy to benefit from it.
3) Have Ceres use static memory allocation. In theory, this might be possible, as everything the amount of required memory is known at compile-time when using the non-dynamic version. But I don't know if this feature has been built in (and I suppose it may be an implementation choice not to propose this).This I believe has to do with the stack inside your autodiff functor, internally ceres does allocate on the stack and/or the heap as the size of these object grows.
--Sameer--
Thanks you very much for your comments on this!
Best,
Kenneth
PS: the problem is dense, and not separable into several ResidualBlocks (or actually it would be, but at a dramatic overload in computation).
You received this message because you are subscribed to the Google Groups "Ceres Solver" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ceres-solver...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ceres-solver/883fbed7-6f07-424c-b1d1-de155553e590%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
You received this message because you are subscribed to a topic in the Google Groups "Ceres Solver" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/ceres-solver/e54ghmebPeQ/unsubscribe.
To unsubscribe from this group and all its topics, send an email to ceres-solver...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ceres-solver/CABqdRUCSH11aF6hNnorA04SbqwTqAwoVAgUuf-3iNpBw1-UbqQ%40mail.gmail.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ceres-solver/CA%2B%2Bq7pxASezUKygKjWXf3USdKpxGGmedZM6j%2BD68Q-NtaTDipQ%40mail.gmail.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ceres-solver/CAK0oyEoqKRxqS1EM6s06uS005zEbqiRXos6Ow%2B8P4pdzpaEqpA%40mail.gmail.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ceres-solver/b5a09c09-e06a-4dc8-aa3c-a6822e82372b%40googlegroups.com.