stack overflow

39 views
Skip to first unread message

Brent Morgan

unread,
Mar 24, 2018, 5:40:23 PM3/24/18
to Ceres Solver
Hello,

I am getting a stack overflow crash at initial evaluation of cost/gradient due to having my array sizes of input data too large- I know this because making the array sizes smaller it works fine.  The size of my arrays are not that large, less than 1 million of double type values.  There are less than 10 of these arrays.  I load the data as global and pass it in for Ceres to evaluate.

Is there memory being added to the stack as Ceres evaluates?  I guess my question is: how are others efficiently dealing with large data inputs for ceres?

Best,
Brent

Sameer Agarwal

unread,
Mar 24, 2018, 5:42:25 PM3/24/18
to ceres-...@googlegroups.com
Brent,
Why are you allocating on the stack? why not allocate on the heap and pass the pointers to the cost function?
Sameer


--
You received this message because you are subscribed to the Google Groups "Ceres Solver" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ceres-solver...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ceres-solver/db6b58c5-ab9a-44bf-b315-df2b554261c5%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Brent Morgan

unread,
Mar 24, 2018, 6:00:47 PM3/24/18
to Ceres Solver
Sameer,

Yep, I'm doing this now...  Thanks, 

Best,
Brent
Reply all
Reply to author
Forward
0 new messages