Predicting memory requirements for covariance computation (Bundle adjustment)

73 views
Skip to first unread message

Niko Haaraniemi

unread,
Nov 26, 2021, 3:16:22 AM11/26/21
to Ceres Solver
Hello again,

I'am working on bundle adjustment stuff. I'm trying to compute covariances for a quite large set of landmarks (500 000 landmarks) and getting these error (the computation works fine on smaller data set)

"CHOLMOD error: out of memory. file: ../Core/cholmod_memory.c line: 146
CHOLMOD error: out of memory. file: ../Core/cholmod_memory.c line: 146
WARNING: Logging before InitGoogleLogging() is written to STDERR
F1126 09:53:44.154214 10831 covariance_impl.cc:680] Check failed: 'permutation' Must be non NULL"
 
My question is, if there is a way of predicting how much memory the covariance computation requires?

Thanks
Niko

Sameer Agarwal

unread,
Nov 26, 2021, 11:20:16 AM11/26/21
to ceres-...@googlegroups.com
Niko,

The memory usage is hard to predict. It depends on the sparsity structure of your Jacobian and the ColAMD algorithm that SuiteSparse is able to use to reduce the fill in the QR factorization. 

Sameer


--
You received this message because you are subscribed to the Google Groups "Ceres Solver" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ceres-solver...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ceres-solver/b629949d-6bcd-4001-8964-a4c7dbb64a92n%40googlegroups.com.

Niko Haaraniemi

unread,
Nov 29, 2021, 7:59:38 AM11/29/21
to Ceres Solver
Okay. Thanks for explaining.
I added 16gb more memory to my laptop and got the computation done! :D

-Niko
Reply all
Reply to author
Forward
0 new messages