Understanding ceres output

1,495 views
Skip to first unread message

Jon Zubizarreta Gorostidi

unread,
Oct 9, 2018, 9:48:22 AM10/9/18
to Ceres Solver
Hello,

I am solving a non-linear optimization problem using Levenberg-Marquardt and Sparse Normal Cholesky. I first initialize parameter values close to the solution and then run the iterative least squares
optimization to refine those parameters (using analytic derivatives). However, if I let ceres use its own termination criteria it makes many iterations, some of which are unsuccessful.
I want to detect why ceres has this behaviour and correct it. I do not know if it comes from a wrong use of ceres or from my cost function implementation (residuals and jacobians).

I obtain the following output from ceres:

iter      cost      cost_change  |gradient|   |step|    tr_ratio  tr_radius  ls_iter  iter_time  total_time
   0  3.054717e+05    0.00e+00    2.60e+05   0.00e+00   0.00e+00  1.00e+04        0    1.54e-02    1.69e-02
   1  1.589883e+05    1.46e+05    3.07e+05   1.76e+00   1.48e+00  3.00e+04        1    2.42e-02    4.53e-02
   2  1.474446e+05    1.15e+04    7.49e+04   3.60e-01   8.37e-01  4.32e+04        1    2.18e-02    6.90e-02
   3  1.454798e+05    1.96e+03    6.62e+04   1.85e-01   3.52e-01  4.21e+04        1    2.17e-02    9.33e-02
   4  1.451735e+05    3.06e+02    5.10e+04   2.01e-01   5.79e-02  2.49e+04        1    2.16e-02    1.18e-01
   5  1.453881e+05   -2.15e+02    0.00e+00   1.88e-01  -5.37e-02  1.24e+04        1    8.29e-03    1.29e-01
   6  1.453879e+05   -2.14e+02    0.00e+00   1.88e-01  -5.36e-02  3.11e+03        1    7.85e-03    1.39e-01
   7  1.453863e+05   -2.13e+02    0.00e+00   1.88e-01  -5.32e-02  3.89e+02        1    7.88e-03    1.49e-01
   8  1.453712e+05   -1.98e+02    0.00e+00   1.87e-01  -4.95e-02  2.43e+01        1    8.02e-03    1.60e-01
   9  1.450673e+05    1.06e+02    5.69e+04   1.79e-01   2.66e-02  1.31e+01        1    2.15e-02    1.84e-01
  10  1.430847e+05    1.98e+03    4.41e+04   1.52e-01   4.88e-01  1.31e+01        1    2.18e-02    2.08e-01
  11  1.418846e+05    1.20e+03    2.88e+04   1.44e-01   3.93e-01  1.30e+01        1    2.13e-02    2.33e-01
  12  1.424717e+05   -5.87e+02    0.00e+00   1.36e-01  -2.04e-01  6.51e+00        1    8.34e-03    2.43e-01
  13  1.419731e+05   -8.85e+01    0.00e+00   1.27e-01  -3.11e-02  1.63e+00        1    7.91e-03    2.53e-01
  14  1.402563e+05    1.63e+03    3.95e+04   9.05e-02   6.58e-01  1.68e+00        1    2.11e-02    2.77e-01
  15  1.399674e+05    2.89e+02    2.15e+04   5.58e-02   3.55e-01  1.64e+00        1    2.12e-02    3.01e-01
  16  1.397803e+05    1.87e+02    9.30e+03   4.96e-02   4.24e-01  1.63e+00        1    2.14e-02    3.24e-01
  17  1.398164e+05   -3.60e+01    0.00e+00   4.94e-02  -1.50e-01  8.17e-01        1    7.88e-03    3.35e-01
  18  1.396007e+05    1.80e+02    2.21e+04   3.58e-02   9.18e-01  1.97e+00        1    2.18e-02    3.59e-01
  19  1.396780e+05   -7.73e+01    0.00e+00   3.80e-02  -5.78e-01  9.84e-01        1    8.15e-03    3.70e-01
  20  1.396153e+05   -1.46e+01    0.00e+00   2.84e-02  -1.30e-01  2.46e-01        1    8.36e-03    3.81e-01
  21  1.395953e+05    5.36e+00    1.12e+04   1.13e-02   1.00e-01  1.63e-01        1    2.18e-02    4.05e-01
  22  1.395893e+05    6.05e+00    2.30e+03   3.48e-03   7.24e-01  1.79e-01        1    2.10e-02    4.29e-01
  23  1.395867e+05    2.61e+00    2.06e+03   1.62e-03   3.45e+00  5.37e-01        1    2.20e-02    4.53e-01
  24  1.395823e+05    4.40e+00    1.70e+03   2.09e-03   5.04e+00  1.61e+00        1    2.13e-02    4.77e-01
  25  1.395793e+05    2.93e+00    2.66e+03   3.58e-03   2.29e+00  4.83e+00        1    2.46e-02    5.05e-01
  26  1.396116e+05   -3.23e+01    0.00e+00   2.33e-02  -3.65e+00  2.42e+00        1    1.20e-02    5.19e-01
  27  1.396081e+05   -2.88e+01    0.00e+00   1.99e-02  -3.52e+00  6.04e-01        1    8.04e-03    5.29e-01
  28  1.395851e+05   -5.77e+00    0.00e+00   1.06e-02  -1.07e+00  7.55e-02        1    7.85e-03    5.39e-01
  29  1.395785e+05    8.88e-01    2.25e+03   1.97e-03   7.51e-01  8.64e-02        1    2.12e-02    5.62e-01

Solver Summary (v 2.0.0-eigen-(3.2.8)-lapack-suitesparse-(5.1.0)-cxsparse-(3.1.9)-eigensparse-no_openmp)

                                     Original                  Reduced
Parameter blocks                         1590                     1555
Parameters                               1604                     1562
Effective parameters                     1602                     1561
Residual blocks                          1553                     1553
Residuals                               12424                    12424

Minimizer                        TRUST_REGION

Sparse linear algebra library    SUITE_SPARSE
Trust region strategy     LEVENBERG_MARQUARDT

                                        Given                     Used
Linear solver          SPARSE_NORMAL_CHOLESKY   SPARSE_NORMAL_CHOLESKY
Threads                                     1                        1
Linear solver ordering               1586,2,2                 1553,1,1

Cost:
Initial                          3.054717e+05
Final                          1.395785e+05
Change                      1.658932e+05

Minimizer iterations                       30
Successful steps                          18
Unsuccessful steps                      12

Time (in seconds):
Preprocessor                         0.001458

Residual only evaluation              0.167858 (30)
Jacobian & residual evaluation     0.238637 (18)
Linear solver                              0.064791 (30)
Minimizer                                  0.570930

Postprocessor                  0.000203
Total                                0.572592

Termination:                      CONVERGENCE (Function tolerance reached. |cost_change|/cost: 8.654149e-08 <= 1.000000e-06)


My question are:

1) Why is ceres doing so many iterations if parameters are so close to the solution? It should converge in few iterations.
2) What does |gradient| mean in each iteration? A local minimum maybe?
3) How could I avoid unsuccesful iterations and reach directly to the minimum?
4) Is ceres jumping from local minimum to local minimum? If so, should I stop in the first local minimum? How can I make ceres stop in the first local minimum and avoid finding other minimums?

Thanks in advance.

Jon

Sameer Agarwal

unread,
Oct 9, 2018, 10:15:44 AM10/9/18
to ceres-...@googlegroups.com
Jon,

My answers are inline.

Two things.

a. Your solution does not seem all that close as the objective function changes a fair bit.
b. The rate at which ceres converges to a solution is a function of the conditioning and the structure of your objective function. Just because your parameters are near does not mean anything by itself.

2) What does |gradient| mean in each iteration? A local minimum maybe?

|gradient| is the max norm of the gradient vector. It is zero for steps where we do not make progress because it is not computed.
 
3) How could I avoid unsuccesful iterations and reach directly to the minimum?

Unless you have an analytical expression for the optimum, this is not possible.
 
4) Is ceres jumping from local minimum to local minimum? If so, should I stop in the first local minimum? How can I make ceres stop in the first local minimum and avoid finding other minimums?

No Ceres can't really jump from local minimum to local minimum, since if it found a local minimum its gradient convergence criterion will be met and it won't make any further progress. Put another way, Ceres stops at the first local minimum it finds.

HTH,
Sameer

 

Thanks in advance.

Jon

--
You received this message because you are subscribed to the Google Groups "Ceres Solver" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ceres-solver...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ceres-solver/2477c671-081b-4ff2-b4fb-5ffd4d09455b%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Jon Zubizarreta Gorostidi

unread,
Oct 9, 2018, 11:48:02 AM10/9/18
to Ceres Solver
Yes! Thanks for the help Sameer.

Following this topic, how can I obtain the step of each parameter block separately? I want to measure the change in my parameters.
I came up with two options. However, there is maybe a better solution:

1) Use an individual local parameterization for each parameter block and store the delta values for each parameter block. Could this work? By the way, when is LocalParameterization::Plus called? Then using an IterationCallback, I would be able to measure the delta values. This could not be a good solution if LocalParameterization::Plus is only called with successful iterations. I would like to measure the step for all the iterations.

2) Use a backup state of the parameters and compute the delta values with the current state each iteration using an IterationCallback. However, this approach also fails if the current state is not updated with an unsuccessful iteration.

Ceres has to compute the step values and apply them to check if the cost is reduced. Thus, how can I obtain the computed step values for each parameter block independently of the iteration type (successful  or unsuccessful )?

Thanks,

Jon




Sameer Agarwal

unread,
Oct 9, 2018, 2:25:02 PM10/9/18
to ceres-...@googlegroups.com
Jon,
Before we get into all this, what is it that you are trying to do?
Sameer


--
You received this message because you are subscribed to the Google Groups "Ceres Solver" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ceres-solver...@googlegroups.com.

Jon Zubizarreta Gorostidi

unread,
Oct 9, 2018, 3:25:52 PM10/9/18
to ceres-...@googlegroups.com
I want to check if some of my parameters have already converged to break the optimization early.

Jon


You received this message because you are subscribed to a topic in the Google Groups "Ceres Solver" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/ceres-solver/ymH9DyaCtS4/unsubscribe.
To unsubscribe from this group and all its topics, send an email to ceres-solver...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ceres-solver/CABqdRUAD8AYu9fOTLLwc7rE4AS62zOL1AOiMekpn7gtHrwnJjA%40mail.gmail.com.

Sameer Agarwal

unread,
Oct 10, 2018, 10:28:29 AM10/10/18
to ceres-...@googlegroups.com
I do not recommend doing this, but if you must, IterationCallback + Solver::Options::update_state_every_iteration is your friend.

Jon Zubizarreta Gorostidi

unread,
Oct 15, 2018, 6:02:18 AM10/15/18
to Ceres Solver
Is it possible?

Thanks,

Jon

Sameer Agarwal

unread,
Oct 15, 2018, 10:11:40 AM10/15/18
to ceres-...@googlegroups.com
yes.

--
You received this message because you are subscribed to the Google Groups "Ceres Solver" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ceres-solver...@googlegroups.com.

Jon Zubizarreta Gorostidi

unread,
Oct 15, 2018, 10:40:12 AM10/15/18
to ceres-...@googlegroups.com
Would any of my two options from above work? Which one do you recommend? Any alternatives?
I was wondering if they would only work for successful iterations. I want to obtain the calculated parameter steps for each parameter block any iteration (good and bad).

Thanks,

Jon

You received this message because you are subscribed to a topic in the Google Groups "Ceres Solver" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/ceres-solver/ymH9DyaCtS4/unsubscribe.
To unsubscribe from this group and all its topics, send an email to ceres-solver...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ceres-solver/CABqdRUCvs%3D3KgO3yOo4OfXH5ZnfZ-t7-weY3cHonUxTBAe0q8w%40mail.gmail.com.

Sameer Agarwal

unread,
Oct 15, 2018, 1:08:52 PM10/15/18
to ceres-...@googlegroups.com
Jon,
Use approach 2, Use an iterationcallback. It tells you whether it is an iteration is successful or not.
Sameer


Jon Zubizarreta Gorostidi

unread,
Oct 15, 2018, 1:59:00 PM10/15/18
to ceres-...@googlegroups.com
The second option will work for successful iterations. However, I also want to obtain the parameter steps for unsuccessful ones.
Is it possible to obtain them with ceres?

Thanks,

Jon


Sameer Agarwal

unread,
Oct 15, 2018, 1:59:49 PM10/15/18
to ceres-...@googlegroups.com
Reply all
Reply to author
Forward
0 new messages