Why the cost didn't decrease as much as the jacobians indicate?

92 views
Skip to first unread message

Yilan Chen

unread,
Feb 26, 2015, 11:34:09 PM2/26/15
to ceres-...@googlegroups.com
Hi all,

I'm using Ceres to optimize a cost function, and I analytically derived the Jacobian matrix. However, the cost didn't descend as much as I expected before the optimization terminates, even though the jacobians indicate there is still space for the cost to decrease.


--------------------------------Example-------------------------------------------------------------------------

In the serial tests below, I initialized the parameter block with the optimal result that I got in the former test and collected some information.

(1)
Initialized parameters: -43.226380, 14.553339, -83.994619
Optimized parameters: -47.370476, 15.958336, -82.959197
Cooreponding jacobians: -0.202512, 0.611803, 0.893090 
Cost:
Initial                         1.607741e-002
Final                           1.171152e-002
Change                          4.365892e-003
Termination:                      CONVERGENCE (Parameter tolerance reached. Rela
tive step_norm: 0.000000e+000 <= 0.000000e+000.)

(2)
Initialized parameters: -47.370476, 15.958336, -82.959197
Optimized parameters: -50.400579, 17.033272, -82.056126
Cooreponding jacobians: -0.275986,  0.762078, 0.856621
Cost:
Initial                         1.171152e-002
Final                           8.765033e-003
Change                          2.946487e-003
Termination:                      CONVERGENCE (Parameter tolerance reached. Rela
tive step_norm: 0.000000e+000 <= 0.000000e+000.)

(3)
Initialized parameters: -50.400579, 17.033272, -82.056126
Optimized parameters: -52.769339, 17.903986, -81.248989
Cooreponding jacobians: -0.299470, 0.801149, 0.826341
Cost:
Initial                         8.765034e-003
Final                           6.678449e-003
Change                          2.086585e-003
Termination:                      CONVERGENCE (Parameter tolerance reached. Rela
tive step_norm: 0.000000e+000 <= 0.000000e+000.)

(4) The result that I expected is (-57.693560, 20.961377, -49.711772), with a nearly zero cost and nearly zero jacobian values. But the real results are just too far from my expection.


----------------------------------Settings of Ceres Solver-----------------------------------------------

Solver Summary (v 1.10.0-no_lapack-openmp)

                                     Original                  Reduced
Parameter blocks                   1                        1
Parameters                           40                       40
Residual blocks                       1                        1
Residual                                 2                        2

Minimizer                        TRUST_REGION

Dense linear algebra library            EIGEN
Trust region strategy     LEVENBERG_MARQUARDT

                                        Given                     Used
Linear solver                  DENSE_QR              DENSE_QR
Threads                               1                             1
Linear solver threads             1                             1

options.function_tolerance = 0;
options.parameter_tolerance = 0;

(I only used 3 parameters and 1 residual in my experiments, the rest residual and jacobians were set to zero)

--------------------------------------------------------------------------------------------------------------------

Can anyone give me some idea about what's going on? How can I minimize the cost as much as possible?

Thanks in advance!

Yilan

Sameer Agarwal

unread,
Feb 27, 2015, 12:14:22 AM2/27/15
to ceres-...@googlegroups.com
Yilan,
Before I say more, I have a couple of questions.

1. Is your objective function non-linear? I am guessing that it is, but it is worth confirming.
2. Can you dump out the execution log and Summary::FullReport output for each problem?

Sameer


--
You received this message because you are subscribed to the Google Groups "Ceres Solver" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ceres-solver...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ceres-solver/ad68f8b3-51a6-4f0d-866d-38fe38329211%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Yilan Chen

unread,
Feb 27, 2015, 12:38:53 AM2/27/15
to ceres-...@googlegroups.com
Hi Sameer,

Yes, my objective function is non-linear. Here follows the log and FullReport for each test.

----------------------------------------------------------------------------------------------------------
(1)
iter      cost      cost_change  |gradient|   |step|    tr_ratio  tr_radius  ls_
iter  iter_time  total_time
   0  1.607741e-002    0.00e+000    1.60e-001   0.00e+000   0.00e+000  1.00e+004
        0    2.78e-002    2.84e-002
   1  1.575819e-002    3.19e-004    1.59e-001   3.18e-001   1.99e-002  5.30e+003
        1    9.03e-002    1.31e-001
   2  1.544756e-002    3.11e-004    1.57e-001   3.11e-001   1.97e-002  2.81e+003
        1    8.73e-002    2.26e-001
   3  1.514518e-002    3.02e-004    1.55e-001   3.04e-001   1.96e-002  1.49e+003
        1    8.68e-002    3.20e-001
   4  1.485075e-002    2.94e-004    1.54e-001   2.97e-001   1.94e-002  7.89e+002
        1    8.68e-002    4.15e-001
   5  1.456399e-002    2.87e-004    1.52e-001   2.90e-001   1.93e-002  4.18e+002
        1    8.67e-002    5.10e-001
   6  1.428467e-002    2.79e-004    1.51e-001   2.84e-001   1.92e-002  2.21e+002
        1    8.80e-002    6.06e-001
   7  1.401261e-002    2.72e-004    1.49e-001   2.78e-001   1.90e-002  1.17e+002
        1    8.64e-002    6.99e-001
   8  1.374769e-002    2.65e-004    1.48e-001   2.72e-001   1.89e-002  6.19e+001
        1    8.68e-002    7.93e-001
   9  1.348997e-002    2.58e-004    1.46e-001   2.66e-001   1.87e-002  3.27e+001
        1    8.68e-002    8.89e-001
  10  1.323972e-002    2.50e-004    1.45e-001   2.60e-001   1.86e-002  1.73e+001
        1    8.71e-002    9.83e-001
  11  1.299765e-002    2.42e-004    1.43e-001   2.52e-001   1.83e-002  9.13e+000
        1    8.62e-002    1.08e+000
  12  1.276520e-002    2.32e-004    1.42e-001   2.44e-001   1.79e-002  4.81e+000
        1    8.57e-002    1.17e+000
  13  1.254503e-002    2.20e-004    1.40e-001   2.32e-001   1.73e-002  2.53e+000
        1    8.65e-002    1.27e+000
  14  1.234152e-002    2.04e-004    1.39e-001   2.15e-001   1.64e-002  1.33e+000
        1    8.64e-002    1.36e+000
  15  1.216112e-002    1.80e-004    1.38e-001   1.92e-001   1.52e-002  6.96e-001
        1    8.62e-002    1.46e+000
  16  1.201133e-002    1.50e-004    1.37e-001   1.60e-001   1.38e-002  3.63e-001
        1    8.63e-002    1.55e+000
  17  1.189765e-002    1.14e-004    1.36e-001   1.22e-001   1.23e-002  1.88e-001
        1    8.60e-002    1.65e+000
  18  1.181984e-002    7.78e-005    1.36e-001   8.36e-002   1.11e-002  9.72e-002
        1    8.67e-002    1.74e+000
  19  1.177151e-002    4.83e-005    1.35e-001   5.20e-002   1.02e-002  5.01e-002
        1    8.56e-002    1.83e+000
  20  1.174368e-002    2.78e-005    1.35e-001   3.00e-002   9.68e-003  2.58e-002
        1    8.31e-002    1.93e+000
  21  1.172842e-002    1.53e-005    1.35e-001   1.64e-002   9.38e-003  1.33e-002
        1    8.42e-002    2.02e+000
  22  1.172031e-002    8.11e-006    1.35e-001   8.74e-003   9.22e-003  6.81e-003
        1    8.48e-002    2.11e+000
  23  1.171606e-002    4.25e-006    1.35e-001   4.57e-003   9.14e-003  3.50e-003
        1    8.40e-002    2.20e+000
  24  1.171386e-002    2.20e-006    1.35e-001   2.37e-003   9.09e-003  1.80e-003
        1    8.36e-002    2.29e+000
  25  1.171272e-002    1.14e-006    1.35e-001   1.23e-003   9.07e-003  9.24e-004
        1    8.41e-002    2.38e+000
  26  1.171214e-002    5.86e-007    1.35e-001   6.31e-004   9.06e-003  4.75e-004
        1    8.37e-002    2.48e+000
  27  1.171184e-002    3.01e-007    1.35e-001   3.25e-004   9.05e-003  2.44e-004
        1    8.34e-002    2.57e+000
  28  1.171168e-002    1.55e-007    1.35e-001   1.67e-004   9.05e-003  1.25e-004
        1    8.29e-002    2.66e+000
  29  1.171160e-002    7.96e-008    1.35e-001   8.57e-005   9.05e-003  6.43e-005
        1    8.49e-002    2.76e+000
  30  1.171156e-002    4.09e-008    1.35e-001   4.40e-005   9.05e-003  3.30e-005
        1    8.37e-002    2.85e+000
  31  1.171154e-002    2.10e-008    1.35e-001   2.26e-005   9.05e-003  1.70e-005
        1    8.38e-002    2.94e+000
  32  1.171153e-002    1.08e-008    1.35e-001   1.16e-005   9.04e-003  8.72e-006
        1    8.42e-002    3.03e+000
  33  1.171152e-002    5.54e-009    1.35e-001   5.97e-006   9.04e-003  4.48e-006
        1    8.40e-002    3.12e+000
  34  1.171152e-002    2.85e-009    1.35e-001   3.07e-006   9.04e-003  2.30e-006
        1    8.39e-002    3.21e+000
  35  1.171152e-002    1.46e-009    1.35e-001   1.58e-006   9.04e-003  1.18e-006
        1    8.48e-002    3.30e+000
  36  1.171152e-002    7.51e-010    1.35e-001   8.09e-007   9.04e-003  6.07e-007
        1    8.38e-002    3.39e+000
  37  1.171152e-002    3.86e-010    1.35e-001   4.16e-007   9.04e-003  3.12e-007
        1    8.41e-002    3.49e+000
  38  1.171152e-002    1.98e-010    1.35e-001   2.14e-007   9.04e-003  1.60e-007
        1    8.35e-002    3.58e+000
  39  1.171152e-002    1.02e-010    1.35e-001   1.10e-007   9.04e-003  8.23e-008
        1    8.29e-002    3.66e+000
  40  1.171152e-002    5.23e-011    1.35e-001   5.64e-008   9.04e-003  4.23e-008
        1    8.41e-002    3.76e+000
  41  1.171152e-002    2.69e-011    1.35e-001   2.90e-008   9.04e-003  2.17e-008
        1    8.67e-002    3.85e+000
  42  1.171152e-002    1.38e-011    1.35e-001   1.49e-008   9.04e-003  1.12e-008
        1    8.58e-002    3.95e+000
  43  1.171152e-002    7.09e-012    1.35e-001   7.64e-009   9.04e-003  5.73e-009
        1    8.59e-002    4.04e+000
  44  1.171152e-002    3.64e-012    1.35e-001   3.92e-009   9.04e-003  2.94e-009
        1    8.63e-002    4.14e+000
  45  1.171152e-002    1.87e-012    1.35e-001   2.02e-009   9.04e-003  1.51e-009
        1    8.49e-002    4.23e+000
  46  1.171152e-002    9.61e-013    1.35e-001   1.04e-009   9.05e-003  7.77e-010
        1    8.46e-002    4.32e+000
  47  1.171152e-002    4.94e-013    1.35e-001   5.32e-010   9.04e-003  3.99e-010
        1    8.49e-002    4.42e+000
  48  1.171152e-002    2.54e-013    1.35e-001   2.73e-010   9.04e-003  2.05e-010
        1    8.48e-002    4.51e+000
  49  1.171152e-002    1.30e-013    1.35e-001   1.40e-010   9.05e-003  1.05e-010
        1    8.57e-002    4.60e+000
  50  1.171152e-002    6.68e-014    1.35e-001   7.21e-011   9.03e-003  5.41e-011
        1    8.59e-002    4.69e+000
  51  1.171152e-002    3.44e-014    1.35e-001   3.70e-011   9.05e-003  2.78e-011
        1    8.56e-002    4.79e+000
  52  1.171152e-002    1.77e-014    1.35e-001   1.90e-011   9.06e-003  1.43e-011
        1    8.55e-002    4.88e+000
  53  1.171152e-002    9.06e-015    1.35e-001   9.78e-012   9.03e-003  7.33e-012
        1    8.62e-002    4.97e+000
  54  1.171152e-002    4.62e-015    1.35e-001   5.02e-012   8.97e-003  3.77e-012
        1    8.56e-002    5.07e+000
  55  1.171152e-002    2.43e-015    1.35e-001   2.58e-012   9.19e-003  1.93e-012
        1    8.50e-002    5.16e+000
  56  1.171152e-002    1.21e-015    1.35e-001   1.32e-012   8.87e-003  9.93e-013
        1    8.46e-002    5.25e+000
  57  1.171152e-002    6.80e-016    1.35e-001   6.77e-013   9.74e-003  5.11e-013
        1    8.52e-002    5.35e+000
  58  1.171152e-002    3.05e-016    1.35e-001   3.49e-013   8.50e-003  2.62e-013
        1    8.55e-002    5.44e+000
  59  1.171152e-002    1.37e-016    1.35e-001   1.78e-013   7.44e-003  1.34e-013
        1    8.58e-002    5.53e+000
  60  1.171152e-002    8.50e-017    1.35e-001   9.43e-014   9.02e-003  6.89e-014
        1    8.40e-002    5.63e+000
  61  1.171152e-002    5.03e-017    1.35e-001   4.71e-014   1.04e-002  3.55e-014
        1    8.53e-002    5.72e+000
  62  1.171152e-002    0.00e+000    0.00e+000   2.25e-014   0.00e+000  1.78e-014
        1    5.91e-002    5.79e+000
  63  1.171152e-002    0.00e+000    0.00e+000   1.46e-014   0.00e+000  4.44e-015
        1    5.73e-002    5.85e+000
  64  1.171152e-002   -1.73e-017    0.00e+000   1.78e-015  -5.56e-002  5.55e-016
        1    5.75e-002    5.92e+000

Solver Summary (v 1.10.0-no_lapack-openmp)

                                     Original                  Reduced
Parameter blocks                            1                        1
Parameters                                 40                       40
Residual blocks                             1                        1
Residual                                    2                        2

Minimizer                        TRUST_REGION

Dense linear algebra library            EIGEN
Trust region strategy     LEVENBERG_MARQUARDT

                                        Given                     Used
Linear solver                        DENSE_QR                 DENSE_QR
Threads                                     1                        1
Linear solver threads                       1                        1

Cost:
Initial                         1.607741e-002
Final                           1.171152e-002
Change                          4.365892e-003

Minimizer iterations                       64
Successful steps                           61
Unsuccessful steps                          3

Time (in seconds):
Preprocessor                           0.0006

  Residual evaluation                  1.7395
    Line search cost evaluation        0.0000
  Jacobian evaluation                  3.3738
    Line search gradient evaluation    1.7157
  Linear solver                        0.3237
  Line search polynomial minimization  0.0002
Minimizer                              5.9827

Postprocessor                          0.0000
Total                                  5.9834

Termination:                      CONVERGENCE (Parameter tolerance reached. Rela
tive step_norm: 0.000000e+000 <= 0.000000e+000.)

-----------------------------------------------------------------------------------------------------------------------
(2)
iter      cost      cost_change  |gradient|   |step|    tr_ratio  tr_radius  ls_
iter  iter_time  total_time
   0  1.171152e-002    0.00e+000    1.35e-001   0.00e+000   0.00e+000  1.00e+004
        0    2.76e-002    2.79e-002
   1  1.150064e-002    2.11e-004    1.33e-001   2.28e-001   1.80e-002  5.27e+003
        1    8.64e-002    1.27e-001
   2  1.129455e-002    2.06e-004    1.32e-001   2.24e-001   1.79e-002  2.78e+003
        1    8.36e-002    2.18e-001
   3  1.109310e-002    2.01e-004    1.31e-001   2.21e-001   1.78e-002  1.47e+003
        1    8.48e-002    3.10e-001
   4  1.089616e-002    1.97e-004    1.29e-001   2.17e-001   1.78e-002  7.73e+002
        1    8.48e-002    4.02e-001
   5  1.070363e-002    1.93e-004    1.28e-001   2.14e-001   1.77e-002  4.07e+002
        1    8.40e-002    4.92e-001
   6  1.051539e-002    1.88e-004    1.27e-001   2.10e-001   1.76e-002  2.15e+002
        1    8.42e-002    5.84e-001
   7  1.033139e-002    1.84e-004    1.25e-001   2.07e-001   1.75e-002  1.13e+002
        1    8.39e-002    6.75e-001
   8  1.015162e-002    1.80e-004    1.24e-001   2.03e-001   1.74e-002  5.95e+001
        1    8.44e-002    7.66e-001
   9  9.976169e-003    1.75e-004    1.23e-001   2.00e-001   1.73e-002  3.13e+001
        1    8.26e-002    8.56e-001
  10  9.805294e-003    1.71e-004    1.21e-001   1.96e-001   1.71e-002  1.65e+001
        1    8.51e-002    9.49e-001
  11  9.639571e-003    1.66e-004    1.20e-001   1.91e-001   1.69e-002  8.67e+000
        1    8.50e-002    1.04e+000
  12  9.480111e-003    1.59e-004    1.19e-001   1.85e-001   1.66e-002  4.55e+000
        1    8.59e-002    1.13e+000
  13  9.328907e-003    1.51e-004    1.18e-001   1.76e-001   1.60e-002  2.39e+000
        1    8.44e-002    1.23e+000
  14  9.189243e-003    1.40e-004    1.17e-001   1.64e-001   1.52e-002  1.25e+000
        1    8.55e-002    1.32e+000
  15  9.065846e-003    1.23e-004    1.16e-001   1.46e-001   1.41e-002  6.51e-001
        1    8.61e-002    1.42e+000
  16  8.964072e-003    1.02e-004    1.15e-001   1.21e-001   1.27e-002  3.38e-001
        1    8.62e-002    1.51e+000
  17  8.887587e-003    7.65e-005    1.14e-001   9.11e-002   1.13e-002  1.75e-001
        1    8.59e-002    1.60e+000
  18  8.835826e-003    5.18e-005    1.14e-001   6.18e-002   1.02e-002  9.01e-002
        1    8.54e-002    1.70e+000
  19  8.804013e-003    3.18e-005    1.14e-001   3.81e-002   9.47e-003  4.64e-002
        1    8.57e-002    1.79e+000
  20  8.785836e-003    1.82e-005    1.14e-001   2.18e-002   9.01e-003  2.38e-002
        1    8.55e-002    1.88e+000
  21  8.775933e-003    9.90e-006    1.14e-001   1.19e-002   8.75e-003  1.22e-002
        1    8.54e-002    1.97e+000
  22  8.770686e-003    5.25e-006    1.13e-001   6.29e-003   8.61e-003  6.27e-003
        1    8.53e-002    2.07e+000
  23  8.767949e-003    2.74e-006    1.13e-001   3.28e-003   8.53e-003  3.22e-003
        1    8.62e-002    2.16e+000
  24  8.766533e-003    1.42e-006    1.13e-001   1.70e-003   8.49e-003  1.65e-003
        1    8.63e-002    2.26e+000
  25  8.765803e-003    7.30e-007    1.13e-001   8.75e-004   8.47e-003  8.46e-004
        1    8.61e-002    2.35e+000
  26  8.765428e-003    3.75e-007    1.13e-001   4.50e-004   8.46e-003  4.34e-004
        1    8.51e-002    2.44e+000
  27  8.765236e-003    1.93e-007    1.13e-001   2.31e-004   8.46e-003  2.22e-004
        1    8.57e-002    2.54e+000
  28  8.765137e-003    9.88e-008    1.13e-001   1.18e-004   8.46e-003  1.14e-004
        1    8.51e-002    2.63e+000
  29  8.765086e-003    5.07e-008    1.13e-001   6.08e-005   8.45e-003  5.85e-005
        1    8.55e-002    2.72e+000
  30  8.765060e-003    2.60e-008    1.13e-001   3.12e-005   8.45e-003  3.00e-005
        1    8.48e-002    2.82e+000
  31  8.765047e-003    1.33e-008    1.13e-001   1.60e-005   8.45e-003  1.54e-005
        1    8.44e-002    2.91e+000
  32  8.765040e-003    6.83e-009    1.13e-001   8.20e-006   8.45e-003  7.88e-006
        1    8.27e-002    3.00e+000
  33  8.765037e-003    3.50e-009    1.13e-001   4.20e-006   8.45e-003  4.04e-006
        1    8.40e-002    3.09e+000
  34  8.765035e-003    1.80e-009    1.13e-001   2.16e-006   8.45e-003  2.07e-006
        1    8.37e-002    3.18e+000
  35  8.765034e-003    9.22e-010    1.13e-001   1.11e-006   8.45e-003  1.06e-006
        1    8.26e-002    3.27e+000
  36  8.765033e-003    4.73e-010    1.13e-001   5.67e-007   8.45e-003  5.45e-007
        1    8.23e-002    3.36e+000
  37  8.765033e-003    2.42e-010    1.13e-001   2.91e-007   8.45e-003  2.80e-007
        1    8.30e-002    3.45e+000
  38  8.765033e-003    1.24e-010    1.13e-001   1.49e-007   8.45e-003  1.43e-007
        1    8.26e-002    3.54e+000
  39  8.765033e-003    6.37e-011    1.13e-001   7.64e-008   8.45e-003  7.35e-008
        1    8.27e-002    3.63e+000
  40  8.765033e-003    3.27e-011    1.13e-001   3.92e-008   8.45e-003  3.77e-008
        1    8.23e-002    3.72e+000
  41  8.765033e-003    1.68e-011    1.13e-001   2.01e-008   8.45e-003  1.93e-008
        1    8.28e-002    3.81e+000
  42  8.765033e-003    8.59e-012    1.13e-001   1.03e-008   8.45e-003  9.91e-009
        1    8.26e-002    3.90e+000
  43  8.765033e-003    4.41e-012    1.13e-001   5.28e-009   8.45e-003  5.08e-009
        1    8.40e-002    3.99e+000
  44  8.765033e-003    2.26e-012    1.13e-001   2.71e-009   8.45e-003  2.61e-009
        1    8.26e-002    4.09e+000
  45  8.765033e-003    1.16e-012    1.13e-001   1.39e-009   8.45e-003  1.34e-009
        1    8.31e-002    4.18e+000
  46  8.765033e-003    5.94e-013    1.13e-001   7.12e-010   8.45e-003  6.85e-010
        1    8.26e-002    4.27e+000
  47  8.765033e-003    3.05e-013    1.13e-001   3.65e-010   8.45e-003  3.51e-010
        1    8.32e-002    4.36e+000
  48  8.765033e-003    1.56e-013    1.13e-001   1.87e-010   8.45e-003  1.80e-010
        1    8.35e-002    4.45e+000
  49  8.765033e-003    8.01e-014    1.13e-001   9.61e-011   8.46e-003  9.24e-011
        1    8.35e-002    4.54e+000
  50  8.765033e-003    4.11e-014    1.13e-001   4.93e-011   8.46e-003  4.74e-011
        1    8.36e-002    4.63e+000
  51  8.765033e-003    2.10e-014    1.13e-001   2.53e-011   8.45e-003  2.43e-011
        1    8.21e-002    4.72e+000
  52  8.765033e-003    1.08e-014    1.13e-001   1.29e-011   8.42e-003  1.25e-011
        1    8.27e-002    4.81e+000
  53  8.765033e-003    5.56e-015    1.13e-001   6.64e-012   8.48e-003  6.39e-012
        1    8.40e-002    4.90e+000
  54  8.765033e-003    2.87e-015    1.13e-001   3.40e-012   8.53e-003  3.28e-012
        1    8.60e-002    5.00e+000
  55  8.765033e-003    1.43e-015    1.13e-001   1.75e-012   8.28e-003  1.68e-012
        1    8.59e-002    5.09e+000
  56  8.765033e-003    7.36e-016    1.13e-001   8.92e-013   8.33e-003  8.61e-013
        1    8.61e-002    5.18e+000
  57  8.765033e-003    3.68e-016    1.13e-001   4.57e-013   8.12e-003  4.41e-013
        1    8.50e-002    5.27e+000
  58  8.765033e-003    2.20e-016    1.13e-001   2.38e-013   9.50e-003  2.27e-013
        1    8.48e-002    5.37e+000
  59  8.765033e-003    1.02e-016    1.13e-001   1.17e-013   8.58e-003  1.16e-013
        1    8.43e-002    5.46e+000
  60  8.765033e-003    1.39e-017    1.13e-001   6.23e-014   2.27e-003  5.86e-014
        1    8.46e-002    5.55e+000
  61  8.765033e-003    4.51e-017    1.13e-001   3.35e-014   1.46e-002  3.06e-014
        1    8.44e-002    5.64e+000
  62  8.765033e-003    0.00e+000    0.00e+000   1.46e-014   0.00e+000  1.53e-014
        1    5.88e-002    5.71e+000
  63  8.765033e-003    0.00e+000    0.00e+000   7.94e-015   0.00e+000  3.82e-015
        1    5.91e-002    5.78e+000

Solver Summary (v 1.10.0-no_lapack-openmp)

                                     Original                  Reduced
Parameter blocks                            1                        1
Parameters                                 40                       40
Residual blocks                             1                        1
Residual                                    2                        2

Minimizer                        TRUST_REGION

Dense linear algebra library            EIGEN
Trust region strategy     LEVENBERG_MARQUARDT

                                        Given                     Used
Linear solver                        DENSE_QR                 DENSE_QR
Threads                                     1                        1
Linear solver threads                       1                        1

Cost:
Initial                         1.171152e-002
Final                           8.765033e-003
Change                          2.946487e-003

Minimizer iterations                       63
Successful steps                           61
Unsuccessful steps                          2

Time (in seconds):
Preprocessor                           0.0003

  Residual evaluation                  1.6841
    Line search cost evaluation        0.0000
  Jacobian evaluation                  3.3217
    Line search gradient evaluation    1.6925
  Linear solver                        0.3187
  Line search polynomial minimization  0.0000
Minimizer                              5.8411

Postprocessor                          0.0000
Total                                  5.8414

Termination:                      CONVERGENCE (Parameter tolerance reached. Rela
tive step_norm: 0.000000e+000 <= 0.000000e+000.)

--------------------------------------------------------------------------------------------------------------------------
(3)
iter      cost      cost_change  |gradient|   |step|    tr_ratio  tr_radius  ls_
iter  iter_time  total_time
   0  8.765034e-003    0.00e+000    1.13e-001   0.00e+000   0.00e+000  1.00e+004
        0    2.72e-002    2.75e-002
   1  8.617538e-003    1.47e-004    1.12e-001   1.78e-001   1.68e-002  5.26e+003
        1    8.39e-002    1.23e-001
   2  8.473034e-003    1.45e-004    1.11e-001   1.75e-001   1.68e-002  2.76e+003
        1    8.32e-002    2.13e-001
   3  8.331447e-003    1.42e-004    1.10e-001   1.73e-001   1.67e-002  1.45e+003
        1    8.34e-002    3.03e-001
   4  8.192711e-003    1.39e-004    1.09e-001   1.71e-001   1.67e-002  7.63e+002
        1    8.32e-002    3.93e-001
   5  8.056764e-003    1.36e-004    1.08e-001   1.68e-001   1.66e-002  4.01e+002
        1    8.26e-002    4.83e-001
   6  7.923562e-003    1.33e-004    1.06e-001   1.66e-001   1.65e-002  2.10e+002
        1    8.43e-002    5.74e-001
   7  7.793079e-003    1.30e-004    1.05e-001   1.64e-001   1.65e-002  1.10e+002
        1    8.28e-002    6.63e-001
   8  7.665332e-003    1.28e-004    1.04e-001   1.62e-001   1.64e-002  5.80e+001
        1    8.27e-002    7.53e-001
   9  7.540407e-003    1.25e-004    1.03e-001   1.59e-001   1.63e-002  3.04e+001
        1    8.28e-002    8.44e-001
  10  7.418519e-003    1.22e-004    1.02e-001   1.57e-001   1.62e-002  1.60e+001
        1    8.26e-002    9.34e-001
  11  7.300117e-003    1.18e-004    1.01e-001   1.53e-001   1.60e-002  8.37e+000
        1    8.30e-002    1.02e+000
  12  7.186050e-003    1.14e-004    1.00e-001   1.49e-001   1.56e-002  4.39e+000
        1    8.24e-002    1.11e+000
  13  7.077830e-003    1.08e-004    9.91e-002   1.42e-001   1.51e-002  2.29e+000
        1    8.18e-002    1.20e+000
  14  6.977932e-003    9.99e-005    9.82e-002   1.32e-001   1.43e-002  1.20e+000
        1    8.25e-002    1.29e+000
  15  6.889883e-003    8.80e-005    9.74e-002   1.17e-001   1.32e-002  6.23e-001
        1    8.35e-002    1.38e+000
  16  6.817606e-003    7.23e-005    9.68e-002   9.66e-002   1.19e-002  3.23e-001
        1    8.21e-002    1.47e+000
  17  6.763658e-003    5.39e-005    9.63e-002   7.24e-002   1.07e-002  1.67e-001
        1    8.33e-002    1.56e+000
  18  6.727427e-003    3.62e-005    9.60e-002   4.88e-002   9.65e-003  8.57e-002
        1    8.34e-002    1.66e+000
  19  6.705312e-003    2.21e-005    9.57e-002   2.98e-002   8.95e-003  4.40e-002
        1    8.34e-002    1.75e+000
  20  6.692744e-003    1.26e-005    9.56e-002   1.70e-002   8.53e-003  2.26e-002
        1    8.49e-002    1.84e+000
  21  6.685922e-003    6.82e-006    9.56e-002   9.21e-003   8.30e-003  1.16e-002
        1    8.55e-002    1.93e+000
  22  6.682318e-003    3.60e-006    9.55e-002   4.87e-003   8.17e-003  5.93e-003
        1    8.47e-002    2.03e+000
  23  6.680442e-003    1.88e-006    9.55e-002   2.54e-003   8.11e-003  3.04e-003
        1    8.58e-002    2.12e+000
  24  6.679473e-003    9.69e-007    9.55e-002   1.31e-003   8.07e-003  1.56e-003
        1    8.52e-002    2.21e+000
  25  6.678974e-003    4.98e-007    9.55e-002   6.74e-004   8.05e-003  7.97e-004
        1    8.54e-002    2.31e+000
  26  6.678718e-003    2.56e-007    9.55e-002   3.46e-004   8.04e-003  4.08e-004
        1    8.51e-002    2.40e+000
  27  6.678587e-003    1.31e-007    9.55e-002   1.77e-004   8.04e-003  2.09e-004
        1    8.56e-002    2.49e+000
  28  6.678520e-003    6.72e-008    9.55e-002   9.09e-005   8.04e-003  1.07e-004
        1    8.53e-002    2.58e+000
  29  6.678486e-003    3.44e-008    9.55e-002   4.65e-005   8.04e-003  5.48e-005
        1    8.63e-002    2.68e+000
  30  6.678468e-003    1.76e-008    9.55e-002   2.38e-005   8.04e-003  2.81e-005
        1    8.62e-002    2.77e+000
  31  6.678459e-003    9.04e-009    9.55e-002   1.22e-005   8.03e-003  1.44e-005
        1    8.64e-002    2.87e+000
  32  6.678454e-003    4.63e-009    9.55e-002   6.25e-006   8.03e-003  7.36e-006
        1    8.61e-002    2.96e+000
  33  6.678452e-003    2.37e-009    9.55e-002   3.20e-006   8.03e-003  3.77e-006
        1    8.74e-002    3.06e+000
  34  6.678451e-003    1.21e-009    9.55e-002   1.64e-006   8.03e-003  1.93e-006
        1    8.50e-002    3.15e+000
  35  6.678450e-003    6.22e-010    9.55e-002   8.40e-007   8.03e-003  9.89e-007
        1    8.42e-002    3.24e+000
  36  6.678450e-003    3.18e-010    9.55e-002   4.30e-007   8.03e-003  5.07e-007
        1    8.54e-002    3.33e+000
  37  6.678450e-003    1.63e-010    9.55e-002   2.20e-007   8.03e-003  2.59e-007
        1    8.47e-002    3.42e+000
  38  6.678449e-003    8.35e-011    9.55e-002   1.13e-007   8.03e-003  1.33e-007
        1    8.53e-002    3.52e+000
  39  6.678449e-003    4.28e-011    9.55e-002   5.78e-008   8.03e-003  6.80e-008
        1    8.52e-002    3.61e+000
  40  6.678449e-003    2.19e-011    9.55e-002   2.96e-008   8.03e-003  3.48e-008
        1    8.55e-002    3.70e+000
  41  6.678449e-003    1.12e-011    9.55e-002   1.52e-008   8.03e-003  1.78e-008
        1    8.50e-002    3.79e+000
  42  6.678449e-003    5.75e-012    9.55e-002   7.77e-009   8.03e-003  9.14e-009
        1    8.38e-002    3.89e+000
  43  6.678449e-003    2.94e-012    9.55e-002   3.98e-009   8.03e-003  4.68e-009
        1    8.28e-002    3.98e+000
  44  6.678449e-003    1.51e-012    9.55e-002   2.04e-009   8.03e-003  2.40e-009
        1    8.30e-002    4.07e+000
  45  6.678449e-003    7.72e-013    9.55e-002   1.04e-009   8.03e-003  1.23e-009
        1    8.33e-002    4.16e+000
  46  6.678449e-003    3.95e-013    9.55e-002   5.34e-010   8.03e-003  6.29e-010
        1    8.29e-002    4.25e+000
  47  6.678449e-003    2.02e-013    9.55e-002   2.74e-010   8.04e-003  3.22e-010
        1    8.28e-002    4.34e+000
  48  6.678449e-003    1.04e-013    9.55e-002   1.40e-010   8.03e-003  1.65e-010
        1    8.20e-002    4.43e+000
  49  6.678449e-003    5.31e-014    9.55e-002   7.18e-011   8.03e-003  8.45e-011
        1    8.22e-002    4.52e+000
  50  6.678449e-003    2.72e-014    9.55e-002   3.68e-011   8.04e-003  4.33e-011
        1    8.30e-002    4.61e+000
  51  6.678449e-003    1.39e-014    9.55e-002   1.88e-011   8.04e-003  2.22e-011
        1    8.33e-002    4.70e+000
  52  6.678449e-003    7.15e-015    9.55e-002   9.64e-012   8.05e-003  1.13e-011
        1    8.27e-002    4.79e+000
  53  6.678449e-003    3.67e-015    9.55e-002   4.94e-012   8.07e-003  5.81e-012
        1    8.35e-002    4.88e+000
  54  6.678449e-003    1.85e-015    9.55e-002   2.53e-012   7.93e-003  2.98e-012
        1    8.18e-002    4.97e+000
  55  6.678449e-003    9.37e-016    9.55e-002   1.30e-012   7.85e-003  1.52e-012
        1    8.38e-002    5.06e+000
  56  6.678449e-003    5.13e-016    9.55e-002   6.65e-013   8.40e-003  7.81e-013
        1    8.32e-002    5.15e+000
  57  6.678449e-003    2.57e-016    9.55e-002   3.39e-013   8.20e-003  4.00e-013
        1    8.30e-002    5.24e+000
  58  6.678449e-003    1.15e-016    9.55e-002   1.76e-013   7.19e-003  2.04e-013
        1    8.24e-002    5.33e+000
  59  6.678449e-003    6.42e-017    9.55e-002   8.79e-014   7.83e-003  1.05e-013
        1    8.27e-002    5.43e+000
  60  6.678449e-003    3.90e-017    9.55e-002   4.71e-014   9.31e-003  5.38e-014
        1    8.21e-002    5.51e+000
  61  6.678449e-003    3.82e-017    9.55e-002   2.66e-014   1.77e-002  2.84e-014
        1    8.22e-002    5.61e+000
  62  6.678449e-003    1.30e-017    9.55e-002   1.46e-014   1.15e-002  1.47e-014
        1    8.23e-002    5.69e+000
  63  6.678449e-003    0.00e+000    0.00e+000   7.94e-015   0.00e+000  7.33e-015
        1    5.80e-002    5.76e+000

Solver Summary (v 1.10.0-no_lapack-openmp)

                                     Original                  Reduced
Parameter blocks                            1                        1
Parameters                                 40                       40
Residual blocks                             1                        1
Residual                                    2                        2

Minimizer                        TRUST_REGION

Dense linear algebra library            EIGEN
Trust region strategy     LEVENBERG_MARQUARDT

                                        Given                     Used
Linear solver                        DENSE_QR                 DENSE_QR
Threads                                     1                        1
Linear solver threads                       1                        1

Cost:
Initial                         8.765034e-003
Final                           6.678449e-003
Change                          2.086585e-003

Minimizer iterations                       63
Successful steps                           62
Unsuccessful steps                          1

Time (in seconds):
Preprocessor                           0.0003

  Residual evaluation                  1.6697
    Line search cost evaluation        0.0000
  Jacobian evaluation                  3.3280
    Line search gradient evaluation    1.6794
  Linear solver                        0.3156
  Line search polynomial minimization  0.0000
Minimizer                              5.8250

Postprocessor                          0.0000
Total                                  5.8253

Termination:                      CONVERGENCE (Parameter tolerance reached. Rela
tive step_norm: 0.000000e+000 <= 0.000000e+000.)

------------------------------------------------------------------------------------------------------

Thanks,

Yilan

--
You received this message because you are subscribed to a topic in the Google Groups "Ceres Solver" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/ceres-solver/X47lYYbg2vs/unsubscribe.
To unsubscribe from this group and all its topics, send an email to ceres-solver...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ceres-solver/CABqdRUC5GKGSvrhGLk_ise7xSfmuy6m7Rd9E%2B777yPmAuh1T%2BQ%40mail.gmail.com.

Yilan Chen

unread,
Feb 27, 2015, 12:41:48 AM2/27/15
to ceres-...@googlegroups.com
Solver Summary (v 1.10.0-no_lapack-openmp)

                                     Original                  Reduced
Parameter blocks                            1                        1
Parameters                                 40                       40
Residual blocks                             1                        1
Residual                                    2                        2

Minimizer                        TRUST_REGION

Dense linear algebra library            EIGEN
Trust region strategy     LEVENBERG_MARQUARDT

                                        Given                     Used
Linear solver                        DENSE_QR                 DENSE_QR
Threads                                     1                        1
Linear solver threads                       1                        1

Cost:
Initial                         1.607741e-002
Final                           1.171152e-002
Change                          4.365892e-003

Minimizer iterations                       64
Successful steps                           61
Unsuccessful steps                          3

Time (in seconds):
Preprocessor                           0.0006

  Residual evaluation                  1.7395
    Line search cost evaluation        0.0000
  Jacobian evaluation                  3.3738
    Line search gradient evaluation    1.7157
  Linear solver                        0.3237
  Line search polynomial minimization  0.0002
Minimizer                              5.9827

Postprocessor                          0.0000
Total                                  5.9834
Termination:                      CONVERGENCE (Parameter tolerance reached. Rela
tive step_norm: 0.000000e+000 <= 0.000000e+000.)

Solver Summary (v 1.10.0-no_lapack-openmp)

                                     Original                  Reduced
Parameter blocks                            1                        1
Parameters                                 40                       40
Residual blocks                             1                        1
Residual                                    2                        2

Minimizer                        TRUST_REGION

Dense linear algebra library            EIGEN
Trust region strategy     LEVENBERG_MARQUARDT

                                        Given                     Used
Linear solver                        DENSE_QR                 DENSE_QR
Threads                                     1                        1
Linear solver threads                       1                        1

Cost:
Initial                         1.171152e-002
Final                           8.765033e-003
Change                          2.946487e-003

Minimizer iterations                       63
Successful steps                           61
Unsuccessful steps                          2

Time (in seconds):
Preprocessor                           0.0003

  Residual evaluation                  1.6841
    Line search cost evaluation        0.0000
  Jacobian evaluation                  3.3217
    Line search gradient evaluation    1.6925
  Linear solver                        0.3187
  Line search polynomial minimization  0.0000
Minimizer                              5.8411

Postprocessor                          0.0000
Total                                  5.8414
Termination:                      CONVERGENCE (Parameter tolerance reached. Rela
tive step_norm: 0.000000e+000 <= 0.000000e+000.)

Solver Summary (v 1.10.0-no_lapack-openmp)

                                     Original                  Reduced
Parameter blocks                            1                        1
Parameters                                 40                       40
Residual blocks                             1                        1
Residual                                    2                        2

Minimizer                        TRUST_REGION

Dense linear algebra library            EIGEN
Trust region strategy     LEVENBERG_MARQUARDT

                                        Given                     Used
Linear solver                        DENSE_QR                 DENSE_QR
Threads                                     1                        1
Linear solver threads                       1                        1

Cost:
Initial                         8.765034e-003
Final                           6.678449e-003
Change                          2.086585e-003

Minimizer iterations                       63
Successful steps                           62
Unsuccessful steps                          1

Time (in seconds):
Preprocessor                           0.0003

  Residual evaluation                  1.6697
    Line search cost evaluation        0.0000
  Jacobian evaluation                  3.3280
    Line search gradient evaluation    1.6794
  Linear solver                        0.3156
  Line search polynomial minimization  0.0000
Minimizer                              5.8250

Postprocessor                          0.0000
Total                                  5.8253
Termination:                      CONVERGENCE (Parameter tolerance reached. Rela
tive step_norm: 0.000000e+000 <= 0.000000e+000.)

------------------------------------------------------------------------------------------------------

Thanks,
Yilan

Sameer Agarwal

unread,
Feb 27, 2015, 1:15:35 AM2/27/15
to ceres-...@googlegroups.com
Yilan,

Two things to note here. One you are using bounds constraints. This means that you cannot just look at the gradient. Instead one has to look at the projected gradient. Projected gradient is defined as:

projected_gradient = x - \Pi( x - g)

for the case where there are no bounds constraints active, \Pi is a no-op and projected_gradient = g.

But if you are on the boundary, then we cannot always move along the gradient. With this in mind, notice that in the execution log, the norm of the projected gradient falls down to 1e-15.  So we are converging to a solution.  

It does not mean that it is the globally optimal solution, but it is locally optimal. And this is the best that can be hoped for with a gradient based algorithm.

Sameer

To unsubscribe from this group and stop receiving emails from it, send an email to ceres-solver+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ceres-solver/2f3f6f82-e195-4af9-a0a7-fd30a9aef306%40googlegroups.com.

Yilan Chen

unread,
Feb 27, 2015, 7:10:01 AM2/27/15
to ceres-...@googlegroups.com
Hi Sameer,

Thanks a lot! Really helpful notes. But I still have one question..
After reading your reply, I commented the sentences of applying bound constraints and kept using trust-region method, but the optimization result was still the same as what I've posted. Then I tried line-search method instead and it can converge to a good result. Why is that?

Thank you,
Yilan
...

Sameer Agarwal

unread,
Feb 27, 2015, 12:40:04 PM2/27/15
to ceres-...@googlegroups.com
Yilan,

Again, an execution log helps for questions like these. The best I can imagine that is going on is that the function is too non-linear for the Gauss-Newton approximation to the Hessian used by the Levenberg-Marquardt algorithm, so it gets stuck in a local optimum - looking at the gradient norm should confirm this.

When you switch to the line search algorithm, it either accidentally or because of the BFGS approximation moves in a different direction and avoids the local minimum.

Sameer



--
You received this message because you are subscribed to the Google Groups "Ceres Solver" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ceres-solver...@googlegroups.com.

Yilan Chen

unread,
Feb 27, 2015, 10:21:55 PM2/27/15
to ceres-...@googlegroups.com
Hi Sameer,

Yea you are right - my cost function is really too non-linear and the gradient norm did reduce to zero. Now I figure out the reasons. 
Thank you for your help!

Yilan
...

Alex Stewart

unread,
Mar 2, 2015, 1:22:17 PM3/2/15
to ceres-...@googlegroups.com
Yilan,

To add to Sameer’s points, another factor is that in the problem you posted, you have 40 parameters, but only 2 residuals, thus the problem is heavily under-determined.  In this case, LM is going to be very close to pure gradient descent, if you had only a single residual, it would be exactly scaled gradient descent.

At every iteration, the Gauss-Newton component (i.e. before addition of \lambda*I) of the LM approximated Hessian of the linear system formed by linearising the problem about x_k is going to be at best rank-2 (after the addition of \lambda*I, it will be full rank), this is likely to result in slow convergence as when determining the step it should take next, the optimiser is in effect blind to the curvature of 38 dimensions of your parameter space (the Null space of the linear system) and the descent direction will be dominated by the gradient.

The reason BFGS can work better for under-constrained problems like this is that it aggregates information about the curvature of the problem over all (BFGS) / multiple (L-BFGS) previous iterations into its current approximation of the Hessian.

Although at each iteration BFGS is actually only ever performing a rank-2 update to its Hessian approximation (two rank-1 updates, not just for your problem, but in general), it aggregates these updates for many iterations, and loosely, each rank-2 update can provide information about the sensitivity of the problem to different parts of the parameter space.  Thus, the result is that BFGS can select a (potentially) more informed descent direction as it has (at least) some idea about the curvature of the problem in the 38 dimensions of the parameter space that LM is blind to.

In fact, MATLAB uses BFGS by default for fminunc & fmincon, because it is solving a general (i.e. not necessarily least-squares) problem with a single scalar residual and an arbitrary number of parameters.

Depending upon the problem you are solving, sometimes it _may_ be that L-BFGS works better than BFGS, because it uses only information from the last Solver::Options::max_lbfgs_rank iterations and thus 'forgets' information about the problem 'far away' from the current position (which may no longer be accurate / helpful); whereas in BFGS this information is always there, but is attenuated by more recent information.  However, this is not the primary purpose for L-BFGS (which is to limit memory usage in problems with very large numbers of parameters) and BFGS usually works better, as dropping information, as L-BFGS does, often drops information that is still useful.

-Alex

--
You received this message because you are subscribed to the Google Groups "Ceres Solver" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ceres-solver...@googlegroups.com.

Yilan Chen

unread,
Mar 5, 2015, 6:43:14 AM3/5/15
to ceres-...@googlegroups.com
Hi Alex,

Thank you for your detailed explanation, it really helps me understand the problem better. I'm trying to separate one of my residual into more residuals and make some adjustments - actually I just added them up for convenience before. I didn't know that it matters.
Again, thanks for your time!

Yilan
...
Reply all
Reply to author
Forward
0 new messages