Convergence failures during solving a linear decomposition problem of moderate number of dimentions

103 views
Skip to first unread message

Pedro Rodrigues

unread,
Apr 9, 2018, 8:54:33 AM4/9/18
to Ceres Solver
Hi everyone

I am solving a large set of linear decomposition problems as follows.
In each case I have a composition Y (n dimensional) that is supposed to be a linear combination of m compositions A1,...Am (each of them also n dimensional), whith (non negative) weights X (m dimensional).
That is to say we expect Y=AX and we want to determine the X that gives the closest match.
The n residuals are AikXk-Yi
It works pretty well up to m around 25. For bigger problems Solver still reports convergence but often fails to obtain the absolute minimum. This can be easily tested using the base compositions Ai as target composition Y.
The parameters being used are:
    options.line_search_interpolation_type = ceres::BISECTION;
    options.minimizer_type = ceres::TRUST_REGION;
    options.use_nonmonotonic_steps = true;
    options.max_consecutive_nonmonotonic_steps = 10;
    options.trust_region_strategy_type = ceres::DOGLEG;
    options.dogleg_type = ceres::SUBSPACE_DOGLEG;
    options.linear_solver_type = ceres::DENSE_QR;
    options.use_explicit_schur_complement = true;
    options.max_num_iterations = 400;

The m weights are constrained by calling SetParameterLowerBound(X, i, 0) for each one.

Do you have any suggestion to improve the results

Many thanks in advance for any help

Pedro

Sameer Agarwal

unread,
Apr 9, 2018, 1:10:43 PM4/9/18
to ceres-...@googlegroups.com
Pedro,
Ceres can only find local minimums. It is not guaranteed to find the globally optimal solution.
Sameer


--
You received this message because you are subscribed to the Google Groups "Ceres Solver" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ceres-solver...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ceres-solver/588260f8-62a7-4162-a7bb-2c2d754ad4cd%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Pedro Rodrigues

unread,
Apr 12, 2018, 10:58:10 AM4/12/18
to Ceres Solver
Hi Sameer

Thank you for your clarification.

However I was able to solve the problem using AutoDiffLocalParameterization to keep X in the hyperplane defined by the sum (Xi) = 1
This condition is successively violated and spontaneously reacquired during the convergence to the solution, but only for numbers of dimensions less than about 25. Apparently, upwards the number of degrees of freedom is so high that the algorithm may become trapped in one of many local minimum that exist outside  that hyperplane despite of being in use the TRUST_REGION variant.

Pedro

Sameer Agarwal

unread,
Apr 12, 2018, 1:43:05 PM4/12/18
to ceres-...@googlegroups.com
Pedro,
are you trying to find a low rank factorization of a large matrix with missing entries?
Sameer


Pedro Rodrigues

unread,
Apr 12, 2018, 6:58:57 PM4/12/18
to Ceres Solver
Hi Sameer

It can be seen as something similar to a rank factorization of a given nx1 matrix but in the present case one of the factors (the nxm matrix) is also known, not to be determined. The only unknown to be determined is the other factor (the mx1 matrix) whose elements must be non negative.
In fact, it have the form of a linear system of equations Y=AX (where Y and A are known and X is to be determined) but were the number of equations is, in general, not the same as the number of the unknowns. Yet, the solutions X are unique because the columns of A are linearly independent and due to the non negativity constraint applyed to the elements of X.

With the modifyed version of LocalParameterization::Plus(), as I referred to in my previous post, Ceres is able to find the correct solutions  in a short number of iterations, even for the larger cases I am dealing with (A dimensions 30x100)

Thank you very much

Pedro

Sameer Agarwal

unread,
Apr 23, 2018, 3:28:37 PM4/23/18
to ceres-...@googlegroups.com
Pedro,
If one of the factors is known, then you have a non-negative linear least squares problem.
This is a convex problem, and there are specialized solvers that will perform better (convergence and speed) than ceres.
There is also CVXOPT.
Sameer


--
You received this message because you are subscribed to the Google Groups "Ceres Solver" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ceres-solver...@googlegroups.com.

Pedro Rodrigues

unread,
May 2, 2018, 8:22:43 AM5/2/18
to Ceres Solver
Hi Sameer

Many thanks for your suggestions.
I spent some time experimenting with tsnnls but unfortunatelly my problems seem to not fit in it. I only obtain convergence for the simplest cases  (and the results produced in that cases are almost correct if not the presence of some small negative ones) with the unconstrainded solver function t_lsqr. The constrained solver function t_snnls() never converged during my tests.
For CVXOPT it is not convenient for me to use it since I need a  library to call from inside my C++ program.

On the other hand, I managed to improve the computation speed of Ceres by a factor of two by using analytical derivatives. It can be even further improved, since the jacobian is allways constant, if there was some way to make Ceres take advantage on this fact by requesting the jacobian evaluation only once. Do you have a suggestion on the location of the source code of Ceres that can be tweaked for this end. 

Thanks again

Pedro

Sameer Agarwal

unread,
May 4, 2018, 7:44:41 PM5/4/18
to ceres-...@googlegroups.com
Hi Pedro,
My comments are inline.

On Wed, May 2, 2018 at 5:22 AM Pedro Rodrigues <re...@fc.ul.pt> wrote:
Hi Sameer

Many thanks for your suggestions.
I spent some time experimenting with tsnnls but unfortunatelly my problems seem to not fit in it. I only obtain convergence for the simplest cases  (and the results produced in that cases are almost correct if not the presence of some small negative ones) with the unconstrainded solver function t_lsqr. The constrained solver function t_snnls() never converged during my tests.

I am surprised. But good to know.

I am not trying to get you off of ceres, but I think a QP solver is really what you need here :). So a few more suggestions:

If you are looking for a high quality convex optimization library, I can recommend mosek (www.mosek.com), I have used it in the past and its absolutely first rate both in quality as well as support. Depending on what you are doing you maybe able to use it for free.

Another more recent option which I have good things about is OSQP (https://github.com/oxfordcontrol/osqp).

For CVXOPT it is not convenient for me to use it since I need a  library to call from inside my C++ program.

On the other hand, I managed to improve the computation speed of Ceres by a factor of two by using analytical derivatives. It can be even further improved, since the jacobian is allways constant, if there was some way to make Ceres take advantage on this fact by requesting the jacobian evaluation only once. Do you have a suggestion on the location of the source code of Ceres that can be tweaked for this end. 

Short of computing this once (at construction) and just copying the Jacobian when asked for it would be the simplest way. 

It is possible to modify ceres internally, but its not going to be pretty and I am hesitant to recommend that path to you. You will have to modify trust_region_minimizer to not evaluate the Jacobian after iteration zero. I forget what linear solver you are using, but the linear solver should also be modified to compute the factorization of the normal equations once only.. though you will have to set the trust region radius to be very large for this work.. 

Sameer



 

Thanks again

Pedro

--
You received this message because you are subscribed to the Google Groups "Ceres Solver" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ceres-solver...@googlegroups.com.
Reply all
Reply to author
Forward
0 new messages