Re-Optimization

117 views
Skip to first unread message

Ignacio

unread,
Dec 7, 2012, 6:19:08 AM12/7/12
to yal...@googlegroups.com
Hello.

I am dealing whit a quadratic non-convex problem. The thing is that with some changes I have converted it to a convex problem and I have obtained a solution. The solution is not bad but is not as good as I think it could be.

My question is, Can I solve the initial quadratic non-convex problem to "re-optimize" the solution that I have just obtained? I know that is NP-Hard and that I will obtain local solutions, but if I am lucky the local solution will be the global solution because the initial point that I am providing is close enough to the global solution.

What do you think?

Thanks in advance,
Ignacio

Johan Löfberg

unread,
Dec 7, 2012, 3:10:50 PM12/7/12
to
Yes, use the 'usex0' option to give an initial guess to the solver. Since you've just solved a problem, a solution is already assigned internally and can thus be used (otherwise you use the command assign)

sdpvar x
RealProblem = [x^2 == 1];
SimpleProblem = [x^2 <= 1];
objective = 1;

solvesdp(SimpleProblem,objective);
solvesdp(RealProblemProblem,objective,sdpsettings('usex0',1));


Message has been deleted
Message has been deleted
Message has been deleted
Message has been deleted

Ignacio

unread,
Dec 10, 2012, 6:43:05 AM12/10/12
to yal...@googlegroups.com
Hello,

I have compiled it and it has been runnig for three days and it has not finished yet. I guess there is somethig wrong.

I am wondering how much I am helping the program providing it a good initial point. The initial guess that I am providing is actually a solution of the problem but I would like to refine it. (The problem that I want to solve is non-convex).

Is a good idea to re-optimize the solution like that?

Thanks a lot!

Johan Löfberg

unread,
Dec 10, 2012, 8:02:08 AM12/10/12
to yal...@googlegroups.com
It depends on the problem. In many cases, it does not help much. Mixed-integer solvers are based on computing upper bounds (feasible solutions) and lower bounds (from relaxations) and then tighten these bounds. In some models it is easy to find feasible solutions, or even the global solution, hence an initial guess does not contribute much, but the problem is to close the gap from below, i.e., computing good lower bounds.

If you show the display from the solver, it might be possible to see what happens in your case. 

BTW, which solver are you talking about? Neither CPLEX nor Gurobi support initial guesses in the current version of YALMIP.

Ignacio

unread,
Dec 10, 2012, 9:33:48 AM12/10/12
to yal...@googlegroups.com
Which one should I use?
to solve a problem like:

Alfa = sdpvar(1);
x = sdpvar(M*N,1,'full','complex');
assign(x,wo.');
ripple=0.5;
Constraints=[];
for i=1:I
    Constraints=[Constraints, (10^(-ripple/20))^2 <=abs(x'*Ac(i,:)'*Ac(i,:)*x) <= (10^(+ripple/20))^2];
end
for j=1:J
    Constraints=[Constraints, -Alfa <= real(A(j,:)*x) <= Alfa,-Alfa <= imag(A(j,:)*x) <= Alfa];
end
solvesdp(Constraints,Alfa,sdpsettings('usex0',1))

Johan Löfberg

unread,
Dec 10, 2012, 9:39:09 AM12/10/12
to yal...@googlegroups.com
Ah, that's right, you have a nonconvex QCQP. The only global solver available through YALMIP is the internal solver BMIBNB.

You say you have been waiting for several days, so which solver did you use then? Sounds strange, since YALMIP doesn't select the global solver by default. Maybe your problem is so massively large that even a local solver chokes on it?

Attach your data so that I can run the code.

Johan Löfberg

unread,
Dec 10, 2012, 9:41:47 AM12/10/12
to yal...@googlegroups.com
This

abs(x'*Ac(i,:)'*Ac(i,:)*x)

makes no sense. The quadratic expression is non-negative by construction. However, when you use an abs operator like this, most likely YALMIP introduces binary variables to allow the nonconvex use of the abs operator.

Ignacio

unread,
Dec 10, 2012, 10:06:24 AM12/10/12
to yal...@googlegroups.com
Thanks,

The code is attached, the data comes from previous rutines.
Code.rar

Ignacio

unread,
Dec 10, 2012, 10:35:04 AM12/10/12
to yal...@googlegroups.com
Sorry, there was a mistake.
I attach the good one.
Code.rar

Johan Löfberg

unread,
Dec 10, 2012, 2:23:37 PM12/10/12
to yal...@googlegroups.com


The problem is simply way way way too large for it to be solved using a general global b&b solver. If having any chance at all you would have to look at a global solver developed with this specific kind of problems in mind, although I doubt it would be tractable even then.

Now, it looks like it is a simple filter design problem and surely there must be a vast literature on methods to solve these problems efficiently, using various change of variables, and / or heuristics.

Just for reference, the model didn't even start to compute on my machine before I cancelled it, i.e., it spends an eternity even setting up the problem. You can improve this by two tricks. To begin with, as it is now, you force YALMIP to symbolically define dense quadratic polynomials with 242 variables 414 times. This takes a lot of time. Better, create a matrix having all involved monomials once, and then rearrange the equations to 

X = x*x';
for i=1:ThetaC
    Constraints=[Constraints, (10^(-ripple/20))^2 <=real(Ac(i,:)*X*Ac(i,:)') <= (10^(+ripple/20))^2];
end

Also in the second for-loop you waste time, since you perform a very simple vectorizable operation. Replace with

Constraints=[Constraints, -Alfa <= real(A*x) <= Alfa,-Beta <= imag(A*x) <= Beta];

Now bmibnb starts within 30s or so. But then it will go on for ever.

BTW, usex0 will not work, since you haven't defined the initial value for alpha and beta.

More, your initial solution is not feasible at all.

 real((wo.')'*Ac(end,:)'*Ac(end,:)*(wo.'))

yields 0.62, whereas your lower bound is .89

Finally, you are making the classical mistake of creating a huge problem and then just looking at it going up in smoke. Reduce and simplify until you get something you actually can solve. Below, you see that already with 3 variables in x, the model is hard to solve

yalmip('clear')
clear all
load A;
load Ac;
load wo;

% Create a tiny version
A = A(:,1:3);
Ac = Ac(:,1:3);
[ThetaC,~]=size(Ac);
[ThetaSLL1,~] = size(A);
Alfa = sdpvar(1);
Beta = sdpvar(1);
x = sdpvar(size(A,2),1,'full','complex');
ripple=0.5;
Constraints=[];
X = x*x';
for i=1:ThetaC  
    Constraints=[Constraints, (10^(-ripple/20))^2 <=real(Ac(i,:)*X*Ac(i,:)') <= (10^(+ripple/20))^2];
end
Constraints=[Constraints, -Alfa <= real(A*x) <= Alfa,-Beta <= imag(A*x) <= Beta];

% Try different local solvers
% Takes 10 seconds when we keep 3 columns using ipopt
% Takes 20 seconds when we keep 6 columns
% Takes 80 seconds when we keep 12 columns
% Takes 65 seconds when we keep 20 columns
% Takes 107 seconds when we keep 30 columns
% Takes 172 seconds when we keep 40 columns
solvesdp(Constraints,Alfa+Beta,sdpsettings('solver','ipopt'))
solvesdp(Constraints,Alfa+Beta,sdpsettings('solver','fmincon'))

% Try global solver
% Uses undocumented hack to turn off all bound-propagation
solvesdp([Constraints, -100 <= [real(x);imag(x);Alfa;Beta] <= 100],Alfa+Beta,sdpsettings('solver','bmibnb','bmibnb.strengthscheme',[]))





Reply all
Reply to author
Forward
0 new messages