non linear convex programming

197 views
Skip to first unread message

Jane

unread,
Aug 28, 2015, 9:37:08 AM8/28/15
to YALMIP

Hello,

I am trying to maximize the following function which is concave. However, when I try to solve it in YALMIP with mosek or sdpt3, it says “Solver not applicable”.  I have implemented it with CVX and it has no issue with the convexity of the problem.


My function:    max   x*log(1+y/x)


X and Y are real positive variables with boundaries. I am sure that my optimization problem is feasible and bounded. I am new to YALMIP and I do not know what the problem is.


My code in YALMIP:


X = sdpvar(a,b,c);

Y = sdpvar(a,b,c);

      

Constraint = [K <= X.*log(1+Y./X)];         

 

Constraint = [Constraint, some other “linear” constraints];

solvesdp(Constraint,-1*sum(sum(sum(K,1),2),3))

 

I would be very thankful if you could help me.

Jane

Johan Löfberg

unread,
Aug 28, 2015, 9:52:46 AM8/28/15
to YALMIP
sdpt3 does not support this type of functions. However, cvx implements a simple approximation of logarithms for conic solvers.

In general, I think it is weird to try to solve standard nonlinear convex programs using a conic approximation, and would just tell you to use, e.g., fmincon, in yalmip (and I think there is some kind of entropy operator for this specific function in YALMIP which will be slightly faster than simply typing using the nonlinear expression directly, don't remember its name though...perpective_slog or something).  Perhaps some of your other "linear" constraints are conic as yalmip says there is no suitable solver? Post the complete model please.

Johan Löfberg

unread,
Aug 28, 2015, 9:59:21 AM8/28/15
to YALMIP
An operator (I think...) is -kullbackleibler(x,x+y)(the benefit of using this in yalmip/fmincon is that the division is avoided, and derivatives are evaluated directly on this operator avoiding any intermediate variablers. But as I said, only applicable with a nonlinear solver

Johan Löfberg

unread,
Aug 28, 2015, 10:41:01 AM8/28/15
to YALMIP
BTW, if it is conic, install scs and describe your model with the kullbackleibler operator, and YALMIP will automatically put the model in a form suitable for the exponential-cone solver scs

Jane

unread,
Sep 2, 2015, 3:35:21 AM9/2/15
to YALMIP
Dear Johan,
Thank you very much for your response. It solved my problem. I described my model with the kullbackleibler operator and solved it with fmincon.

I have another question.

X is a binary variable and my function is

max   x*log(1+y)


My code in YALMIP for this function is:


X = binvar(a,b,c);

Y = sdpvar(a,b,c);

K = sdpvar(a,b,c);

      

Constraint = [K <= -1*crossentropy(X,(1+Y))];         

 

Constraint = [Constraint, Y>=0, X.*Y <= T];


solvesdp(Constraint,-1*sum(sum(sum(K,1),2),3))

 


I am sure that my problem is feasible. However, bnb can not solve it. Would you please tell me if I described my model correctly and which solver can solve my problem? I would greatly appreciate it.

Johan Löfberg

unread,
Sep 2, 2015, 3:38:23 AM9/2/15
to YALMIP
Can you supply code wihch actually runs? As I see it, this should not go through due to a bug in the Kullback operator (which I am fixing now)

Johan Löfberg

unread,
Sep 2, 2015, 3:40:36 AM9/2/15
to YALMIP
I meant bug in crossentropy

Johan Löfberg

unread,
Sep 2, 2015, 3:45:54 AM9/2/15
to YALMIP
Your code doesn't make sense. K is a matrix (nd-array), but the result from crossentropy is a scalar (it is sum(X.*log(1+Y))), so having all those constraints are redundant. You probably want to sum up crossentropy evaluated on each element

Jane

unread,
Sep 2, 2015, 4:06:35 AM9/2/15
to YALMIP
Sorry, I tried to summarize my code which made it confusing. I did sum up crossentropy evaluated on each element. It runs but I get "Infeasible problem" with bnb. Is there another way apart from crossentropy to describe the model?

Johan Löfberg

unread,
Sep 2, 2015, 4:16:32 AM9/2/15
to YALMIP
Is this what you want to solve?
Constraint = [Y>=0, X.*Y <= T];
objective
= 0;
for i = 1:numel(X)
    objective
= objective  + crossentropy(X(i),(1+Y(i)));
end
optimize
(Constraint,objective)

Don't be shy to show the code you actually want to solve. The code above has the problematic issue that you have a bilinear constraint in X and Y, making the problem nonconvex, and thus bnb is really not applicable

Johan Löfberg

unread,
Sep 2, 2015, 4:17:36 AM9/2/15
to YALMIP
the non-crossentropy version is simply

for i = 1:numel(X)

    objective
= objective  -X(i)*log(1+Y(i));
end


Johan Löfberg

unread,
Sep 2, 2015, 4:23:33 AM9/2/15
to YALMIP
To get rid of nonconvexities (X.*Y, x*log(1+y)), you can linearize this model

Nonconvex constraint can be automatically linearized (bounded domain in Y required)
Constraint = [Y>=0, X.*Y <= T];
Constraint = binmodel([Constraint, Y <= 100]);


The objective, if you want to maximize sum X(i)*log(1+Y(i)) can be linearized by introducing a new variable Z(i) and use the objective -sum(Z) with the convex constraint

[log(1+Y) >= Z - M*(1-X), Z <= M*X

where M is the big-M constant (too tired to see the value, let's say 100)

This model should perform significantly better in bnb


Jane

unread,
Sep 3, 2015, 4:03:47 AM9/3/15
to YALMIP

Thank you for your great hints. Binmodel command did not work for me. So I linearize constraints myself. But I still cannot solve the problem. My optimization problem and YALMIP code are as follows:


My optimization problem is:


Max    sum(sum(sum(X.*log(1+Y))

S.t.    sum(X.*log(1+Y), 3) >= A

sum(sum(X,2),1) == 1

sum(sum(X.*Y,3),2) <= T

Y >= 0


Running code in YALMIP:



a= 2;

b=2;

c=4;

A = [1 3; 3 1];

M1 = 100;

M2 = 10000;

T = [2000; 2000];

X = binvar(a,b,c);

Y = sdpvar(a,b,c);

Z = sdpvar(a,b,c);

K = sdpvar(a,b,c);

 

 

Constraints = [Z <= M1*X, Z>=0];     

 

for i=1:a

    for j=1:b

        for l=1:c

            Constraints = [Constraints, Z(i,j,l) - M1*(1-X(i,j,l)) <= log(1+ Y(i,j,l))];

        end

    end

end

 

Constraints = [Constraints, sum(Z,3)>=A];

Constraints = [Constraints, sum(sum(X,2),1) == ones(1,1,c)];

Constraints = [Constraints, sum(sum(K,3),2)<=T , K<=Y+M2*(1-X), K>=Y-M2*(1-X), K>=0, K<=M2*X];

Constraints = [Constraints, Y>=0];

 

optimize(Constraints,-1*sum(sum(sum(Z,1),2),3))

 

 

M1 is the big-M constant and M2 is upper bound on variable K.

Johan Löfberg

unread,
Sep 3, 2015, 4:08:56 AM9/3/15
to yal...@googlegroups.com
Why didn't binmodel work?

Note that your X,Y,Z matrices are symmetric in the first two dimensions.

Jane

unread,
Sep 3, 2015, 5:04:27 AM9/3/15
to YALMIP
I performed binmodel command two times on a constraint and the number of resulting constraints was different each time. I guess I was confused so I linearized the constraints myself.

Even when I change all the variables to:

X = binvar(a,b,c,'full');

Y = sdpvar(a,b,c,'full');

Z = sdpvar(a,b,c,'full');

K = sdpvar(a,b,c,'full');

I still get "Infeasible problem" with bnb solver as the result. However, I am sure that the problem is feasible. One feasible solution is:
X(:,:,1)=[1 0;0 0];  X(:,:,2)=[0 1; 0 0];  X(:,:,3)=[0 0;1 0];  X(:,:,4)=[0 0;0 1];
Y(:,:,1)=
[900 0;0 0];  Y(:,:,2)=[0 900; 0 0];  Y(:,:,3)=[0 0;900 0];  Y(:,:,4)=[0 0;0 900];

Johan Löfberg

unread,
Sep 3, 2015, 5:17:21 AM9/3/15
to YALMIP
Please show me the non-working code so I can check that.

Your proposed solution is not feasible, as it isn't symmetric in the first 2d slices. you want to use sdpvar(a,b,c,'full')

Jane

unread,
Sep 3, 2015, 5:35:47 AM9/3/15
to YALMIP
Thank you for your response. The non-working code is:


a= 2;
b
=2;
c
=4;
A
= [1 3; 3 1];
M1
= 100;
M2
= 10000;
T
= [2000; 2000];

X
= binvar(a,b,c,'full');
Y
= sdpvar(a,b,c,'full');
Z
= sdpvar(a,b,c,'full');
K
= sdpvar(a,b,c,'full');


Johan Löfberg

unread,
Sep 3, 2015, 6:33:07 AM9/3/15
to YALMIP
I meant the code which doesn't work with binmodel.

This code works, but bnb fails as the lower solver fmincon fails to solve the relaxation well, for some reason. I will investigate that 

I recommend you to install SCIP. The global MINLP solver in SCIP performs really well here. For some silly reasons (which I will fix), you have to replace log(1+..) with log(1.00001+...) (YALMIP replaces log(1+x) with an internal function slog(x) to speed up some stuff, but the slog operator is not supported in scip (it only supports a limited set of oeprators, it is not based on function/derivative callback)

Johan Löfberg

unread,
Sep 3, 2015, 6:34:29 AM9/3/15
to YALMIP
...

optimize(Constraints,-1*sum(sum(sum(Z,1),2),3),sdpsettings('solver','scip'))

Johan Löfberg

unread,
Sep 3, 2015, 6:41:41 AM9/3/15
to YALMIP
most likely the problems are due to poor scaling (M2 being so large). 

Johan Löfberg

unread,
Sep 3, 2015, 9:33:09 AM9/3/15
to YALMIP
Note, with scip, you can use

Constraints = [Z(:) == X(:).*log(1+Y(:))];
Constraints = [Constraints, sum(Z,3)>=A];
Constraints = [Constraints, sum(sum(X,2),1) == ones(1,1,c)];
Constraints = [Constraints, sum(sum(K,3),2)<=T, K<=Y+M2*(1-X), K>=Y-M2*(1-X), K>=0, K<=M2*X];
Constraints = [Constraints, Y>=0];

scip performs spatial branching in the nonconvex products. Keeping the nonlinear X.*Y seems to detoriate performance massively, so I kept the linearization here

Jane

unread,
Sep 12, 2015, 12:39:41 AM9/12/15
to YALMIP
Replacing log(1+..) with log(1.00001+...) solved my problem. I greatly appreciate your help and support.
Reply all
Reply to author
Forward
0 new messages