Dear John:
Sorry to disturb you. Recently, I met a problem when using Minimum Variance or Standard Deviation as objective function. It is a model for balancing the reserve of power system.
For example, a simplified model is as follows:
%n_gen is the number of generators and n_T is the time period under study
n_gen = 3; n_T = 3;
UnitState = binvar(n_gen, n_T);
GenOutput = sdpvar(n_gen, n_T);
Load = [0.8,1.5,2.1];
Capacity = [1 2 3];
C = [];
for t = 1: n_T
C = [C, UnitState(1,t)*Capacity(1)>= GenOutput(1,t),UnitState(2,t)*Capacity(2)>= GenOutput(2,t),UnitState(3,t)*Capacity(3)>= GenOutput(3,t)];
C = [C, GenOutput(1,t) + GenOutput(2,t) + GenOutput(3,t) >= Load(t)];
end
AverageReserve = sdpvar(1,1);
Reserve = sdpvar(1, n_T);
C = [C, AverageReserve*n_T == Reserve(1,1) + Reserve(1,2) + Reserve(1,3)];
for t = 1: n_T
C = [C, Reserve(1,t) == (UnitState(1,t)*Capacity(1) + UnitState(2,t)*Capacity(2) +UnitState(3,t)*Capacity(3) - Load(t))/Load(t) ];
end
Obj = 0;
for t = 1: n_T
Obj = Obj + (Reserve(1,t)-AverageReserve)^2/n_T;
end
result = optimize(C, Obj);
When the scale increases, the lower bound of this MIP problem will always be 0, and hardly changes with the time, resulting in a long solution time. I have tried different solvers, like Gurobi and Cplex, but it does not work.
So is there any way to solve this problem?
Yours
Wang Yanping