Alternative Approaches to Decrease Run Time Requested

93 views
Skip to first unread message

Taner Cokyasar

unread,
Jul 16, 2017, 10:23:21 AM7/16/17
to Gurobi Optimization
Hello everyone,

I am using Gurobi 7.0.2 via Python 2.7.12 with Anaconda 4.2.0 (64 -bit). I have a large size (1m x 1m) mip model consisting more than 5m nonzeros on the matrix. I ran the code and waited more than 2.5 hours for results, however, it was still iterating. Then, I just left the pc in my office on to see the results on Monday. I am looking for a way to decrease the computation time.

I found Benders Decomposition that may help me. I haven't done a broad research on that till now to deeply learn how it really works. However, I also read that Fortsp uses Benders decomposition to solve a problem. Fortsp works on C language, which I have no idea about. Is there any suggestions you have to solve my problem?

Thanks,

Tobias Achterberg

unread,
Jul 17, 2017, 3:54:58 AM7/17/17
to gur...@googlegroups.com
This is a pretty general question that is hard to answer in general.

First, a MIP with 1 million variables and 1 million constraints is pretty big.
Gurobi can solve some models of this size, but only if they are "easy" in a
combinatorial sense (i.e., the problem structure is simple and the LP relaxation
is very close to the integer polyhedron).

MIP in general is, as you probably know, NP-hard, and (to no surprise) the
algorithm to solve MIPs implemented in Gurobi has exponential worst case
run-time complexity. Hence, there are even very small problems (with 50
variables and just a few constraints) that Gurobi is unable to solve in
reasonable time. This is also true for other MIP solvers that currently exist,
and it will remain true until someone proves that P = NP. But most of the time,
models for practical problems have certain structure that the solver can exploit
so that the exponential worst case behavior does not kick in.

There are a few applications where the simple approach of formulating a model
and calling Gurobi as a black box does not work. These are usually areas of
scientific research, and one approach that emerges often in such research
projects is Benders decomposition. But there are other ways to tackle
complicated problems. In my view, the most important aspect is the modeling
part. Maybe you can simplify the model and make it smaller by aggregating
aspects of the real world, by using a coarser time or space discretization, by
disaggregating parts of your problem into several independent models, and so on.
This all depends on the problem you want to solve.

Finally, you could provide a link to your problem file (in mps format) so that
we can take a look. Maybe there is something like parameter settings that we can
recommend.


Regards,

Taner Cokyasar

unread,
Jul 18, 2017, 10:35:40 AM7/18/17
to Gurobi Optimization
Dr. Achterberg,

First of all, thank you very much for your interest. I am doing a research to develop a model to adjust salary increases based on some assumptions. Basically, we minimize the differences in salaries between a company's employees and the sector average of that employee's position to determine the new (increased) salary of all employees in a company. Besides, we find out the best month to make increases for each employee. You may see my coding (Salary Data Generator for Banking Industry.ipynb) and the model (Formulation.docx) in attached files. There is also an excel file (Salary Comparison for Banking.xlsx) that includes info which goes to the data generator section of the coding. The second cell of the code gets the generated data and solves it. Recently, Gurobi could solve the problem in almost 12 hours. The size of the problem is based on i, m, and t. However, m=5 and t=12 are fixed. So, i (number of employees) determine the size of the problem. In this example, I used i=121,941. Gurobi could solve i=10,000 in 3 mins. If you may suggest me anything to decrease this time (12 hrs), I would really appreciate it.

Best regards,
Salary Data Generator for Banking Industry.ipynb
Salary Comparison for Banking.xlsx
Formulation.docx

Tobias Achterberg

unread,
Jul 21, 2017, 7:42:52 AM7/21/17
to gur...@googlegroups.com
I looked at your model. It does look quite reasonable.

I can imagine that constraints (7) and (8) make the model hard to solve. They
implement some kind of scheduling structure (earliest and latest "start time"),
which is always hard to deal with for MIP solvers. What is the value of the
big-M you are using in these constraints?

Regards,

Tobias

Taner Cokyasar

unread,
Jul 21, 2017, 8:42:13 AM7/21/17
to gur...@googlegroups.com
Thank you very much Dr. Achterberg.

The value of big M is 6. It is the the smallest possible we could use.

Best regards,


--

---
You received this message because you are subscribed to a topic in the Google Groups "Gurobi Optimization" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/gurobi/ok1CYIgUvik/unsubscribe.
To unsubscribe from this group and all its topics, send an email to gurobi+un...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
--
Sent from Gmail Mobile

Tobias Achterberg

unread,
Jul 24, 2017, 12:56:36 PM7/24/17
to gur...@googlegroups.com
Well, 6 is not really large, so I guess this should not be a big problem.

You could generate a number of smaller instances of your problem (with fewer
employees) and then run the Gurobi tuning tool on this set in order to find
parameter settings that are better suited to your problem.


Regards,

Tobias
Reply all
Reply to author
Forward
0 new messages