MIP solver performance question

136 views
Skip to first unread message

Feng

unread,
Jun 29, 2023, 2:22:37 PM6/29/23
to or-tools-discuss
Hi,

I have a general question related to or-tool MIP solver performance:

if the MIP problem is a combination of multiple disjoint subproblems, how efficient would the solver be?

1. just throw to solver the whole problem
2. pre-process the problem and split into multiple smaller subproblems; use solver to solve each of them.
Will there be significant difference in the performance of 1 vs. 2?

Thanks!

Bruno De Backer

unread,
Jun 30, 2023, 4:40:22 AM6/30/23
to or-tools...@googlegroups.com
2. is far better. 
In general, linear programming algorithms don't "see" that problems are disjoint, and just "mix up" everything. This is due to fill-in when computing with sparse matrices. 

--
You received this message because you are subscribed to the Google Groups "or-tools-discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to or-tools-discu...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/or-tools-discuss/7619dc89-8b17-4d2a-a254-07081ffacb8bn%40googlegroups.com.


--
BdB

Feng

unread,
Jun 30, 2023, 1:26:06 PM6/30/23
to or-tools-discuss
Thanks Bruno, just want to check if current development has considered disjoint and optimized it under the hood. So the solver is a general one and we'd better preprocess.

Feng

unread,
Jun 30, 2023, 1:29:58 PM6/30/23
to or-tools-discuss
And by the way is there a guideline of the limit of the MIP solver?

Runtime is usually within 1s in my trial tests. But it will drastically increase when system gets large. Do we have an estimate like if number of variables is larger than some number N, runtime will be long, like > 1 min?

Bruno De Backer

unread,
Jul 3, 2023, 5:06:11 AM7/3/23
to or-tools...@googlegroups.com
No guideline, really. 
There are very small real problems (a few dozen variables and constraints) for which the optimal has still not be proven. 
One can build almost arbitrarily small problems that are arbitrarily hard. 
That's the general case. Exponential in the worst case. 
Then each subclass of problems behaves differently. But the algorithm is still exponential, and the underlying simplex has O(number of constraints * log(number of variables)) iterations, with each iteration being O(n^2) or O(n^3). 
This gives you a lower bound on the complexity of MIP. 
So, if you multiply time spent by 60, I'd expect no more than a possible 5x increase in size. 
However, you may be able to discover dominating constraints, some clever preprocessing, and abandon the need for a proven optimal, and you may get better speed in the end. 




--
BdB

Feng

unread,
Jul 3, 2023, 4:34:30 PM7/3/23
to or-tools-discuss
Hi Bruno, thanks for the detailed information!
Reply all
Reply to author
Forward
0 new messages