parallelize or tools

2,111 views
Skip to first unread message

Salman Yousaf

unread,
Feb 9, 2017, 4:45:02 AM2/9/17
to or-tools-discuss
I have used integer programming implementation from google optimization tools but the problem is my number of variable is too much high that the program gets crashed on my machine. Can we run solver in parallel mode or somebody in this group has any idea to how to run linear programming problem in a parallel fashion?


Thanks.

csm...@bytefoods.co

unread,
Feb 9, 2017, 7:50:18 PM2/9/17
to or-tools-discuss
Also looked into this, I think you need to do it yourself, so something like

- partition the space
- solve each subspace on separate threads
- combine solutions

Maybe you could use spark for it?

Salman Yousaf

unread,
Feb 14, 2017, 7:38:21 AM2/14/17
to or-tools...@googlegroups.com
Can you please describe me how to partition the space here.
At the moment, In my implementation, there is one object of the solver and I added all my constraints in this solver object to solve for the optimum value. How can I divide this task into different threads? I appreciate your help in this regard.

Or It will be awesome if you share some useful resources here for the usage of spark in this scenario. 

Thanks for the help.

Regards!
Salman Yousaf

--
You received this message because you are subscribed to a topic in the Google Groups "or-tools-discuss" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/or-tools-discuss/o_-v_eGms64/unsubscribe.
To unsubscribe from this group and all its topics, send an email to or-tools-discuss+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

csm...@bytefoods.co

unread,
Feb 18, 2017, 4:11:07 PM2/18/17
to or-tools-discuss
I think it requires a little bit of creativity, because it depends on your specific problem. Basically you want to find an abstraction layer at which the grouping of variables involves the least interaction between the variables. So maybe you have to optimize the contents of some trucks, but the variables for contents in one truck don't really share constraints with variables in another truck. Then maybe you can solve them separately and have a combination step for whatever supplies the trucks is.

The idea with spark or any of these cluster computing platforms is it forces you to partition your problem and give each piece to a different machine on the cluster. Usually you are partitioning database table or something like that, so maybe it doesn't work in your situation.
To unsubscribe from this group and all its topics, send an email to or-tools-discu...@googlegroups.com.

Andres Collart

unread,
Mar 1, 2017, 9:06:34 AM3/1/17
to or-tools-discuss
Salman, like CSM said there are ways to partition it. If you post a bit more about the problem you're trying to solve we may be able to be of more assistance.

-Andres

Salman Yousaf

unread,
Mar 9, 2017, 7:11:47 AM3/9/17
to or-tools...@googlegroups.com
Firs of all thanks for your reply and yes my problem is:
Actually, we are trying to optimize the network wich has 40K nodes and millions of edges so If I try to convert my graph problem into LP then google linear optimization runs out of memory that is why I thought If I able to run it on a cluster using a spark or something like that. I would appreciate any help.

Thanks

Regards!
Salman Yousaf

--
You received this message because you are subscribed to a topic in the Google Groups "or-tools-discuss" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/or-tools-discuss/o_-v_eGms64/unsubscribe.
To unsubscribe from this group and all its topics, send an email to or-tools-discuss+unsubscribe@googlegroups.com.

Chris Smith

unread,
Mar 9, 2017, 2:44:20 PM3/9/17
to or-tools...@googlegroups.com
You might look around here

I think you'll have to find a reasonably good partition of your graph into smaller subgraphs that aren't very connected to each other so there's not a lot of work in the reduce step. Maybe you could do something like pick some random vertices, find the minimum cut for each pair of them, take some of those cuts, repeat that until you have some partitions, remember which vertices you cut, put them back in when you reassemble. It almost certainly won't give you an optimal solution, but maybe you can get something reasonably good if subsolutions can be combined well.

Good luck, sounds like a hard problem.

Reply all
Reply to author
Forward
0 new messages