Gurobi with Apache Spark?

669 views
Skip to first unread message

user34

unread,
Mar 15, 2019, 11:40:46 AM3/15/19
to Gurobi Optimization
Is anyone (else) using Gurobi with Apache Spark?

We have an involved application that requires over 800,000 invocations of the Gurobi optimizer. Some of these runs take seconds, others take considerably longer. We're using Apache Spark running in Amazon's Elastic Map Reduce to distribute the load.

We've found some problems with our setup, including:

1. The Gurobi license server (token server) does not like receiving hundreds of simultaneous requests for tokens. We've had to implement a random back-off. Also, we suspect that sometimes we exceed our license limits, so we also implemented a wait.

2. Spark will queue more work that it uses. If Spark has a free node, it will occasionally have multiple nodes attempt the same action on the same partition, and just go with the one that finishes first. If you are paying for Gurobi by the hour, this can result in charges for work that is then discarded.

We've learned a lot about running Gurobi in this configuration. It would be useful to share our experience with others, if it is useful.


Reply all
Reply to author
Forward
0 new messages