Multi-threaded?

454 views
Skip to first unread message

Colin

unread,
Feb 12, 2013, 11:02:12 PM2/12/13
to opentrippl...@googlegroups.com
Is OTP meant to be multi-threaded when planning trips and building graphs? I've noticed planning longer trips it'll only use 1 core, sometimes it briefly touches a second core, and its the same case when building graphs. I use the -server argument when building graphs and also running the packaged winstone server.

Is it a problem with my configuration or is this how OTP runs? It seems a waste having so many cores available but only using one. Can I do anything to utilise more of the CPU?

I've used OTP for a while but haven't looked very closely at performance, however with larger graphs I'd like to refine the trip planning process as much as I can. I'm sourcing some faster memory to run some benchmarks using a benchmark tool I wrote, the plan is to take a random sample of trips and time how long OTP takes at different memory speeds.

David Turner

unread,
Feb 13, 2013, 1:26:36 AM2/13/13
to opentrippl...@googlegroups.com
OTP is multi-threaded -- but it only uses one core per plan.
Multi-threaded planning is fairly complicated, IIRC.

Colin

unread,
Feb 13, 2013, 2:34:43 AM2/13/13
to opentrippl...@googlegroups.com
OK thanks David, that confirms my thinking. I'll continue to favour core speed over number of cores for hardware purchases.

Andrew Byrd

unread,
Feb 13, 2013, 2:44:12 AM2/13/13
to opentrippl...@googlegroups.com
On 02/13/2013 08:34 AM, Colin wrote:
> OK thanks David, that confirms my thinking. I'll continue to favour core speed over number of cores for hardware purchases.

Of course this depends on what you're doing with OTP -- a public-facing
trip planning server will often be handling many simultaneous requests,
and many analysis tasks are embarrassingly parallel. If your problem
naturally breaks down into multiple parallel requests, OTP scales really
well with the number of cores. The Analyst batch processor will for
example run about 4 times faster with 4 cores.

-Andrew

Rafael Pereira

unread,
Nov 20, 2016, 12:46:36 PM11/20/16
to OpenTripPlanner Users
Hi Andrew,

I'm using a python script to call OTP and estimate several travel time matrices at different departure times (script attached).  This seems to be a embarrassingly parallel task but I'm very new to Python.

Is there a chance you could give me a hand in making this short script parallel using multi-threaded ?

best wishes,

Rafael
sto_traveltimematrix_pt parallel.py

Andrew Byrd

unread,
Nov 21, 2016, 1:30:38 AM11/21/16
to Rafael Pereira, OpenTripPlanner Users
Hi Rafael,

This particular problem is specific to Python itself rather than OTP. Fortunately there’s a huge Python community and many answers to this question are already available on Stack Overflow, blogs, etc. around the web.


From a cursory glance at these answers, it looks like you’ll want to use the threading, multiprocessing, or concurrent.futures libraries.

In my experience OTP scales almost linearly with the number of cores, so you’ll want to start as many requesting threads/processes as your machine has cores.

Andrew

Andrew Byrd

unread,
Nov 21, 2016, 1:36:34 AM11/21/16
to Rafael Pereira, OpenTripPlanner Users
Hi again Rafael,

I had originally assumed that by “using a python script to call OTP” you meant you were calling the web API. After looking at your script in more detail I now realize you are using the OTP scripting extensions, which I’ve never used. I have no idea how that will interact with Python multi-processing, so you may want to wait for an answer from someone who is more familiar with this system.

Andrew

--
You received this message because you are subscribed to the Google Groups "OpenTripPlanner Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to opentripplanner-...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Laurent Grégoire

unread,
Nov 21, 2016, 5:56:59 AM11/21/16
to OpenTripPlanner Users, correiod...@gmail.com

Hi Rafael,

There is nothing that prevent you from parallelizing the script, the expensive operation is spt = router.plan(req), so this is the operation you probably want to parallelize first. The python scripting API has never been tested in multi-threaded scripts, but if you take care of synchronizing the I/O access and using the SPT in a single thread all should be fine (the cached sample evaluation method is synchronized on the Java side as it may be accessible from different threads).

As for how to do this in python there are indeed lots of resources on the web to help you on that, this is not really OTP-specific.

HTH,

--Laurent


Le lundi 21 novembre 2016 07:36:34 UTC+1, Andrew Byrd a écrit :
I had originally assumed that by “using a python script to call OTP” you meant you were calling the web API. After looking at your script in more detail I now realize you are using the OTP scripting extensions, which I’ve never used. I have no idea how that will interact with Python multi-processing, so you may want to wait for an answer from someone who is more familiar with this system.

Andrew

On 21 Nov 2016, at 14:30, Andrew Byrd <and...@fastmail.net> wrote:

Hi Rafael,

This particular problem is specific to Python itself rather than OTP. Fortunately there’s a huge Python community and many answers to this question are already available on Stack Overflow, blogs, etc. around the web.


From a cursory glance at these answers, it looks like you’ll want to use the threading, multiprocessing, or concurrent.futures libraries.

In my experience OTP scales almost linearly with the number of cores, so you’ll want to start as many requesting threads/processes as your machine has cores.


Rafael Pereira

unread,
Nov 21, 2016, 10:17:34 AM11/21/16
to OpenTripPlanner Users
Hi Andrew and Laurent,

thank you for your suggestions, I will look into them. I understand this is a Python question , sorry to post it here. I thought someone could have already done this before. In any case, I opened a SO question.


ps. I'm familiar with R but I'm illiterate with Python :] 

best,

Rafael
Reply all
Reply to author
Forward
0 new messages