What is the easy way to deploy route in a production environment?

7 views
Skip to first unread message

Diego Moreno

unread,
Jun 2, 2009, 11:54:35 AM6/2/09
to openwfe...@googlegroups.com
Hi!

in my project I started with ruote-web and I added some features. In development mode I use a single mongrel as a web server.

Now I am trying to deploy this service in a "more production" mode, supporting concurrent requests. I read in this list that “passenger phusion” is a bad solution because of its problems with threads.

I am thinking about two posibilities:

* Using nginx as a request dispatcher to serve requests torwards a mongrel cluster. This is a good deployment for rails applications and it is surely useful with concurrent requests. But I don’t know how to make mongrels share a single workflow engine instance (ruote instance).

* Using a JRuby/Tomcat deployment for my rails app. I don’t have experience in this architecture, and thus I don’t know if it is valid for serving concurrent requests.

What is the best choice? Is there another posibility?

Best Regards,

Diego.

John Mettraux

unread,
Jun 2, 2009, 7:30:02 PM6/2/09
to openwfe...@googlegroups.com
On Wed, Jun 3, 2009 at 12:54 AM, Diego Moreno <dmo...@dit.upm.es> wrote:
> Hi!
>
> in my project I started with ruote-web and I added some features. In
> development mode I use a single mongrel as a web server.
>
> Now I am trying to deploy this service in a "more production" mode,
> supporting concurrent requests. I read in this list that “passenger phusion”
> is a bad solution because of its problems with threads.

Hi Diego,

well passenger is quite aggressive with the threads spawned by hosted
rails applications.

There is this mention about threads in the Passenger guide :

http://www.modrails.com/documentation/Users%20guide.html#_smart_spawning_gotcha_2_the_need_to_revive_threads

But I never had time to study that.


> I am thinking about two posibilities:
>
> * Using nginx as a request dispatcher to serve requests torwards a mongrel
> cluster. This is a good deployment for rails applications and it is surely
> useful with concurrent requests. But I don’t know how to make mongrels share
> a single workflow engine instance (ruote instance).

Others, like Kenneth, have taken the ruote-rest way (and later
ruote-kit) to let the workflow engine sit in a web application behind
the front web application[s].


> * Using a JRuby/Tomcat deployment for my rails app. I don’t have experience
> in this architecture, and thus I don’t know if it is valid for serving
> concurrent requests.
>
> What is the best choice? Is there another posibility?

Do you really need serving "concurrent requests" ? Do you really have
that much users ? With bottleneck patterns ?

Have you taken a look at Thin ? It's a successor of Mongrel :

http://code.macournoyer.com/thin/


Best regards,

--
John Mettraux - http://jmettraux.wordpress.com

Kenneth Kalmer

unread,
Jun 3, 2009, 2:54:07 AM6/3/09
to openwfe...@googlegroups.com
On Wed, Jun 3, 2009 at 1:30 AM, John Mettraux <jmet...@openwfe.org> wrote:

On Wed, Jun 3, 2009 at 12:54 AM, Diego Moreno <dmo...@dit.upm.es> wrote:
> I am thinking about two posibilities:
>
> * Using nginx as a request dispatcher to serve requests torwards a mongrel
> cluster. This is a good deployment for rails applications and it is surely
> useful with concurrent requests. But I don’t know how to make mongrels share
> a single workflow engine instance (ruote instance).

Others, like Kenneth, have taken the ruote-rest way (and later
ruote-kit) to let the workflow engine sit in a web application behind
the front web application[s].

Hi Diego

We've discussed this issue on the list before at http://groups.google.com/group/openwferu-users/browse_thread/thread/bbea87de0e7b5bce/3b8d3cb6998b73c3, but I'll summarize it for you again.

The problem with running multiple instances of the engine (inside mongrels/passenger) is that you'll have the schedulers tramping all over each other, amongst other things. John and myself have discussed some conceptual enhancements to ruote that would allow for multiple instances to run together, but those chats were more of a "wow, that would be awesome!" nature than anything serious.

We using ruote-rest in production, and a few other members of the group do too. Rails than polls ruote-rest for workitems, and launches processes, etc, much the same way it would use a database. Actually, ruote-rest can be thought of a workitem database :)

I've extracted some our production code and smacked it on github [1] that shows how we go about this. The code is first iteration and is going to evolve into a Ruby gem that will be the client for ruote-kit.

In close, if your userbase is small and request volumes low, stick to a single evented server like thin and possible enable threadsafe mode inside rails (making sure your code is threadsafe). That should give you decent milage before needing ruote-rest. If however this is not the case, and you'll need to do a lot of requests, strip the engine out of rails and get ruote-rest behind it as quickly as possible.

[1] http://github.com/kennethkalmer/ruote-rest-rails-client


--
Kenneth Kalmer
kenneth...@gmail.com
http://opensourcery.co.za
@kennethkalmer

Diego Moreno

unread,
Jun 10, 2009, 5:53:56 AM6/10/09
to openwfe...@googlegroups.com
Thanks John & Kenneth!

I will use thin gem for some time and I will try to do some performance tests. I hope it will be enough. At the moment, it seems the easy way :)
Reply all
Reply to author
Forward
0 new messages