On Fri, 21 Dec 2012, Matt Wynne wrote:
>
> On 19 Dec 2012, at 17:58, R. Tyler Croy wrote:
>
> >
> > On Wed, 19 Dec 2012, Matt Wynne wrote:
> >
> >>
> >> On 19 Dec 2012, at 06:32, R. Tyler Croy wrote:
> >>
> >>> A tool might exist that does this already, but I'd like to write a Cucumber
> >>> scenario runner which will run multiple scenarios concurrently inside of one
> >>> process.
> >>>
> >>> For most of my Scenarios, they're actually backed by Capybara and the Selenium
> >>> Web Driver, which means there's a lot of I/O waiting going on.
> >>>
> >>> Using the parallel_tests gem isn't a great option because of the memory
> >>> heaviness of our Rails environment which is loaded into every Ruby process.
> >>>
> >>> Ideally I'd like to instantiate a new "runner" object for each scenario, and
> >>> run a few at a time in the same process, is this feasible?
> >>
> >> The problem you'd have is with shared mutable state. If you have multiple scenarios hitting the app concurrently, they'll fight over the state of the database and probably trip each other up. Or did you already think about that?
> >
> >
> > This has already been accounted for elsewhere, we're already running scenarios
> > in parallel with parallel_tests, the only problem is that we max out the memory
> > LONG before we max out the CPU on a test machine :)
> >
> > All scenarios generate the state they need when hitting the app server as part
> > of their Given blocks.
>
> I've never used parallel_tests, but it looks as though it creates multiple databases for you and configures each separate process to use its own database. So I can't see how you'd be able to hit a single running Rails app but have different requests be served by different databases. It seems to me it would be more straightforward to buy some more RAM for your build machine!
It's really unfortunate that the parallel_tests gem describes all that junk.
That's not what I'm using, nor advocate using to others for running Cucumber in
parallel.
Forget you read about any of that Rails mess :)
What we use parallel_tests for now is to effectively fork multiple cucumber
processes which each will run one .feature file. There is ZERO shared state
here, it's all independent, and the Selenium underpinnings are all hitting a
remote server.
> Have you profiled the memory use? It's possible it's Firefox rather than Rails. Maybe you need to try out one of the headless drivers like Capybara-webkit.
We use Sauce Labs, so the powering of the browser isn't what is causing the
memory usage, it's just a side effect of our gigantic Rails environment :-/
What I really would like to do is something like this:
features = load_feature_files
features.each do |feature|
threads = []
feature.scenarios.each do |scenario|
Thread.new do
Cucumber::Runner.run(scenario)
end
threads.each do |thread|
thread.join
end
end
Then I would expect the World object for each scenario to be isolated and
separate from the others; does this make sense and seem feasible?
Perhaps it's time for me to go code spelunking again :)
Cheers