Play framework poor performance (tested based on community feedback)

2,946 views
Skip to first unread message

designerd

unread,
Apr 5, 2013, 11:42:42 AM4/5/13
to play-fr...@googlegroups.com
Our friends at Techempower have released a version 2 of the benchmarks they released earlier AFTER adhering to the community's pull requests.

This is sad :( I was expecting Play Scala to rocket to the top, but this is really disappointing :(



Link:


Derek Williams

unread,
Apr 5, 2013, 11:54:41 AM4/5/13
to play-fr...@googlegroups.com
I certainly wish Play! was the fastest, but as mentioned previously the benchmarks are quite simplistic.  I'm not asserting that they are meaningless, but they may not reflect how real world (more complex) applications would fare in comparison.  I'm feeling really sorry for the Sinatra-JRuby fans.




--
You received this message because you are subscribed to the Google Groups "play-framework" group.
To unsubscribe from this group and stop receiving emails from it, send an email to play-framewor...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 



--
Derek Williams
Cell: 970.214.8928

Luis Ángel Vicente Sánchez

unread,
Apr 5, 2013, 12:01:22 PM4/5/13
to play-fr...@googlegroups.com
Not only they are simplistic... if you pay attention to the graphs there is a point were the throughput stays constant. That point changes from framework to framework and I bet that is related to the threadpool/database pool. Some framework do much more things that the others before hitting the method/function that creates the result.

In a real situation if you are using the top5 ones, after adding a few more things like authentication/authorization, request body parsing and validation, ... you will have the same kind of perfomance. And the you can also increase the threadpool/database pool to fine tuning your application.

What it's a bit disturbing it's that in this iteration of the benchmark... play-java has lower marks...


2013/4/5 Derek Williams <der...@gmail.com>

Derek Williams

unread,
Apr 5, 2013, 12:09:20 PM4/5/13
to play-fr...@googlegroups.com
I had a quick look at both the java and scala implementations in the benchmark repo, and they both appear to be using 'parallelism-factor = 1.0' for the default dispatcher, which I believe is the one that Play uses for executing Futures by default (I could be wrong here). That will limit the threadpool size to the number of cores of the system.


Note that 'parallelism-max = 50', but that just puts a cap on the parallelism. With the factor being 1.0, the cap would only be reached if you were using over 50 cores.

I'll try to take a look at this later to see if I can bump up the speed, at least for the db tests.

Luis Ángel Vicente Sánchez

unread,
Apr 5, 2013, 12:20:37 PM4/5/13
to play-fr...@googlegroups.com
The database pool is 2x5 = 10. Compojure is using a pool of 256 and the j2ee framework are using the resin default size (probably 64)... that something to take into account. You also would have to tune the threadpool used by the DB futures to match the database pool size.


2013/4/5 Derek Williams <de...@fyrie.net>

Skamander

unread,
Apr 5, 2013, 12:59:05 PM4/5/13
to play-fr...@googlegroups.com
The new tests don't include the newest play-java and play-scala version. For example they used this .scala version: https://github.com/Skamander/FrameworkBenchmarks/blob/5b3a886a81115685993db2ce0aded45b35059d8e/play-scala/app/controllers/Application.scala Because of that they couldn't test the db in the play-scala version, since it is not working.

Nilanjan Raychaudhuri

unread,
Apr 5, 2013, 1:10:24 PM4/5/13
to play-fr...@googlegroups.com
I will take a look at the database thing and why its not working. But for JSON there is a fix in the Play master now which will improve the latency.

Nilanjan, Developer & Consultant
Typesafe Inc.
Twitter: @nraychaudhuri

Nilanjan Raychaudhuri

unread,
Apr 5, 2013, 1:15:26 PM4/5/13
to play-fr...@googlegroups.com
Derek,

I wonder whether we should use Future.blocking here:


So that fork-join pool can resize based on demand?

Nilanjan, Developer & Consultant
Typesafe Inc.
Twitter: @nraychaudhuri

Skamander

unread,
Apr 5, 2013, 1:16:14 PM4/5/13
to play-fr...@googlegroups.com
The db test works with the new version I send. But the version didn't make it in time for the tests.

https://github.com/TechEmpower/FrameworkBenchmarks/tree/master/play-scala

Nilanjan Raychaudhuri

unread,
Apr 5, 2013, 1:36:49 PM4/5/13
to play-fr...@googlegroups.com
thanks for fixing it.

Nilanjan, Developer & Consultant
Typesafe Inc.
Twitter: @nraychaudhuri

Skamander

unread,
Apr 5, 2013, 4:19:10 PM4/5/13
to play-fr...@googlegroups.com
You're welcome. :)

btw: has someone the benchmark on his computer (+ the time, feel for it etc.)  and could compare a new version i comitted a moment ago to my fork - https://github.com/Skamander/FrameworkBenchmarks/tree/master/play-scala -  with the master branch of the benchmark - https://github.com/TechEmpower/FrameworkBenchmarks/tree/master/play-scala

I could only test it with 3 friends of mine (+ System.nanoTime() as a stopwatch), which would rapidly hammer F5, but ist seems 2-3 times faster.

Brian Hauer

unread,
Apr 5, 2013, 8:30:40 PM4/5/13
to play-fr...@googlegroups.com
Hi everyone.  Thanks again for helping tune the Play tests!  I have quickly grown to love the Play community.

I noticed before I left the office that we had received a note about the database connection pool.  I know one of you (or probably more than that!) had pointed that out before and we failed to fix it in this second-round run.  I want to make sure that number is properly configured for our third-round tests.  I believe a pull request is in the works.

Also, and I may have mentioned this before: we want to add a fourth test to the benchmark suite soon that will exercise more computational functionality such as filtering, sorting, and server-side rendering via templates.  If you have any thoughts or recommendations for that test ("be sure to include X"), please let me know!

-Brian

Christopher Hunt

unread,
Apr 5, 2013, 9:32:58 PM4/5/13
to play-fr...@googlegroups.com
On Saturday, 6 April 2013 11:30:40 UTC+11, Brian Hauer wrote:

I noticed before I left the office that we had received a note about the database connection pool.  I know one of you (or probably more than that!) had pointed that out before and we failed to fix it in this second-round run.  I want to make sure that number is properly configured for our third-round tests.  I believe a pull request is in the works.
 

Kind regards,
Christopher

Martin Grotzke

unread,
Apr 6, 2013, 2:58:33 AM4/6/13
to play-fr...@googlegroups.com

Am 05.04.2013 18:01 schrieb "Luis Ángel Vicente Sánchez" <langel...@gmail.com>:
>
> What it's a bit disturbing it's that in this iteration of the benchmark... play-java has lower marks...

I'm also wondering what's the reason for this, has anybody an idea / possible explanation?

Cheers,
Martin

>
>
> 2013/4/5 Derek Williams <der...@gmail.com>
>>
>> I certainly wish Play! was the fastest, but as mentioned previously the benchmarks are quite simplistic.  I'm not asserting that they are meaningless, but they may not reflect how real world (more complex) applications would fare in comparison.  I'm feeling really sorry for the Sinatra-JRuby fans.
>>
>>
>> On Fri, Apr 5, 2013 at 9:42 AM, designerd <d3si...@gmail.com> wrote:
>>>
>>> Our friends at Techempower have released a version 2 of the benchmarks they released earlier AFTER adhering to the community's pull requests.
>>>
>>> This is sad :( I was expecting Play Scala to rocket to the top, but this is really disappointing :(
>>>
>>>
>>>

Christopher Hunt

unread,
Apr 6, 2013, 3:50:44 AM4/6/13
to play-fr...@googlegroups.com
I believe that the results were from last Tuesday. We've submitted 3 PRs since then - the latest one being focused on connection pools (the cp configuration before was very low in comparison to the other frameworks being used).

play-java was using async code in its json test while the play-scala version was not (for short lived operations async will be an overhead in this regard). For the db tests play-java was performing n db finds synchronously while the play-scala was fully asynchronous. play-java has now been brought into line with play-scala.

Suffice to state, the results in at the moment are not a good reflection on Play's performance. I'm hoping that next week will look quite different.

Kind regards,
Christopher

On 06/04/2013, at 5:58 PM, Martin Grotzke <martin....@googlemail.com> wrote:

Am 05.04.2013 18:01 schrieb "Luis Ángel Vicente Sánchez" <langel...@gmail.com>:
>
> What it's a bit disturbing it's that in this iteration of the benchmark... play-java has lower marks...

I'm also wondering what's the reason for this, has anybody an idea / possible explanation?

Cheers,
Martin


-- 
Christopher Hunt
Bit twiddler

See you at Scala Days 2013 in NYC!
June 10th - June 12th
http://www.scaladays.org

Twitter: @huntchr

Pascal Voitot Dev

unread,
Apr 6, 2013, 3:53:59 AM4/6/13
to play-fr...@googlegroups.com
Great job Christopher ;)


Andy

unread,
Apr 6, 2013, 7:51:50 AM4/6/13
to play-framework

> play-java was using async code in its json test while the play-scala version was not (for short lived operations async will be an overhead in this regard).

vertx is async but it's still many times faster than play in the json
test. The async overhead can't be that big.


> For the db tests play-java was performing n db finds synchronously while the play-scala was fully asynchronous. play-java has now been brought into line with play-scala.

Where are the play-scala results in the db tests? I don't see any.

Tom Carchrae

unread,
Apr 6, 2013, 8:15:26 AM4/6/13
to play-framework
Suffice to state, the results in at the moment are not a good reflection on Play's performance. I'm hoping that next week will look quite different.

I disagree.  This is a reflection of Play's performance using default settings.  But sure, perhaps Play has potential for great performance.  And you and others are to be commended for working to improve this.  

But 'out-of-the-box' performance is important - it is what most developers will start with (and sometimes stop with).  Now, you can protest and say, "no situation is the same, there are no best defaults", to which I would respond "what scenario, if any, are the current defaults optimized for?".  

If Play has become so complex as to require core developers to tune it to get similar performance to other platform's 'out-of-the-box' configurations, then it has failed in the goal "focusing back on simplicity".

Just some nuts from the peanut gallery.  Keep up the good work.

Tom



Skamander

unread,
Apr 6, 2013, 8:20:12 AM4/6/13
to play-fr...@googlegroups.com

Christopher Hunt

unread,
Apr 6, 2013, 9:05:51 AM4/6/13
to play-fr...@googlegroups.com
Hi Tom

Pleased be assured that I have kept to idiomatic Play. There should be nothing in my contributions that requires specialised knowledge. All that is there is documented. Please let me know if you think otherwise.

Connection pool and thread pool config should always be sensitive to their application. The other platforms have similarly configured these in the benchmark project for the purposes of handling the load. That said, I believe that the defaults we have in Play are a reasonable starting point.

Kind regards
Christopher

Sent from my iPhone
--
You received this message because you are subscribed to a topic in the Google Groups "play-framework" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/play-framework/SX_pg7fVQ0A/unsubscribe?hl=en-US.
To unsubscribe from this group and all its topics, send an email to play-framewor...@googlegroups.com.

Luis Ángel Vicente Sánchez

unread,
Apr 6, 2013, 9:31:29 AM4/6/13
to play-fr...@googlegroups.com
I disagree. Using default settings would acceptable if other frameworks in the benchmark were using default settings. Tuning play settings to match other frameworks settings is mandatory if you want to compare apples to apples; and I really think that is not against the goal "focusing back on simplicity".


2013/4/6 Tom Carchrae <t...@carchrae.net>

Tom Carchrae

unread,
Apr 6, 2013, 9:50:13 AM4/6/13
to play-framework
Please don't take my comment the wrong way: my intent was constructive.  I'm just asking, what kind of knowledge is required to get better performance, and is it easy/evident to a new user of Play how to do that?  Is it really true that the default of Play is worse than other frameworks?

The effort required to get the tuned performance would be a very interesting part for TechEmpower to report on - not just for Play, but for each framework.  How much time/effort/skill was required.  

And Luis, while I agree somewhat in the level playing field, I don't think excessive tuning of each framework (using experts) is a realistic or honest comparison.  I think there are realistic tests that are important to most of us.  Out-of-the-box, and "Basic Tuning".  In reality you are not comparing apples to apples - you are comparing different kinds of fruit in how they handle a basic function under load.

Tom

Luis Ángel Vicente Sánchez

unread,
Apr 6, 2013, 9:58:48 AM4/6/13
to play-fr...@googlegroups.com
The effort required to get the tuned performance would be a very interesting part for TechEmpower to report on - not just for Play, but for each framework.  How much time/effort/skill was required.  


That would be actually something I would like to see as part of the last benchmark... but I believe there is no easy way to make a fair comparison. 

Brian Hauer

unread,
Apr 6, 2013, 1:34:06 PM4/6/13
to play-fr...@googlegroups.com
We have considered providing that sort of information, and in fact, we have a lot of it collected internally.  However, we have resisted including that because of precisely the reasons you probably already have in mind:
  1. It's not a fair measurement of the framework but rather us specifically working with the framework.  Grails was really easy for us to set up, for example, because the idiomatic code ends up looking almost identical to the code we use with our in-house framework.  But does that mean Grails is similarly easy for everyone?  Maybe; maybe not.  I can't say either way with any authority.  All I could provide is anecdotes.
  2. Because of that--the data is only anecdotal--we worry that including it would taint the rest of the exercise as not being as impartial as we want it to appear.  Perhaps with enough time to let the dust settle, we could add some anecdotal evidence once the community as a whole has grown comfortable with the approach we've taken to collecting the numeric data.
For Play specifically, I will say that it took us very little time to put together our initial play-java test.  But on the flip-side, the initial feedback was that we made several mistakes that made the code and deployment configuration sub-optimal.  (Thank goodness for pull requests!)

Even if we do not end up publishing our internal "level of effort" data, one opportunity for learning that we want to share with all framework developers is this: across nearly all frameworks, we felt the documentation was often sparse on production-grade deployments.  The best documentation tends to be very detailed about how to set up development environments and get rolling with code.  But concerning planning for and executing a production deployment--again across many frameworks--the documentation was often quite shallow.

In some communities that are highly fragmented (e.g., the PHP community), this in part comes from the fragmentation.  With fragmentation, there are a variety of opinions about what settings are "best" for a production environment.

Christian Papauschek

unread,
Apr 10, 2013, 1:44:15 PM4/10/13
to play-fr...@googlegroups.com
Since we're using Play 2.1.1 (scala) in production, it was quite interesting to read about the results.

We also made a few tests on our own using our production pages (instead of synthetic tests).

If you're interested, read about the results here:

In short, our production code is not much slower than the synthetic results from the TechEmpower article, which is good news ;)

Brian Hauer

unread,
Apr 10, 2013, 8:50:50 PM4/10/13
to play-fr...@googlegroups.com
Christian: I just read your blog entry and it's great.  Thanks for putting that together.  It's very useful for people to have some real-world data--data collected from an actual operational web site--to consider as part of any decision-making process.

Christian Papauschek

unread,
Apr 11, 2013, 4:58:07 AM4/11/13
to play-fr...@googlegroups.com
Brian, thanks for your feedback! We also plan on sharing more details about our experience with Play in the future :)
Reply all
Reply to author
Forward
0 new messages