bad vertx performance compared to pure java

1,321 views
Skip to first unread message

E. Ulrich Kriegel

unread,
Dec 11, 2015, 10:51:04 AM12/11/15
to vert.x
Hi there,
I have a java 8 Rest Service implemented with grizzly and jersey which queries a postgres database. It takes 2 ms/query after a ramp up phase.
I rewrote the Service as verticle in the hope to gain more speed, but it takes about 5 - 8 ms after ramp up. C3PO database connector is used.

Are there similar experiences that a verticle is so much slower than a normal java service.

Thanks in advance
--Ulrich

David Bush

unread,
Dec 11, 2015, 5:50:27 PM12/11/15
to vert.x
I haven't done any benchmarking of Eclipse Vert.x versus anything else. But, having used it for quite a while, I think you will see it shine under heavy load when there is a very high number of transactions. I'm using it for quantitative trading and machine learning. My previous implementation was straight Groovy with the Gpars library. It ended up being unworkable due to the large number of events that needed to be processed. A single JVM just couldn't handle it. Coordinating the work among multiple JVMs was going to be difficult and not something I wanted to tackle. 

The Vert.x solution is working great using the same infrastructure I was using with straight Groovy. Now I've switched from Groovy Verticles to Java 8 and It's even better.

That's anecdotal and not scientific. But, it works for me.

I think you will find Java 8 is always faster for a small number of transactions as Vert.x is sending messages over sockets whilst Java is making direct function or method calls.

A better comparison would be with another framework like Akka or maybe Apache Ignite.

I'd suggest trying your test again by pounding your REST service with maybe 10,000 requests and see what happens. You may have to increase the number of Verticle instances deployed to get the best performance. 

Guido Medina

unread,
Dec 14, 2015, 5:45:42 PM12/14/15
to vert.x
I have to agree that for little load you will not see Vert.x or Akka being faster than pure Java, asynchronous design comes with some little overhead but it shines under heavy load and it can scale horizontally, also, not related, try switching your data source implementation, here is a good article that I'm quite sure will convince you:


Hope that helps,

Guido.

E. Ulrich Kriegel

unread,
Jan 12, 2016, 6:04:59 AM1/12/16
to vert.x

Thanks for all the hints. I finally finished my transfer project from Grizzly/Jersey framework to Vertx3.2 and performed some Stresstests using Gatling:
A Rest-service receives a query 
- validate the query using Drools
- forward the query to another RestService which performs a lookup in a Postgres database with 100k entries
- present the results

The test was run on a MacBookPro with 16GB RAM. Each Service was started with -Xmx2g -Xms2g
For a run with 30 user/over 100sec the following i got the following results

Grizzly
> request count                                       3000 (OK=3000   KO=0     )
> min response time                                     11 (OK=11     KO=-     )
> max response time                                    457 (OK=457    KO=-     )
> mean response time                                    27 (OK=27     KO=-     )
> std deviation                                         45 (OK=45     KO=-     )
> response time 50th percentile                         15 (OK=15     KO=-     )
> response time 75th percentile                         19 (OK=19     KO=-     )
> mean requests/sec                                 29.998 (OK=29.998 KO=-     )
---- Response Time Distribution ------------------------------------------------
> t < 800 ms                                          3000 (100%)
> 800 ms < t < 1200 ms                                   0 (  0%)
> t > 1200 ms                                            0 (  0%)
> failed                                                 0 (  0%)


Vertx

> request count                                       3000 (OK=3000   KO=0     )
> min response time                                      6 (OK=6      KO=-     )
> max response time                                    373 (OK=373    KO=-     )
> mean response time                                    10 (OK=10     KO=-     )
> std deviation                                         24 (OK=24     KO=-     )
> response time 50th percentile                          7 (OK=7      KO=-     )
> response time 75th percentile                          8 (OK=8      KO=-     )
> mean requests/sec                                 30.001 (OK=30.001 KO=-     )
---- Response Time Distribution ------------------------------------------------
> t < 800 ms                                          3000 (100%)
> 800 ms < t < 1200 ms                                   0 (  0%)
> t > 1200 ms                                            0 (  0%)
> failed                                                 0 (  0%)


Grizzly is faster and scales better. Whereas my Grizzly implementation gets problems with more than 30 users/sec Vertx can handle 100 users!
Great software, thanks to all developers.

--Ulrich

Reply all
Reply to author
Forward
0 new messages