Hey guys,
Past year or so we have released a couple of servers to production using Vert.x 2.1, and we are extremely happy about it. The TPS and response times we are getting are pretty impressive. We are not seeing any stability issues.
I wanted to share our experience.
My team was very experienced with NodeJS, but as a company requirement we had to use pure Java, after doing some research we decided to go with Vert.x because of its similarity with NodeJS as a stack and usage of Java as the main language. And it all worked out nicely. We are easily beating the speeds we were seeing on our previous NodeJS stacks.
Most of our services are heavy on NetworkIO, (They do many external calls to DB, some other external REST APIs and Redis)
don't do much CPU processing on the data besides organizing the data and JSON handling,
and they don't have much memory requirements.
Speed was one of the biggest requirements because of our large customer base.
Here are some of the approaches we took, (well - I'm noticing some of these are drastically different than approaches people taking on the Vert.x forums.)
* We use Yoke as our main router and middleware handler. It is very similar to ExpressJS, we got used to it pretty quickly. Can't recommend it enough - awesome library.
* On our services we always use a single Verticle, and rarely use EventBus (except some tests). Coming from NodeJS mentality we didn't have a need for an actor based solution. (We do use "--instances" option to scale the server to available core count)
* We ported NodeJS's
Async libraries to Java. This helped us a lot. We are very used to it from our NodeJS days, so couldn't live without it. We use it pretty much everywhere.
* We use Vertx's HTTP Client for all internal calls.
* We wrote our own Redis client on Java using Vert.x's Java TCP client. (Instead of the mod-redis)
The last decision was a bit drastic, at the beginning we did use mod-redis, but we noticed having a separate Verticle and communicating with it through EventBus was slowing down things. Normally the overhead of EventBus is actually not much, but Redis is really fast, and at that speeds EventBus overhead is pretty noticeable.
We also needed to add Redis Sentinel support, which requires a lot of changes on the low end of the Redis client. So we decided to write our own version.
We are getting about 80% better speeds compared to mod-redis. (As far as we can see, the main slowness of the mod-redis is the Json serialization/deserialization during the EventBus communication - so nothing wrong with Mr. Lopes's awesome library internally)
Since most of our overhead is on Redis, this approach gave us tremendous speed gains.
I know, with our approach we are losing the polygot nature of the VertX. But since we are Java only, it doesn't affect us. But I feel like we are a bit diverging from many Vert.x developers' approach.
So my thoughts, and questions:
* For pure java applications, I think libraries (eg: mod-redis) that use verticles actually hurt the speed due to Verticle and EventBus overhead. I wish there was a way for library builders to have an option to give a pure Java API (or other languages), and also an API through EventBus (for polygot language support). Reflection features of the language could be used to discover the API and expose it through EventBus automatically.
* Vertx module registry is awesome. But it would be even better if there was a way to share language specific modules with the community. For example we would love to share our Async port on Java with the community. It is Java only, and it is just a utility library. It is not a module. But I'm not sure what is the best way to share it with others currently.
I guess what I'm looking for is something more like npm, where people share utility libraries and everything. IMO this is hurting the Vertx community a bit.
Well, polygot nature of vertx makes this part a bit hard.
Anyways, thanks a lot,
- Tolga