Hi there,Very interesting project. I stumbled on it from a StackOverflow answer yesterday.
Question about the benchmark at the bottom of: http://deftserver.appspot.com/Deft appears to be ~2x faster running one instance than running 4 instances behind nginx. Am I reading that correctly? Is that due to the increased latency introduced by the nginx reverse proxy?
Was the deft server CPU bound in either benchmark?
Just curious, as I would think exploiting all 4 cores would increase performance. Or perhaps I'm misunderstanding the benchmark.thank you!-- James
I will probably replace the second (rightmost graph) with a benchmark that illustrates how deft, tornado and node.js deals with request if ~100k idle connections are hanging on the servers.The reason for the non-intuitive benchmark result is that nginx (atleast the version we used for the latest benchmark) does not support http 1.1 keep-alive connections to its backend (i.e there is no connection keep-alive between nginx and the four deft/tornado/node.js instances, and the tcp handskake procedure is pretty expensive relative to the rest of the work in the hello world example).(Nb. nginx does support connection keep-alive between its clients and itself.)
Thank Roger for your comment.I guess I did not state my question clear enough. Based on the benchmark you have published, 4 instances of Deft fronted with Nginx actually performed worse than a single instance, in terms of the total throughput. So what's the point of this deployment model other than fault-tolerance?
I also have noticed that load balanced node.js and Tornado had improved, even if not a lot, the throughput, leaving Deft the only guy that suffered from load balancing. Is the observation correct? So the real question is why the "competitors" can take advantage of load balancing to scale up but Deft cannot.
Adding a thread pool in the handler might work, but will it make the architecture look similar to Netty for example?
Keep up the nice work!
Bing
Hi,Sorry to miss your answer to a previous message.
Yes it would be interesting to see the result without the client-side keep-alive. I was just curious why the other two had responded to Nginx positively while Deft negatively.