Unfortunately I have no near-term plans on POST support, because it will complicate the code and it is really hard to satisfy all possible use cases with POST caching. The main problem is determining what to use as a key for cache. With GET this is really simple - use Request URI - i.e. the string passed in the first line of HTTP GET request:
GET /RequestURI HTTP/1.x
Since currently go-cdn-booster is a single-host proxy, there is no need in adding Host header value into the cache key.
Using RequestURI as a cache key for POST requests won't satisfy the majority of use cases, since usually different POST requests have identical RequestURI and differ by request's body. Using (RequestURI + POST body) as a cache key looks better, but it will fail if some HTTP headers must be included in the key. Moreover, the majority of HTTP proxy cache users don't need and don't expect POST caching.
But you can easily hack go-cdn-booster for your particular use case thanks to Go's simplicity and expressiveness. I believe the result will be better from clarity, performance and maintainability PoV comparing to the solution with complex configuration files provided by Varnish and/or Nginx.
Keepalive on, 10 workers:
ab: 27469 qps
go-cdn-booster-bench: 34912 qps
Keepalive on, 100 workers:
ab: 24722 qps
go-cdn-booster-bench: 38118 qps
Keepalive on, 1000 workers:
ab: 22525 qps
go-cdn-booster-bench: 35945 qps
Keepalive on, 10000 workers:
ab: 18828 qps
go-cdn-booster-bench: 25055 qps
Keepalive off (only ab numbers, since go-cdn-booster-bench doesn't support keepalive off):
10 workers: 9461 qps
100 workers: 8507 qps
1000 workers: 7404 qps
10000 workers: 5235 qps
These numbers lead to the following conclusions:
- go-cdn-booster easily handles 10K concurrent connections without significant performance degradation (thanks to excellent Go's http
Server implementation).
- Issuing multiple requests over a single keepalive connection is 3-5x faster comparing to 'new connection per request' strategy. This is good news, because all modern browsers actively exploit keepalive connections.
- go-cdn-booster-bench has better performance comparing to ab tool. Probably, because it gathers way less stats and has dead simple and short code - currently
127 lines :)
Let's kick Nginx's ass! Test your Nginx, Varnish or any other caching http proxy with go-cdn-booster-bench tool and post performance comparison with go-cdn-booster here :)
P.S. Don't forget building go-cdn-booster and go-cdn-booster-bench with Go1.1 - this will give you 30-40% performance boost comparing to Go1.
+1 voor Post and use some header value as key. (this we cannot not tell Varnish todo at present).
--
Best Regards,
Aliaksandr