Hello!
On Wed, Feb 10, 2016 at 3:15 AM, Sunny GUPTA wrote:
> I go to know os.execute is not lua friendly and is thread blocking call.
>
Yeah, it's horrible.
>
> Second Tries :
>
> ###nginx_ss.lua
> local packets = '{"some" : "data" }'
> res = ngx.location.capture( '/foo/bar', { method = ngx.HTTP_POST, body =
> packets } )
>
> ##nginx.conf
> location /foo/bar {
> proxy_pass
http://remote_server_endpoint;
> }
>
Using the lua-resty-http-simple library can often be more efficient
than ngx.location.capture + ngx_proxy:
https://github.com/bakins/lua-resty-http-simple
Also, ensure you have enabled backend connection pooling in both your
proxy_pass and lua-resty-http-simple.
> But above method is also not giving performance, after benchmarking, new
> results are only 300 req/sec with 1 core 2gb ram. And without nginx_ss.lua
> its 1900req/sec
>
When talking about performance, it's always recommended to generate
flame graphs to get insights about any differences:
https://openresty.org/#Profiling
There's so many ways to get things wrong in a (blind) benchmark.
> so considering my requirement, is there any fire-and-forget approach in lua
> boundary that can help in achieving above threshold. Can
>
https://github.com/liseen/lua-resty-http can help
> me through this. Is there any other hack to achieve the performance ?
>
I think you need to use the
ngx.timer.at() API to do async processing.
Please ensure that you have read the documentation about
ngx.timer.at() carefully since there's some caveats about async
processing models (in general).
Best regards,
-agentzh