[ANN] Yet Another HTTP Client for OpenResty

380 views
Skip to first unread message

tokers

unread,
Apr 25, 2018, 8:38:13 AM4/25/18
to openresty-en
Hello!

I have created a HTTP client library which named lua-resty-requests, I just mimicked  the Python Requests library.
I hope lua-resty-requests is human-friendly, although this library is still callow.
Please see the README for the details and have fun with this! PRs and issues are welcome.

Regards

Alex Zhang
 

Sreekanth Madhavan

unread,
Jun 6, 2018, 10:46:02 PM6/6/18
to openresty-en
Hello!

Thanks for writing lua-resty-request library.   I have following queries on the API.

- Does the lua-resty-requests API blocks the nginx worker ?

- How does the requests.get(url) API gets response object as return value without blocking the nginx worker ?

Sample code:

local requests = require "resty.requests"

-- example url
local url = "http://example.com/index.html"

local r, err = requests.get(url)
if not r then
    ngx.log(ngx.ERR, err)
    return
end

-- read all body
local body = r:body()
ngx.print(body)

Thanks,
Sreekanth

tokers

unread,
Jun 6, 2018, 11:44:27 PM6/6/18
to openresty-en
Hello!

> - Does the lua-resty-requests API blocks the nginx worker ?

It will not block the nginx worker since lua-resty-requests is implemented based on Cosocket, which is 100% non-blocking.

- How does the requests.get(url) API gets response object as return value without blocking the nginx worker ?

I guess you are curious about the Cosocket APIs: https://github.com/openresty/lua-nginx-module#ngxsockettcp

Cosocket is also driven by the nginx event loop and the nginx timers. 

jona...@findmeon.com

unread,
Jun 7, 2018, 6:58:39 PM6/7/18
to openresty-en
are the main benefits over the existing library just:

* persistent session
* interface design

or are there any performance improvements?

tokers

unread,
Jun 7, 2018, 10:00:14 PM6/7/18
to openresty-en
Hello!

I haven't done any performance benchmarks. But I was careful when I was coding it, for instance:

* avoid the table resize
* use the JIT function

Maybe I really need do some benchmarks :)

jona...@findmeon.com

unread,
Jun 8, 2018, 12:29:47 AM6/8/18
to openresty-en

On Thursday, June 7, 2018 at 10:00:14 PM UTC-4, tokers wrote:

Maybe I really need do some benchmarks :)

yes!

we mostly use the existing lua-resty-http module to pull routing information from an internal api (then cached into nginx for a bit).   for this application, speed is most important.

i'm thinking of building an auto-cert module for letsencrypt, and this package might be better for that because the speed concern isn't as sensitive.

tokers

unread,
Jun 8, 2018, 6:12:03 AM6/8/18
to openresty-en
> yes!
>
>we mostly use the existing lua-resty-http module to pull routing information from an internal api (then cached into nginx for a bit).   for this application, speed is most important.
> i'm thinking of building an auto-cert module for letsencrypt, and this package might be better for that because the speed concern isn't as sensitive.

OK. I will do some benchmarks and paste the result in the README.md (also synchronize it here). BTW, PR is welcome if you think some code path is not efficient enough :).

Gene Unigovski

unread,
Jun 13, 2018, 10:17:34 AM6/13/18
to openresty-en
I did some performance testing to compare lua-resty-requests with lua-resty-http - for requests/responses with small body (under 1K) lua-resty-requests could be up to 10x times faster. I also have a hacked version of lua-resty-http where I had removed coroutines usage for request_uri() function - it is also more than 10x faster then the original one (for some reason that I don't completely understand running body_reader as a coroutine wrapped function causes a huge performance degradation).

Hamish Forbes

unread,
Jun 18, 2018, 6:55:37 AM6/18/18
to openresty-en
It's probably because coroutines in ngx_lua are a bit weird as they have had to be re-implemented to work in the Nginx environment.

Switching lua-resty-http to a purely functional non-coroutine approach is/was something we were going to do.
The co-routine approach is nice and elegant from a code PoV but isn't JIT'able and appears to have other performance issues.
It was kinda assumed that not being able to JIT the cosocket operations would be the bottleneck in the realworld but it was never benchmarked or investigated very far.

PR to switch pintsized/lua-resty-http to a function based body reader would be very much appreciated, as would any benchmarks/flamegraphs that show the coroutine reader as an issue!

Thanks
Hamish

tokers

unread,
Jun 18, 2018, 8:25:48 AM6/18/18
to openresty-en
Hello!

I have uploaded the lua-resty-requests (version 0.3) to https://opm.openresty.org/. Now we can download this by opm :). Have fun for this!
Reply all
Reply to author
Forward
0 new messages