connect shared socket once and use it for all users and requests

96 views
Skip to first unread message

Hadi Abbasi

unread,
Jun 22, 2019, 7:57:28 AM6/22/19
to openresty-en
Hey Friends...

as I followed many tutorials and documents, in openresty, for sending data via socket and 
rabbitmq ,...  or sending data to redis or ....  we have to make new connector object then connect and send our data, then we have to close it!
but when our sending rate is too much, connection and handshaking can be time and process consuming, so I'm looking for a way to make one shared connector object which was connected and send data for multiple times!
so I think I can use a way like this (redis example):

 local redisLib = require "resty.redis"
 ngx
.shared.redis = redisLib:new()
 ngx
.shared.redis:set_timeout(config.socket_timeout)
 
local okRedis, errRedisCon = ngx.shared.redis:connect(Ip,Port)
 
if not okRedis then
 ngx
.log(ngx.DEBUG,"redis connection error:" .. errRedisCon)
 
end
 ngx
.shared.redis:select(Db_no)

when I wanna send data:

ngx.shared.redis:set(Key,Value)

it can be used for socket connection or rabbitmq or ...
is it suggested way or there is better way to do that?
it was better to have an error event to reconnecting suing time_out!
thanks a lot...
Best Regards,
Hadi

Maxim Avramenko

unread,
Jun 22, 2019, 9:05:39 AM6/22/19
to openresty-en
Hi! Have you try to use pool and keepalive?


суббота, 22 июня 2019 г., 14:57:28 UTC+3 пользователь Hadi Abbasi написал:
D5611C4A-8D1F-4DD1-AA8F-3C1EBB6A2416.jpeg

Hadi Abbasi

unread,
Jun 23, 2019, 5:37:53 AM6/23/19
to openresty-en
thanks a lot for your attention...
no I haven't!
can I use it to have a shared socket connection without need to close and connect again for the next requests and users?

Thibault Charbonnier

unread,
Jun 24, 2019, 2:58:31 PM6/24/19
to openre...@googlegroups.com
Hi,


On 6/23/19 2:37 AM, Hadi Abbasi wrote:
> can I use it to have a shared socket connection without need to close
> and connect again for the next requests and users?

No, you cannot share the same cosocket for several requests (due to
obvious concurrency issues between concurrent requests trying to use the
same connection to Redis). Each Lua coroutine has to call
ngx.socket.tcp() and socket:connect().

That said, if you use socket:keepalive(), OpenResty will maintain
connection pools for you and subsequent socket:connect() operations will
actually reuse an underlying keepalive connection (if available).

All of this is already documented, so please have a more careful read at
the documentation:

https://github.com/openresty/lua-nginx-module#ngxsockettcp

https://github.com/openresty/lua-nginx-module#lua_socket_pool_size

For some referential code, have a look at the source code and
recommended usage (Synopsis) of OpenResty-maintained drivers, e.g.
lua-resty-redis you are trying to use:

https://github.com/openresty/lua-resty-redis

Also, I'd strongly recommend against overriding ngx.shared attributes as
per your previous code example, in order to avoid readers of your code
(or even yourself) from mixing up an actual shared dictionary with the
I/O interface you are trying to build.

Best,
Thibault

Maxim Avramenko

unread,
Jun 24, 2019, 7:49:22 PM6/24/19
to openresty-en
You can try to init redisLib = require "resty.redis" in init_by_lua_file /path/to/init.lua
I'm not so familiar with lua but i think this is like init a singleton object in OOP. For example lua-vips must be created in init phase, maybe i'm wrong and don't understand how it works, i just copy/paste from this example and it's works for me

https://github.com/weserv/images/blob/c42a769e52d2395a65f3e2a3de718042fe2a0f6e/config/nginx/conf.d/imagesweserv.conf#L14

I hope this will help you.

суббота, 22 июня 2019 г., 14:57:28 UTC+3 пользователь Hadi Abbasi написал:

Hadi Abbasi

unread,
Jun 30, 2019, 8:10:33 AM6/30/19
to openresty-en
thanks a lot maxim... sure it will be useful for me...
I think setting keepalive and using pool connection, (as you said at the previous post) is the best way...
thank you and good luck...
All The Best,
Hadi

Hadi Abbasi

unread,
Jun 30, 2019, 8:14:17 AM6/30/19
to openresty-en
yeah, thanks a lot Thibault ...
I will use your comments...
Good Luck...
Best Regards,
Hadi

Hadi Abbasi

unread,
Jul 10, 2019, 3:48:00 AM7/10/19
to openresty-en
Hey friends...

I have to establish a local high rate pool connection between 
 nginx server to nodejs server for sending many image bytes! (connection is local -127.0.0.1)
suppose that I wanna try to send 5 files data by 5 separated requests to the local nodejs server using ngx.socket.tcp()!
the code is this:
local ok, err = sock:connect(Ip,Port)
 
if not ok then
    ngx
.log(ngx.ERROR,"error on connection")
   
return
 
end
 
local bytes, err = sock:send(file data)
 
if not bytes and err ~= nil then
    ngx
.log(ngx.ERROR, "Error on sendinf data", err)
 
end
 local ok, err = sock:setkeepalive(1000) 
 -- sock:settimeout(1000)
 -- local ok, err = sock:setkeepalive(1000,100) --deprecated
when I wanna send 5 parallel requests at the same time, the request data content of this pool collection on nodejs server side is same, like I wanna send 5 same files, so when I set sock:setkeepalive(10) , I see that nodejs received data is correct (5 different data)
I don't know what I have to do? what timeout must be set? if pool size must be set?
I don't know why setting timeout more than 10 milliseconds for setkeepalive causes same request contents on other listener server side?!!!

Hadi Abbasi

unread,
Jul 11, 2019, 3:49:43 AM7/11/19
to openresty-en
I Have solved my problem!
that was happened, because node didn't close the connection because its response was not required for nginx!
by closing the socket after receiving the data, my problem have been solved!
also I needed to save my file in node and do some processes on it, so saving the file from openresty and sending its location using socket is the best way to optimize my connection...
All The Best,
Hadi
Reply all
Reply to author
Forward
0 new messages