Building a chat system with lua-resty-websocket

2,226 views
Skip to first unread message

Osman Tekin

unread,
Aug 13, 2014, 4:12:44 PM8/13/14
to openre...@googlegroups.com
Hi Yichun!

I'm trying to build a simple chat system with lua-resty-websocket, but I can't get it to work and especially I don't get what I have to do server side, I'm really lost.

So far there are: send_text, send_binary, send_ping, send_pong, send_close, send_frame, recv_frame.

I think recv_frame is used to receive server side sent events no?

What I need is simply for multiple users to be able to send and at the same time to receive messages sent by other users.

What would be the best way to accomplish this?

Regards, x01dev

Yichun Zhang (agentzh)

unread,
Aug 13, 2014, 5:50:41 PM8/13/14
to openresty-en
Hello!

On Wed, Aug 13, 2014 at 1:12 PM, Osman Tekin wrote:
> I'm trying to build a simple chat system with lua-resty-websocket, but I
> can't get it to work and especially I don't get what I have to do server
> side, I'm really lost.
>

Seems like I need to create a demo chat app atop openresty. It's much
easier for me than answering such general questions over and over
again on the list :)

> So far there are: send_text, send_binary, send_ping, send_pong, send_close,
> send_frame, recv_frame.
>
> I think recv_frame is used to receive server side sent events no?
>

Lua-resty-websocket library does have an official document:

https://github.com/openresty/lua-resty-websocket#readme

Please read it before asking questions :)

recv_frame is for reading websocket messages (or "frame" in the
websocket terminology) from the remote end.

> What I need is simply for multiple users to be able to send and at the same
> time to receive messages sent by other users.
>
> What would be the best way to accomplish this?
>

The simplest (but may be not the best) way is to use external shared
data services like Redis, as mentioned in the following thread:

https://groups.google.com/forum/#!topic/openresty-en/McUECt7YgPc

A much more efficient (though much more complex) way is to use a
shared data service supporting message multiplexing within a single
stream-typed connection and to use the upcoming ngx.thread.semaphore
API to coordinate requests' "light threads".

Regards,
-agentzh

Aapo Talvensaari

unread,
Aug 14, 2014, 9:16:24 AM8/14/14
to openre...@googlegroups.com
On Thursday, August 14, 2014 12:50:41 AM UTC+3, agentzh wrote:
The simplest (but may be not the best) way is to use external shared
data services like Redis, as mentioned in the following thread:

    https://groups.google.com/forum/#!topic/openresty-en/McUECt7YgPc

A much more efficient (though much more complex) way is to use a
shared data service supporting message multiplexing within a single
stream-typed connection and to use the upcoming ngx.thread.semaphore
API to coordinate requests' "light threads".

Yes, Redis is not the best solution here. I quickly checked what has been happening in this messaging scene, and I found this:

It looks rather interesting, and there is also a few Lua bindings as well:

I'm not sure though how these work in context of OpenResty (for example the luajit-nanomsg contains socket connecting code in that C-lib, should it rather use ngx.socket.tcp which meant that the lib should be reimplemented?).

But that could be a nice option in making a scalable low-overhead chat-system amongs the others.

  

x01dev

unread,
Aug 14, 2014, 9:50:24 AM8/14/14
to openre...@googlegroups.com
Hi!

Thank you for the clarifications! :)

I did look before but couldn't get the examples to work. What I mean by that is "echo" functionality works, Redis pubsub does too, because I can see the messages in redis-cli in real-time, whereas they don't show on the chat example.

Is there maybe a listen function/command I should be interfacing to?

I'm referring to this code (server-side):

local server = require "resty.websocket.server"
local redis  = require "resty.redis"

local function subscribe(ws)
  local sub = redis:new()
  sub:connect("127.0.0.1", 6379)
  sub:subscribe("chat.messages")
  while true do
    local bytes, err = sub:read_reply()
    if bytes then
        ws:send_text(bytes[3])
    end
  end
end

local ws, err = server:new{ timeout = 30000, max_payload_len = 65535 }

ngx.thread.spawn(subscribe, ws)

local pub = redis:new()
pub:connect("127.0.0.1", 6379)

while true do
  local bytes, typ, err = ws:recv_frame()
  if ws.fatal then return
  elseif not bytes then
      ws:send_ping()
  elseif typ == "close" then break
  elseif typ == "text"  then
      pub:publish("chat.messages", bytes)
  end
end

ws:send_close()
and client-side:

<html>
<head>
<script>
var ws = null;
function connect() {
 
if (ws !== null) return log('already connected');
  ws
= new WebSocket('ws://127.0.0.1/test/');
  ws
.onopen = function () {
    log
('connected');
 
};
  ws
.onerror = function (error) {
    log
(error);
 
};
  ws
.onmessage = function (e) {
    log
('recv: ' + e.data);
 
};
  ws
.onclose = function () {
    log
('disconnected');
    ws
= null;
 
};
 
return false;
}
function disconnect() {
 
if (ws === null) return log('already disconnected');
  ws
.close();
 
return false;
}
function send() {
 
if (ws === null) return log('please connect first');
 
var text = document.getElementById('text').value;
  document
.getElementById('text').value = "";
  log
('send: ' + text);
  ws
.send(text);
 
return false;
}
function log(text) {
 
var li = document.createElement('li');
  li
.appendChild(document.createTextNode(text));
  document
.getElementById('log').appendChild(li);
 
return false;
}
</script>
</head>
<body>
 
<form onsubmit="return send();">
   
<button type="button" onclick="return connect();">
      Connect
   
</button>
   
<button type="button" onclick="return disconnect();">
      Disconnect
   
</button>
   
<input id="text" type="text">
   
<button type="submit">Send</button>
 
</form>
 
<ol id="log"></ol>
</body>
</html>

Apart from the lack of error handling (which would be fixed later), are you able to pinpoint any of the issue at a glance?

Alternatively, do you think you could easily port Socket.io on top of Openresty?

Thank you Yichun, keep up with your great work! I'll sponsor your work ASAP :)

x01dev

unread,
Aug 14, 2014, 9:53:11 AM8/14/14
to openre...@googlegroups.com
Oh hi Aapo! :) Your templating engine is great! Anyway I'll look on that too before giving up troubleshooting the current issue.

x01dev

unread,
Aug 14, 2014, 4:14:36 PM8/14/14
to openre...@googlegroups.com
So, I did a bit of progress and I can receive my own messages while subscribed to Redis pubsub, but others can't.

ws:send_text(bytes[3])

Should be change to:

ws:send_text(bytes[1])

And it will show messages. Will post back when I manage (I do hope) to make this work for all connected clients.

Yichun Zhang (agentzh)

unread,
Aug 14, 2014, 6:56:35 PM8/14/14
to openresty-en
Hello!

On Thu, Aug 14, 2014 at 6:50 AM, x01dev wrote:
> I'm referring to this code (server-side):
>

I've already listed those obvious mistakes in this code snippet you're
referencing in the original email thread:

https://groups.google.com/d/msg/openresty-en/McUECt7YgPc/IMig2CHG5lEJ

You should at least fix these mistakes yourself.

BTW, when you're testing multiple websocket clients, you should use
multiple separate browsers (not separate browser tabs!) or even
separate machines to avoid hitting potential connection count limits
enforced by your web browser (to be polite to your web server).

Regards,
-agentzh

Brent Tucker

unread,
Aug 14, 2014, 7:28:33 PM8/14/14
to openre...@googlegroups.com
This link might help. I have not tested it, but it appears to be a simple example of setting up a resty websocket server and also includes javascript client code.


Brent

Jerome Lafon

unread,
Aug 16, 2014, 10:30:56 AM8/16/14
to openre...@googlegroups.com
Hello,

Can you explain why using REDIS PUBSUB is less efficient than using nanomsg for example?

Jérôme

Yichun Zhang (agentzh)

unread,
Aug 16, 2014, 1:03:15 PM8/16/14
to openresty-en
Hello!

On Sat, Aug 16, 2014 at 7:30 AM, Jerome Lafon wrote:
> Can you explain why using REDIS PUBSUB is less efficient than using nanomsg
> for example?
>

The real goal here is to use a small number of backend connections for
all the potentially very large number of incoming websocket or http
connections.

But I think we can also use redis pub/sub to emulate TCP-connection
level multiplexing by devising our own dispatch mechanism within Lua.
For example, we can have a background "light thread" doing the message
dispatch (to all the downstream requests within the current nginx
worker) from a single (or a very small number of) redis connections.
For synchronization between this background "light threads" and
request "light threads", we can use polling based on ngx.sleep() or
more efficiently, use the upcoming ngx.thread.semaphore API for it :)

Regards,
-agentzh

Aapo Talvensaari

unread,
Aug 18, 2014, 10:50:16 AM8/18/14
to openre...@googlegroups.com
On Saturday, August 16, 2014 5:30:56 PM UTC+3, Jerome Lafon wrote:
Hello,

Can you explain why using REDIS PUBSUB is less efficient than using nanomsg for example?

I shouldn't probably have said that. Redis is a nice key-value/data structure server, but I think that on messaging side we do have other options that should be considered as well. But yes, you are correct, I think that in many (ptobably most) scenarios redis will do just fine. Even then, it is always good to consider other options as well: http://www.bravenewgeek.com/dissecting-message-queues/.

I also remember reading some downsides of Redis Pub/Sub in Redis in Action book (but that might be fixed in newer versions of Redis). In that book they actually did write pub/sub style of messaging with Redis _without_ using Redis' Pub/Sub mechanism. You might also consider using UDP on a transport layer. But it really depends what you are trying to archive. Are you trying to beat someone other's bot in HFT or do you just want to have a nice chat room for your local soccer club?

Vladislav Manchev

unread,
Aug 18, 2014, 12:52:41 PM8/18/14
to openre...@googlegroups.com
Hello!

On Sat, Aug 16, 2014 at 8:03 PM, Yichun Zhang (agentzh) <age...@gmail.com> wrote:
Hello!

On Sat, Aug 16, 2014 at 7:30 AM, Jerome Lafon wrote:
> Can you explain why using REDIS PUBSUB is less efficient than using nanomsg
> for example?
>

The real goal here is to use a small number of backend connections for
all the potentially very large number of incoming websocket or http
connections. 

Exactly, so well put.
 
But I think we can also use redis pub/sub to emulate TCP-connection
level multiplexing by devising our own dispatch mechanism within Lua.
For example, we can have a background "light thread" doing the message
dispatch (to all the downstream requests within the current nginx
worker) from a single (or a very small number of) redis connections.
For synchronization between this background "light threads" and
request "light threads", we can use polling based on ngx.sleep() or
more efficiently, use the upcoming ngx.thread.semaphore API for it :)

That's actually the option I went for when conceptualizing a chat service.
I'll need to implement it soon and then I can share some code with the list.

Really interested in the ngx.thread.semaphore API though, so I'll be grateful if you could expand a bit on that in advance.
 

Regards,
-agentzh

--
You received this message because you are subscribed to the Google Groups "openresty-en" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openresty-en...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Best,
Vladislav

Yichun Zhang (agentzh)

unread,
Aug 18, 2014, 2:55:41 PM8/18/14
to openresty-en
Hello!

On Mon, Aug 18, 2014 at 9:52 AM, Vladislav Manchev wrote:
>
> That's actually the option I went for when conceptualizing a chat service.
> I'll need to implement it soon and then I can share some code with the list.
>

Terrific!

> Really interested in the ngx.thread.semaphore API though, so I'll be
> grateful if you could expand a bit on that in advance.
>

My current (vague) plan is to use an API similar to the POSIX
(unnamed) semaphore API (but operating on "light threads" instead of
POSIX threads or OS threads). Basically, the ngx.thread.semaphore
class will at least have the following methods: new(), post(), wait(),
and try_wait(). This ngx.thread.semaphore objects will be on the Lua
VM level only (that is, it cannot be shared across multiple worker
processes).

Regards,
-agentzh
Reply all
Reply to author
Forward
0 new messages