Re: Publishing to multiple channels at once

714 views
Skip to first unread message

Salvatore Sanfilippo

unread,
Aug 30, 2012, 11:34:32 AM8/30/12
to redi...@googlegroups.com
Hello, starting from 2.6 you can use a Redis Lua Script.

There are no solutions for 2.4, nor planned since 2.6 solved the issue.

Salvatore

On Thu, Aug 30, 2012 at 5:24 PM, Parham Negahdar <pneg...@gmail.com> wrote:
> Is there a way to publish to multiple channels at once to save on bandwidth?
>
> --
> You received this message because you are subscribed to the Google Groups
> "Redis DB" group.
> To view this discussion on the web visit
> https://groups.google.com/d/msg/redis-db/-/ecntVZn-y9IJ.
> To post to this group, send email to redi...@googlegroups.com.
> To unsubscribe from this group, send email to
> redis-db+u...@googlegroups.com.
> For more options, visit this group at
> http://groups.google.com/group/redis-db?hl=en.



--
Salvatore 'antirez' Sanfilippo
open source developer - VMware
http://invece.org

Beauty is more important in computing than anywhere else in technology
because software is so complicated. Beauty is the ultimate defence
against complexity.
— David Gelernter

Dvir Volk

unread,
Aug 30, 2012, 11:35:23 AM8/30/12
to redi...@googlegroups.com
I'm not sure if you're talking about sending the same message to many channels at once or many messages.

1. you can do it with a pipeline or transaction. but that doesn't save bandwidth, just network roundtrips. 

2. if you have one message to many, you can write a Lua function (in redis 2.6) that takes a message and replicates it to many channels.

3. you can have the clients listen to a single channel at once, or to have the clients use PSUBSCRIBE to subscribe to a channel _pattern_, not a specific channel, then any publish to any channel in the pattern will get to them.


On Thu, Aug 30, 2012 at 6:24 PM, Parham Negahdar <pneg...@gmail.com> wrote:
Is there a way to publish to multiple channels at once to save on bandwidth? 

--
You received this message because you are subscribed to the Google Groups "Redis DB" group.
To view this discussion on the web visit https://groups.google.com/d/msg/redis-db/-/ecntVZn-y9IJ.
To post to this group, send email to redi...@googlegroups.com.
To unsubscribe from this group, send email to redis-db+u...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/redis-db?hl=en.



--
Dvir Volk
Chief Architect, Everything.me

Parham Negahdar

unread,
Aug 30, 2012, 11:46:01 AM8/30/12
to redi...@googlegroups.com
Related question: what happens when you publish to redis at say 1Gb/s and the subscribers are connected with a 100Mb/s connection? What happens to the data that cant be sent over fast enough?  

Salvatore Sanfilippo

unread,
Aug 30, 2012, 11:45:46 AM8/30/12
to redi...@googlegroups.com
On Thu, Aug 30, 2012 at 5:35 PM, Dvir Volk <dvi...@gmail.com> wrote:

> I'm not sure if you're talking about sending the same message to many
> channels at once or many messages.

My guess is that the OP wants to send the exact same message to
multiple channels. A 2.6 script seems the best way to do this. Or
maybe a change in the overall design to avoid duplication in some way?

Salvatore

Dvir Volk

unread,
Aug 30, 2012, 11:47:25 AM8/30/12
to redi...@googlegroups.com
since PUBLISH replicates the message to all listeners already and does it fast, the right design would be to attack this via the channel pattern, or use PSUBSCRIBE.

--
You received this message because you are subscribed to the Google Groups "Redis DB" group.
To post to this group, send email to redi...@googlegroups.com.
To unsubscribe from this group, send email to redis-db+u...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/redis-db?hl=en.

PN

unread,
Aug 30, 2012, 11:48:50 AM8/30/12
to redi...@googlegroups.com
The reason I want to send to multiple channels is because I want publish to a channel based on how much data the subscribers connection can digest. Say I publish data to a single channel at 1Gb/s to redis and some subscribers have a full 1Gb/s connection and some have a 100Mb/s how would redis handle this? I'm fine with the interim data getting lost, but I am not fine with the data stream backing up. 

Salvatore Sanfilippo

unread,
Aug 30, 2012, 12:01:01 PM8/30/12
to redi...@googlegroups.com
On Thu, Aug 30, 2012 at 5:46 PM, Parham Negahdar <pneg...@gmail.com> wrote:

> Related question: what happens when you publish to redis at say 1Gb/s and
> the subscribers are connected with a 100Mb/s connection? What happens to the
> data that cant be sent over fast enough?

In 2.4 land, what happens is an Armageddon (crash for out of memory eventually).
In 2.6 land you can configure limits, and the slow-reader connection
is closed when those limits are reached (see redis.conf for more
info).

Cheers,

PN

unread,
Aug 30, 2012, 12:16:05 PM8/30/12
to redi...@googlegroups.com
So does each client have its own buffer/queue? Is there a way to neglect the backed up data instead of disconnecting the client? Like in 2.4 can we just set a memory limit to redis with a MAXMEMORY policy which deletes the publishing queue once the memory limit is reached or is the memory limit just for keys and not subscribers as well? 

PN

unread,
Aug 30, 2012, 1:03:25 PM8/30/12
to redi...@googlegroups.com
For example if I could truncate the queue/buffer of the slow clients it'd be golden for multiple clients of variable speeds. 

Dvir Volk

unread,
Aug 30, 2012, 1:47:08 PM8/30/12
to redi...@googlegroups.com

IMHO if you're thinking of solving this type of problem with redis, you're looking in the wrong place. To the best of my knowledge, redis was not designed to be a streaming server. While it can do that, you are better off using servers that were designed for this.

sent from my Sinclair ZX48

--
You received this message because you are subscribed to the Google Groups "Redis DB" group.
To view this discussion on the web visit https://groups.google.com/d/msg/redis-db/-/AFzNfa8DUKAJ.

PN

unread,
Aug 30, 2012, 2:15:46 PM8/30/12
to redi...@googlegroups.com
Any recommendations?

Dvir Volk

unread,
Aug 30, 2012, 2:19:34 PM8/30/12
to redi...@googlegroups.com
since this is a redis group it's off topic, but I'd have a look at zeromq.
they have this stuff configurable and you can limit the buffer size per connection and the desired behavior.




On Thu, Aug 30, 2012 at 9:15 PM, PN <pneg...@gmail.com> wrote:
Any recommendations?


--
You received this message because you are subscribed to the Google Groups "Redis DB" group.
To view this discussion on the web visit https://groups.google.com/d/msg/redis-db/-/2ET_zA1FEi8J.

To post to this group, send email to redi...@googlegroups.com.
To unsubscribe from this group, send email to redis-db+u...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/redis-db?hl=en.

M. Edward (Ed) Borasky

unread,
Aug 30, 2012, 2:49:26 PM8/30/12
to redi...@googlegroups.com
If you want to talk huge scale, I'm not sure what Twitter does but I'm
guessing it's custom Scala code, or at least it was at one time.
Personally I'm a huge fan of custom code in languages built on a
really tight VM, like the JVM, V8, the Erlang run-time or LuaJIT for
this sort of thing. You end up rewriting a bunch of stuff when your
users outgrow your prototype, so you might as well pick a runtime VM
you can grow with at the beginning.

In the context of Redis, 2.6 (and Luvit) have forced me to look at Lua
as a viable language for constructing large scalable systems. Well
played, all. ;-)
Twitter: http://twitter.com/znmeb; Computational Journalism Publishers
Workbench: http://j.mp/QCsXOr

How the Hell can the lion sleep with all those people singing "A weem
oh way!" at the top of their lungs?
Reply all
Reply to author
Forward
0 new messages