Hi Ariel,
Play is perfect for your use case, in fact I would be very interested in hearing how you go, so please keep the questions coming.
The utilities to do what you want to do can be found in the play.api.libs.iteratee.Concurrent object. I'll give you sample code to follow.
At 1 million users, 1 server is probably not going to be enough, so you'll need some sort of messaging system. Here are some options (there are many more):
* Akka remoting
* Plain HTTP with reactive streams from Play nodes to a Play hub (in which case, the Play hub and the Play nodes have the same code for handling client connections, but the Play nodes connect back to the hub and fan out its messages to its clients)
So as for the code, lets say a client comes in requesting to receive updates for game A. First you check if a broadcaster exists for game A, if it doesn't create it. This might look something like this:
val (enumerator, channel) = Concurrent.broadcast[GameUpdates]
Presumably when there are no longer any clients subscribed to that game, you want to clean up the broadcaster, so there's another method with the same name, this wraps an enumerator to broadcast into something that let's you monitor whether clients are connected, and it invokes a callback when there are none, so you can do your clean up (remove it from the map, unsubscribe from upstream notifications, whatever). So then we can do this:
val (broadcastEnumerator, broadcaster) = Concurrent.broadcast(enumerator, broadcaster => /* clean up code here */)
Ok, so you've got your broadcastEnumerator, now aside from maybe mapping it with an enumeratee to transform the messages into something that the client can parse (which actually you should do in the code above when creating it, otherwise the mapping will be done for each client), you can now send that enumerator to the client using the websockets documentation here:
You might want to checkout Concurrent.patchPanel, you could send this to the client instead, and patch the games enumerator into that, and that would also let you multiplex multiple enumerators (games) into the one stream, or stop one, or whatever. Also, the broadcast enumerator will wait for all iteratees to consume each input before sending the next one. You might not want to do this, so theres also a Concurrent.buffer enumeratee, and a Concurrent.dropInputIfNotReady enumeratee, which can be used to wrap your iteratee so that it will consume (and buffer) the input immediately, so a slow client won't hold up everyone else from receiving messages.
When you receive an update to broadcast, you just look up the enumerator for that game, and call channel.push() with the update.
With this solution, there's no looping on your part (though underneath there will be loops, but since there's no blocking, it will generally be very fast, and a lot of stuff will be done in parallel.
So, not sure if that's basic enough for you, but feel free to ask more questions. But I can tell you one thing, this will scale, you'll be able to implement this using Play using far less servers than other frameworks, and far less code too.
On Wednesday, 19 December 2012 06:46:45 UTC+11, Ariel Scarpinelli wrote:
Hi all,
I'm evaluating using Play 2.1 to dev an app that has to broadcast data updates from about 100k to 1m users. The app is to follow a live sports match.
My question is: How well Play could handle this kind of load?
I've been looking at the websocket-chat example. It seems to be using a for loop to walk thru every connected user to broadcast a message. Doesn't seem to be very performant (just an impression). Is there any better option for broadcasting?.
In the case that the volume of users requiere to run the app in different servers, how should I configure these to communicate each other in order to get updates and broadcast them to their connected users?
As a side note, as all the traffic is unidirectional, my other option is to "cook" updated JSON files on each new event and upload them to S3. Then use polling to S3 on the client side. I'm open to suggestions about this option or any other you think could work.
Thanks!