I have prototyped an application that basically multiplexes messages,
assembles them into blocks, and then segments them. The application
tries to do as much concurrently as possible.
I have used something I call a 'pulse queue' which is based on a
non-blocking concurrent queue.
http://docs.oracle.com/javase/7/docs/api/java/util/concurrent/ConcurrentLinkedQueue.html
Basically, I create a separate future to handle the processing of each
message, and the futures complete their work by adding their results to
the non-blocking queue. The pulse part is that periodically (i.e. each
second) the queue is drained to create a block or segment. There are
other reasons to drain a queue, i.e. size, but primarily it is the pulse
that keeps data moving through the system.
I have found this to be simple to code, and effective at handling large
numbers of Futures concurrently.
As I am trying to understand Akka Streams, and Akka HTTP, I am wondering
if Streams would be a better or equivalent solution, but I do not really
understand how streams work under the hood to answer my curiosity. For
example, in the Streams environment is it possible to create a separate
future to handle each message, the way the pulse queue does? My sense is
that each element in the stream is an Actor under the hood, so that the
messages (while non-blocking) would get serialized.
Cheers, Eric