Dynamic list of HTTP requests

321 views
Skip to first unread message

Ryan Ashcraft

unread,
Apr 22, 2015, 12:11:13 PM4/22/15
to elm-d...@googlegroups.com
Hi all! I'm coming from the JS/React side of things, so many of the concepts are new to me, but having a lot of fun learning.

I'm currently experimenting with creating a simple Hacker News top stories list, just so I can learn more about signals, mailboxes, and ports. I'm currently not clear on how to actually implement it, given the constraints from the HN API.

Hacker News' top stories API (example) returns just a list of IDs. To get more information about a story, you need to 

What I'd like to do, is to take the response from the list API and then start making requests for every item, in parallel. As each item response is returned, the view would re-render. This seems to me like a problem that naturally requires a dynamic signal graph. How would one go about designing something like this in Elm today?

Hassan Hayat

unread,
Apr 22, 2015, 1:43:52 PM4/22/15
to elm-d...@googlegroups.com
Nope, there's actually no need for a dynamic signal graph for this. 

Here's the code: https://gist.github.com/TheSeamau5/1dc5597a2e3b7ae5f33e It'll display each new story as soon as it gets it, progressively, incrementally. (They're just ul's and li's, but I hope the point gets across)

I'll write a walkthough of the code a little later on but read through it and ask any questions if you have any.

Dobes Vandermeer

unread,
Apr 22, 2015, 2:20:44 PM4/22/15
to elm-d...@googlegroups.com
Hassan,

Won't that fetch them in series rather than in parallel?

That said, the browser typically has a limit to parallelism, so I suppose with a bit more work you could have a fixed number of "tasks" running to fetch new items, and it still wouldn't require a dynamic graph.




--
You received this message because you are subscribed to the Google Groups "Elm Discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elm-discuss...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Hassan Hayat

unread,
Apr 22, 2015, 2:35:10 PM4/22/15
to elm-d...@googlegroups.com
Yeah, it does fetch them in series but the view updates progressively as opposed to all at once (which is a win because Hacker News gives you 500 ids).

I guess you could do it with multiple ports but that is such a pain... I mean, you can... but it's not code you'd like to write. Ideally, you'd want a function called "parallel" that's analogous to "sequence". 

parallel : List (Task error value) -> Task error (List value)

and then you can batch them. So, rather than saying, 1) get one story, 2) send the story to the new stories mailbox 3) get the next story, etc...
you'd say something like 1) get 20 stories, 2) send the 20 stories to the new stories mailbox, 3) get the next 20 stories, etc...

John Mayer

unread,
Apr 22, 2015, 3:19:54 PM4/22/15
to elm-d...@googlegroups.com

You could use spawn also. Each thread sends to the same mailbox, and you collect them in the signal graph.

Hassan Hayat

unread,
Apr 22, 2015, 3:40:45 PM4/22/15
to elm-d...@googlegroups.com, john.p....@gmail.com
Thanks John! That did it.

I updated the gist to show how it works. Now the requests come in blazingly fast!

https://gist.github.com/TheSeamau5/1dc5597a2e3b7ae5f33e

Now I can write that walkthrough properly! :D

Hassan Hayat

unread,
Apr 22, 2015, 3:45:36 PM4/22/15
to elm-d...@googlegroups.com, john.p....@gmail.com
What's nice with this approach is that I could come up with a "parallel" function that's really useful. You just need to pass the address (which makes it slightly unlike "sequence")

parallel : Address a -> List (Task error a) -> Task error (List ThreadID)
parallel address tasks = 
  let
      sendToAddress task = spawn (task `andThen` send address) 
  in
      sequence (List.map sendToAddress tasks)

Again, thanks John for the tip!

Ryan Ashcraft

unread,
Apr 22, 2015, 4:16:25 PM4/22/15
to elm-d...@googlegroups.com, john.p....@gmail.com
Thanks so much Hassan and John! Hassan, thanks so much for taking the time to write this up! This is incredibly useful for me. I'm going to spend some time later to try to fully understand it all.

Ryan

Dobes Vandermeer

unread,
Apr 22, 2015, 4:18:14 PM4/22/15
to elm-d...@googlegroups.com, john.p....@gmail.com

Maybe you could write a task combinator like this?

inParallelWith t1 t2 = (spawn t1) `andThen` t2

And use it like to have parallel actions?

t1 `inParallelWith` t2 `inParallelWith` t3

The use of "address" in your example seems like just a convenience, in theory you could have attached the "sendTo" action to each task before passing it to parallel.  But that would be more work for the user of parallel.


Hassan Hayat

unread,
Apr 22, 2015, 4:38:06 PM4/22/15
to elm-d...@googlegroups.com, john.p....@gmail.com
Yup, I just realized you can actually do it without passing in the address:

parallel = 
  sequence << List.map spawn

So, yeah, then you need the task to send the stuff to the address a priori, which is probably a better idea (cuz it's more flexible this way). 

John Mayer

unread,
Apr 22, 2015, 5:02:33 PM4/22/15
to elm-d...@googlegroups.com

We could do better by implementing a native MVar primitive, which would allow inter-"thread" communication, enabling the construction of proper async combinators.

Hassan Hayat

unread,
Apr 22, 2015, 5:39:02 PM4/22/15
to elm-d...@googlegroups.com, john.p....@gmail.com
We could do better by implementing a native MVar primitive, which would allow inter-"thread" communication, enabling the construction of proper async combinators.

That would be very cool indeed. 

I think for the time being, I'd recommend adding something like the little function I came up with to the core library and for a later release figure out how to best get MVars.

Samuel Rødal

unread,
Apr 22, 2015, 5:52:12 PM4/22/15
to elm-d...@googlegroups.com, john.p....@gmail.com
I played a bit with this problem too, trying to solve the batching up of 10 requests at the time before rendering. Here's what I ended up with:

foldT : (a -> b -> Task x b) -> b -> List a -> Task x b
foldT f value list
= case list of
   
[] -> succeed value
    x
:: xs -> f x value `andThen` (\value' -> foldT f value' xs)


type
alias Dispatcher a x =
 
{ batchSize       : Int,
    count          
: Int,
    list            
: List a,
    process        
: List a -> Task x ()
 
}


makeDispatcher
: (List a -> Task x ()) -> Int -> Dispatcher a x
makeDispatcher process batchSize
= Dispatcher batchSize 1 [] process


dispatchFlush
: Dispatcher a x -> Task x ()
dispatchFlush d
= d.process (List.reverse d.list)


dispatch
: a -> Dispatcher a x -> Task x (Dispatcher a x)
dispatch message d
=
    let
        list
= message :: d.list
   
in
       
if d.batchSize == d.count then
            d
.process (List.reverse list)
           
`andThen` (\_ -> succeed (Dispatcher d.batchSize 1 [] d.process))
       
else
            succeed
(Dispatcher d.batchSize (d.count + 1) list d.process)


progressivelyDispatch
: (List a -> Task x ()) -> Int -> List a -> Task x ()
progressivelyDispatch process batchSize list
=
    foldT dispatch
(makeDispatcher process batchSize) list
   
`andThen` dispatchFlush


progressivelyGetStories
: List Int -> Task Error ()
progressivelyGetStories list
=
    progressivelyDispatch
(processStories newStoryMailbox.address) 10 list

processStories
: Signal.Address (Maybe Story) -> List Int -> Task Http.Error ()
processStories address list
=
    sequence
(List.map getStory list)
   
`andThen` (sequence << List.map (send newStoryMailbox.address << Just))
   
`andThen` \_ -> succeed ()


As noted above it's pretty slow as all the requests happen in sequence. We can of course use the parallel approach as suggested by John and implemented by Hassan, but then the results all arrive in semi-random order and we're still not batching up the rendering.

To engage in some wishful thinking, if Task instead had the following API:

spawn : Task x a -> Task y (ThreadID x a)

await
: ThreadID x a -> Task x a

Then we could implement processStories as this instead:

parallel : List (Task x a) -> Task y (List (ThreadID x a))
parallel
= sequence << List.map spawn


processStories
: Signal.Address (Maybe Story) -> List Int -> Task Http.Error ()
processStories address list
=
    parallel
(List.map getStory list)
   
`andThen` (sequence << List.map await)
   
`andThen` (sequence << List.map (send newStoryMailbox.address << Just))
   
`andThen` \_ -> succeed ()

And thus make sure the full batch of stories have finished fetching in parallel, before we send them all to the mailbox.

Btw, maybe the foldT function above could also be useful as part of Task.

--
Samuel

Hassan Hayat

unread,
Apr 22, 2015, 6:27:07 PM4/22/15
to elm-d...@googlegroups.com, john.p....@gmail.com
Cool. I think that should be turned into a library. 

For await, yeah, I wish so too. Currently those ThreadIDs are pretty useless (aside from debugging).

I made a PR here: https://github.com/elm-lang/core/pull/224 to add that little function. I think it'll come in handy for those 80% use cases like the one described by Ryan.

Evan Czaplicki

unread,
Apr 22, 2015, 6:33:01 PM4/22/15
to elm-d...@googlegroups.com, john.p....@gmail.com
I'd go with Hassan's approach for now. You have the order of the items from the list of IDs so even if the results come in in a non-deterministic order, you can show them in order.

The big answer here is that we want to have channels as in "concurrent programming in ml". This would let us create a channel, spawn a ton of tasks, and then all report back to that channel. The task is done once we've heard from all the threads and order can be preserved. That's the proper implementation of "parallel". When (kill : ThreadID -> Task x ()) exists, we can implement "race" in roughly the same way: create a channel, spawn all the tasks, when you get an answer kill all the remaining threads.

I left this stuff out of 0.15 so that it could be released in a timely manner, but it's on the roadmap. I don't know if it can all be done without changing the task implementation at all, but I'd ask that you read "concurrent programming in ml" before taking a swing at this. Concurrency is very hard and if you try to reinvent it yourself it's probably gonna go bad.
--
You received this message because you are subscribed to the Google Groups "Elm Discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elm-discuss...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


--
Sent from Gmail Mobile

Samuel Rødal

unread,
Apr 22, 2015, 6:47:29 PM4/22/15
to elm-d...@googlegroups.com, john.p....@gmail.com
Indeed, it should be possible to do the batching by fetching all the stories asynchronously like Hassan is doing, and then use Signal.foldp or some similar mechanism on the other side of the mailbox. It'd make sure to only add a result to the list that gets fed to the view once all the preceding results are also ready.
To unsubscribe from this group and stop receiving emails from it, send an email to elm-discuss+unsubscribe@googlegroups.com.

For more options, visit https://groups.google.com/d/optout.

Hassan Hayat

unread,
Apr 22, 2015, 10:14:46 PM4/22/15
to elm-d...@googlegroups.com
Here, as promised, I made a walkthrough of the code that sorta serves as a mini tutorial of tasks and how it relates to the Elm Architecture.

https://gist.github.com/TheSeamau5/1dc5597a2e3b7ae5f33e

I also cleaned up the code so it's more readable and friendlier. 


On Wednesday, April 22, 2015 at 5:11:13 PM UTC+1, Ryan Ashcraft wrote:

Marko Dvecko

unread,
Mar 13, 2016, 5:04:02 AM3/13/16
to Elm Discuss, john.p....@gmail.com
Hi. I'm independent software developer and playing with Elm for my internal projects. 
One use case that I'm having is to be able to send multiple http requests and to be able to get http responses in non deterministic order but with option to remove some responses if new request was already sent. I'm the end I'll end up with ordered responses without the ones that doesn't interests us any more. In2 Rx it can be done by flatMapLatest operator but that is with dynamic graphs.
Now be real example could be flickr application.
User types in search box topic that he wants to see. As user types he is producing stream of topics that he wants to see. Every event on topic stream will send http request.
There is one scenario:
  1. http request for "z"
  2. http request for "za"
  3. http response for "z" will be ingored because we are interested in "za"
  4. http response for "za" will be accepted
  5. http request for "zag"
  6. http request for "zagr"
  7. http response for "zagr" will be accepted
  8. http response for "zag" will be ignored because we are interested in "zagr"

This functionality can increase user experience because application will be responsive and he will get result that interests him.
You can check this example that is done with RxJS to get more information. http://www.cs.berkeley.edu/~bodik/cs164/sp13/lectures/temporary-lectures/22-rx-sp13.pdf go to 45 page pictures.

Regards,
marko

On Thursday, April 23, 2015 at 12:33:01 AM UTC+2, Evan wrote:
To unsubscribe from this group and stop receiving emails from it, send an email to elm-discuss+unsubscribe@googlegroups.com.

For more options, visit https://groups.google.com/d/optout.
Reply all
Reply to author
Forward
0 new messages