Has anyone ever tried building a debounce / batch intercepter into $http?

128 views
Skip to first unread message

Eric Eslinger

unread,
Aug 15, 2014, 8:37:52 PM8/15/14
to ang...@googlegroups.com
Right now, my pseudo-ORM for angular does a good job caching objects, so it doesn't have to continually hit /profiles/N for the same person over and over, even if you see that person on many pages. It's great the second or third time you visit a discussion thread, or when there's four people posting many times to a single thread (no need to re-load the profile inside the ng-repeat over and over).

The problem I'm facing is that the simple implementation means that when a user visits a page with a lot of profiles to display (a long reply thread in a forum, for example), angular generates a lot of little hits on /profiles/ID to get all the new profiles loaded. They all get generated at about the same time. This is fine in a fast network, but I've noticed some pretty huge performance issues when there's large end-to-end latency. My api server dishes it up in 100 ms or less, but there's 2000 ms of ping between the user's location and that api server. Not great. My users are teachers, and schools often have fairly cruddy internet performance.

Ideally, what I want to do is: whenever somebody hits $http.get, wait a few hundred ms, and accumulate requests over that time, then fire them all off in a batch request. My backend server uses HAPI, so I can use Bassmaster to handle the batch request with minimal overhead.

That way, when the user first loads that long, many-reply post, the client will submit one batch request for profile IDs 1-99 instead of asking for profile 1, profile 2, and so on. Then when the batch returns, I can manually resolve all the $http promise objects with their relevant content.

I've got some ideas about how to handle this, but some of them are hacky and others are kind of specific to my own implementation (override Model.find behavior instead of doing an $http interceptor). So: has anyone tried doing this in the past? Ideas? Libraries?

e

Sander Elias

unread,
Aug 16, 2014, 1:42:33 AM8/16/14
to ang...@googlegroups.com
Hi Eric,

The idea you have is sound, however a generic solution to this is a very hard thing to create. The main reason being
that it depends on the server side.  So a solution that works in your case will seldom work for anyone elses.

Also there are a lot of unknown variables. like the size of your poster base, and the size of every profile
If those are reasonable small a solution might be, to send off the entire thing in 1 go, and pre-populate your cache

Also you might include the profiles in the initial request on the server, and again pre-populate your cache

another solution is entire client-side. before rendering you page, loop through your result, compare to the cache, and 
fetch the missing profiles in a single request. (Here I'm assuming your server can provide a suitable answer!). when this
request comes back, stuff it in the cache again, and then render the part of the page where you need this.

Regards
Sander


Eric Eslinger

unread,
Aug 22, 2014, 1:53:53 PM8/22/14
to ang...@googlegroups.com
For what it's worth (in case future searchers find this thread), here's what I did.


It's pretty simple, and I decided to *not* override anything on $http directly with interceptors, but instead provide my own service. That way if I have really important stuff, I can hit $http directly.

Anyway, all it really does is: on a BatchRequest.get call, check to see if there's already a debounce timer counting down (timerPromise) and start one if not. Then it adds the current request to the batch and returns a promise to that request.

When the timer fires, it takes all the currently queued requests, assembles a post request formatted for Bassmaster https://github.com/hapijs/bassmaster and sends that request. Upon getting a response to that post, it resolves or rejects promises accordingly. Bassmaster will return data in order, so request 0 in the POST request ends up at position 0 in the response.

It's worth noting that Bassmaster always returns a 200, even if some of the sub-requests had error codes, so you have to manually deal with error situations. You also lose some data you normally have in a  $http call, but that's not the end of the world. Functionally, it means that instead of passing response.data into my response parser (since http promises resolve to an object with a .data and .headers and that stuff), I just pass response to my parser.

Any feedback is welcome. This seems to be my favorite approach to solving this problem for now, because it doesn't try to predict user behavior or have the server track the state of the client cache. Requests only get sent off if the object isn't already in the cache. I do some clever pre-loading in certain situations (if you are loading a post thread, you're pretty much guaranteed to Need and have Never Loaded all replies to that thread) but only in really constrained situations. The goal is to have as dumb of an API as possible, and handle the cleverness on the client side.

e


--
You received this message because you are subscribed to the Google Groups "AngularJS" group.
To unsubscribe from this group and stop receiving emails from it, send an email to angular+u...@googlegroups.com.
To post to this group, send email to ang...@googlegroups.com.
Visit this group at http://groups.google.com/group/angular.
For more options, visit https://groups.google.com/d/optout.

Reply all
Reply to author
Forward
0 new messages