Right now, my pseudo-ORM for angular does a good job caching objects, so it doesn't have to continually hit /profiles/N for the same person over and over, even if you see that person on many pages. It's great the second or third time you visit a discussion thread, or when there's four people posting many times to a single thread (no need to re-load the profile inside the ng-repeat over and over).
The problem I'm facing is that the simple implementation means that when a user visits a page with a lot of profiles to display (a long reply thread in a forum, for example), angular generates a lot of little hits on /profiles/ID to get all the new profiles loaded. They all get generated at about the same time. This is fine in a fast network, but I've noticed some pretty huge performance issues when there's large end-to-end latency. My api server dishes it up in 100 ms or less, but there's 2000 ms of ping between the user's location and that api server. Not great. My users are teachers, and schools often have fairly cruddy internet performance.
Ideally, what I want to do is: whenever somebody hits $http.get, wait a few hundred ms, and accumulate requests over that time, then fire them all off in a batch request. My backend server uses HAPI, so I can use Bassmaster to handle the batch request with minimal overhead.
That way, when the user first loads that long, many-reply post, the client will submit one batch request for profile IDs 1-99 instead of asking for profile 1, profile 2, and so on. Then when the batch returns, I can manually resolve all the $http promise objects with their relevant content.
I've got some ideas about how to handle this, but some of them are hacky and others are kind of specific to my own implementation (override Model.find behavior instead of doing an $http interceptor). So: has anyone tried doing this in the past? Ideas? Libraries?
e