ANN: express-cache-on-demand for better performance during traffic surges

87 views
Skip to first unread message

Tom Boutell

unread,
Oct 19, 2014, 12:09:11 PM10/19/14
to nod...@googlegroups.com
We've just released a module that kicks in automatically and provides caching when an Express route is hit by more than one visitor simultaneously:


This is a new approach to caching. Rather than holding on to the data for an arbitrary amount of time, we simply deliver the same data to any visitors who arrive during the period of time we're generating a response for the first user.

After that's done, when a new request arrives, we let the route generate a new response.

So the content is always timely, and yet the server is never overwhelmed.

This middleware is intended for routes that do a lot of work, then deliver a response under 1MB or so. Rendering a web page after fetching lots of related data from different sources is a perfect example.

"Should I apply this to my entire site?" No. This middleware is NOT intended - and will NOT work - for responses that call res.sendFile(), or pipe data into "res". Generally speaking, these kinds of responses don't do a lot of work up front, and you did it this way to avoid holding something big in memory. So just don't apply the middleware to routes of that kind.

The middleware is smart enough to automatically skip the cache if a user is logged in (req.user exists), or their session doesn't look empty, or the request is not a GET or HEAD request.

Looking forward to your feedback!

Floby

unread,
Oct 20, 2014, 5:39:04 AM10/20/14
to nod...@googlegroups.com
This looks promising. We've got this problem right now. One of our routes fetches a lot of data and does a lot of parsing/processing and end up kind of slow. We will definitely try this.
Thank you

Jérémy Lal

unread,
Oct 20, 2014, 2:02:17 PM10/20/14
to nod...@googlegroups.com
This looks similar to varnish's grace mode, isn't that cool ?

Jérémy.

Tom Boutell

unread,
Oct 21, 2014, 8:00:15 AM10/21/14
to nod...@googlegroups.com
Interesting point about varnish-grace, I figured this idea couldn't be completely unique in the universe.

The main difference is that our hasher function is able to inspect 'req' to decide if this particular request is OK to cache. In particular we can examine req.session. And you can override that logic with your own to make your own "game day decision" about which requests are OK to cache.

Tom Boutell

unread,
Oct 22, 2014, 11:23:07 AM10/22/14
to nod...@googlegroups.com
On Sunday, October 19, 2014 12:09:11 PM UTC-4, Tom Boutell wrote:
We've just released a module that kicks in automatically and provides caching when an Express route is hit by more than one visitor simultaneously:




This morning we caught a bug in this module: redirects were not handled properly. We've published the fix in npm and added a unit test.
 
Reply all
Reply to author
Forward
0 new messages