New Cache Support on AMS

305 views
Skip to first unread message

João Moura

unread,
Oct 20, 2014, 5:16:55 PM10/20/14
to rails-a...@googlegroups.com
Hey people, what's up?

I've just started to work on the Cache support to the 0.10.0 version of AMS.
Before focus on optimising it I'd like to know how you feel about a new syntax proposal to the old "cached" method.

There are a three of things I don't like on the cache support at older versions (0.8.0):

class PostSerializer < ActiveModel::Serializer
 cached
  attributes :title, :body
  def cache_key
   [object, scope]
 end
end

1. If you want to define a custom cache_key you need to create a method into your serializer
2. You can't set an auto-expire on it's definition
3. You can't easily know which will be your final cache_key, so you can't easily expire with a custom code.

My new syntax proposal is something like: (There is a gist with it too: https://gist.github.com/joaomdmoura/f85a38fe10c1723f00e1)

class PostSerializer < ActiveModel::Serializer
 cache cache_key: 'my-posts', expires_in: 3.days
 
 attributes :title, :body
  url :post
end

I changed the method name to "cache". It has an optional argument that is an options hash.
The accepted options would be "cache_key" and "expire_in".

So, any thoughts? 

John Bohn

unread,
Oct 21, 2014, 8:58:42 AM10/21/14
to rails-a...@googlegroups.com
I like the new syntax. The thing I might change is the name of the `cache_key` key. Having the word cache twice is a bit funny looking. I'd propose dropping the `cache_` prefix like the following:

class
PostSerializer < ActiveModel::Serializer
 cache key: 'my-posts', expires_in: 3.days
 
 attributes :title, :body
  url :post
end

João Moura

unread,
Oct 21, 2014, 9:06:00 AM10/21/14
to John Bohn, rails-a...@googlegroups.com
Great John! I Totally agree. Indeed it makes more sense.
I’ll update it!

João M. D. Moura


--
You received this message because you are subscribed to the Google Groups "rails-api-core" group.
To unsubscribe from this group and stop receiving emails from it, send an email to rails-api-cor...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

João Moura

unread,
Oct 21, 2014, 3:21:21 PM10/21/14
to rails-a...@googlegroups.com, jjb...@gmail.com
There are some tricks related to the cache key option.

Keep in mind that a Serializer represents one single object, and it call itself recursively when parsing an array of objects. (as on an index method return)

There is two ways to cache it:

1. Cache the entire serializer return for each method that uses it
2. Cache every major object that is in it, individually (it may be one or more, as in case of an array of objects)

Both are valid strategies, but the second one seems the best, because it allows AMS to re-use the cached object in different returns, and if a cached object is destroyed it doesn't destroy others that still valid.
There are definitively some hidden tradeoffs in there, but it seems the best way to go.

In order to implement it I'm going to need more then one cache key, so, I'd like to define a pattern related to the cache key

Take the example bellow:

class PostSerializer < ActiveModel::Serializer
 cache key: 'great-post', expires_in: 3.days
 
 attributes :title, :body
  url :post
end

In this case the cache key option is 'great-post'. It will be the prefix to every cache key object. The suffix will be the post id.

class PostsController

    ## Call show method passing id N
    ### The key of the cached object will be "great-post-#{N}"
def show
post = Post.find(params[:id])
render json: post
end

## Call index method
    ### The key of each cached object (5 in this case), will be "great-post-#{post.id}"
def index
posts = Post.limit(5).order('id ASC')
render json: posts
end
end

Sorry for the long post, but I'd like to hear considerations and opinions before move on this.

Best,
To unsubscribe from this group and stop receiving emails from it, send an email to rails-api-core+unsubscribe@googlegroups.com.

Janko Marohnić

unread,
May 19, 2015, 2:18:52 PM5/19/15
to rails-a...@googlegroups.com, jjb...@gmail.com
Hi João,

I have one question about your work on caching for AMS. It seems to me that, at the moment when ActiveRecord objects reach the serializer, they were already fetched from the database. So what is AMS actually caching? As I see from the implementation, there is no HTTP caching involved, so if the database is always hit, what do we need to cache?

From what I understood, you want to cache the actual object serialization. But isn't the speed of serialization uncomparable to the speed of database fetching? For me it seems that, if a user is not using HTTP caching, then they gain this very small performance boost from AMS caching (if the object hasn't changed), but when they do use HTTP caching, then the AMS cache isn't used at all.
To unsubscribe from this group and stop receiving emails from it, send an email to rails-api-cor...@googlegroups.com.

João Moura

unread,
May 19, 2015, 2:44:32 PM5/19/15
to Janko Marohnić, rails-a...@googlegroups.com, jjb...@gmail.com
Hey Janko, thank you for your email!

Indeed, there is no direct HTTP caching involved, but I strongly disagree that the performance boost is small :)

From my own experience, Serializers tend to be simple and thin at the begging but after some while you might end up overriding attributes and relationships, what imply in more requests loading ActiveRecord objects and some runtime code that have huge impact on performance when dealing with a lot of reqs per second. (This implementation caches it all, avoiding the overriding methods too).

Of course, the HTTP cache will boost the performance more than this implementation, but the idea here was to enable developers to easily (just adding one line into their serializers) boost performance and start using cache. It was also a platform to the Fragment Cache feature there is even more useful.

Btw, just letting you know, this feature is already on 0.10-rc together with FragmentCache. 
We released the 0.10 version at last RailsConf, you can check my talk on the release day here:e https://www.youtube.com/watch?v=PqgQNgWdUB8&index=71&list=PLE7tQUdRKcybf82pLlMnPZjAMMMV5DJsK

*There is a small benchmark at the talk comparing old versions and the new one (using cache either).




— João M. D. Moura
Reply all
Reply to author
Forward
0 new messages