Serve old data while fetching new (second round)

10 views
Skip to first unread message

Fabricio Brasiliense

unread,
Mar 10, 2016, 3:37:49 PM3/10/16
to spray.io User List
Many problems related with response variance of a service could be solved by answering an already cached value, and then, update cache asynchronously.

The only good result searching was
https://groups.google.com/forum/#!topic/spray-user/U1c48z4A6A0

The problem is that from my tests it could cause multiples async updates.

Like the author, I prefer a already proved solution that try my own.

Did someone have a better ready for production?

I usually prefer this async update over the classic implementations. Is there any reason why this cache style be so underground? I am missing something? Like some critical drawback?

Thanks,
Sisso

Fabricio Brasiliense

unread,
Mar 18, 2016, 1:15:32 PM3/18/16
to spray.io User List
Probably nobody is interested. I take some time to play with the code and find my own solution that solve the problem.

Here is the link if some random one become interested.
https://gist.github.com/sisso/0309e0677da4e60b4044
Reply all
Reply to author
Forward
0 new messages