--
You received this message because you are subscribed to the Google Groups "Redis DB" group.
To post to this group, send email to redi...@googlegroups.com.
To unsubscribe from this group, send email to redis-db+u...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/redis-db?hl=en.
Cheers,
Salvatore
> --
> You received this message because you are subscribed to the Google Groups
> "Redis DB" group.
> To post to this group, send email to redi...@googlegroups.com.
> To unsubscribe from this group, send email to
> redis-db+u...@googlegroups.com.
> For more options, visit this group at
> http://groups.google.com/group/redis-db?hl=en.
>
--
Salvatore 'antirez' Sanfilippo
open source developer - VMware
http://invece.org
"We are what we repeatedly do. Excellence, therefore, is not an act,
but a habit." -- Aristotele
Though it is fairly old, gzip (and the underlying zlib compression
scheme) is actually fairly fast and is available on literally every
platform you could want to use (never mind that the zlib license is
about as liberal as you can get, even more so than BSD). Unless you
have huge numbers of processor cycles to spare, I would recommend
against bzip2; it runs roughly 10-20x slower than gzip, with typically
a modest reduction in compressed data size.
If your platform has no existing compression libraries, does not
support data compression using zlib/gzip, or has almost no spare
cycles (latency reasons), only then would I recommend using lzf.
Otherwise zlib/gzip generally fits a much broader range of compression
ratio vs. time tradeoff ranges.
- Josiah
Quite a few frameworks use local caching to store
pre-rendered/pre-compiled templates, which is typically sufficient in
a lot of cases. If I remember correctly (or believe the rumors),
Reddit pre-renders basically all of it's pages and stores them in
Redis.
Storing large chunks of text in Redis isn't necessarily a bad idea.
Heck, in a modestly sized Redis box, you could probably store the
entire back archives of the New York Times (only the article text, not
the markup copied/pasted on every page). Compressing the data would
just push the number of articles you could store even higher.
- Josiah