Using Couchbase as replicated cache across data centers.

23 views
Skip to first unread message

Vitaly Elyashev

unread,
May 4, 2015, 10:07:15 AM5/4/15
to couc...@googlegroups.com
Hi,
We're considering to use Couchbase as replicated cache across data centers (using XDCR).

We don't have huge amounts of data and we not going change it too much, but we want to support a lot of data centers - it can be around hundred. 
So single site should be available to replicate to around 100 hundred of different sites and receive updates from these sites as well. 
There are only few sites that need to replicate to all others - most of them will replicate to only couple of sites.

We intended to create bucket per site to be replicated and another bucket to have site updates. So for 100 sites we need 200 buckets, each of them or replicating to or replicating from.

In Couchbase docs we found that there are some issues with instance, having more than 10 buckets. Again we don't need greatest performance during the replication, but we do want to use couchbase as local db and cache and performing well during queries.

And the question: is Couchbase fit our needs and what best configuration?

Best,
Vitaly

Brian Jones

unread,
May 8, 2015, 12:57:10 PM5/8/15
to couc...@googlegroups.com
I've used XDCR quite a bit. The latest Community version, XDCR seems much more stable. We seem to have XDCR dialed in for our data. We move many hundreds of MB daily across a few data centers. I remember previous version of CB would allow you to override the 10 bucket limit, with the side effect of a performance drop. I would suggest creating ONE bucket that is XDCR and create unique keys per web site. Saves your bucket capacity for other things and won't impact performance like trying to move 100 buckets.

Vitaly Elyashev

unread,
May 11, 2015, 9:17:58 AM5/11/15
to couc...@googlegroups.com
Thanks Brian for your help.
My problem is that from security purpose I cannot replicate data that not site related.
But even if I do so, still I need hundred sites replicate to my site. Are you tried to replicate number of buckets to same bucket?
Because if its not possible - I still need hundred buckets.

Brian Jones

unread,
May 11, 2015, 9:49:57 AM5/11/15
to couc...@googlegroups.com
Hi Vitaly,

I've never replicated that many buckets, but pretty sure you can override the bucket limit thru REST API calls. You could test this out, for instance on a cloud service, AWS, GCE pretty inexpensively.

-B
Reply all
Reply to author
Forward
0 new messages