BigTable for 100GB Data

128 views
Skip to first unread message

akhil baby

unread,
Apr 19, 2021, 11:12:44 PM4/19/21
to Google Cloud Bigtable Discuss
Hi, 

According to the documentation, BigTable is not good for holding data of size less than 1TB. 

I am exploring a database option for holding 100 GB of data (about 400 million records). The table holding this data is being consumed by a Web service and hence the response of reading data from the table should be less than 1second. The writes to the table will be happening once everyday and will be happening as some kind of a batch job from BigQuery to Bigtable (the entire 100GB will be written as part of this batch job only). 

Can BigTable be an option for such a small dataset ? 

Regards, 
Akhil


Anton Gething

unread,
Apr 20, 2021, 12:03:07 PM4/20/21
to Google Cloud Bigtable Discuss
Hi Akhil

Will it be a consistent 100GB? By that I mean, do you expect the data size to shrink with garbage collection?

Another question, what sort of request traffic will you have? Bigtable could be an option for your 100GB dataset and performance requirements but it will also depend on your QPS - Bigtable is optimized for high traffic workloads so you may see some latency if your read requests are quite sporadic.
How many nodes will you be using? ...I assume 1. Too many nodes with not enough data is also not optimal.

The writes batch job sounds like it won't be a problem. 

Regards
Anton
Reply all
Reply to author
Forward
0 new messages