Hi Akhil
Will it be a consistent 100GB? By that I mean, do you expect the data size to shrink with garbage collection?
Another question, what sort of request traffic will you have? Bigtable could be an option for your 100GB dataset and performance requirements but it will also depend on your QPS - Bigtable is optimized for high traffic workloads so you may see some latency if your read requests are quite sporadic.
How many nodes will you be using? ...I assume 1. Too many nodes with not enough data is also not optimal.
The writes batch job sounds like it won't be a problem.
Regards
Anton