Bucket Storage Upsert

Skip to first unread message

kamaldeen aiyeleso

May 8, 2021, 9:07:43 AMMay 8
to google-cloud-storage-discuss
Hi everyone, 

I have an objective of creating a scheduled firestore export to a cloud storage bucket. This was succesful but however, not in the manner that I expected or wanted. 

I observed that for every scheduled trigger for running the export from firestore to the cloud storage bucket, a new folder is created to backup the firestore data in the storage bucket. 

I had wanted an update of the existing folder of the first export. Is this a design limitation or there are other ways around having an updated file of my firestore data to cloud storage bucket in the same folder name? 



May 11, 2021, 7:03:32 PMMay 11
to google-cloud-storage-discuss

I am not sure what steps you are taking to make this pipeline, however you may consider assigning a lifecycle management configuration to the destination bucket (ie. Delete object action & Age condition) so that you can only keep one object in your bucket. I should note that,  folders/sub-directories are a convenience to help you organize objects in a bucket; and as explained in this public documentation, folder name is considered as a part of the object name.

Imran Khan

May 17, 2021, 9:18:14 AMMay 17
to google-cloud-storage-discuss

One workaround solution could be to use Cloud functions. As soon as firestore exports the file in the bucket, let's say in a new folder, a cloud Functions can be triggered to copy that file in another bucket. If the file already exists in the target bucket it'll overwrite it. Get more details about cloud Functions, here.

Reply all
Reply to author
0 new messages