Bucket Storage Upsert

28 views
Skip to first unread message

kamaldeen aiyeleso

unread,
May 8, 2021, 9:07:43 AM5/8/21
to google-cloud-storage-discuss
Hi everyone, 

I have an objective of creating a scheduled firestore export to a cloud storage bucket. This was succesful but however, not in the manner that I expected or wanted. 

I observed that for every scheduled trigger for running the export from firestore to the cloud storage bucket, a new folder is created to backup the firestore data in the storage bucket. 

I had wanted an update of the existing folder of the first export. Is this a design limitation or there are other ways around having an updated file of my firestore data to cloud storage bucket in the same folder name? 

Thanks.

knoparast

unread,
May 11, 2021, 7:03:32 PM5/11/21
to google-cloud-storage-discuss
Hi,

I am not sure what steps you are taking to make this pipeline, however you may consider assigning a lifecycle management configuration to the destination bucket (ie. Delete object action & Age condition) so that you can only keep one object in your bucket. I should note that,  folders/sub-directories are a convenience to help you organize objects in a bucket; and as explained in this public documentation, folder name is considered as a part of the object name.

Imran Khan

unread,
May 17, 2021, 9:18:14 AM5/17/21
to google-cloud-storage-discuss
Hi,

One workaround solution could be to use Cloud functions. As soon as firestore exports the file in the bucket, let's say in a new folder, a cloud Functions can be triggered to copy that file in another bucket. If the file already exists in the target bucket it'll overwrite it. Get more details about cloud Functions, here.

Reply all
Reply to author
Forward
0 new messages