Download an exported backup of PG database

23 views
Skip to first unread message

Marcel

unread,
Jan 12, 2019, 9:39:29 AM1/12/19
to Google Cloud SQL discuss
Hi Community,

I am trying to download a postgres database in Google Cloud (size over 5GB) and dowload it to my PC so I can do some research locally.

I can export and save it to a storage bucket. But, when I then go to the bucket and click the file it starts downloading terribly slow. Way slower then my internet connection speed. I tried several storage classes / regions. So I thought, whatever I will just let it run for a few hours, but after like 30 minutes or so the download fails (probably a timeout or something like that).

Any way around this? I need an export of this database locally but I cannot find any way to get this done.

Ps. I could connect to the database from localhost, export it with PG admin, but the database is on internal networking, so it's not publicly accesible. And this seems like a very unlogical way to do it.

What to do? 

Thanks!

diogoa...@google.com

unread,
Jan 18, 2019, 12:11:49 PM1/18/19
to Google Cloud SQL discuss
Please try other methods to download the file from the bucket, like using gsutil, REST API, or the code sample provided here.

A strategy to find out if the issue is with the file or the bucket is this: try downloading the file into a Compute Engine instance, using gsutil for example. If it works properly, it means that the issue is your network.

Marcel

unread,
Jan 24, 2019, 5:36:19 AM1/24/19
to Google Cloud SQL discuss
Hi, it seems to be working now. Earlier I just clicked the filename in the console and it would start downloading but now I did right-click, save link as and it seems to work. Has been downloading for over an hour already. Thanks for your help. 

Op vrijdag 18 januari 2019 18:11:49 UTC+1 schreef diogoa...@google.com:
Reply all
Reply to author
Forward
0 new messages