Download an exported backup of PG database

23 просмотра
Перейти к первому непрочитанному сообщению

Marcel

не прочитано,
12 янв. 2019 г., 09:39:2912.01.2019
– Google Cloud SQL discuss
Hi Community,

I am trying to download a postgres database in Google Cloud (size over 5GB) and dowload it to my PC so I can do some research locally.

I can export and save it to a storage bucket. But, when I then go to the bucket and click the file it starts downloading terribly slow. Way slower then my internet connection speed. I tried several storage classes / regions. So I thought, whatever I will just let it run for a few hours, but after like 30 minutes or so the download fails (probably a timeout or something like that).

Any way around this? I need an export of this database locally but I cannot find any way to get this done.

Ps. I could connect to the database from localhost, export it with PG admin, but the database is on internal networking, so it's not publicly accesible. And this seems like a very unlogical way to do it.

What to do? 

Thanks!

diogoa...@google.com

не прочитано,
18 янв. 2019 г., 12:11:4918.01.2019
– Google Cloud SQL discuss
Please try other methods to download the file from the bucket, like using gsutil, REST API, or the code sample provided here.

A strategy to find out if the issue is with the file or the bucket is this: try downloading the file into a Compute Engine instance, using gsutil for example. If it works properly, it means that the issue is your network.

Marcel

не прочитано,
24 янв. 2019 г., 05:36:1924.01.2019
– Google Cloud SQL discuss
Hi, it seems to be working now. Earlier I just clicked the filename in the console and it would start downloading but now I did right-click, save link as and it seems to work. Has been downloading for over an hour already. Thanks for your help. 

Op vrijdag 18 januari 2019 18:11:49 UTC+1 schreef diogoa...@google.com:
Ответить всем
Отправить сообщение автору
Переслать
0 новых сообщений