Exporting Multiple Tables from a Postgres Instance

700 views
Skip to first unread message

Kevin Marsh

unread,
Jan 17, 2018, 5:23:18 PM1/17/18
to Google Cloud SQL discuss
I'm trying to get a dump of a Postgres database while avoiding several large tables. I've tried several methods to specify certain only tables, as the docs seem to say you should be able to. I've tried:
  • gcloud sql instances export db-foo gs://db-backups/$(date +%s)-sql.gz --database foo_production -t leads,issues (only exported leads)
  • gcloud sql instances export db-foo gs://db-backups/$(date +%s)-sql.gz --database foo_production -t leads -t issues (only exported issues)
  • gcloud sql instances export db-foo gs://db-backups/$(date +%s)-sql.gz --database foo_production --table=leads,issues (only exported leads) 
Is anyone else having this issue?

George (Cloud Platform Support)

unread,
Jan 17, 2018, 8:40:20 PM1/17/18
to Google Cloud SQL discuss
Hello Kevin, 

Your commands don't seem to allow for a space between leads,issues. You didn't try running the command with  --table=leads -t issues.  What is the output of the command when run with the gcloud wide flag --verbosity, and its value set to debug? 

Kevin Marsh

unread,
Jan 19, 2018, 4:40:36 PM1/19/18
to Google Cloud SQL discuss
George,

That particular incantation doesn't make much sense to me, but here's the result when running that with an increased verbosity:

$ gcloud sql instances export va-production gs://va-backups/$(date +%s)2-sql.gz --database vintageaerial_production --table=leads -t issues --verbosity=debug
DEBUG: Running [gcloud.sql.instances.export] with arguments: [--database: "['vintageaerial_production']", --table: "['issues']", --verbosity: "debug", INSTANCE: "va-production", URI: "gs://va-backups/15163977432-sql.gz"]
Exporting Cloud SQL instance...done.
INFO: Display format "value(.)".

Still seems to only pick up the issues table, not the second one.

George (Cloud Platform Support)

unread,
Jan 19, 2018, 11:18:05 PM1/19/18
to Google Cloud SQL discuss
Hi Kevin, 

Is this because of the --table: "['issues'] message? You need to verify the successful completion of the operation by other means than the debug messages of the gcloud command. In this respect, you may follow the advice provided through the "Checking the Status of Import and Export Operations" page

James Richardson

unread,
Aug 7, 2020, 11:46:34 AM8/7/20
to Google Cloud SQL discuss
Hi, this thread is a few years old, but this is the exact problem that I am also having. Basically, the "export" process in gcloud sql DOES NOT export multiple tables (as described in the OP).

Does anybody else know the solution? Is it just a bug that has never been fixed? Hope someone can give me more information here as it is very frustrating.

Please do read the OP for more information. It is the exact same problem that I am having.

Mary (Cloud Platform Support)

unread,
Aug 7, 2020, 5:42:35 PM8/7/20
to Google Cloud SQL discuss
Hello James,

Our Cloud SQL product team has been made aware this issue is still being experienced. You can follow the public report here[1]. I recommend to star the issue to receive any updates and to upvote the issue by selecting +1. 
Any updates will be provided via the public facing report. 


James Richardson

unread,
Aug 17, 2020, 9:01:46 AM8/17/20
to Google Cloud SQL discuss
Hi Mary

I have been waiting for a status update on this bug in the issue tracker. There is no activity in the last 2 weeks. Unfortunately this bug does affect the product as it is extremely difficult to migrate data when it is not possible to export multiple tables in the same export. This means I have to create a single export for each table or export the entire database. I do not want to export the entire database, just selected tables.

Best regards.

Elliott (Google Cloud Platform Support)

unread,
Aug 17, 2020, 2:03:17 PM8/17/20
to Google Cloud SQL discuss
Hello James,

Thank you for your patience. The issue is in the process of being assigned to an Engineer. Please note that there is no guaranteed ETA but this issue is being prioritized. You may follow the progress here[1].

Reply all
Reply to author
Forward
0 new messages