proxy cloud_sql_proxy suddenly gives Error 403: The client is not authorized to make this request., notAuthorized

4,740 views
Skip to first unread message

Pierfrancesco Marsiaj

unread,
Mar 1, 2017, 7:20:32 PM3/1/17
to Google Cloud SQL discuss
I have an SQL cloud instance and a compute Engine with a Centos 7 VM on different projects. I use Utilizzo l'SQL proxy cloud_sql_proxy to connect to the SQL instance with mysql CLI client and via php-mysql (wordpress). accesses are configured properly, VM's service account has a role of SQL administrator on the SQL instance. everything works fine for 30-40 minutes, I can run mysql CLI commands, wordpress works fine.

then suddenly I get on the proxy console: 
2017/03/01 23:17:44 couldn't connect to "database-xxxx:europe-west1:xxxx": ensure that the account has access to "database-xxxx:europe-west1:xxxx" (and make sure there's no typo in that name). Error during createEphemeral for database-xxxx:europe-west1:xxxx: googleapi: Error 403: The client is not authorized to make this request., notAuthorized

and the mysql connection is lost. I tried the proxy authenticated as service account (VM has SQL apis enabled) AND with a service account created on purpose to run the proxy authenticated with the  -credential-file option and the private JSON generated for that service account. works for a while, sometimes 10 minutes, sometimes an hour, maybe more, and then suddenly I get "notAuthorized" errors.
closing and restarting the proxy doesn't help, closing and reopening the SQL connection doesn't help either. 

This happens at differents times of the day, mornings, afternoons, evenings, nights. 
Are there stability issues with this infrastructure? It seems totally unreliable. 
Really reluctant to put my production server here until I don't understand what's going on.

paynen

unread,
Mar 2, 2017, 5:59:51 PM3/2/17
to Google Cloud SQL discuss
Hey Pierfrancesco,

That seems like a strange error - you'd think if it was an auth issue, it would be all-or-nothing, or else it might occur after only 1 hour (maybe an oauth token timeout issue, where usually 3600 seconds - or 1 hour - is the timeout when a refresh token is needed).

However, thankfully it appears that this is a known issue reported on the Cloud SQL github page. 9 days ago, a user "abstrctn" had the same issue (from the sounds of it), and they resolved it by adding the "roles/cloudsql.client" role to the service account. Other users said that adding the service account as a project editor resolved the issue, so that would be worth checking as well.

Could you see about whether that resolves the issue and let me know? I'll be happy to answer any additional questions you have as well.

Cheers,

Nick
Cloud Platform Community Support

David Newgas

unread,
Mar 2, 2017, 6:11:35 PM3/2/17
to Google Cloud SQL discuss
I also looked into this earlier (https://groups.google.com/d/topic/google-cloud-sql-discuss/5jU0Upr7uqY/discussion). Nick is correct, and we are also trying to address fix this so that cloudsql.client is sufficient. I don't believe our fix is in prod yet though, so follow nick's advice in the mean time.

--
You received this message because you are subscribed to the Google Groups "Google Cloud SQL discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to google-cloud-sql-discuss+unsub...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/google-cloud-sql-discuss/7ca5799f-4874-49be-ba7f-d7e272f989e3%40googlegroups.com.

For more options, visit https://groups.google.com/d/optout.

Christopher Ivanovich

unread,
Jan 22, 2020, 10:21:55 AM1/22/20
to Google Cloud SQL discuss
Almost 3 years later, and I am running into this issue, trying to set up an ETL job for my organization, to import a csv file from GCS into a Cloud SQL instance. The only time it has been possible so far is to make the service account a project-wide editor, which is obviously not desirable for us. It was also necessary to add the SQL instance's service account to the GCS bucket's permissions list, but that is insufficient. The service account being used to launch the import task has all of the Cloud SQL roles available already: Editor, Viewer, Client.

Jad El Houssami

unread,
Jan 23, 2020, 3:57:47 PM1/23/20
to Google Cloud SQL discuss
Hello Christopher, 

I can understand how inconvenient it can be to run into permission issues when trying to set up an ETL job. From my understanding, you have a service account being used to launch the import task and it already has all the Cloud SQL roles. Once the import task is launched, you have given the SQL instance's service account permission to access the GCS bucket.

From the Cloud SQL documentation, it explains: “To import data from Cloud Storage, the instance's service account needs to have the Bucket Reader ACL permission set in the project.” In other words, the service account needs to have the ‘roles/storage.legacyBucketReader’ role which is equivalent to the ‘Bucket Reader’ ACL permission. Are you able to confirm if you have already granted that role to your Cloud SQL instance's service account?

Additionally, if you prefer not to give the import task service account the role of project-wide editor, it was previously suggested to try granting it the "roles/cloudsql.client" role instead, have you already given that a try?

Christopher Ivanovich

unread,
Jan 24, 2020, 9:05:50 AM1/24/20
to Google Cloud SQL discuss
The issue in the end had nothing to do with bucket access for the SQL instance's service account, the issue was that the import task-launching service account needed to have cloud.sql.admin role; apparently editor, viewer, and client are inadequate to run an import job. 

Now, I encounter a new issue: gcloud sql import csv for a Mysql database ignores the csv file's NULL values (mysqlimport requires empty/missing fields contain a '\N' value, otherwise the import overrides the default NULL value setting and inserts a zero value or empty string), which BQ extract csv does not insert, so I have to programmatically insert \N into our BQ extract csv files before attempting to upload them.

So, to clarify, I've found that cloud sql import csv does not correctly interpret \N nor missing values as meaning "INSERT NULL", it inserts the literal \N value or 0 values. mysqlimport utility, with some finagling, seems to recognize the \N values as meaning NULL, but it's finicky and easily misinterprets.

Jad El Houssami

unread,
Jan 27, 2020, 3:44:42 PM1/27/20
to Google Cloud SQL discuss
This is a currently known limitation with Cloud SQL CSV imports and exports (issue #1, issue #2), it appears to be affecting null (\N) and newlines (\n). There is currently no ETA, however, I advise you to press on ‘+1’ to bring more visibility to the issue and CC yourself to stay in the loop whenever there are any new updates.

I’m curious if you have tried importing the CSV file (that you exported from BQ) into Cloud SQL without inserting \N into it?

If you’ve already tried the above and it did not work, you may want to also consider Dataflow. It can also be used to transfer BigQuery tables to Cloud SQL (see pricing).
Reply all
Reply to author
Forward
0 new messages