MySQL importing dump fails with "unique_checks=0 which is non deterministic"

588 views
Skip to first unread message

Der Moxli

unread,
Jun 26, 2018, 8:57:28 AM6/26/18
to Google Cloud SQL discuss
Hello everyone,

I am currently trying to import a mysqldump (~60GB), which includes stored procedures and triggers, to my google cloud sql instance.
I have paid attention to the docs and have set log_bin_trust_function_creators to true using terraform.

I have included the triggers and stored procedures in the dump but removed the DEFINER with perl because its not supported.
I tried using the gcloud sql import function and the web console to import the dump from a storage bucket, each time the following error pops up after a couple seconds:

severity: "ERROR" 
textPayload: "2018-06-26T10:48:42.727799Z 87406 [Warning] Using unique_checks=0 which is non deterministic!" 

As far as I know unique_checks is set during dumping to speed up the import afterwards.

I didn't use the parameters suggested in this guide because I found that after doing my dump: https://cloud.google.com/sql/docs/mysql/import-export/creating-sqldump-csv#std

I would like to avoid doing the dump again(60GB you know..).

Does anyone know what this error means and how I can fix this? Can it be fixed with a new dump?

Please let me know if there is anything wrong with my question, this is my first time posting here.

Best regards,

Max

George (Cloud Platform Support)

unread,
Jun 27, 2018, 7:40:22 PM6/27/18
to google-cloud...@googlegroups.com
The “safeness” of a statement in MySQL replication refers to whether the statement and its effects can be replicated correctly using statement-based format. To require statement-based format, you may choose to set the binlog_format system variables used with binary logging to "MIXED". When MIXED is specified, statement-based replication is used. You may gather more detail from the "7.2.1.3 Determination of Safe and Unsafe Statements in Binary Logging" online documentation page. Performing a new dumping operation with this parameter set as described would represent one of the ways to address your issue. It would take considerable more effort to recover the same information from a inadequately performed dump, than starting correctly from the beginning. 
You may also consider setting the unique_checks system variable at session level. 
Reply all
Reply to author
Forward
0 new messages