1

I have a database that is giving error:

ASCII '\0' appeared in the statement, but this is not allowed unless option --binary-mode is enabled and mysql is run in non-interactive mode. Set --binary-mode to 1 if ASCII '\0' is expected.

I'm including importing the database through the console with gcloud sql import sql mydb gs://my-path/mydb.sql --database=mydb but I don't see in the documentation any flags for binary mode. Is it possible at all?

Optional - is there a way to set this flag when importing through the MySQL Workbench. I haven't seen anything about it there too, but may be I'm missing some setting or something. If there is way to set that flag, then I can import my database through MySQL Workbench.

Thank you.

2
  • Kindly share the following information to troubleshoot this issue. The dump file was exported from Cloud SQL or an on-prem database? Did you follow any Google documentation to export and import the file? Commented Oct 21, 2020 at 13:00
  • I have the database on non cloud service, a different hosting. I used the standard mysqldump -u user -p databasename > file.sql But found this google guide on export. I will give it a try and will updated here, if it worked. Yet, I'm pretty interested to find if there is a specific setting in Workbench that can enable binary mode. Commented Oct 22, 2020 at 14:35

1 Answer 1

2

Depending where the source database is hosted, on Cloud SQL or on an on-premise environment, the proper flags are set during the export, so the dump file is compatible with the target database.

Since you would like to import a file that has been exported from an on-premise environment, mysqldump is the suggested way to perform the export.

First, create a dump file as suggested in the documentation. Make sure to pay attention to the following 2 points:

  • Do not export customer-created MySQL users. This will cause the import to the new instance to fail. Instead, manually create the MySQL users you wish to.
  • Make sure that you have configured the appropriate flags in order to make sure that the dump file will contain all the necessary details you need. Eg triggers, stored procedures etc.

Then, create a Cloud Storage Bucket and upload the dump file to the bucket.

Before proceeding with the import, grant the Storage Object Admin role to the service account of the target Cloud SQL instance. You may do that with the following command:

gsutil iam ch serviceAccount:[SERVICE-ACCOUNT]:objectAdmin gs://[BUCKET-NAME]

You may locate the aforementioned Service Account in the Cloud SQL instance Overview, or by running the following command:

gcloud sql instances describe [INSTANCE_NAME]

The service account will be mentioned at the serviceAccountEmailAddress field.

Now you are able to do the import either from Console, or using the gcloud command or a REST API.

More details in Google documentation

Best Practices for importing/exporting data

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.