For a project based on the Google Cloud Platform recently, one of the very first things that came up was how to easily set up a new database from scratch on the platform from the command line, so that we could automate it to quickly set up new development environments.

To be able to do this, you must install the Google Cloud Command Line Tools as well as login to your account on the command line.


These are the variables that will be used for the script with reasonable defaults.

GCLOUD_DEV_PROJECT="$(gcloud config get-value project)"

We need the name for a temporary Google Storage bucket to store the dump in, the name for the CloudSQL instance we should create, the Google Cloud project to fetch the data from (here we use the current project by calling the gcloud command line tool) as well as a name for the database to create on the SQL server.

Bucket to hold SQL database dump

To be able to import a SQL dump into CloudSQL, it needs to be uploaded to a Google Storage bucket. If you already have one for the project, you can reuse it. If not, create a new bucket to hold this data (change the region eu to your own):

gsutil mb -c nearline -l eu \

Then to copy your local database dump (we assume the name is: a_mysql_dump.sql) up to Google Storage, run the following command:

gsutil cp a_mysql_dump.sql gs://$CLOUDSQL_DEV_BUCKET/dump.sql

Create SQL instance and database

To create the database, you use the create command from the gcloud command line tools. Here you specify the instance size, what region the instance should be in, and what version of MySQL (or Postgres) the server should run. You can easily change this to match your needs.

gcloud sql instances create $CLOUDSQL_INSTANCE \
    --tier=db-g1-small \
    --region=europe-west3 \

Then to create the database on the server, you need to run another command.

gcloud sql databases create $CLOUDSQL_DB_NAME \
    --instance $CLOUDSQL_INSTANCE

If you are doing the setup from a script, it usually takes 10-15 seconds before the instance accepts incoming SQL connections. So add a sleep 20 timeout after this command if you are making a script.

You should also set a root password that you can use to login in to the instance;

gcloud sql users set-password root % \
    --instance $CLOUDSQL_INSTANCE --password "AGreatPassword"

You can use this command to set the password for any SQL user, simple change the root % to the user you wish to update.

Give access to service account

You need to give access to the service account on the CloudSQL server, to the storage bucket where the database dump is stored. You do this by first fetching the service account email of the SQL server:

# This gives you email with a some extra text
gcloud sql instances describe $CLOUDSQL_INSTANCE | grep serviceAccountEmailAddress

# This will only give you the email (using cut)
MYSQL_SERVICE_ACCOUNT=$(gcloud sql instances describe $CLOUDSQL_INSTANCE | grep serviceAccountEmailAddress | cut -d" " -f 2)

The email will be some gibberish like:

To grant access to the bucket, run the following two commands. You need to give write access to the bucket, and read access to the database dump file.

gsutil acl ch -u $(MYSQL_SERVICE_ACCOUNT):R gs://$CLOUDSQL_DEV_BUCKET/dump.sql

(it will not work without write access to the bucket, belive me I’ve tried)

Import database

After all this setup, we are finally ready to import the dump into the CloudSQL database. This is done by running gcloud sql import pointing it the the file we uploaded to the storage bucket:

yes | gcloud sql import sql $CLOUDSQL_INSTANCE $MYSQL_IMPORT_DUMP

If this results in the incredibly unhelpful ERROR: (gcloud.sql.import.sql) ERROR_RDBMS this means there is an error in your SQL database file (or that you did not grant the proper permissions to the Google Storage bucket). Unfortunately there is no real way to get the actual error from the server. You can try importing the dump to a local MySQL server to see the error.

And that’s it! You should now have a fresh CloudSQL server, with your database dump loaded. Enjoy!