Uploading data to Google’s cloud platform

This project was to refresh data every night from a production SQL Server table to a cloud-based MySQL Server. (see my previous post on preparation).

On the GCP website

I created a minimal MySQL instance (which I think of as a “Server”) with a single zone and a Standard machine type (being 1vCPU and 3.75 GB of memory).

Storage I set to a 100GB HDD. Then added an Authorised Network comprising an ip-address range covering our production servers along with a friendly name. I disabled auto storage increase, backups, and point-in-time recovery.

I created a database specifically for this project, and a User (with password) restricted to the same ip-address as above.

Finally, because I would be copying data from a MS-SQL table into a MySQL table (which is not 100% compatible), I set flag “sql_mode” to “ANSI”.

On the Production SQL Server

I downloaded the current MySQL 32 bit ODBC driver and configured a new ODBC data source.

Then I created a 2-step SQL Job scheduled to run every night.

Step-1 Collects and saves the data into a purpose made table using a stored-procedure. Replacing the old data.

Step-2 Uploads the table from step-1 into the GCP database using an SSIS package.

To help (with tasks like creating, deleting, viewing and testing tables), I installed the MySQL admin tool “My Workbench” locally. Even so, the SSIS package took a time to perfect. It had to :-

  • Connect to the source and target servers.
  • Replace all the target data every time it ran – for robustness.
  • Refer to reserved column names using double-quotes.
  • Ensure target data-type and sizes exactly matched the source.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s