Replatform SQL on Amazon RDS - AWS Migration Hub Orchestrator

Replatform SQL on Amazon RDS

With Replatform SQL server on Amazon RDS template, you can replatform your SQL Server databases on an instance to Amazon RDS using native backup and restore. You can also migrate databases that are encrypted with transparent data encryption. This template migrates User database items, Certificates, Logins and Agent Jobs that are associated with your SQL Server.

Prerequisites

You must set up the source environment before creating a migration workflow.

Source environment setup

  • Ensure that PowerShell is enabled on the server that contains your SQL server instance.

  • Install AWS.Tools on the server that contains your SQL server instance, with the following command.

Install-Module -Name AWS.Tools.Installer
  • Install the DBA.Tools module on your Windows machine, with the following command.

Cmd: Install-Module dbatools

Creating the migration workflow

Note

You can customize the migration workflow once it has been created. For more information, see Migration workflows.

Application

Select the application you want to migrate. If you do not see the application in the list, you must define it in AWS Application Discovery Service. An Application in this context is considered the unit of migration, and does not refer to applications running on top of your SQL server.

ServerId

Within the Application you defined in the AWS Application Discovery Service, select the serverId of the server which hosts your SQL server.

Source Environment Configuration

The details here help us to identify the details of your source SQL Server.

  • TDE - Check this checkbox if you have TDE enabled on your Databases. If you select this option, your certificates will be migrated to the target server.

  • Migration Mode - This template offers 3 distinct migrations depending on your use-case. 

    • Use only Full backup” - The template will only create a full backup of your databases and restore it on your target.

    • Use Full backup and Differential backup for Cutover” - A full backup of your databases will be created and restored on the target, after which you can mark the databases readonly, and a differential backup and restore will be used to migrate the remainder of the data. 

    • Use Full backup, Differential backup for pre-cutover and T-Log backup for cutover” - A full backup of your databases will be created and restored on the target. When you are getting ready for cutover, a differential backup and restore will be used to migrate the remainder of the data.  Lastly, after you mark your databases readonly, Tail-Log backups will be used to migrate the remainder of the data.

  • Allow Migration Without Direct Connect - This template uploads backup files from your source instance to S3 using the AWS CLI. The database files are transmitted over an HTTPS to AWS S3. However, if you are not comfortable with the backup files travelling over the public Internet, we recommend using AWS Direct Connect with a Public VIF setup. If you are comfortable with this, please select this checkbox. The migration workflow will not create unless you check this checkbox or have the setup mentioned above.

  • Source SQL Server database names - The names of the SQL Databases that you would like to migrate.

  • AWS ADS server ID for your application - See “ServerId” section above.

  • Source SQL Server instance name - The name of your SQL server instance.

  • Backup location - As a part of the migration, this template needs to take backups of your SQL Server. The path specified here is where the backup files will be stored. Please ensure this is an absolute path and has enough space for a Full and Differential backup of your databases.

Running the migration workflow

  • When configuring the Migration Hub Orchestrator plugin, ensure that the username that is provided to connect to your Windows machine has the SYSAdmin permission on the source SQL server instance.

Create AWS Profile on Source Server

  • Create an IAM policy with the following permissions.

{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "s3:PutObject", "kms:GenerateDataKey", "kms:CreateKey" ], "Resource": "*" } ] }
  • Create an IAM user with the above policy attached.

  • Configure a named profile for AWS Command Line Interface that uses the preceding IAM user. For more information, see Using AWS credentials. The credentials stored in the profile are used to upload your backups to a S3 bucket located in your account. Note the name of this profile and enter it into the step when prompted.

Create your RDS Database

This template does not create your RDS instance for you. 

  • Deploy an Amazon RDS SQL server with the same version as the source SQL server.

  • Configure the target Amazon RDS SQL server with the same parameter groups as the source SQL server.

  • Configure the option group for backup/restore and transparent data encryption in Amazon RDS, and attach the following policies to the created IAM role.

    { "Version": "2012-10-17", "Statement": [ { "Sid": "VisualEditor0", "Effect": "Allow", "Action": [ "kms:Decrypt", "s3:ListAllMyBuckets", "kms:DescribeKey" ], "Resource": "*" }, { "Sid": "VisualEditor1", "Effect": "Allow", "Action": [ "s3:ListBucket", "s3:GetBucketAcl", "s3:GetBucketLocation" ], "Resource": [ "*" ] }, { "Sid": "VisualEditor2", "Effect": "Allow", "Action": [ "s3:PutObject", "s3:GetObject", "s3:AbortMultipartUpload", "s3:ListMultipartUploadParts" ], "Resource": [ "*" ] } ] }
  • The trust policy for this role should be:

    { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Service": "rds.amazonaws.com" }, "Action": "sts:AssumeRole" } ] }

Create attached EC2 Instance

  • Deploy an Amazon EC2 instance and create an instance role.

    • Attach the AWSMigrationHubOrchestratorInstanceRolePolicy and AmazonSSMManagedInstanceCore managed policies to this role.

    • Add the following permissions to this role.

{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "s3:ListBucket" ], "Resource": [ "arn:aws:s3:::migrationhub-orchestrator-*", "arn:aws:s3:::aws-migrationhub-orchestrator-*/*" ] } ] }
  • Ensure that your Amazon RDS instance can be reached from the created Amazon EC2 instance.

  • This instance is used to connect to your RDS instance and run restore procedures.

Create Target SQL Server User 

  • Provide credentials in AWS Secrets Manager for the username and password for the admin user for your RDS Server.

    1. Sign in to https://console.aws.amazon.com/secretsmanager/

    2. On the AWS Secrets Manager page, select Store a new secret.

    3. For Secret type, select Other type of secret and enter the following keys.

      • username - enter your username

      • password - enter your password

    4. Select Next and enter a name for the key pair beginning with migrationhub-orchestrator-secretname123.

      • The Secret ID must begin with the prefix migrationhub-orchestrator- and must only be followed by an alphanumeric value.

    5. Select Next and then, select Store.

    6. Copy the name of this secret and provide it to the workflow step when prompted.

FAQ

Q. What does this template do?

A. This template migrates User Database Items, Certificates, Agent Jobs and Logins from a source SQL server to a target SQL Server hosted on RDS.

Q. Do I need to create the target SQL Server?

A. Yes. This template focuses on data migration. You need to setup the target SQL server before using this template. 

Q. What kind of backups do you use for migration?

A. Based on your input, we use either only a full backup, a combination of full and differential backups or a combination of full, differential and tail-log backups for migration.

Q. When do I need to put my databases in ‘readonly’ mode?

A. Based on the type of migration selected there are different points to do this -

  • For full backup only migrations set the databases to readonly before begging the migration workflow.

  • For full and differential backup migrations, set the databases to read only when instructed to do so on Step 4.1 in the workflow.

  • For full, differential and tail-log backups, set the databases to read only when instructed to do so on Step 4.4 in the workflow.

These different configurations help us ensure we capture all the changes in your SQL server when migrating, to create parity between your source and target.

Q.What security measures do you take while migrating data?

A. Our goal is to handle data securely at all times. Backup files are transferred over an HTTPS connection to AWS S3, and then to your EC2 instance, before being restored to RDS. Certificate files are handled specially, as the workflow creates a KMS key in your account, which is used to encrypt the certificate before transport, and de-crypt the certificate in your target environment.

Q.Why do I need to create an EC2 instance?

A. The EC2 instance you create is used to run the restore procedures on your RDS endpoint. It is designed this way so that your RDS endpoint does not need to be exposed to the public internet for the restore procedure.

Q.What are the limitations of this template?

A. This template will not do the following:

  • This template does not migrate System Databases or SQL Server properties.

  • This template can only migrate SQL logins. Any Windows level logins are not guaranteed to be migrated.

  • This template expects that while the workflow is running, you will not initiate a full-backup of the database yourself. If a full-backup is taken, it breaks the chain of backups used to restore your databases on the target server.

  • This template can only migrate databases that have the “DBO” set as a sql user or AD user. If the database is owner is a Windows level user which is not available in the RDS environment, the database will be inaccessible when restored on RDS.

Q. I ran into an error during a database connection step. What do I do?

A. An error here indicates a problem with connecting to your SQL Server.

  • If this occurs on the source, ensure the user that was given to the plugin to connect to the machine has SYSADMIN permissions on your source SQL server.

  • If this happens on the target, ensure that the EC2 Instance ID provided during workflow creation has connectivity to your RDS Endpoint, and ensure the SQL credentials stored in your Secrets Manager Secret are correct.

Q. I ran into an error during a database validation step. What do I do?

A. An error here indicates an incompatibility between the inputs provided during workflow creation and the target environment. Look at the step logs located inside the S3 bucket shown in the error message to diagnose the issue and re-create the workflow with the appropriate inputs.

Q. I ran into an error while a backup step was running. What do I do?

A. If you run into an error during the backup steps, look at the step logs located inside the S3 bucket shown in the error message. Once you diagnose and fix the issue, please clean the appropriate backup directory on the source machine before re-trying this step.

Q. I ran into an error while a restore step was running. What do I do?

A. If you run into an error during the restore steps, look at the step logs located inside the S3 bucket shown in the error message. If you have taken a full backup after the workflow started, the backup chain is broken and hence this workflow cannot be recovered. You will have to delete the workflow, wipe your target SQL server and re-create the workflow.