Migrate Db2 for LUW to Amazon EC2 by using log shipping to reduce outage time
Created by Feng Cai (AWS), Ambarish Satarkar (AWS), and Saurabh Sharma (AWS)
Environment: Production | Source: On-premises Db2 for Linux | Target: Db2 on Amazon EC2 |
R Type: Rehost | Workload: IBM | Technologies: Migration; Databases |
AWS services: AWS Direct Connect; Amazon EBS; Amazon EC2; Amazon S3; AWS Site-to-Site VPN |
Summary
When customers migrate their IBM Db2 for LUW (Linux, UNIX, and Windows) workloads to Amazon Web Services (AWS), using Amazon Elastic Compute Cloud (Amazon EC2) with the Bring Your Own License (BYOL) model is the fastest way. However, migrating large amounts of data from on-premises Db2 into AWS can be a challenge, especially when the outage window is short. Many customers try to set the outage window to less than 30 minutes, which leaves little time for the database itself.
This pattern covers how to accomplish a Db2 migration with a short outage window by using transaction log shipping. This approach applies to Db2 on a little-endian Linux platform.
Prerequisites and limitations
Prerequisites
An active AWS account
A Db2 instance running on EC2 instance that matches the on-premises file system layouts
An Amazon Simple Storage Service (Amazon S3) bucket accessible to the EC2 instance
An AWS Identity and Access Management (IAM) policy and role to make programmatic calls to Amazon S3
Synchronized time zone and system clocks on Amazon EC2 and the on-premises server
The on-premises network connected to AWS through AWS Site-to-Site VPN
or AWS Direct Connect
Limitations
The Db2 on-premises instance and Amazon EC2 must be on the same platform family
. The Db2 on-premises workload must be logged. To block any unlogged transaction, set
blocknonlogged=yes
in the database configuration.
Product versions
Db2 for LUW version 11.5.9 and later
Architecture
Source technology stack
Db2 on Linux x86_64
Target technology stack
Amazon EBS
Amazon EC2
AWS Identity and Access Management (IAM)
Amazon S3
AWS Site-to-Site VPN or Direct Connect
Target architecture
The following diagram shows one Db2 instance running on-premises with a virtual private network (VPN) connection to Db2 on Amazon EC2. The dotted lines represent the VPN tunnel between your data center and the AWS Cloud.
Tools
AWS services
AWS Command Line Interface (AWS CLI) is an open-source tool that helps you interact with AWS services through commands in your command-line shell.
AWS Direct Connect links your internal network to a Direct Connect location over a standard Ethernet fiber-optic cable. With this connection, you can create virtual interfaces directly to public AWS services while bypassing internet service providers in your network path.
Amazon Elastic Block Store (Amazon EBS) provides block-level storage volumes for use with Amazon Elastic Compute Cloud (Amazon EC2) instances.
Amazon Elastic Compute Cloud (Amazon EC2) provides scalable computing capacity in the AWS Cloud. You can launch as many virtual servers as you need and quickly scale them up or down.
AWS Identity and Access Management (IAM) helps you securely manage access to your AWS resources by controlling who is authenticated and authorized to use them.
Amazon Simple Storage Service (Amazon S3) is a cloud-based object storage service that helps you store, protect, and retrieve any amount of data.
AWS Site-to-Site VPN helps you pass traffic between instances that you launch on AWS and your own remote network.
Other tools
db2cli
is the Db2 interactive CLI command.
Best practices
On the target database, use gateway endpoints for Amazon S3 to access the database backup image and log files in Amazon S3.
On the source database, use AWS PrivateLink for Amazon S3 to send the database backup image and log files to Amazon S3.
Epics
Task | Description | Skills required |
---|---|---|
Set environment variables. | This pattern uses the following names:
You can change them to fit your environment. | DBA |
Task | Description | Skills required |
---|---|---|
Set up the AWS CLI. | To download and install the latest version of the AWS CLI, run the following commands:
| Linux administrator |
Set up a local destination for Db2 archive logs. | To keep the target database on Amazon EC2 in sync with the on-premises source database, the latest transaction logs need be retrieved from the source. In this setup,
| DBA |
Run an online database backup. | Run an online database backup, and save it to the local backup file system:
| DBA |
Task | Description | Skills required |
---|---|---|
Create an S3 bucket. | Create an S3 bucket for the on-premises server to send the backup Db2 images and log files to on AWS. The bucket will also be accessed by Amazon EC2:
| AWS systems administrator |
Create an IAM policy. | The
To create the policy, use the following AWS CLI command:
The JSON output shows the Amazon Resource Name (ARN) for the policy, where
| AWS administrator, AWS systems administrator |
Attach the IAM policy to the IAM role used by the EC2 instance. | In most AWS environments, a running EC2 instance has an IAM Role set by your systems administrator. If the IAM role is not set, create the role and choose Modify IAM role on the EC2 console to associate the role with the EC2 instance that hosts the Db2 database. Attach the IAM policy to the IAM role with the policy ARN:
After the policy is attached, any EC2 instance associated with the IAM role can access the S3 bucket. | AWS administrator, AWS systems administrator |
Task | Description | Skills required |
---|---|---|
Configure the AWS CLI on the on-premises Db2 server. | Configure the AWS CLI with the
| AWS administrator, AWS systems administrator |
Send the backup image to Amazon S3. | Earlier, an online database backup was saved to the
| AWS administrator, Migration engineer |
Send the Db2 archive logs to Amazon S3. | Sync the on-premises Db2 archive logs with the S3 bucket that can be accessed by the target Db2 instance on Amazon EC2:
Run this command periodically by using cron or other scheduling tools. The frequency depends on how often the source database archives transaction log files. | AWS administrator, Migration engineer |
Task | Description | Skills required |
---|---|---|
Create a PKCS12 keystore. | Db2 uses a Public-Key Cryptography Standards (PKCS) encryption keystore to keep the AWS access key secure. Create a keystore and configure the source Db2 instance to use it:
| DBA |
Create the Db2 storage access alias. | To create the storage access alias
For example, your script might look like the following:
| DBA |
Set the staging area. | By default, Db2 uses We also recommend using
| DBA |
Restore the database from the backup image. | Restore the target database on Amazon EC2 from the backup image in the S3 bucket:
| DBA |
Roll forward the database. | After the restore is complete, the target database will be put into rollforward pending state. Configure
Start database rollforward:
This command processes all log files that have been transferred to the S3 bucket. Run it periodically based on the frequency of the | DBA |
Task | Description | Skills required |
---|---|---|
Bring the target database online. | During the cutover window, do one of the following:
After the last transaction log is synced into Amazon S3, run the
Bring the target database online, and point the application connections to Db2 on Amazon EC2. | DBA |
Troubleshooting
Issue | Solution |
---|---|
If multiple databases have the same instance name and database name on different hosts (DEV, QA, PROD), backups and logs might go to the same subdirectory. | Use different S3 buckets for DEV, QA, and PROD, and add the hostname as subdirectory prefix to avoid confusion. |
If there are multiple backup images in the same location, you will get the following error when you restore:
| In the
|