Enable DB2 log archiving directly to Amazon S3 in an IBM Db2 database
Ambarish Satarkar, Amazon Web Services
Summary
This pattern describes how to use Amazon Simple Storage Service (Amazon S3) as catalog storage for archive logs that are generated by IBM Db2, without using a staging area.
You can specify DB2REMOTElogarchmeth1
parameter to specify the primary destination for logs that are archived from the current log path. With this capability, you can archive and retrieve transaction logs to and from Amazon S3 directly, without using a staging area.
Amazon S3
Prerequisites and limitations
Prerequisites
Limitations
Only Db2 11.5.7
or later allows log archiving directly to Amazon S3 storage. Some AWS services aren’t available in all AWS Regions. For Region availability, see AWS Services by Region
. For specific endpoints, see Service endpoints and quotas, and choose the link for the service. In all configurations, the following limitations exist for Amazon S3:
AWS Key Management Service (AWS KMS) is not supported.
AWS role-based (AWS Identity and Access Management (IAM)) or token-based (AWS Security Token Service (AWS STS)) credentials are not supported.
Product versions
AWS CLI version 2 or later
IBM Db2 11.5.7 or later
Linux SUSE Linux Enterprise Server (SLES) 11 or later
Red Hat Enterprise Linux (RHEL) 6 or later
Windows Server 2008 R2, 2012 (R2), 2016, or 2019
Architecture
The following diagram shows the components and workflow for this pattern.

The architecture on the AWS Cloud includes the following:
Virtual private cloud (VPC) – A logically isolated section of the AWS Cloud where you launch resources.
Availability Zone – Provides high availability by running the Db2 LUW (Linux, Unix, Windows) workload in an isolated data center within the AWS Region.
Public subnet – Provides RDP (Remote Desktop Protocol) access for administrators and internet connectivity through a NAT gateway.
Private subnet – Hosts the Db2 LUW database. The Db2 LUW instance is configured with the
LOGARCHMETH1
parameter. The parameter writes database log archive files directly to an Amazon S3 path through the gateway endpoint.
The following AWS services provide support:
Amazon S3 – Serves as the durable, scalable storage location for Db2 log archive files.
Amazon Elastic File System (Amazon EFS) – Provides a shared, fully managed file system that Db2 can use for database backups and staging. Db2 can also use Amazon EFS as a mount point for log files before they are archived to Amazon S3.
Amazon CloudWatch – Collects and monitors metrics, logs, and events from Db2 and the underlying EC2 instances. You can use CloudWatch to create alarms, dashboards, and automated responses to performance or availability issues.
Automation and scale
This pattern provides a fully automated solution to store Db2 log archive backup.
You can use the same Amazon S3 bucket to enable log archive of multiple Db2 databases.
Tools
AWS services
Amazon CloudWatch helps you monitor the metrics of your AWS resources and the applications you run on AWS in real time.
AWS Command Line Interface (AWS CLI) is an open source tool that helps you interact with AWS services through commands in your command-line shell.
Amazon Elastic Compute Cloud (Amazon EC2) provides scalable computing capacity in the AWS Cloud. You can launch as many virtual servers as you need and quickly scale them up or down.
Amazon Elastic File System (Amazon EFS) helps you create and configure shared file systems in the AWS Cloud.
AWS IAM Identity Center helps you centrally manage single sign-on (SSO) access to all of your AWS accounts and cloud applications.
Amazon Simple Storage Service (Amazon S3) is a cloud-based object storage service that helps you store, protect, and retrieve any amount of data.
Amazon Virtual Private Cloud (Amazon VPC) helps you launch AWS resources into a virtual network that you’ve defined. This virtual network resembles a traditional network that you’d operate in your own data center, with the benefits of using the scalable infrastructure of AWS.
Other tools
Best practices
Follow the principle of least privilege and grant the minimum permissions required to perform a task. For more information, see Grant least privilege and Security best practices in the IAM documentation.
Epics
Task | Description | Skills required |
---|---|---|
Set up the AWS CLI. | To download and install the AWS CLI, use the following commands:
| AWS systems administrator, AWS administrator |
Configure the AWS CLI. | To configure the AWS CLI, use the following commands:
| AWS systems administrator, AWS administrator |
Create IAM user. | To create an IAM user to use later for the Db2 database connection with Amazon S3, use the following command:
Following is an example of the command:
WarningThis scenario requires IAM users with programmatic access and long-term credentials, which presents a security risk. To mitigate this risk, we recommend that you provide these users with only the permissions they require to perform the task and that you remove these users when they are no longer needed. Access keys can be updated if necessary. For more information, see AWS security credentials and Manage access keys for IAM users in the IAM documentation. | AWS systems administrator |
Create Amazon S3 bucket. | To create an Amazon S3 bucket for storing the database backup, use the following command:
Following is an example command:
| AWS systems administrator |
Authorize the IAM user. | To authorize the newly created IAM user to have Amazon S3 permissions, use the following steps:
| AWS systems administrator, AWS administrator |
Create access key. | To generate an access key to programmatically access Amazon S3 from the DB2 instance, use the following command:
Following is an example of the command:
WarningThis scenario requires IAM users with programmatic access and long-term credentials, which presents a security risk. To mitigate this risk, we recommend that you provide these users with only the permissions they require to perform the task and that you remove these users when they are no longer needed. Access keys can be updated if necessary. For more information, see AWS security credentials and Manage access keys for IAM users in the IAM documentation. | AWS systems administrator |
Create a PKCS keystore. | To create a PKCS keystore to store the key and create a secret access key to transfer the data to Amazon S3, use the following command:
| AWS systems administrator |
Configure DB2 to use the keystore. | To configure DB2 to use the keystore with the
| AWS systems administrator |
Create a DB2 storage access alias. | A storage access alias specifies the Amazon S3 bucket to use. It also provides the connection details such as the username and password that are stored in the local keystore in an encrypted format. For more information, see CATALOG STORAGE ACCESS command To create a storage access alias, use the following syntax:
Following is an example:
| AWS systems administrator |
Task | Description | Skills required |
---|---|---|
Update the | To use the storage access alias that you defined earlier, update the
To separate the logs from other files, specify a subdirectory (that is, the Amazon S3 bucket prefix) Following is an example:
You should see the following message: | AWS systems administrator |
Restart DB2. | Restart the DB2 instance after reconfiguring it for log archiving. However, if | AWS administrator, AWS systems administrator |
Task | Description | Skills required |
---|---|---|
Check the archive log in Amazon S3. | At this point, your database is completely configured to archive the transaction logs directly to the Amazon S3 storage. To confirm the configuration, start executing transactional activities on the database to start consuming (and archiving) the log space. Then, check the archive logs in Amazon S3. | AWS administrator, AWS systems administrator |
Check archive log configuration in | After you check the archive log in Amazon S3, look for the following message in the DB2 diagnostic log
This message confirms that the closed DB2 transaction log files are being archived to the (remote) Amazon S3 storage. | AWS systems administrator |
Related resources
AWS service documentation
AWS security credentials (IAM documentation)
Grant least privilege (IAM documentation)
Manage access keys for IAM users (IAM documentation)
Security best practices in IAM (IAM documentation)
IBM resources