Amazon S3 integration
You can transfer files between an Amazon RDS for Oracle DB instance and an Amazon S3 bucket. You can use Amazon S3 integration with Oracle features such as Data Pump. For example, you can download Data Pump files from Amazon S3 to the DB instance host.
The DB instance and the Amazon S3 bucket must be in the same AWS Region.
Topics
Prerequisites for Amazon RDS Oracle integration with Amazon S3
To work with Amazon RDS for Oracle integration with Amazon S3, the Amazon RDS DB instance must have access to an Amazon S3 bucket. For this, you create an AWS Identity and Access Management (IAM) policy and an IAM role. The Amazon VPC used by your DB instance doesn't need to provide access to the Amazon S3 endpoints.
To add a role to a DB instance, the status of the DB instance must be available
.
To create an IAM policy to allow Amazon RDS access to an Amazon S3 bucket
-
Open the IAM Management Console
. -
In the navigation pane, choose Policies.
-
Choose Create policy.
-
On the Visual editor tab, choose Choose a service, and then choose S3.
-
For Actions, choose Expand all, and then choose the bucket permissions and object permissions needed for the IAM policy.
Include the appropriate actions in the policy based on the type of access required:
-
GetObject
– Required to transfer files from an Amazon S3 bucket to Amazon RDS. -
ListBucket
– Required to transfer files from an Amazon S3 bucket to Amazon RDS. -
PutObject
– Required to transfer files from Amazon RDS to an Amazon S3 bucket.
Object permissions are permissions for object operations in Amazon S3, and need to be granted for objects in a bucket, not the bucket itself. For more information about permissions for object operations in Amazon S3, see Permissions for object operations.
-
-
Choose Resources, and choose Add ARN for bucket.
-
In the Add ARN(s) dialog box, provide the details about your resource, and choose Add.
Specify the Amazon S3 bucket to allow access to. For instance, if you want to allow Amazon RDS to access the Amazon S3 bucket named
example-bucket
, then set the Amazon Resource Name (ARN) value toarn:aws:s3:::example-bucket
. -
If the object resource is listed, choose Add ARN for object.
-
In the Add ARN(s) dialog box, provide the details about your resource.
For the Amazon S3 bucket, specify the Amazon S3 bucket to allow access to. For the object, you can choose Any to grant permissions to any object in the bucket.
Note You can set Amazon Resource Name (ARN) to a more specific ARN value to allow Amazon RDS to access only specific files or folders in an Amazon S3 bucket. For more information about how to define an access policy for Amazon S3, see Managing access permissions to your Amazon S3 resources.
-
(Optional) Choose Add additional permissions to add another Amazon S3 bucket to the policy, and repeat the previous steps for the bucket.
Note You can repeat this to add corresponding bucket permission statements to your policy for each Amazon S3 bucket that you want Amazon RDS to access. Optionally, you can also grant access to all buckets and objects in Amazon S3.
-
Choose Review policy.
-
For Name, enter a name for your IAM policy, for example
rds-s3-integration-policy
. You use this name when you create an IAM role to associate with your DB instance. You can also add an optional Description value. -
Choose Create policy.
To create an IAM role to allow Amazon RDS access to an Amazon S3 bucket
-
In the navigation pane, choose Roles.
-
Choose Create role.
-
For AWS service, choose RDS.
-
For Select your use case, choose RDS – Add Role to Database.
-
Choose Next: Permissions.
-
For Search under Attach permissions policies, enter the name of the IAM policy you created, and choose the policy when it appears in the list.
-
Choose Next: Tags and then Next: Review.
-
Set Role name to a name for your IAM role, for example
rds-s3-integration-role
. You can also add an optional Role description value. -
Choose Create Role.
To associate your IAM role with your DB instance
-
Sign in to the AWS Management Console and open the Amazon RDS console at https://console.aws.amazon.com/rds/
. -
Choose the Oracle DB instance name to display its details.
-
On the Connectivity & security tab, in the Manage IAM roles section, choose the role to add under Add IAM roles to this instance.
-
For Feature, choose S3_INTEGRATION.
-
Choose Add role.
To grant Amazon RDS access to an Amazon S3 bucket
-
Create an AWS Identity and Access Management (IAM) policy that grants Amazon RDS access to an Amazon S3 bucket.
Include the appropriate actions in the policy based on the type of access required:
-
GetObject
– Required to transfer files from an Amazon S3 bucket to Amazon RDS. -
ListBucket
– Required to transfer files from an Amazon S3 bucket to Amazon RDS. -
PutObject
– Required to transfer files from Amazon RDS to an Amazon S3 bucket.
The following AWS CLI command creates an IAM policy named
with these options. It grants access to a bucket namedrds-s3-integration-policy
.your-s3-bucket-arn
Example
For Linux, macOS, or Unix:
aws iam create-policy \ --policy-name
rds-s3-integration-policy
\ --policy-document '{ "Version": "2012-10-17", "Statement": [ { "Sid": "s3integration
", "Action": [ "s3:GetObject", "s3:ListBucket", "s3:PutObject" ], "Effect": "Allow", "Resource": [ "arn:aws:s3:::your-s3-bucket-arn
", "arn:aws:s3:::your-s3-bucket-arn/*
" ] } ] }'For Windows:
aws iam create-policy ^ --policy-name
rds-s3-integration-policy
^ --policy-document '{ "Version": "2012-10-17", "Statement": [ { "Sid": "s3integration
", "Action": [ "s3:GetObject", "s3:ListBucket", "s3:PutObject" ], "Effect": "Allow", "Resource": [ "arn:aws:s3:::your-s3-bucket-arn
", "arn:aws:s3:::your-s3-bucket-arn/*
" ] } ] }' -
-
After the policy is created, note the Amazon Resource Name (ARN) of the policy. You need the ARN for a subsequent step.
-
Create an IAM role that Amazon RDS can assume on your behalf to access your Amazon S3 buckets.
The following AWS CLI command creates the
for this purpose.rds-s3-integration-role
Example
For Linux, macOS, or Unix:
aws iam create-role \ --role-name
rds-s3-integration-role
\ --assume-role-policy-document '{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Service": "rds.amazonaws.com" }, "Action": "sts:AssumeRole" } ] }'For Windows:
aws iam create-role ^ --role-name
rds-s3-integration-role
^ --assume-role-policy-document '{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Service": "rds.amazonaws.com" }, "Action": "sts:AssumeRole" } ] }'For more information, see Creating a role to delegate permissions to an IAM user in the IAM User Guide.
-
After the role is created, note the ARN of the role. You need the ARN for a subsequent step.
-
Attach the policy you created to the role you created.
The following AWS CLI command attaches the policy to the role named
.rds-s3-integration-role
Example
For Linux, macOS, or Unix:
aws iam attach-role-policy \ --policy-arn
your-policy-arn
\ --role-namerds-s3-integration-role
For Windows:
aws iam attach-role-policy ^ --policy-arn
your-policy-arn
^ --role-namerds-s3-integration-role
Replace
with the policy ARN that you noted in a previous step.your-policy-arn
-
Add the role to the Oracle DB instance.
The following AWS CLI command adds the role to an Oracle DB instance named
.mydbinstance
Example
For Linux, macOS, or Unix:
aws rds add-role-to-db-instance \ --db-instance-identifier
mydbinstance
\ --feature-name S3_INTEGRATION \ --role-arnyour-role-arn
For Windows:
aws rds add-role-to-db-instance ^ --db-instance-identifier
mydbinstance
^ --feature-name S3_INTEGRATION ^ --role-arnyour-role-arn
Replace
with the role ARN that you noted in a previous step.your-role-arn
S3_INTEGRATION
must be specified for the--feature-name
option.
Adding the Amazon S3 integration option
To use Amazon RDS for Oracle Integration with Amazon S3, your Amazon RDS Oracle DB
instance must
be associated with an option group that includes the S3_INTEGRATION
option.
To configure an option group for Amazon S3 integration
-
Create a new option group or identify an existing option group to which you can add the
S3_INTEGRATION
option.For information about creating an option group, see Creating an option group.
-
Add the
S3_INTEGRATION
option to the option group.For information about adding an option to an option group, see Adding an option to an option group.
-
Create a new Oracle DB instance and associate the option group with it, or modify an Oracle DB instance to associate the option group with it.
For information about creating a DB instance, see Creating an Amazon RDS DB instance.
For information about modifying an Oracle DB instance, see Modifying an Amazon RDS DB instance.
To configure an option group for Amazon S3 integration
-
Create a new option group or identify an existing option group to which you can add the
S3_INTEGRATION
option.For information about creating an option group, see Creating an option group.
-
Add the
S3_INTEGRATION
option to the option group.For example, the following AWS CLI command adds the
S3_INTEGRATION
option to an option group namedmyoptiongroup
.Example
For Linux, macOS, or Unix:
aws rds add-option-to-option-group \ --option-group-name
myoptiongroup
\ --options OptionName=S3_INTEGRATION,OptionVersion=1.0For Windows:
aws rds add-option-to-option-group ^ --option-group-name
myoptiongroup
^ --options OptionName=S3_INTEGRATION,OptionVersion=1.0 -
Create a new Oracle DB instance and associate the option group with it, or modify an Oracle DB instance to associate the option group with it.
For information about creating a DB instance, see Creating an Amazon RDS DB instance.
For information about modifying an Oracle DB instance, see Modifying an Amazon RDS DB instance.
Transferring files between Amazon RDS for Oracle and an Amazon S3 bucket
You can use Amazon RDS procedures to upload files from an Oracle DB instance to an Amazon S3 bucket. You can also use Amazon RDS procedures to download files from an Amazon S3 bucket to an Oracle DB instance.
These procedures upload or download the files in a single directory. You can't include subdirectories in these operations.
Topics
Uploading files from an Oracle DB instance to an Amazon S3 bucket
You can upload files from an Oracle DB instance to an Amazon S3 bucket. For example, you can upload Oracle Recovery Manager (RMAN) backup files. The maximum object size in an Amazon S3 bucket is 5 TB. For more information about working with objects, see Amazon Simple Storage Service Developer Guide. For more information about performing RMAN backups, see Common DBA Recovery Manager (RMAN) tasks for Oracle DB instances.
You upload files using the rdsadmin.rdsadmin_s3_tasks.upload_to_s3
procedure. This procedure
has the following parameters.
Parameter name | Data type | Default | Required | Description |
---|---|---|---|---|
|
VARCHAR2 |
– |
required |
The name of the Amazon S3 bucket to upload files to. |
|
VARCHAR2 |
– |
required |
The name of the Oracle directory object to upload files from.
The directory can be any user-created directory object or the
Data Pump directory, such as You can only upload files from the specified directory. You can't upload files in subdirectories in the specified directory. |
|
VARCHAR2 |
– |
required |
An Amazon S3 file name prefix that files are uploaded to. An empty prefix uploads all files to the top level in the specified Amazon S3 bucket and doesn't add a prefix to the file names. For example, if the prefix is |
|
VARCHAR2 |
– |
required |
A file name prefix that file names must match to be uploaded. An empty prefix uploads all files in the specified directory. |
The return value for the rdsadmin.rdsadmin_s3_tasks.upload_to_s3
procedure is a task ID.
The following example uploads all of the files in the
directory to the Amazon S3 bucket named
DATA_PUMP_DIR
.
mys3bucket
SELECT rdsadmin.rdsadmin_s3_tasks.upload_to_s3( p_bucket_name => '
mys3bucket
', p_prefix => '', p_s3_prefix => '', p_directory_name => 'DATA_PUMP_DIR
') AS TASK_ID FROM DUAL;
The following example uploads all of the files with the prefix
in the db
directory to the Amazon S3 bucket named DATA_PUMP_DIR
.
mys3bucket
SELECT rdsadmin.rdsadmin_s3_tasks.upload_to_s3( p_bucket_name => '
mys3bucket
', p_prefix => 'db
', p_s3_prefix => '', p_directory_name => 'DATA_PUMP_DIR
') AS TASK_ID FROM DUAL;
The following example uploads all of the files in the
directory to the Amazon S3
bucket named DATA_PUMP_DIR
. The files are
uploaded to a mys3bucket
dbfiles
folder.
SELECT rdsadmin.rdsadmin_s3_tasks.upload_to_s3( p_bucket_name => '
mys3bucket
', p_prefix => '', p_s3_prefix => 'dbfiles/
', p_directory_name => 'DATA_PUMP_DIR
') AS TASK_ID FROM DUAL;
The following example uploads all of the files in the
directory to the Amazon S3
bucket named DATA_PUMP_DIR
. The files are
uploaded to a mys3bucket
dbfiles
folder and ora
is added to the
beginning of each file name.
SELECT rdsadmin.rdsadmin_s3_tasks.upload_to_s3( p_bucket_name => '
mys3bucket
', p_prefix => '', p_s3_prefix => 'dbfiles/ora
', p_directory_name => 'DATA_PUMP_DIR
') AS TASK_ID FROM DUAL;
In each example, the SELECT
statement returns the ID of the task in a
VARCHAR2
data type.
You can view the result by displaying the task's output file.
SELECT text FROM table(rdsadmin.rds_file_util.read_text_file('BDUMP','dbtask-
task-id
.log'));
Replace
with the task ID returned by the procedure.
task-id
Tasks are executed asynchronously.
Downloading files from an Amazon S3 bucket to an Oracle DB instance
To download files from an Amazon S3 bucket to an Oracle DB instance, use the Amazon
RDS
procedure rdsadmin.rdsadmin_s3_tasks.download_from_s3
. The
rdsadmin.rdsadmin_s3_tasks.download_from_s3
procedure has the
following parameters.
Parameter name | Data type | Default | Required | Description |
---|---|---|---|---|
|
VARCHAR2 |
– |
required |
The name of the Amazon S3 bucket to download files from. |
|
VARCHAR2 |
– |
required |
The name of the Oracle directory object to download files to.
The directory can be any user-created directory object or the
Data Pump directory, such as |
|
VARCHAR2 |
'' |
optional |
A file name prefix that file names must match to be downloaded. An empty prefix downloads all of the top level files in the specified Amazon S3 bucket, but not the files in folders in the bucket. The procedure downloads Amazon S3 objects only from the first level folder that matches the prefix. Nested directory structures matching the specified prefix are not downloaded. For example, suppose that an Amazon S3 bucket has the folder structure If, instead, you specify the |
The return value for the rdsadmin.rdsadmin_s3_tasks.download_from_s3
procedure is a task ID.
The following example downloads all of the files in the Amazon S3 bucket named
to the
mys3bucket
directory.
DATA_PUMP_DIR
SELECT rdsadmin.rdsadmin_s3_tasks.download_from_s3( p_bucket_name => '
mys3bucket
', p_directory_name => 'DATA_PUMP_DIR
') AS TASK_ID FROM DUAL;
The following example downloads all of the files with the prefix
in the Amazon S3 bucket named
db
to the
mys3bucket
directory.
DATA_PUMP_DIR
SELECT rdsadmin.rdsadmin_s3_tasks.download_from_s3( p_bucket_name => '
mys3bucket
', p_s3_prefix => 'db
', p_directory_name => 'DATA_PUMP_DIR
') AS TASK_ID FROM DUAL;
The following example downloads all of the files in the folder
in the Amazon S3 bucket named
myfolder/
to the
mys3bucket
directory. Use the
prefix parameter setting to specify the Amazon S3 folder.
DATA_PUMP_DIR
SELECT rdsadmin.rdsadmin_s3_tasks.download_from_s3( p_bucket_name => '
mys3bucket
', p_s3_prefix => 'myfolder/
', p_directory_name => 'DATA_PUMP_DIR
') AS TASK_ID FROM DUAL;
In each example, the SELECT
statement returns the ID of the task in a
VARCHAR2
data type.
You can view the result by displaying the task's output file.
SELECT text FROM table(rdsadmin.rds_file_util.read_text_file('BDUMP','dbtask-
task-id
.log'));
Replace
with the task ID returned by the procedure.
task-id
Tasks are executed asynchronously.
You can use the UTL_FILE.FREMOVE
Oracle procedure to remove files from a directory. For more information,
see FREMOVE procedure
Monitoring the status of a file transfer
File transfer tasks publish Amazon RDS events when they start and when they complete. For information about viewing events, see Viewing Amazon RDS events.
You can view the status of an ongoing task in a bdump file. The bdump files are
located in the /rdsdbdata/log/trace
directory. Each bdump file name is
in the following format.
dbtask-
task-id
.log
Replace
with the ID of the task that you want
to monitor.
task-id
Tasks are executed asynchronously.
You can use the rdsadmin.rds_file_util.read_text_file
stored
procedure to view the contents of bdump files. For example, the following query
returns the contents of the
bdump file.
dbtask-1546988886389-2444.log
SELECT text FROM table(rdsadmin.rds_file_util.read_text_file('BDUMP','
dbtask-1546988886389-2444.log
'));
Removing the Amazon S3 integration option
You can remove Amazon S3 integration option from a DB instance.
To remove the Amazon S3 integration option from a DB instance, do one of the following:
-
To remove the Amazon S3 integration option from multiple DB instances, remove the
S3_INTEGRATION
option from the option group to which the DB instances belong. This change affects all DB instances that use the option group. For more information, see Removing an option from an option group. -
To remove the Amazon S3 integration option from a single DB instance, modify the DB instance and specify a different option group that doesn't include the
S3_INTEGRATION
option. You can specify the default (empty) option group or a different custom option group. For more information, see Modifying an Amazon RDS DB instance.