使用 AWS CLI Bash 脚本的 Amazon S3 示例 - AWS SDK 代码示例

文档 AWS SDK 示例 GitHub 存储库中还有更多 S AWS DK 示例

本文属于机器翻译版本。若本译文内容与英语原文存在差异,则一律以英文原文为准。

使用 AWS CLI Bash 脚本的 Amazon S3 示例

以下代码示例向您展示了如何在 Amazon S3 中使用 with Bash 脚本来执行操作和实现常见场景。 AWS Command Line Interface

基础知识是向您展示如何在服务中执行基本操作的代码示例。

操作是大型程序的代码摘录,必须在上下文中运行。您可以通过操作了解如何调用单个服务函数,还可以通过函数相关场景的上下文查看操作。

场景是向您演示如何通过在一个服务中调用多个函数或与其他 AWS 服务结合来完成特定任务的代码示例。

每个示例都包含一个指向完整源代码的链接,您可以从中找到有关如何在上下文中设置和运行代码的说明。

基本功能

以下代码示例展示了如何:

  • 创建桶并将文件上载到其中。

  • 从桶中下载对象。

  • 将对象复制到存储桶中的子文件夹。

  • 列出存储桶中的对象。

  • 删除存储桶及其对象。

AWS CLI 使用 Bash 脚本
注意

还有更多相关信息 GitHub。在 AWS 代码示例存储库中查找完整示例,了解如何进行设置和运行。

############################################################################### # function s3_getting_started # # This function creates, copies, and deletes S3 buckets and objects. # # Returns: # 0 - If successful. # 1 - If an error occurred. ############################################################################### function s3_getting_started() { { if [ "$BUCKET_OPERATIONS_SOURCED" != "True" ]; then cd bucket-lifecycle-operations || exit source ./bucket_operations.sh cd .. fi } echo_repeat "*" 88 echo "Welcome to the Amazon S3 getting started demo." echo_repeat "*" 88 echo "A unique bucket will be created by appending a Universally Unique Identifier to a bucket name prefix." echo -n "Enter a prefix for the S3 bucket that will be used in this demo: " get_input bucket_name_prefix=$get_input_result local bucket_name bucket_name=$(generate_random_name "$bucket_name_prefix") local region_code region_code=$(aws configure get region) if create_bucket -b "$bucket_name" -r "$region_code"; then echo "Created demo bucket named $bucket_name" else errecho "The bucket failed to create. This demo will exit." return 1 fi local file_name while [ -z "$file_name" ]; do echo -n "Enter a file you want to upload to your bucket: " get_input file_name=$get_input_result if [ ! -f "$file_name" ]; then echo "Could not find file $file_name. Are you sure it exists?" file_name="" fi done local key key="$(basename "$file_name")" local result=0 if copy_file_to_bucket "$bucket_name" "$file_name" "$key"; then echo "Uploaded file $file_name into bucket $bucket_name with key $key." else result=1 fi local destination_file destination_file="$file_name.download" if yes_no_input "Would you like to download $key to the file $destination_file? (y/n) "; then if download_object_from_bucket "$bucket_name" "$destination_file" "$key"; then echo "Downloaded $key in the bucket $bucket_name to the file $destination_file." else result=1 fi fi if yes_no_input "Would you like to copy $key a new object key in your bucket? (y/n) "; then local to_key to_key="demo/$key" if copy_item_in_bucket "$bucket_name" "$key" "$to_key"; then echo "Copied $key in the bucket $bucket_name to the $to_key." else result=1 fi fi local bucket_items bucket_items=$(list_items_in_bucket "$bucket_name") # shellcheck disable=SC2181 if [[ $? -ne 0 ]]; then result=1 fi echo "Your bucket contains the following items." echo -e "Name\t\tSize" echo "$bucket_items" if yes_no_input "Delete the bucket, $bucket_name, as well as the objects in it? (y/n) "; then bucket_items=$(echo "$bucket_items" | cut -f 1) if delete_items_in_bucket "$bucket_name" "$bucket_items"; then echo "The following items were deleted from the bucket $bucket_name" echo "$bucket_items" else result=1 fi if delete_bucket "$bucket_name"; then echo "Deleted the bucket $bucket_name" else result=1 fi fi return $result }

此场景中使用的 Amazon S3 函数。

############################################################################### # function create-bucket # # This function creates the specified bucket in the specified AWS Region, unless # it already exists. # # Parameters: # -b bucket_name -- The name of the bucket to create. # -r region_code -- The code for an AWS Region in which to # create the bucket. # # Returns: # The URL of the bucket that was created. # And: # 0 - If successful. # 1 - If it fails. ############################################################################### function create_bucket() { local bucket_name region_code response local option OPTARG # Required to use getopts command in a function. # bashsupport disable=BP5008 function usage() { echo "function create_bucket" echo "Creates an Amazon S3 bucket. You must supply a bucket name:" echo " -b bucket_name The name of the bucket. It must be globally unique." echo " [-r region_code] The code for an AWS Region in which the bucket is created." echo "" } # Retrieve the calling parameters. while getopts "b:r:h" option; do case "${option}" in b) bucket_name="${OPTARG}" ;; r) region_code="${OPTARG}" ;; h) usage return 0 ;; \?) echo "Invalid parameter" usage return 1 ;; esac done if [[ -z "$bucket_name" ]]; then errecho "ERROR: You must provide a bucket name with the -b parameter." usage return 1 fi local bucket_config_arg # A location constraint for "us-east-1" returns an error. if [[ -n "$region_code" ]] && [[ "$region_code" != "us-east-1" ]]; then bucket_config_arg="--create-bucket-configuration LocationConstraint=$region_code" fi iecho "Parameters:\n" iecho " Bucket name: $bucket_name" iecho " Region code: $region_code" iecho "" # If the bucket already exists, we don't want to try to create it. if (bucket_exists "$bucket_name"); then errecho "ERROR: A bucket with that name already exists. Try again." return 1 fi # shellcheck disable=SC2086 response=$(aws s3api create-bucket \ --bucket "$bucket_name" \ $bucket_config_arg) # shellcheck disable=SC2181 if [[ ${?} -ne 0 ]]; then errecho "ERROR: AWS reports create-bucket operation failed.\n$response" return 1 fi } ############################################################################### # function copy_file_to_bucket # # This function creates a file in the specified bucket. # # Parameters: # $1 - The name of the bucket to copy the file to. # $2 - The path and file name of the local file to copy to the bucket. # $3 - The key (name) to call the copy of the file in the bucket. # # Returns: # 0 - If successful. # 1 - If it fails. ############################################################################### function copy_file_to_bucket() { local response bucket_name source_file destination_file_name bucket_name=$1 source_file=$2 destination_file_name=$3 response=$(aws s3api put-object \ --bucket "$bucket_name" \ --body "$source_file" \ --key "$destination_file_name") # shellcheck disable=SC2181 if [[ ${?} -ne 0 ]]; then errecho "ERROR: AWS reports put-object operation failed.\n$response" return 1 fi } ############################################################################### # function download_object_from_bucket # # This function downloads an object in a bucket to a file. # # Parameters: # $1 - The name of the bucket to download the object from. # $2 - The path and file name to store the downloaded bucket. # $3 - The key (name) of the object in the bucket. # # Returns: # 0 - If successful. # 1 - If it fails. ############################################################################### function download_object_from_bucket() { local bucket_name=$1 local destination_file_name=$2 local object_name=$3 local response response=$(aws s3api get-object \ --bucket "$bucket_name" \ --key "$object_name" \ "$destination_file_name") # shellcheck disable=SC2181 if [[ ${?} -ne 0 ]]; then errecho "ERROR: AWS reports put-object operation failed.\n$response" return 1 fi } ############################################################################### # function copy_item_in_bucket # # This function creates a copy of the specified file in the same bucket. # # Parameters: # $1 - The name of the bucket to copy the file from and to. # $2 - The key of the source file to copy. # $3 - The key of the destination file. # # Returns: # 0 - If successful. # 1 - If it fails. ############################################################################### function copy_item_in_bucket() { local bucket_name=$1 local source_key=$2 local destination_key=$3 local response response=$(aws s3api copy-object \ --bucket "$bucket_name" \ --copy-source "$bucket_name/$source_key" \ --key "$destination_key") # shellcheck disable=SC2181 if [[ $? -ne 0 ]]; then errecho "ERROR: AWS reports s3api copy-object operation failed.\n$response" return 1 fi } ############################################################################### # function list_items_in_bucket # # This function displays a list of the files in the bucket with each file's # size. The function uses the --query parameter to retrieve only the key and # size fields from the Contents collection. # # Parameters: # $1 - The name of the bucket. # # Returns: # The list of files in text format. # And: # 0 - If successful. # 1 - If it fails. ############################################################################### function list_items_in_bucket() { local bucket_name=$1 local response response=$(aws s3api list-objects \ --bucket "$bucket_name" \ --output text \ --query 'Contents[].{Key: Key, Size: Size}') # shellcheck disable=SC2181 if [[ ${?} -eq 0 ]]; then echo "$response" else errecho "ERROR: AWS reports s3api list-objects operation failed.\n$response" return 1 fi } ############################################################################### # function delete_items_in_bucket # # This function deletes the specified list of keys from the specified bucket. # # Parameters: # $1 - The name of the bucket. # $2 - A list of keys in the bucket to delete. # Returns: # 0 - If successful. # 1 - If it fails. ############################################################################### function delete_items_in_bucket() { local bucket_name=$1 local keys=$2 local response # Create the JSON for the items to delete. local delete_items delete_items="{\"Objects\":[" for key in $keys; do delete_items="$delete_items{\"Key\": \"$key\"}," done delete_items=${delete_items%?} # Remove the final comma. delete_items="$delete_items]}" response=$(aws s3api delete-objects \ --bucket "$bucket_name" \ --delete "$delete_items") # shellcheck disable=SC2181 if [[ $? -ne 0 ]]; then errecho "ERROR: AWS reports s3api delete-object operation failed.\n$response" return 1 fi } ############################################################################### # function delete_bucket # # This function deletes the specified bucket. # # Parameters: # $1 - The name of the bucket. # Returns: # 0 - If successful. # 1 - If it fails. ############################################################################### function delete_bucket() { local bucket_name=$1 local response response=$(aws s3api delete-bucket \ --bucket "$bucket_name") # shellcheck disable=SC2181 if [[ $? -ne 0 ]]; then errecho "ERROR: AWS reports s3api delete-bucket failed.\n$response" return 1 fi }

操作

以下代码示例演示了如何使用 CopyObject

AWS CLI 使用 Bash 脚本
注意

还有更多相关信息 GitHub。在 AWS 代码示例存储库中查找完整示例,了解如何进行设置和运行。

############################################################################### # function errecho # # This function outputs everything sent to it to STDERR (standard error output). ############################################################################### function errecho() { printf "%s\n" "$*" 1>&2 } ############################################################################### # function copy_item_in_bucket # # This function creates a copy of the specified file in the same bucket. # # Parameters: # $1 - The name of the bucket to copy the file from and to. # $2 - The key of the source file to copy. # $3 - The key of the destination file. # # Returns: # 0 - If successful. # 1 - If it fails. ############################################################################### function copy_item_in_bucket() { local bucket_name=$1 local source_key=$2 local destination_key=$3 local response response=$(aws s3api copy-object \ --bucket "$bucket_name" \ --copy-source "$bucket_name/$source_key" \ --key "$destination_key") # shellcheck disable=SC2181 if [[ $? -ne 0 ]]; then errecho "ERROR: AWS reports s3api copy-object operation failed.\n$response" return 1 fi }
  • 有关 API 的详细信息,请参阅AWS CLI 命令参考CopyObject中的。

以下代码示例演示了如何使用 CreateBucket

AWS CLI 使用 Bash 脚本
注意

还有更多相关信息 GitHub。在 AWS 代码示例存储库中查找完整示例,了解如何进行设置和运行。

############################################################################### # function iecho # # This function enables the script to display the specified text only if # the global variable $VERBOSE is set to true. ############################################################################### function iecho() { if [[ $VERBOSE == true ]]; then echo "$@" fi } ############################################################################### # function errecho # # This function outputs everything sent to it to STDERR (standard error output). ############################################################################### function errecho() { printf "%s\n" "$*" 1>&2 } ############################################################################### # function create-bucket # # This function creates the specified bucket in the specified AWS Region, unless # it already exists. # # Parameters: # -b bucket_name -- The name of the bucket to create. # -r region_code -- The code for an AWS Region in which to # create the bucket. # # Returns: # The URL of the bucket that was created. # And: # 0 - If successful. # 1 - If it fails. ############################################################################### function create_bucket() { local bucket_name region_code response local option OPTARG # Required to use getopts command in a function. # bashsupport disable=BP5008 function usage() { echo "function create_bucket" echo "Creates an Amazon S3 bucket. You must supply a bucket name:" echo " -b bucket_name The name of the bucket. It must be globally unique." echo " [-r region_code] The code for an AWS Region in which the bucket is created." echo "" } # Retrieve the calling parameters. while getopts "b:r:h" option; do case "${option}" in b) bucket_name="${OPTARG}" ;; r) region_code="${OPTARG}" ;; h) usage return 0 ;; \?) echo "Invalid parameter" usage return 1 ;; esac done if [[ -z "$bucket_name" ]]; then errecho "ERROR: You must provide a bucket name with the -b parameter." usage return 1 fi local bucket_config_arg # A location constraint for "us-east-1" returns an error. if [[ -n "$region_code" ]] && [[ "$region_code" != "us-east-1" ]]; then bucket_config_arg="--create-bucket-configuration LocationConstraint=$region_code" fi iecho "Parameters:\n" iecho " Bucket name: $bucket_name" iecho " Region code: $region_code" iecho "" # If the bucket already exists, we don't want to try to create it. if (bucket_exists "$bucket_name"); then errecho "ERROR: A bucket with that name already exists. Try again." return 1 fi # shellcheck disable=SC2086 response=$(aws s3api create-bucket \ --bucket "$bucket_name" \ $bucket_config_arg) # shellcheck disable=SC2181 if [[ ${?} -ne 0 ]]; then errecho "ERROR: AWS reports create-bucket operation failed.\n$response" return 1 fi }
  • 有关 API 的详细信息,请参阅AWS CLI 命令参考CreateBucket中的。

以下代码示例演示了如何使用 DeleteBucket

AWS CLI 使用 Bash 脚本
注意

还有更多相关信息 GitHub。在 AWS 代码示例存储库中查找完整示例,了解如何进行设置和运行。

############################################################################### # function errecho # # This function outputs everything sent to it to STDERR (standard error output). ############################################################################### function errecho() { printf "%s\n" "$*" 1>&2 } ############################################################################### # function delete_bucket # # This function deletes the specified bucket. # # Parameters: # $1 - The name of the bucket. # Returns: # 0 - If successful. # 1 - If it fails. ############################################################################### function delete_bucket() { local bucket_name=$1 local response response=$(aws s3api delete-bucket \ --bucket "$bucket_name") # shellcheck disable=SC2181 if [[ $? -ne 0 ]]; then errecho "ERROR: AWS reports s3api delete-bucket failed.\n$response" return 1 fi }
  • 有关 API 的详细信息,请参阅AWS CLI 命令参考DeleteBucket中的。

以下代码示例演示了如何使用 DeleteObject

AWS CLI 使用 Bash 脚本
注意

还有更多相关信息 GitHub。在 AWS 代码示例存储库中查找完整示例,了解如何进行设置和运行。

############################################################################### # function errecho # # This function outputs everything sent to it to STDERR (standard error output). ############################################################################### function errecho() { printf "%s\n" "$*" 1>&2 } ############################################################################### # function delete_item_in_bucket # # This function deletes the specified file from the specified bucket. # # Parameters: # $1 - The name of the bucket. # $2 - The key (file name) in the bucket to delete. # Returns: # 0 - If successful. # 1 - If it fails. ############################################################################### function delete_item_in_bucket() { local bucket_name=$1 local key=$2 local response response=$(aws s3api delete-object \ --bucket "$bucket_name" \ --key "$key") # shellcheck disable=SC2181 if [[ $? -ne 0 ]]; then errecho "ERROR: AWS reports s3api delete-object operation failed.\n$response" return 1 fi }
  • 有关 API 的详细信息,请参阅AWS CLI 命令参考DeleteObject中的。

以下代码示例演示了如何使用 DeleteObjects

AWS CLI 使用 Bash 脚本
注意

还有更多相关信息 GitHub。在 AWS 代码示例存储库中查找完整示例,了解如何进行设置和运行。

############################################################################### # function errecho # # This function outputs everything sent to it to STDERR (standard error output). ############################################################################### function errecho() { printf "%s\n" "$*" 1>&2 } ############################################################################### # function delete_items_in_bucket # # This function deletes the specified list of keys from the specified bucket. # # Parameters: # $1 - The name of the bucket. # $2 - A list of keys in the bucket to delete. # Returns: # 0 - If successful. # 1 - If it fails. ############################################################################### function delete_items_in_bucket() { local bucket_name=$1 local keys=$2 local response # Create the JSON for the items to delete. local delete_items delete_items="{\"Objects\":[" for key in $keys; do delete_items="$delete_items{\"Key\": \"$key\"}," done delete_items=${delete_items%?} # Remove the final comma. delete_items="$delete_items]}" response=$(aws s3api delete-objects \ --bucket "$bucket_name" \ --delete "$delete_items") # shellcheck disable=SC2181 if [[ $? -ne 0 ]]; then errecho "ERROR: AWS reports s3api delete-object operation failed.\n$response" return 1 fi }
  • 有关 API 的详细信息,请参阅AWS CLI 命令参考DeleteObjects中的。

以下代码示例演示了如何使用 GetObject

AWS CLI 使用 Bash 脚本
注意

还有更多相关信息 GitHub。在 AWS 代码示例存储库中查找完整示例,了解如何进行设置和运行。

############################################################################### # function errecho # # This function outputs everything sent to it to STDERR (standard error output). ############################################################################### function errecho() { printf "%s\n" "$*" 1>&2 } ############################################################################### # function download_object_from_bucket # # This function downloads an object in a bucket to a file. # # Parameters: # $1 - The name of the bucket to download the object from. # $2 - The path and file name to store the downloaded bucket. # $3 - The key (name) of the object in the bucket. # # Returns: # 0 - If successful. # 1 - If it fails. ############################################################################### function download_object_from_bucket() { local bucket_name=$1 local destination_file_name=$2 local object_name=$3 local response response=$(aws s3api get-object \ --bucket "$bucket_name" \ --key "$object_name" \ "$destination_file_name") # shellcheck disable=SC2181 if [[ ${?} -ne 0 ]]; then errecho "ERROR: AWS reports put-object operation failed.\n$response" return 1 fi }
  • 有关 API 的详细信息,请参阅AWS CLI 命令参考GetObject中的。

以下代码示例演示了如何使用 HeadBucket

AWS CLI 使用 Bash 脚本
注意

还有更多相关信息 GitHub。在 AWS 代码示例存储库中查找完整示例,了解如何进行设置和运行。

############################################################################### # function bucket_exists # # This function checks to see if the specified bucket already exists. # # Parameters: # $1 - The name of the bucket to check. # # Returns: # 0 - If the bucket already exists. # 1 - If the bucket doesn't exist. ############################################################################### function bucket_exists() { local bucket_name bucket_name=$1 # Check whether the bucket already exists. # We suppress all output - we're interested only in the return code. if aws s3api head-bucket \ --bucket "$bucket_name" \ >/dev/null 2>&1; then return 0 # 0 in Bash script means true. else return 1 # 1 in Bash script means false. fi }
  • 有关 API 的详细信息,请参阅AWS CLI 命令参考HeadBucket中的。

以下代码示例演示了如何使用 ListObjectsV2

AWS CLI 使用 Bash 脚本
注意

还有更多相关信息 GitHub。在 AWS 代码示例存储库中查找完整示例,了解如何进行设置和运行。

############################################################################### # function errecho # # This function outputs everything sent to it to STDERR (standard error output). ############################################################################### function errecho() { printf "%s\n" "$*" 1>&2 } ############################################################################### # function list_items_in_bucket # # This function displays a list of the files in the bucket with each file's # size. The function uses the --query parameter to retrieve only the key and # size fields from the Contents collection. # # Parameters: # $1 - The name of the bucket. # # Returns: # The list of files in text format. # And: # 0 - If successful. # 1 - If it fails. ############################################################################### function list_items_in_bucket() { local bucket_name=$1 local response response=$(aws s3api list-objects \ --bucket "$bucket_name" \ --output text \ --query 'Contents[].{Key: Key, Size: Size}') # shellcheck disable=SC2181 if [[ ${?} -eq 0 ]]; then echo "$response" else errecho "ERROR: AWS reports s3api list-objects operation failed.\n$response" return 1 fi }
  • 有关 API 的详细信息,请参阅《AWS CLI 命令参考》中的 ListObjectsV2

以下代码示例演示了如何使用 PutObject

AWS CLI 使用 Bash 脚本
注意

还有更多相关信息 GitHub。在 AWS 代码示例存储库中查找完整示例,了解如何进行设置和运行。

############################################################################### # function errecho # # This function outputs everything sent to it to STDERR (standard error output). ############################################################################### function errecho() { printf "%s\n" "$*" 1>&2 } ############################################################################### # function copy_file_to_bucket # # This function creates a file in the specified bucket. # # Parameters: # $1 - The name of the bucket to copy the file to. # $2 - The path and file name of the local file to copy to the bucket. # $3 - The key (name) to call the copy of the file in the bucket. # # Returns: # 0 - If successful. # 1 - If it fails. ############################################################################### function copy_file_to_bucket() { local response bucket_name source_file destination_file_name bucket_name=$1 source_file=$2 destination_file_name=$3 response=$(aws s3api put-object \ --bucket "$bucket_name" \ --body "$source_file" \ --key "$destination_file_name") # shellcheck disable=SC2181 if [[ ${?} -ne 0 ]]; then errecho "ERROR: AWS reports put-object operation failed.\n$response" return 1 fi }
  • 有关 API 的详细信息,请参阅AWS CLI 命令参考PutObject中的。

场景

以下代码示例展示了如何:

  • 创建具有唯一命名和区域配置的 S3 存储桶

  • 配置存储桶安全设置,包括禁止公开访问

  • 启用版本控制和默认加密以保护数据

  • 上传带有和不带自定义元数据的对象

  • 将对象从存储桶下载到本地存储

  • 复制存储桶中的对象以整理文件夹中的数据

  • 列出存储桶内容和带有特定前缀的对象

  • 为存储桶添加标签以进行资源管理

  • 清理所有资源,包括版本控制对象

AWS CLI 使用 Bash 脚本
注意

还有更多相关信息 GitHub。在示例开发者教程存储库中查找完整的示例并学习如何设置和运行。

#!/bin/bash # Amazon S3 Getting Started Tutorial Script # This script demonstrates basic S3 operations including: # - Creating a bucket # - Configuring bucket settings # - Uploading, downloading, and copying objects # - Deleting objects and buckets # Latest fixes: # 1. Fixed folder creation using temporary file # 2. Corrected versioned object deletion in cleanup # 3. Improved error handling for cleanup operations # Set up error handling set -e trap 'cleanup_handler $?' EXIT # Log file setup LOG_FILE="s3-tutorial-$(date +%Y%m%d-%H%M%S).log" exec > >(tee -a "$LOG_FILE") 2>&1 # Function to log messages log() { echo "[$(date +"%Y-%m-%d %H:%M:%S")] $1" } # Function to handle errors handle_error() { log "ERROR: $1" exit 1 } # Function to check if a bucket exists bucket_exists() { if aws s3api head-bucket --bucket "$1" 2>/dev/null; then return 0 else return 1 fi } # Function to delete all versions of objects in a bucket delete_all_versions() { local bucket=$1 log "Deleting all object versions from bucket $bucket..." # Get and delete all versions versions=$(aws s3api list-object-versions --bucket "$bucket" --query 'Versions[].{Key:Key,VersionId:VersionId}' --output json 2>/dev/null) if [ -n "$versions" ] && [ "$versions" != "null" ]; then echo "{\"Objects\": $versions}" | aws s3api delete-objects --bucket "$bucket" --delete file:///dev/stdin >/dev/null 2>&1 || log "Warning: Some versions could not be deleted" fi # Get and delete all delete markers markers=$(aws s3api list-object-versions --bucket "$bucket" --query 'DeleteMarkers[].{Key:Key,VersionId:VersionId}' --output json 2>/dev/null) if [ -n "$markers" ] && [ "$markers" != "null" ]; then echo "{\"Objects\": $markers}" | aws s3api delete-objects --bucket "$bucket" --delete file:///dev/stdin >/dev/null 2>&1 || log "Warning: Some delete markers could not be deleted" fi } # Function to handle cleanup on exit cleanup_handler() { local exit_code=$1 # Only run cleanup if it hasn't been run already if [ -z "$CLEANUP_DONE" ]; then cleanup fi exit $exit_code } # Function to clean up resources cleanup() { log "Starting cleanup process..." CLEANUP_DONE=1 # List all resources created for confirmation log "Resources created:" if [ -n "$BUCKET_NAME" ]; then log "- S3 Bucket: $BUCKET_NAME" # Only try to list objects if the bucket exists if bucket_exists "$BUCKET_NAME"; then # Check if any objects were created OBJECTS=$(aws s3api list-objects-v2 --bucket "$BUCKET_NAME" --query 'Contents[].Key' --output text 2>/dev/null || echo "") if [ -n "$OBJECTS" ]; then log "- Objects in bucket:" echo "$OBJECTS" | tr '\t' '\n' | while read -r obj; do log " - $obj" done fi # Ask for confirmation before cleanup read -p "Do you want to proceed with cleanup and delete all resources? (y/n): " confirm if [[ $confirm != [yY] && $confirm != [yY][eE][sS] ]]; then log "Cleanup aborted by user." return fi # Delete all versions of objects delete_all_versions "$BUCKET_NAME" # Delete the bucket log "Deleting bucket $BUCKET_NAME..." aws s3api delete-bucket --bucket "$BUCKET_NAME" || log "Warning: Failed to delete bucket" else log "Bucket $BUCKET_NAME does not exist, skipping cleanup" fi fi # Clean up local files log "Removing local files..." rm -f sample-file.txt sample-document.txt downloaded-sample-file.txt empty-file.tmp log "Cleanup completed." } # Generate a random bucket name generate_bucket_name() { local hex_id hex_id=$(openssl rand -hex 6) echo "demo-s3-bucket-$hex_id" } # Main script execution main() { log "Starting Amazon S3 Getting Started Tutorial" # Generate a unique bucket name BUCKET_NAME=$(generate_bucket_name) log "Generated bucket name: $BUCKET_NAME" # Step 1: Create a bucket log "Step 1: Creating S3 bucket..." # Get the current region or default to us-east-1 REGION=$(aws configure get region) REGION=${REGION:-us-east-1} log "Using region: $REGION" if [ "$REGION" = "us-east-1" ]; then aws s3api create-bucket --bucket "$BUCKET_NAME" || handle_error "Failed to create bucket" else aws s3api create-bucket \ --bucket "$BUCKET_NAME" \ --region "$REGION" \ --create-bucket-configuration LocationConstraint="$REGION" || handle_error "Failed to create bucket" fi log "Bucket created successfully" # Configure bucket settings log "Configuring bucket settings..." # Block public access (security best practice) log "Blocking public access..." aws s3api put-public-access-block \ --bucket "$BUCKET_NAME" \ --public-access-block-configuration "BlockPublicAcls=true,IgnorePublicAcls=true,BlockPublicPolicy=true,RestrictPublicBuckets=true" || handle_error "Failed to configure public access block" # Enable versioning log "Enabling versioning..." aws s3api put-bucket-versioning \ --bucket "$BUCKET_NAME" \ --versioning-configuration Status=Enabled || handle_error "Failed to enable versioning" # Set default encryption log "Setting default encryption..." aws s3api put-bucket-encryption \ --bucket "$BUCKET_NAME" \ --server-side-encryption-configuration '{"Rules": [{"ApplyServerSideEncryptionByDefault": {"SSEAlgorithm": "AES256"}}]}' || handle_error "Failed to set encryption" # Step 2: Upload an object log "Step 2: Uploading objects to bucket..." # Create a sample file echo "This is a sample file for the S3 tutorial." > sample-file.txt # Upload the file aws s3api put-object \ --bucket "$BUCKET_NAME" \ --key "sample-file.txt" \ --body "sample-file.txt" || handle_error "Failed to upload object" log "Object uploaded successfully" # Upload with metadata echo "This is a document with metadata." > sample-document.txt aws s3api put-object \ --bucket "$BUCKET_NAME" \ --key "documents/sample-document.txt" \ --body "sample-document.txt" \ --content-type "text/plain" \ --metadata "author=AWSDocumentation,purpose=tutorial" || handle_error "Failed to upload object with metadata" log "Object with metadata uploaded successfully" # Step 3: Download an object log "Step 3: Downloading object from bucket..." aws s3api get-object \ --bucket "$BUCKET_NAME" \ --key "sample-file.txt" \ "downloaded-sample-file.txt" || handle_error "Failed to download object" log "Object downloaded successfully" # Check if an object exists log "Checking if object exists..." aws s3api head-object \ --bucket "$BUCKET_NAME" \ --key "sample-file.txt" || handle_error "Object does not exist" log "Object exists" # Step 4: Copy object to a folder log "Step 4: Copying object to a folder..." # Create a folder structure using a temporary empty file log "Creating folder structure..." touch empty-file.tmp aws s3api put-object \ --bucket "$BUCKET_NAME" \ --key "favorite-files/" \ --body empty-file.tmp || handle_error "Failed to create folder" # Copy the object log "Copying object..." aws s3api copy-object \ --bucket "$BUCKET_NAME" \ --copy-source "$BUCKET_NAME/sample-file.txt" \ --key "favorite-files/sample-file.txt" || handle_error "Failed to copy object" log "Object copied successfully" # List objects in the bucket log "Listing all objects in the bucket..." aws s3api list-objects-v2 \ --bucket "$BUCKET_NAME" \ --query 'Contents[].Key' \ --output table || handle_error "Failed to list objects" # List objects with a specific prefix log "Listing objects in the favorite-files folder..." aws s3api list-objects-v2 \ --bucket "$BUCKET_NAME" \ --prefix "favorite-files/" \ --query 'Contents[].Key' \ --output table || handle_error "Failed to list objects with prefix" # Add tags to the bucket log "Adding tags to the bucket..." aws s3api put-bucket-tagging \ --bucket "$BUCKET_NAME" \ --tagging 'TagSet=[{Key=Project,Value=S3Tutorial},{Key=Environment,Value=Demo}]' || handle_error "Failed to add tags" log "Tags added successfully" log "Tutorial completed successfully!" } # Execute the main function main