Use PutObject with an AWS SDK or CLI - AWS SDK Code Examples

There are more AWS SDK examples available in the AWS Doc SDK Examples GitHub repo.

Use PutObject with an AWS SDK or CLI

The following code examples show how to use PutObject.

Action examples are code excerpts from larger programs and must be run in context. You can see this action in context in the following code examples:

.NET
AWS SDK for .NET
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository.

/// <summary> /// Shows how to upload a file from the local computer to an Amazon S3 /// bucket. /// </summary> /// <param name="client">An initialized Amazon S3 client object.</param> /// <param name="bucketName">The Amazon S3 bucket to which the object /// will be uploaded.</param> /// <param name="objectName">The object to upload.</param> /// <param name="filePath">The path, including file name, of the object /// on the local computer to upload.</param> /// <returns>A boolean value indicating the success or failure of the /// upload procedure.</returns> public static async Task<bool> UploadFileAsync( IAmazonS3 client, string bucketName, string objectName, string filePath) { var request = new PutObjectRequest { BucketName = bucketName, Key = objectName, FilePath = filePath, }; var response = await client.PutObjectAsync(request); if (response.HttpStatusCode == System.Net.HttpStatusCode.OK) { Console.WriteLine($"Successfully uploaded {objectName} to {bucketName}."); return true; } else { Console.WriteLine($"Could not upload {objectName} to {bucketName}."); return false; } }

Upload an object with server-side encryption.

using System; using System.Threading.Tasks; using Amazon.S3; using Amazon.S3.Model; /// <summary> /// This example shows how to upload an object to an Amazon Simple Storage /// Service (Amazon S3) bucket with server-side encryption enabled. /// </summary> public class ServerSideEncryption { public static async Task Main() { string bucketName = "amzn-s3-demo-bucket"; string keyName = "samplefile.txt"; // If the AWS Region defined for your default user is different // from the Region where your Amazon S3 bucket is located, // pass the Region name to the Amazon S3 client object's constructor. // For example: RegionEndpoint.USWest2. IAmazonS3 client = new AmazonS3Client(); await WritingAnObjectAsync(client, bucketName, keyName); } /// <summary> /// Upload a sample object include a setting for encryption. /// </summary> /// <param name="client">The initialized Amazon S3 client object used to /// to upload a file and apply server-side encryption.</param> /// <param name="bucketName">The name of the Amazon S3 bucket where the /// encrypted object will reside.</param> /// <param name="keyName">The name for the object that you want to /// create in the supplied bucket.</param> public static async Task WritingAnObjectAsync(IAmazonS3 client, string bucketName, string keyName) { try { var putRequest = new PutObjectRequest { BucketName = bucketName, Key = keyName, ContentBody = "sample text", ServerSideEncryptionMethod = ServerSideEncryptionMethod.AES256, }; var putResponse = await client.PutObjectAsync(putRequest); // Determine the encryption state of an object. GetObjectMetadataRequest metadataRequest = new GetObjectMetadataRequest { BucketName = bucketName, Key = keyName, }; GetObjectMetadataResponse response = await client.GetObjectMetadataAsync(metadataRequest); ServerSideEncryptionMethod objectEncryption = response.ServerSideEncryptionMethod; Console.WriteLine($"Encryption method used: {0}", objectEncryption.ToString()); } catch (AmazonS3Exception ex) { Console.WriteLine($"Error: '{ex.Message}' when writing an object"); } } }
  • For API details, see PutObject in AWS SDK for .NET API Reference.

Bash
AWS CLI with Bash script
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository.

############################################################################### # function errecho # # This function outputs everything sent to it to STDERR (standard error output). ############################################################################### function errecho() { printf "%s\n" "$*" 1>&2 } ############################################################################### # function copy_file_to_bucket # # This function creates a file in the specified bucket. # # Parameters: # $1 - The name of the bucket to copy the file to. # $2 - The path and file name of the local file to copy to the bucket. # $3 - The key (name) to call the copy of the file in the bucket. # # Returns: # 0 - If successful. # 1 - If it fails. ############################################################################### function copy_file_to_bucket() { local response bucket_name source_file destination_file_name bucket_name=$1 source_file=$2 destination_file_name=$3 response=$(aws s3api put-object \ --bucket "$bucket_name" \ --body "$source_file" \ --key "$destination_file_name") # shellcheck disable=SC2181 if [[ ${?} -ne 0 ]]; then errecho "ERROR: AWS reports put-object operation failed.\n$response" return 1 fi }
  • For API details, see PutObject in AWS CLI Command Reference.

C++
SDK for C++
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository.

bool AwsDoc::S3::putObject(const Aws::String &bucketName, const Aws::String &fileName, const Aws::S3::S3ClientConfiguration &clientConfig) { Aws::S3::S3Client s3Client(clientConfig); Aws::S3::Model::PutObjectRequest request; request.SetBucket(bucketName); //We are using the name of the file as the key for the object in the bucket. //However, this is just a string and can be set according to your retrieval needs. request.SetKey(fileName); std::shared_ptr<Aws::IOStream> inputData = Aws::MakeShared<Aws::FStream>("SampleAllocationTag", fileName.c_str(), std::ios_base::in | std::ios_base::binary); if (!*inputData) { std::cerr << "Error unable to read file " << fileName << std::endl; return false; } request.SetBody(inputData); Aws::S3::Model::PutObjectOutcome outcome = s3Client.PutObject(request); if (!outcome.IsSuccess()) { std::cerr << "Error: putObject: " << outcome.GetError().GetMessage() << std::endl; } else { std::cout << "Added object '" << fileName << "' to bucket '" << bucketName << "'."; } return outcome.IsSuccess(); }
  • For API details, see PutObject in AWS SDK for C++ API Reference.

CLI
AWS CLI

The following example uses the put-object command to upload an object to Amazon S3:

aws s3api put-object --bucket text-content --key dir-1/my_images.tar.bz2 --body my_images.tar.bz2

The following example shows an upload of a video file (The video file is specified using Windows file system syntax.):

aws s3api put-object --bucket text-content --key dir-1/big-video-file.mp4 --body e:\media\videos\f-sharp-3-data-services.mp4

For more information about uploading objects, see Uploading Objects in the Amazon S3 Developer Guide.

  • For API details, see PutObject in AWS CLI Command Reference.

Go
SDK for Go V2
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository.

Put an object in a bucket by using the low-level API.

import ( "bytes" "context" "errors" "fmt" "io" "log" "os" "time" "github.com/aws/aws-sdk-go-v2/aws" "github.com/aws/aws-sdk-go-v2/feature/s3/manager" "github.com/aws/aws-sdk-go-v2/service/s3" "github.com/aws/aws-sdk-go-v2/service/s3/types" "github.com/aws/smithy-go" ) // BucketBasics encapsulates the Amazon Simple Storage Service (Amazon S3) actions // used in the examples. // It contains S3Client, an Amazon S3 service client that is used to perform bucket // and object actions. type BucketBasics struct { S3Client *s3.Client } // UploadFile reads from a file and puts the data into an object in a bucket. func (basics BucketBasics) UploadFile(ctx context.Context, bucketName string, objectKey string, fileName string) error { file, err := os.Open(fileName) if err != nil { log.Printf("Couldn't open file %v to upload. Here's why: %v\n", fileName, err) } else { defer file.Close() _, err = basics.S3Client.PutObject(ctx, &s3.PutObjectInput{ Bucket: aws.String(bucketName), Key: aws.String(objectKey), Body: file, }) if err != nil { var apiErr smithy.APIError if errors.As(err, &apiErr) && apiErr.ErrorCode() == "EntityTooLarge" { log.Printf("Error while uploading object to %s. The object is too large.\n"+ "To upload objects larger than 5GB, use the S3 console (160GB max)\n"+ "or the multipart upload API (5TB max).", bucketName) } else { log.Printf("Couldn't upload file %v to %v:%v. Here's why: %v\n", fileName, bucketName, objectKey, err) } } else { err = s3.NewObjectExistsWaiter(basics.S3Client).Wait( ctx, &s3.HeadObjectInput{Bucket: aws.String(bucketName), Key: aws.String(objectKey)}, time.Minute) if err != nil { log.Printf("Failed attempt to wait for object %s to exist.\n", objectKey) } } } return err }

Upload an object to a bucket by using a transfer manager.

import ( "bytes" "context" "errors" "fmt" "log" "time" "github.com/aws/aws-sdk-go-v2/aws" "github.com/aws/aws-sdk-go-v2/feature/s3/manager" "github.com/aws/aws-sdk-go-v2/service/s3" "github.com/aws/aws-sdk-go-v2/service/s3/types" "github.com/aws/smithy-go" ) // S3Actions wraps S3 service actions. type S3Actions struct { S3Client *s3.Client S3Manager *manager.Uploader } // UploadObject uses the S3 upload manager to upload an object to a bucket. func (actor S3Actions) UploadObject(ctx context.Context, bucket string, key string, contents string) (string, error) { var outKey string input := &s3.PutObjectInput{ Bucket: aws.String(bucket), Key: aws.String(key), Body: bytes.NewReader([]byte(contents)), ChecksumAlgorithm: types.ChecksumAlgorithmSha256, } output, err := actor.S3Manager.Upload(ctx, input) if err != nil { var noBucket *types.NoSuchBucket if errors.As(err, &noBucket) { log.Printf("Bucket %s does not exist.\n", bucket) err = noBucket } } else { err := s3.NewObjectExistsWaiter(actor.S3Client).Wait(ctx, &s3.HeadObjectInput{ Bucket: aws.String(bucket), Key: aws.String(key), }, time.Minute) if err != nil { log.Printf("Failed attempt to wait for object %s to exist in %s.\n", key, bucket) } else { outKey = *output.Key } } return outKey, err }
  • For API details, see PutObject in AWS SDK for Go API Reference.

Java
SDK for Java 2.x
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository.

Upload a file to a bucket using an S3Client.

/** * Uploads a local file to an AWS S3 bucket asynchronously. * * @param bucketName the name of the S3 bucket to upload the file to * @param key the key (object name) to use for the uploaded file * @param objectPath the local file path of the file to be uploaded * @return a {@link CompletableFuture} that completes with the {@link PutObjectResponse} when the upload is successful, or throws a {@link RuntimeException} if the upload fails */ public CompletableFuture<PutObjectResponse> uploadLocalFileAsync(String bucketName, String key, String objectPath) { PutObjectRequest objectRequest = PutObjectRequest.builder() .bucket(bucketName) .key(key) .build(); CompletableFuture<PutObjectResponse> response = getAsyncClient().putObject(objectRequest, AsyncRequestBody.fromFile(Paths.get(objectPath))); return response.whenComplete((resp, ex) -> { if (ex != null) { throw new RuntimeException("Failed to upload file", ex); } }); }

Use an S3TransferManager to upload a file to a bucket. View the complete file and test.

import org.slf4j.Logger; import org.slf4j.LoggerFactory; import software.amazon.awssdk.transfer.s3.S3TransferManager; import software.amazon.awssdk.transfer.s3.model.CompletedFileUpload; import software.amazon.awssdk.transfer.s3.model.FileUpload; import software.amazon.awssdk.transfer.s3.model.UploadFileRequest; import software.amazon.awssdk.transfer.s3.progress.LoggingTransferListener; import java.net.URI; import java.net.URISyntaxException; import java.net.URL; import java.nio.file.Paths; import java.util.UUID; public String uploadFile(S3TransferManager transferManager, String bucketName, String key, URI filePathURI) { UploadFileRequest uploadFileRequest = UploadFileRequest.builder() .putObjectRequest(b -> b.bucket(bucketName).key(key)) .source(Paths.get(filePathURI)) .build(); FileUpload fileUpload = transferManager.uploadFile(uploadFileRequest); CompletedFileUpload uploadResult = fileUpload.completionFuture().join(); return uploadResult.response().eTag(); }

Upload an object to a bucket and set tags using an S3Client.

/** * Puts tags on an Amazon S3 object. * * @param s3 An {@link S3Client} object that represents the Amazon S3 client. * @param bucketName The name of the Amazon S3 bucket. * @param objectKey The key of the Amazon S3 object. * @param objectPath The file path of the object to be uploaded. */ public static void putS3ObjectTags(S3Client s3, String bucketName, String objectKey, String objectPath) { try { Tag tag1 = Tag.builder() .key("Tag 1") .value("This is tag 1") .build(); Tag tag2 = Tag.builder() .key("Tag 2") .value("This is tag 2") .build(); List<Tag> tags = new ArrayList<>(); tags.add(tag1); tags.add(tag2); Tagging allTags = Tagging.builder() .tagSet(tags) .build(); PutObjectRequest putOb = PutObjectRequest.builder() .bucket(bucketName) .key(objectKey) .tagging(allTags) .build(); s3.putObject(putOb, RequestBody.fromBytes(getObjectFile(objectPath))); } catch (S3Exception e) { System.err.println(e.getMessage()); System.exit(1); } } /** * Updates the tags associated with an object in an Amazon S3 bucket. * * @param s3 an instance of the S3Client class, which is used to interact with the Amazon S3 service * @param bucketName the name of the S3 bucket containing the object * @param objectKey the key (or name) of the object in the S3 bucket * @throws S3Exception if there is an error updating the object's tags */ public static void updateObjectTags(S3Client s3, String bucketName, String objectKey) { try { GetObjectTaggingRequest taggingRequest = GetObjectTaggingRequest.builder() .bucket(bucketName) .key(objectKey) .build(); GetObjectTaggingResponse getTaggingRes = s3.getObjectTagging(taggingRequest); List<Tag> obTags = getTaggingRes.tagSet(); for (Tag sinTag : obTags) { System.out.println("The tag key is: " + sinTag.key()); System.out.println("The tag value is: " + sinTag.value()); } // Replace the object's tags with two new tags. Tag tag3 = Tag.builder() .key("Tag 3") .value("This is tag 3") .build(); Tag tag4 = Tag.builder() .key("Tag 4") .value("This is tag 4") .build(); List<Tag> tags = new ArrayList<>(); tags.add(tag3); tags.add(tag4); Tagging updatedTags = Tagging.builder() .tagSet(tags) .build(); PutObjectTaggingRequest taggingRequest1 = PutObjectTaggingRequest.builder() .bucket(bucketName) .key(objectKey) .tagging(updatedTags) .build(); s3.putObjectTagging(taggingRequest1); GetObjectTaggingResponse getTaggingRes2 = s3.getObjectTagging(taggingRequest); List<Tag> modTags = getTaggingRes2.tagSet(); for (Tag sinTag : modTags) { System.out.println("The tag key is: " + sinTag.key()); System.out.println("The tag value is: " + sinTag.value()); } } catch (S3Exception e) { System.err.println(e.getMessage()); System.exit(1); } } /** * Retrieves the contents of a file as a byte array. * * @param filePath the path of the file to be read * @return a byte array containing the contents of the file, or null if an error occurs */ private static byte[] getObjectFile(String filePath) { FileInputStream fileInputStream = null; byte[] bytesArray = null; try { File file = new File(filePath); bytesArray = new byte[(int) file.length()]; fileInputStream = new FileInputStream(file); fileInputStream.read(bytesArray); } catch (IOException e) { e.printStackTrace(); } finally { if (fileInputStream != null) { try { fileInputStream.close(); } catch (IOException e) { e.printStackTrace(); } } } return bytesArray; } }

Upload an object to a bucket and set metadata using an S3Client.

import software.amazon.awssdk.core.sync.RequestBody; import software.amazon.awssdk.regions.Region; import software.amazon.awssdk.services.s3.S3Client; import software.amazon.awssdk.services.s3.model.PutObjectRequest; import software.amazon.awssdk.services.s3.model.S3Exception; import java.io.File; import java.util.HashMap; import java.util.Map; /** * Before running this Java V2 code example, set up your development * environment, including your credentials. * <p> * For more information, see the following documentation topic: * <p> * https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/get-started.html */ public class PutObjectMetadata { public static void main(String[] args) { final String USAGE = """ Usage: <bucketName> <objectKey> <objectPath>\s Where: bucketName - The Amazon S3 bucket to upload an object into. objectKey - The object to upload (for example, book.pdf). objectPath - The path where the file is located (for example, C:/AWS/book2.pdf).\s """; if (args.length != 3) { System.out.println(USAGE); System.exit(1); } String bucketName = args[0]; String objectKey = args[1]; String objectPath = args[2]; System.out.println("Putting object " + objectKey + " into bucket " + bucketName); System.out.println(" in bucket: " + bucketName); Region region = Region.US_EAST_1; S3Client s3 = S3Client.builder() .region(region) .build(); putS3Object(s3, bucketName, objectKey, objectPath); s3.close(); } /** * Uploads an object to an Amazon S3 bucket with metadata. * * @param s3 the S3Client object used to interact with the Amazon S3 service * @param bucketName the name of the S3 bucket to upload the object to * @param objectKey the name of the object to be uploaded * @param objectPath the local file path of the object to be uploaded */ public static void putS3Object(S3Client s3, String bucketName, String objectKey, String objectPath) { try { Map<String, String> metadata = new HashMap<>(); metadata.put("author", "Mary Doe"); metadata.put("version", "1.0.0.0"); PutObjectRequest putOb = PutObjectRequest.builder() .bucket(bucketName) .key(objectKey) .metadata(metadata) .build(); s3.putObject(putOb, RequestBody.fromFile(new File(objectPath))); System.out.println("Successfully placed " + objectKey + " into bucket " + bucketName); } catch (S3Exception e) { System.err.println(e.getMessage()); System.exit(1); } } }

Upload an object to a bucket and set an object retention value using an S3Client.

import software.amazon.awssdk.regions.Region; import software.amazon.awssdk.services.s3.S3Client; import software.amazon.awssdk.services.s3.model.PutObjectRetentionRequest; import software.amazon.awssdk.services.s3.model.ObjectLockRetention; import software.amazon.awssdk.services.s3.model.S3Exception; import java.time.Instant; import java.time.LocalDate; import java.time.LocalDateTime; import java.time.ZoneOffset; /** * Before running this Java V2 code example, set up your development * environment, including your credentials. * <p> * For more information, see the following documentation topic: * <p> * https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/get-started.html */ public class PutObjectRetention { public static void main(String[] args) { final String usage = """ Usage: <key> <bucketName>\s Where: key - The name of the object (for example, book.pdf).\s bucketName - The Amazon S3 bucket name that contains the object (for example, bucket1).\s """; if (args.length != 2) { System.out.println(usage); System.exit(1); } String key = args[0]; String bucketName = args[1]; Region region = Region.US_EAST_1; S3Client s3 = S3Client.builder() .region(region) .build(); setRentionPeriod(s3, key, bucketName); s3.close(); } /** * Sets the retention period for an object in an Amazon S3 bucket. * * @param s3 the S3Client object used to interact with the Amazon S3 service * @param key the key (name) of the object in the S3 bucket * @param bucket the name of the S3 bucket where the object is stored * * @throws S3Exception if an error occurs while setting the object retention period */ public static void setRentionPeriod(S3Client s3, String key, String bucket) { try { LocalDate localDate = LocalDate.parse("2020-07-17"); LocalDateTime localDateTime = localDate.atStartOfDay(); Instant instant = localDateTime.toInstant(ZoneOffset.UTC); ObjectLockRetention lockRetention = ObjectLockRetention.builder() .mode("COMPLIANCE") .retainUntilDate(instant) .build(); PutObjectRetentionRequest retentionRequest = PutObjectRetentionRequest.builder() .bucket(bucket) .key(key) .bypassGovernanceRetention(true) .retention(lockRetention) .build(); // To set Retention on an object, the Amazon S3 bucket must support object // locking, otherwise an exception is thrown. s3.putObjectRetention(retentionRequest); System.out.print("An object retention configuration was successfully placed on the object"); } catch (S3Exception e) { System.err.println(e.awsErrorDetails().errorMessage()); System.exit(1); } } }
  • For API details, see PutObject in AWS SDK for Java 2.x API Reference.

JavaScript
SDK for JavaScript (v3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository.

Upload the object.

import { readFile } from "node:fs/promises"; import { PutObjectCommand, S3Client, S3ServiceException, } from "@aws-sdk/client-s3"; /** * Upload a file to an S3 bucket. * @param {{ bucketName: string, key: string, filePath: string }} */ export const main = async ({ bucketName, key, filePath }) => { const client = new S3Client({}); const command = new PutObjectCommand({ Bucket: bucketName, Key: key, Body: await readFile(filePath), }); try { const response = await client.send(command); console.log(response); } catch (caught) { if ( caught instanceof S3ServiceException && caught.name === "EntityTooLarge" ) { console.error( `Error from S3 while uploading object to ${bucketName}. \ The object was too large. To upload objects larger than 5GB, use the S3 console (160GB max) \ or the multipart upload API (5TB max).`, ); } else if (caught instanceof S3ServiceException) { console.error( `Error from S3 while uploading object to ${bucketName}. ${caught.name}: ${caught.message}`, ); } else { throw caught; } } };
Kotlin
SDK for Kotlin
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository.

suspend fun putS3Object( bucketName: String, objectKey: String, objectPath: String, ) { val metadataVal = mutableMapOf<String, String>() metadataVal["myVal"] = "test" val request = PutObjectRequest { bucket = bucketName key = objectKey metadata = metadataVal body = File(objectPath).asByteStream() } S3Client { region = "us-east-1" }.use { s3 -> val response = s3.putObject(request) println("Tag information is ${response.eTag}") } }
  • For API details, see PutObject in AWS SDK for Kotlin API reference.

PHP
SDK for PHP
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository.

Upload an object to a bucket.

$s3client = new Aws\S3\S3Client(['region' => 'us-west-2']); $fileName = __DIR__ . "/local-file-" . uniqid(); try { $this->s3client->putObject([ 'Bucket' => $this->bucketName, 'Key' => $fileName, 'SourceFile' => __DIR__ . '/testfile.txt' ]); echo "Uploaded $fileName to $this->bucketName.\n"; } catch (Exception $exception) { echo "Failed to upload $fileName with error: " . $exception->getMessage(); exit("Please fix error with file upload before continuing."); }
  • For API details, see PutObject in AWS SDK for PHP API Reference.

PowerShell
Tools for PowerShell

Example 1: This command uploads the single file "local-sample.txt" to Amazon S3, creating an object with key "sample.txt" in bucket "test-files".

Write-S3Object -BucketName amzn-s3-demo-bucket -Key "sample.txt" -File .\local-sample.txt

Example 2: This command uploads the single file "sample.txt" to Amazon S3, creating an object with key "sample.txt" in bucket "test-files". If the -Key parameter is not supplied, the filename is used as the S3 object key.

Write-S3Object -BucketName amzn-s3-demo-bucket -File .\sample.txt

Example 3: This command uploads the single file "local-sample.txt" to Amazon S3, creating an object with key "prefix/to/sample.txt" in bucket "test-files".

Write-S3Object -BucketName amzn-s3-demo-bucket -Key "prefix/to/sample.txt" -File .\local-sample.txt

Example 4: This command uploads all files in the subdirectory "Scripts" to the bucket "test-files" and applies the common key prefix "SampleScripts" to each object. Each uploaded file will have a key of "SampleScripts/filename" where 'filename' varies.

Write-S3Object -BucketName amzn-s3-demo-bucket -Folder .\Scripts -KeyPrefix SampleScripts\

Example 5: This command uploads all *.ps1 files in the local director "Scripts" to bucket "test-files" and applies the common key prefix "SampleScripts" to each object. Each uploaded file will have a key of "SampleScripts/filename.ps1" where 'filename' varies.

Write-S3Object -BucketName amzn-s3-demo-bucket -Folder .\Scripts -KeyPrefix SampleScripts\ -SearchPattern *.ps1

Example 6: This command creates a new S3 object containing the specified content string with key 'sample.txt'.

Write-S3Object -BucketName amzn-s3-demo-bucket -Key "sample.txt" -Content "object contents"

Example 7: This command uploads the specified file (the filename is used as the key) and applies the specified tags to the new object.

Write-S3Object -BucketName amzn-s3-demo-bucket -File "sample.txt" -TagSet @{Key="key1";Value="value1"},@{Key="key2";Value="value2"}

Example 8: This command recursively uploads the specified folder and applies the specified tags to all the new objects.

Write-S3Object -BucketName amzn-s3-demo-bucket -Folder . -KeyPrefix "TaggedFiles" -Recurse -TagSet @{Key="key1";Value="value1"},@{Key="key2";Value="value2"}
  • For API details, see PutObject in AWS Tools for PowerShell Cmdlet Reference.

Python
SDK for Python (Boto3)
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository.

class ObjectWrapper: """Encapsulates S3 object actions.""" def __init__(self, s3_object): """ :param s3_object: A Boto3 Object resource. This is a high-level resource in Boto3 that wraps object actions in a class-like structure. """ self.object = s3_object self.key = self.object.key def put(self, data): """ Upload data to the object. :param data: The data to upload. This can either be bytes or a string. When this argument is a string, it is interpreted as a file name, which is opened in read bytes mode. """ put_data = data if isinstance(data, str): try: put_data = open(data, "rb") except IOError: logger.exception("Expected file name or binary data, got '%s'.", data) raise try: self.object.put(Body=put_data) self.object.wait_until_exists() logger.info( "Put object '%s' to bucket '%s'.", self.object.key, self.object.bucket_name, ) except ClientError: logger.exception( "Couldn't put object '%s' to bucket '%s'.", self.object.key, self.object.bucket_name, ) raise finally: if getattr(put_data, "close", None): put_data.close()

Upload an object using a conditional request.

class S3ConditionalRequests: """Encapsulates S3 conditional request operations.""" def __init__(self, s3_client): self.s3 = s3_client @classmethod def from_client(cls): """ Instantiates this class from a Boto3 client. """ s3_client = boto3.client("s3") return cls(s3_client) def put_object_conditional(self, object_key: str, source_bucket: str, data: bytes): """ Uploads an object to Amazon S3 with a conditional request. Prevents overwrite using an IfNoneMatch condition for the object key. :param object_key: The key of the object to upload. :param source_bucket: The source bucket of the object. :param data: The data to upload. """ try: self.s3.put_object( Bucket=source_bucket, Key=object_key, Body=data, IfNoneMatch="*" ) print( f"\tConditional write successful for key {object_key} in bucket {source_bucket}." ) except ClientError as e: error_code = e.response["Error"]["Code"] if error_code == "PreconditionFailed": print("\tConditional write failed: Precondition failed") else: logger.error(f"Unexpected error: {error_code}") raise
  • For API details, see PutObject in AWS SDK for Python (Boto3) API Reference.

Ruby
SDK for Ruby
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository.

Upload a file using a managed uploader (Object.upload_file).

require 'aws-sdk-s3' # Wraps Amazon S3 object actions. class ObjectUploadFileWrapper attr_reader :object # @param object [Aws::S3::Object] An existing Amazon S3 object. def initialize(object) @object = object end # Uploads a file to an Amazon S3 object by using a managed uploader. # # @param file_path [String] The path to the file to upload. # @return [Boolean] True when the file is uploaded; otherwise false. def upload_file(file_path) @object.upload_file(file_path) true rescue Aws::Errors::ServiceError => e puts "Couldn't upload file #{file_path} to #{@object.key}. Here's why: #{e.message}" false end end # Example usage: def run_demo bucket_name = "amzn-s3-demo-bucket" object_key = "my-uploaded-file" file_path = "object_upload_file.rb" wrapper = ObjectUploadFileWrapper.new(Aws::S3::Object.new(bucket_name, object_key)) return unless wrapper.upload_file(file_path) puts "File #{file_path} successfully uploaded to #{bucket_name}:#{object_key}." end run_demo if $PROGRAM_NAME == __FILE__

Upload a file using Object.put.

require 'aws-sdk-s3' # Wraps Amazon S3 object actions. class ObjectPutWrapper attr_reader :object # @param object [Aws::S3::Object] An existing Amazon S3 object. def initialize(object) @object = object end def put_object(source_file_path) File.open(source_file_path, 'rb') do |file| @object.put(body: file) end true rescue Aws::Errors::ServiceError => e puts "Couldn't put #{source_file_path} to #{object.key}. Here's why: #{e.message}" false end end # Example usage: def run_demo bucket_name = "amzn-s3-demo-bucket" object_key = "my-object-key" file_path = "my-local-file.txt" wrapper = ObjectPutWrapper.new(Aws::S3::Object.new(bucket_name, object_key)) success = wrapper.put_object(file_path) return unless success puts "Put file #{file_path} into #{object_key} in #{bucket_name}." end run_demo if $PROGRAM_NAME == __FILE__

Upload a file using Object.put and add server-side encryption.

require 'aws-sdk-s3' # Wraps Amazon S3 object actions. class ObjectPutSseWrapper attr_reader :object # @param object [Aws::S3::Object] An existing Amazon S3 object. def initialize(object) @object = object end def put_object_encrypted(object_content, encryption) @object.put(body: object_content, server_side_encryption: encryption) true rescue Aws::Errors::ServiceError => e puts "Couldn't put your content to #{object.key}. Here's why: #{e.message}" false end end # Example usage: def run_demo bucket_name = "amzn-s3-demo-bucket" object_key = "my-encrypted-content" object_content = "This is my super-secret content." encryption = "AES256" wrapper = ObjectPutSseWrapper.new(Aws::S3::Object.new(bucket_name, object_content)) return unless wrapper.put_object_encrypted(object_content, encryption) puts "Put your content into #{bucket_name}:#{object_key} and encrypted it with #{encryption}." end run_demo if $PROGRAM_NAME == __FILE__
  • For API details, see PutObject in AWS SDK for Ruby API Reference.

Rust
SDK for Rust
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository.

pub async fn upload_object( client: &aws_sdk_s3::Client, bucket_name: &str, file_name: &str, key: &str, ) -> Result<aws_sdk_s3::operation::put_object::PutObjectOutput, S3ExampleError> { let body = aws_sdk_s3::primitives::ByteStream::from_path(std::path::Path::new(file_name)).await; client .put_object() .bucket(bucket_name) .key(key) .body(body.unwrap()) .send() .await .map_err(S3ExampleError::from) }
  • For API details, see PutObject in AWS SDK for Rust API reference.

SAP ABAP
SDK for SAP ABAP
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository.

"Get contents of file from application server." DATA lv_body TYPE xstring. OPEN DATASET iv_file_name FOR INPUT IN BINARY MODE. READ DATASET iv_file_name INTO lv_body. CLOSE DATASET iv_file_name. "Upload/put an object to an S3 bucket." TRY. lo_s3->putobject( iv_bucket = iv_bucket_name iv_key = iv_file_name iv_body = lv_body ). MESSAGE 'Object uploaded to S3 bucket.' TYPE 'I'. CATCH /aws1/cx_s3_nosuchbucket. MESSAGE 'Bucket does not exist.' TYPE 'E'. ENDTRY.
  • For API details, see PutObject in AWS SDK for SAP ABAP API reference.

Swift
SDK for Swift
Note

There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository.

import AWSS3 import Smithy public func uploadFile(bucket: String, key: String, file: String) async throws { let fileUrl = URL(fileURLWithPath: file) do { let fileData = try Data(contentsOf: fileUrl) let dataStream = ByteStream.data(fileData) let input = PutObjectInput( body: dataStream, bucket: bucket, key: key ) _ = try await client.putObject(input: input) } catch { print("ERROR: ", dump(error, name: "Putting an object.")) throw error } }
import AWSS3 import Smithy public func createFile(bucket: String, key: String, withData data: Data) async throws { let dataStream = ByteStream.data(data) let input = PutObjectInput( body: dataStream, bucket: bucket, key: key ) do { _ = try await client.putObject(input: input) } catch { print("ERROR: ", dump(error, name: "Putting an object.")) throw error } }
  • For API details, see PutObject in AWS SDK for Swift API reference.