AWS Tools for Windows PowerShell
Command Reference

AWS services or capabilities described in AWS Documentation may vary by region/location. Click Getting Started with Amazon AWS to see specific differences applicable to the China (Beijing) Region.

Synopsis

Uploads one or more files from the local file system to an S3 bucket.

Syntax

UploadSingleFile (Default)

Write-S3Object
-BucketName <String>
-Key <String>
-File <String>
-CannedACLName <S3CannedACL>
-PublicReadOnly <SwitchParameter>
-PublicReadWrite <SwitchParameter>
-ContentType <String>
-StorageClass <S3StorageClass>
-StandardStorage <SwitchParameter>
-ReducedRedundancyStorage <SwitchParameter>
-ServerSideEncryption <ServerSideEncryptionMethod>
-ServerSideEncryptionKeyManagementServiceKeyId <String>
-ServerSideEncryptionCustomerMethod <ServerSideEncryptionCustomerMethod>
-ServerSideEncryptionCustomerProvidedKey <String>
-ServerSideEncryptionCustomerProvidedKeyMD5 <String>
-Metadata <Hashtable>
-HeaderCollection <Hashtable>
-TagSet <Tag[]>
-ChecksumAlgorithm <ChecksumAlgorithm>
-RequestPayer <RequestPayer>
-ConcurrentServiceRequest <Int32>
-CalculateContentMD5Header <Boolean>
-PartSize <FileSize>
-IfNoneMatch <String>
-Force <SwitchParameter>
-ClientConfig <AmazonS3Config>
-UseAccelerateEndpoint <SwitchParameter>
-UseDualstackEndpoint <SwitchParameter>
-ForcePathStyleAddressing <Boolean>

UploadFromContent

Write-S3Object
-BucketName <String>
-Key <String>
-Content <String>
-CannedACLName <S3CannedACL>
-PublicReadOnly <SwitchParameter>
-PublicReadWrite <SwitchParameter>
-ContentType <String>
-StorageClass <S3StorageClass>
-StandardStorage <SwitchParameter>
-ReducedRedundancyStorage <SwitchParameter>
-ServerSideEncryption <ServerSideEncryptionMethod>
-ServerSideEncryptionKeyManagementServiceKeyId <String>
-ServerSideEncryptionCustomerMethod <ServerSideEncryptionCustomerMethod>
-ServerSideEncryptionCustomerProvidedKey <String>
-ServerSideEncryptionCustomerProvidedKeyMD5 <String>
-Metadata <Hashtable>
-HeaderCollection <Hashtable>
-TagSet <Tag[]>
-ChecksumAlgorithm <ChecksumAlgorithm>
-RequestPayer <RequestPayer>
-ConcurrentServiceRequest <Int32>
-CalculateContentMD5Header <Boolean>
-PartSize <FileSize>
-IfNoneMatch <String>
-Force <SwitchParameter>
-ClientConfig <AmazonS3Config>
-UseAccelerateEndpoint <SwitchParameter>
-UseDualstackEndpoint <SwitchParameter>
-ForcePathStyleAddressing <Boolean>

UploadFromStream

Write-S3Object
-BucketName <String>
-Key <String>
-Stream <Object>
-CannedACLName <S3CannedACL>
-PublicReadOnly <SwitchParameter>
-PublicReadWrite <SwitchParameter>
-ContentType <String>
-StorageClass <S3StorageClass>
-StandardStorage <SwitchParameter>
-ReducedRedundancyStorage <SwitchParameter>
-ServerSideEncryption <ServerSideEncryptionMethod>
-ServerSideEncryptionKeyManagementServiceKeyId <String>
-ServerSideEncryptionCustomerMethod <ServerSideEncryptionCustomerMethod>
-ServerSideEncryptionCustomerProvidedKey <String>
-ServerSideEncryptionCustomerProvidedKeyMD5 <String>
-Metadata <Hashtable>
-HeaderCollection <Hashtable>
-TagSet <Tag[]>
-ChecksumAlgorithm <ChecksumAlgorithm>
-RequestPayer <RequestPayer>
-ConcurrentServiceRequest <Int32>
-CalculateContentMD5Header <Boolean>
-PartSize <FileSize>
-IfNoneMatch <String>
-Force <SwitchParameter>
-ClientConfig <AmazonS3Config>
-UseAccelerateEndpoint <SwitchParameter>
-UseDualstackEndpoint <SwitchParameter>
-ForcePathStyleAddressing <Boolean>

UploadFolder

Write-S3Object
-BucketName <String>
-KeyPrefix <String>
-Folder <String>
-Recurse <SwitchParameter>
-SearchPattern <String>
-CannedACLName <S3CannedACL>
-PublicReadOnly <SwitchParameter>
-PublicReadWrite <SwitchParameter>
-ContentType <String>
-StorageClass <S3StorageClass>
-StandardStorage <SwitchParameter>
-ReducedRedundancyStorage <SwitchParameter>
-ServerSideEncryption <ServerSideEncryptionMethod>
-ServerSideEncryptionKeyManagementServiceKeyId <String>
-ServerSideEncryptionCustomerMethod <ServerSideEncryptionCustomerMethod>
-ServerSideEncryptionCustomerProvidedKey <String>
-ServerSideEncryptionCustomerProvidedKeyMD5 <String>
-Metadata <Hashtable>
-HeaderCollection <Hashtable>
-TagSet <Tag[]>
-RequestPayer <RequestPayer>
-ConcurrentServiceRequest <Int32>
-CalculateContentMD5Header <Boolean>
-PartSize <FileSize>
-IfNoneMatch <String>
-Force <SwitchParameter>
-ClientConfig <AmazonS3Config>
-UseAccelerateEndpoint <SwitchParameter>
-UseDualstackEndpoint <SwitchParameter>
-ForcePathStyleAddressing <Boolean>

Description

Uploads a local file, text content or a folder hierarchy of files to Amazon S3, placing them into the specified bucket using the specified key (single object) or key prefix (multiple objects). If you are uploading large files, Write-S3Object cmdlet will use multipart upload to fulfill the request. If a multipart upload is interrupted, Write-S3Object cmdlet will attempt to abort the multipart upload. Under certain circumstances (network outage, power failure, etc.), Write-S3Object cmdlet will not be able to abort the multipart upload. In this case, in order to stop getting charged for the storage of uploaded parts, you should manually invoke the Remove-S3MultipartUploads to abort the incomplete multipart uploads.

Parameters

-BucketName <String>
The name of the bucket that will hold the uploaded content.Directory buckets - When you use this operation with a directory bucket, you must use virtual-hosted-style requests in the format Bucket_name.s3express-az_id.region.amazonaws.com. Path-style requests are not supported. Directory bucket names must be unique in the chosen Availability Zone. Bucket names must follow the format bucket_base_name--az-id--x-s3 (for example, DOC-EXAMPLE-BUCKET--usw2-az1--x-s3). For information about bucket naming restrictions, see Directory bucket naming rules in the Amazon S3 User Guide.Access points - When you use this action with an access point, you must provide the alias of the access point in place of the bucket name or specify the access point ARN. When using the access point ARN, you must direct requests to the access point hostname. The access point hostname takes the form AccessPointName-AccountId.s3-accesspoint.Region.amazonaws.com. When using this action with an access point through the Amazon Web Services SDKs, you provide the access point ARN in place of the bucket name. For more information about access point ARNs, see Using access points in the Amazon S3 User Guide.Access points and Object Lambda access points are not supported by directory buckets.S3 on Outposts - When you use this action with Amazon S3 on Outposts, you must direct requests to the S3 on Outposts hostname. The S3 on Outposts hostname takes the form AccessPointName-AccountId.outpostID.s3-outposts.Region.amazonaws.com. When you use this action with S3 on Outposts through the Amazon Web Services SDKs, you provide the Outposts access point ARN in place of the bucket name. For more information about S3 on Outposts ARNs, see What is S3 on Outposts? in the Amazon S3 User Guide.
Required?True
Position?1
Accept pipeline input?True (ByValue, ByPropertyName)
-CalculateContentMD5Header <Boolean>
This property determines whether the Content-MD5 header should be calculated for upload.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-CannedACLName <S3CannedACL>
Amazon.PowerShell.Cmdlets.S3.WriteS3ObjectCmdlet.CannedACLName
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-ChecksumAlgorithm <ChecksumAlgorithm>
Indicates the algorithm you want Amazon S3 to use to create the checksum for the object. For more information, see Checking object integrity in the Amazon S3 User Guide.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-ClientConfig <AmazonS3Config>
Amazon.PowerShell.Cmdlets.S3.AmazonS3ClientCmdlet.ClientConfig
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-ConcurrentServiceRequest <Int32>
This property determines how many active threads will be used to upload the file . This property is only applicable if the file being uploaded is larger than 16 MB, in which case TransferUtility is used to upload multiple parts in parallel. The default value is 10.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesConcurrentServiceRequests
-Content <String>
Specifies text content that will be used to set the content of the object in S3. Use a here-string to specify multiple lines of text.
Required?True
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesText
-ContentType <String>
Amazon.PowerShell.Cmdlets.S3.WriteS3ObjectCmdlet.ContentType
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-File <String>
The full path to the local file to be uploaded.
Required?True
Position?3
Accept pipeline input?True (ByPropertyName)
-Folder <String>
The full path to a local folder; all content in the folder will be uploaded to the specified bucket and key. Sub-folders in the folder will only be uploaded if the Recurse switch is specified.
Required?True
Position?3
Accept pipeline input?True (ByPropertyName)
AliasesDirectory
This parameter overrides confirmation prompts to force the cmdlet to continue its operation. This parameter should always be used with caution.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-ForcePathStyleAddressing <Boolean>
S3 requests can be performed using one of two URI styles: Virtual or Path. When using Virtual style, the bucket is included as part of the hostname. When using Path style the bucket is included as part of the URI path. The default value is $true when the EndpointUrl parameter is specified, $false otherwise.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-HeaderCollection <Hashtable>
Response headers to set on the object.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesHeaders
-IfNoneMatch <String>
Uploads the object only if the object key name does not already exist in the bucket specified. Otherwise, Amazon S3 returns a 412 Precondition Failed error.If a conflicting operation occurs during the upload S3 returns a 409 ConditionalRequestConflict response. On a 409 failure you should re-initiate the multipart upload with CreateMultipartUpload and re-upload each part.Expects the '*' (asterisk) character.For more information about conditional requests, see RFC 7232, or Conditional requests in the Amazon S3 User Guide.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-Key <String>
The key that will be used to identify the object in S3. If the -File parameter is specified, -Key is optional and the object key can be inferred from the filename value supplied to the -File parameter.
Required?True (UploadFromContent, UploadFromStream)
Position?2
Accept pipeline input?True (ByPropertyName)
-KeyPrefix <String>
The common key prefix that will be used for the objects uploaded to S3. Use this parameter when uploading multiple objects. Each object's final key will be of the form 'keyprefix/filename'.To indicate that all content should be uploaded to the root of the bucket, specify a KeyPrefix of '\' or '/'.
Required?True
Position?2
Accept pipeline input?True (ByPropertyName)
AliasesPrefix
-Metadata <Hashtable>
Metadata headers to set on the object.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-PartSize <FileSize>
This property determines the part size of the upload. The uploaded file will be divided into parts of the size specified and uploaded to Amazon S3 individually. The part size can be between 5 MB to 5 GB. You can specify this value in one of two ways:
  • The part size in bytes. For example, 6291456.
  • The part size with a size suffix. You can use bytes, KB, MB, GB. For example, 6291456bytes, 15.12MB, "15.12 MB".
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-PublicReadOnly <SwitchParameter>
If set, applies an ACL making the S3 object(s) public with read-only permissions
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-PublicReadWrite <SwitchParameter>
If set, applies an ACL making the S3 object(s) public with read-write permissions
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-Recurse <SwitchParameter>
If set, all sub-folders beneath the folder set in LocalFolder will also be uploaded. The folder structure will be mirrored in S3. Defaults off [false].
Required?False
Position?4
Accept pipeline input?True (ByPropertyName)
-ReducedRedundancyStorage <SwitchParameter>
Specifies S3 should use REDUCED_REDUNDANCY storage class for the object. This provides a reduced (99.99%) durability guarantee at a lower cost as compared to the STANDARD storage class. Use this storage class for non-mission critical data or for data that does not require the higher level of durability that S3 provides with the STANDARD storage class.This parameter is deprecated. Please use the StorageClass parameter instead.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-RequestPayer <RequestPayer>
Confirms that the requester knows that they will be charged for the request. Bucket owners need not specify this parameter in their requests.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-SearchPattern <String>
The search pattern used to determine which files in the directory are uploaded.
Required?False
Position?5
Accept pipeline input?True (ByPropertyName)
AliasesPattern
-ServerSideEncryption <ServerSideEncryptionMethod>
The server-side encryption algorithm used when storing this object in Amazon S3 Allowable values: None, AES256, aws:kms.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-ServerSideEncryptionCustomerMethod <ServerSideEncryptionCustomerMethod>
Specifies the server-side encryption algorithm to be used with the customer provided key. Allowable values: None or AES256.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-ServerSideEncryptionCustomerProvidedKey <String>
Specifies base64-encoded encryption key for Amazon S3 to use to encrypt the object.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-ServerSideEncryptionCustomerProvidedKeyMD5 <String>
Specifies base64-encoded MD5 of the encryption key for Amazon S3 to use to decrypt the object. This field is optional, the SDK will calculate the MD5 if this is not set.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-ServerSideEncryptionKeyManagementServiceKeyId <String>
The id of the AWS Key Management Service key that Amazon S3 should use to encrypt and decrypt the object. If a key id is not specified, the default key will be used for encryption and decryption. If x-amz-server-side-encryption has a valid value of aws:kms, this header specifies the ID of the Amazon Web Services Key Management Service (Amazon Web Services KMS) symmetric encryption customer managed key that was used for the object. If you specify x-amz-server-side-encryption:aws:kms, but do not provide x-amz-server-side-encryption-aws-kms-key-id, Amazon S3 uses the Amazon Web Services managed key to protect the data. If the KMS key does not exist in the same account issuing the command, you must use the full ARN and not just the ID.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-StandardStorage <SwitchParameter>
Specifies the STANDARD storage class, which is the default storage class for S3 objects. Provides a 99.999999999% durability guarantee.This parameter is deprecated. Please use the StorageClass parameter instead.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-StorageClass <S3StorageClass>
Specifies the storage class for the object. Please refer to Storage Classes for information on S3 storage classes.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-Stream <Object>
The stream to be uploaded.The cmdlet accepts a parameter of type string, string[], System.IO.FileInfo or System.IO.Stream.
Required?True
Position?Named
Accept pipeline input?True (ByPropertyName)
-TagSet <Tag[]>
One or more tags to apply to the object.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-UseAccelerateEndpoint <SwitchParameter>
Enables S3 accelerate by sending requests to the accelerate endpoint instead of the regular region endpoint. To use this feature, the bucket name must be DNS compliant and must not contain periods (.).
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-UseDualstackEndpoint <SwitchParameter>
Configures the request to Amazon S3 to use the dualstack endpoint for a region. S3 supports dualstack endpoints which return both IPv6 and IPv4 values. The dualstack mode of Amazon S3 cannot be used with accelerate mode.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)

Common Credential and Region Parameters

-AccessKey <String>
The AWS access key for the user account. This can be a temporary access key if the corresponding session token is supplied to the -SessionToken parameter.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesAK
-Credential <AWSCredentials>
An AWSCredentials object instance containing access and secret key information, and optionally a token for session-based credentials.
Required?False
Position?Named
Accept pipeline input?True (ByValue, ByPropertyName)
-EndpointUrl <String>
The endpoint to make the call against.Note: This parameter is primarily for internal AWS use and is not required/should not be specified for normal usage. The cmdlets normally determine which endpoint to call based on the region specified to the -Region parameter or set as default in the shell (via Set-DefaultAWSRegion). Only specify this parameter if you must direct the call to a specific custom endpoint.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-NetworkCredential <PSCredential>
Used with SAML-based authentication when ProfileName references a SAML role profile. Contains the network credentials to be supplied during authentication with the configured identity provider's endpoint. This parameter is not required if the user's default network identity can or should be used during authentication.
Required?False
Position?Named
Accept pipeline input?True (ByValue, ByPropertyName)
-ProfileLocation <String>
Used to specify the name and location of the ini-format credential file (shared with the AWS CLI and other AWS SDKs)If this optional parameter is omitted this cmdlet will search the encrypted credential file used by the AWS SDK for .NET and AWS Toolkit for Visual Studio first. If the profile is not found then the cmdlet will search in the ini-format credential file at the default location: (user's home directory)\.aws\credentials.If this parameter is specified then this cmdlet will only search the ini-format credential file at the location given.As the current folder can vary in a shell or during script execution it is advised that you use specify a fully qualified path instead of a relative path.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesAWSProfilesLocation, ProfilesLocation
-ProfileName <String>
The user-defined name of an AWS credentials or SAML-based role profile containing credential information. The profile is expected to be found in the secure credential file shared with the AWS SDK for .NET and AWS Toolkit for Visual Studio. You can also specify the name of a profile stored in the .ini-format credential file used with the AWS CLI and other AWS SDKs.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesStoredCredentials, AWSProfileName
-Region <Object>
The system name of an AWS region or an AWSRegion instance. This governs the endpoint that will be used when calling service operations. Note that the AWS resources referenced in a call are usually region-specific.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesRegionToCall
-SecretKey <String>
The AWS secret key for the user account. This can be a temporary secret key if the corresponding session token is supplied to the -SessionToken parameter.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesSK, SecretAccessKey
-SessionToken <String>
The session token if the access and secret keys are temporary session-based credentials.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesST

Outputs

Examples

Example 1

Write-S3Object -BucketName amzn-s3-demo-bucket -Key "sample.txt" -File .\local-sample.txt
This command uploads the single file "local-sample.txt" to Amazon S3, creating an object with key "sample.txt" in bucket "test-files".

Example 2

Write-S3Object -BucketName amzn-s3-demo-bucket -File .\sample.txt
This command uploads the single file "sample.txt" to Amazon S3, creating an object with key "sample.txt" in bucket "test-files". If the -Key parameter is not supplied, the filename is used as the S3 object key.

Example 3

Write-S3Object -BucketName amzn-s3-demo-bucket -Key "prefix/to/sample.txt" -File .\local-sample.txt
This command uploads the single file "local-sample.txt" to Amazon S3, creating an object with key "prefix/to/sample.txt" in bucket "test-files".

Example 4

Write-S3Object -BucketName amzn-s3-demo-bucket -Folder .\Scripts -KeyPrefix SampleScripts\
This command uploads all files in the subdirectory "Scripts" to the bucket "test-files" and applies the common key prefix "SampleScripts" to each object. Each uploaded file will have a key of "SampleScripts/filename" where 'filename' varies.

Example 5

Write-S3Object -BucketName amzn-s3-demo-bucket -Folder .\Scripts -KeyPrefix SampleScripts\ -SearchPattern *.ps1
This command uploads all *.ps1 files in the local director "Scripts" to bucket "test-files" and applies the common key prefix "SampleScripts" to each object. Each uploaded file will have a key of "SampleScripts/filename.ps1" where 'filename' varies.

Example 6

Write-S3Object -BucketName amzn-s3-demo-bucket -Key "sample.txt" -Content "object contents"
This command creates a new S3 object containing the specified content string with key 'sample.txt'.

Example 7

Write-S3Object -BucketName amzn-s3-demo-bucket -File "sample.txt" -TagSet @{Key="key1";Value="value1"},@{Key="key2";Value="value2"}
This command uploads the specified file (the filename is used as the key) and applies the specified tags to the new object.

Example 8

Write-S3Object -BucketName amzn-s3-demo-bucket -Folder . -KeyPrefix "TaggedFiles" -Recurse -TagSet @{Key="key1";Value="value1"},@{Key="key2";Value="value2"}
This command recursively uploads the specified folder and applies the specified tags to all the new objects.

Supported Version

AWS Tools for PowerShell: 2.x.y.z