AWS Tools for Windows PowerShell
Command Reference

AWS services or capabilities described in AWS Documentation may vary by region/location. Click Getting Started with Amazon AWS to see specific differences applicable to the China (Beijing) Region.

Synopsis

Uploads one or more files from the local file system to an S3 bucket.

Syntax

UploadSingleFile (Default)

Write-S3Object
-BucketName <String>
-Key <String>
-File <String>
-CannedACLName <S3CannedACL>
-PublicReadOnly <SwitchParameter>
-PublicReadWrite <SwitchParameter>
-ContentType <String>
-StorageClass <S3StorageClass>
-StandardStorage <SwitchParameter>
-ReducedRedundancyStorage <SwitchParameter>
-ServerSideEncryption <ServerSideEncryptionMethod>
-ServerSideEncryptionKeyManagementServiceKeyId <String>
-ServerSideEncryptionCustomerMethod <ServerSideEncryptionCustomerMethod>
-ServerSideEncryptionCustomerProvidedKey <String>
-ServerSideEncryptionCustomerProvidedKeyMD5 <String>
-Metadata <Hashtable>
-HeaderCollection <Hashtable>
-UseAccelerateEndpoint <SwitchParameter>
-UseDualstackEndpoint <SwitchParameter>
-TagSet <Tag[]>
-ConcurrentServiceRequest <Nullable<Int32>>
-Force <SwitchParameter>

UploadFromContent

Write-S3Object
-BucketName <String>
-Key <String>
-Content <String>
-CannedACLName <S3CannedACL>
-PublicReadOnly <SwitchParameter>
-PublicReadWrite <SwitchParameter>
-ContentType <String>
-StorageClass <S3StorageClass>
-StandardStorage <SwitchParameter>
-ReducedRedundancyStorage <SwitchParameter>
-ServerSideEncryption <ServerSideEncryptionMethod>
-ServerSideEncryptionKeyManagementServiceKeyId <String>
-ServerSideEncryptionCustomerMethod <ServerSideEncryptionCustomerMethod>
-ServerSideEncryptionCustomerProvidedKey <String>
-ServerSideEncryptionCustomerProvidedKeyMD5 <String>
-Metadata <Hashtable>
-HeaderCollection <Hashtable>
-UseAccelerateEndpoint <SwitchParameter>
-UseDualstackEndpoint <SwitchParameter>
-TagSet <Tag[]>
-ConcurrentServiceRequest <Nullable<Int32>>
-Force <SwitchParameter>

UploadFromStream

Write-S3Object
-BucketName <String>
-Key <String>
-Stream <Stream>
-CannedACLName <S3CannedACL>
-PublicReadOnly <SwitchParameter>
-PublicReadWrite <SwitchParameter>
-ContentType <String>
-StorageClass <S3StorageClass>
-StandardStorage <SwitchParameter>
-ReducedRedundancyStorage <SwitchParameter>
-ServerSideEncryption <ServerSideEncryptionMethod>
-ServerSideEncryptionKeyManagementServiceKeyId <String>
-ServerSideEncryptionCustomerMethod <ServerSideEncryptionCustomerMethod>
-ServerSideEncryptionCustomerProvidedKey <String>
-ServerSideEncryptionCustomerProvidedKeyMD5 <String>
-Metadata <Hashtable>
-HeaderCollection <Hashtable>
-UseAccelerateEndpoint <SwitchParameter>
-UseDualstackEndpoint <SwitchParameter>
-TagSet <Tag[]>
-ConcurrentServiceRequest <Nullable<Int32>>
-Force <SwitchParameter>

UploadFolder

Write-S3Object
-BucketName <String>
-KeyPrefix <String>
-Folder <String>
-Recurse <SwitchParameter>
-SearchPattern <String>
-CannedACLName <S3CannedACL>
-PublicReadOnly <SwitchParameter>
-PublicReadWrite <SwitchParameter>
-ContentType <String>
-StorageClass <S3StorageClass>
-StandardStorage <SwitchParameter>
-ReducedRedundancyStorage <SwitchParameter>
-ServerSideEncryption <ServerSideEncryptionMethod>
-ServerSideEncryptionKeyManagementServiceKeyId <String>
-ServerSideEncryptionCustomerMethod <ServerSideEncryptionCustomerMethod>
-ServerSideEncryptionCustomerProvidedKey <String>
-ServerSideEncryptionCustomerProvidedKeyMD5 <String>
-Metadata <Hashtable>
-HeaderCollection <Hashtable>
-UseAccelerateEndpoint <SwitchParameter>
-UseDualstackEndpoint <SwitchParameter>
-TagSet <Tag[]>
-ConcurrentServiceRequest <Nullable<Int32>>
-Force <SwitchParameter>

Description

Uploads a local file, text content or a folder hierarchy of files to Amazon S3, placing them into the specified bucket using the specified key (single object) or key prefix (multiple objects). If you are uploading large files, Write-S3Object cmdlet will use multipart upload to fulfill the request. If a multipart upload is interrupted, Write-S3Object cmdlet will attempt to abort the multipart upload. Under certain circumstances (network outage, power failure, etc.), Write-S3Object cmdlet will not be able to abort the multipart upload. In this case, in order to stop getting charged for the storage of uploaded parts, you should manually invoke the Remove-S3MultipartUploads to abort the incomplete multipart uploads.

Parameters

-BucketName <String>
The name of the bucket that will hold the uploaded content.
Required?True
Position?1
Accept pipeline input?True (ByValue, ByPropertyName)
-CannedACLName <S3CannedACL>
Amazon.PowerShell.Cmdlets.S3.WriteS3ObjectCmdlet.CannedACLName
Required?False
Position?Named
Accept pipeline input?False
-ConcurrentServiceRequest <Nullable<Int32>>
This property determines how many active threads will be used to upload the file . This property is only applicable if the file being uploaded is larger than 16 MB, in which case TransferUtility is used to upload multiple parts in parallel. The default value is 10.
Required?False
Position?Named
Accept pipeline input?False
-Content <String>
Specifies text content that will be used to set the content of the object in S3. Use a here-string to specify multiple lines of text.
Required?True
Position?Named
Accept pipeline input?False
-ContentType <String>
Specifies the MIME type of the content being uploaded.
Required?False
Position?Named
Accept pipeline input?False
-File <String>
The full path to the local file to be uploaded.
Required?True
Position?3
Accept pipeline input?False
-Folder <String>
The full path to a local folder; all content in the folder will be uploaded to the specified bucket and key. Sub-folders in the folder will only be uploaded if the Recurse switch is specified.
Required?True
Position?3
Accept pipeline input?False
-Force <SwitchParameter>
This parameter overrides confirmation prompts to force the cmdlet to continue its operation. This parameter should always be used with caution.
Required?False
Position?Named
Accept pipeline input?False
-HeaderCollection <Hashtable>
Response headers to set on the object.
Required?False
Position?Named
Accept pipeline input?False
-Key <String>
The key that will be used to identify the object in S3. If the -File parameter is specified, -Key is optional and the object key can be inferred from the filename value supplied to the -File parameter.
Required?True
Position?2
Accept pipeline input?True (ByPropertyName)
-KeyPrefix <String>
The common key prefix that will be used for the objects uploaded to S3. Use this parameter when uploading multiple objects. Each object's final key will be of the form 'keyprefix/filename'.To indicate that all content should be uploaded to the root of the bucket, specify a KeyPrefix of '\' or '/'.
Required?True
Position?2
Accept pipeline input?True (ByPropertyName)
-Metadata <Hashtable>
Metadata headers to set on the object.
Required?False
Position?Named
Accept pipeline input?False
-PublicReadOnly <SwitchParameter>
If set, applies an ACL making the S3 object(s) public with read-only permissions
Required?False
Position?Named
Accept pipeline input?False
-PublicReadWrite <SwitchParameter>
If set, applies an ACL making the S3 object(s) public with read-write permissions
Required?False
Position?Named
Accept pipeline input?False
-Recurse <SwitchParameter>
If set, all sub-folders beneath the folder set in LocalFolder will also be uploaded. The folder structure will be mirrored in S3. Defaults off [false].
Required?False
Position?4
Accept pipeline input?False
-ReducedRedundancyStorage <SwitchParameter>
Specifies S3 should use REDUCED_REDUNDANCY storage class for the object. This provides a reduced (99.99%) durability guarantee at a lower cost as compared to the STANDARD storage class. Use this storage class for non-mission critical data or for data that does not require the higher level of durability that S3 provides with the STANDARD storage class.This parameter is deprecated. Please use the StorageClass parameter instead.
Required?False
Position?Named
Accept pipeline input?False
-SearchPattern <String>
The search pattern used to determine which files in the directory are uploaded.
Required?False
Position?5
Accept pipeline input?False
-ServerSideEncryption <ServerSideEncryptionMethod>
Specifies the encryption used on the server to store the content. Allowable values: None, AES256, aws:kms.
Required?False
Position?Named
Accept pipeline input?False
-ServerSideEncryptionCustomerMethod <ServerSideEncryptionCustomerMethod>
Specifies the server-side encryption algorithm to be used with the customer provided key. Allowable values: None or AES256.
Required?False
Position?Named
Accept pipeline input?False
-ServerSideEncryptionCustomerProvidedKey <String>
Specifies base64-encoded encryption key for Amazon S3 to use to encrypt the object.
Required?False
Position?Named
Accept pipeline input?False
-ServerSideEncryptionCustomerProvidedKeyMD5 <String>
Specifies base64-encoded MD5 of the encryption key for Amazon S3 to use to decrypt the object. This field is optional, the SDK will calculate the MD5 if this is not set.
Required?False
Position?Named
Accept pipeline input?False
-ServerSideEncryptionKeyManagementServiceKeyId <String>
Specifies the AWS KMS key for Amazon S3 to use to encrypt the object.
Required?False
Position?Named
Accept pipeline input?False
-StandardStorage <SwitchParameter>
Specifies the STANDARD storage class, which is the default storage class for S3 objects. Provides a 99.999999999% durability guarantee.This parameter is deprecated. Please use the StorageClass parameter instead.
Required?False
Position?Named
Accept pipeline input?False
-StorageClass <S3StorageClass>
Specifies the storage class for the object. Please refer to Storage Classes for information on S3 storage classes.
Required?False
Position?Named
Accept pipeline input?False
-Stream <Stream>
The stream to be uploaded.
Required?True
Position?Named
Accept pipeline input?False
-TagSet <Tag[]>
One or more tags to apply to the object.
Required?False
Position?Named
Accept pipeline input?False
-UseAccelerateEndpoint <SwitchParameter>
Enables S3 accelerate by sending requests to the accelerate endpoint instead of the regular region endpoint. To use this feature, the bucket name must be DNS compliant and must not contain periods (.).
Required?False
Position?Named
Accept pipeline input?False
-UseDualstackEndpoint <SwitchParameter>
Configures the request to Amazon S3 to use the dualstack endpoint for a region. S3 supports dualstack endpoints which return both IPv6 and IPv4 values. The dualstack mode of Amazon S3 cannot be used with accelerate mode.
Required?False
Position?Named
Accept pipeline input?False

Common Credential and Region Parameters

-AccessKey <String>
The AWS access key for the user account. This can be a temporary access key if the corresponding session token is supplied to the -SessionToken parameter.
Required? False
Position? Named
Accept pipeline input? False
-Credential <AWSCredentials>
An AWSCredentials object instance containing access and secret key information, and optionally a token for session-based credentials.
Required? False
Position? Named
Accept pipeline input? False
-ProfileLocation <String>

Used to specify the name and location of the ini-format credential file (shared with the AWS CLI and other AWS SDKs)

If this optional parameter is omitted this cmdlet will search the encrypted credential file used by the AWS SDK for .NET and AWS Toolkit for Visual Studio first. If the profile is not found then the cmdlet will search in the ini-format credential file at the default location: (user's home directory)\.aws\credentials. Note that the encrypted credential file is not supported on all platforms. It will be skipped when searching for profiles on Windows Nano Server, Mac, and Linux platforms.

If this parameter is specified then this cmdlet will only search the ini-format credential file at the location given.

As the current folder can vary in a shell or during script execution it is advised that you use specify a fully qualified path instead of a relative path.

Required? False
Position? Named
Accept pipeline input? False
-ProfileName <String>
The user-defined name of an AWS credentials or SAML-based role profile containing credential information. The profile is expected to be found in the secure credential file shared with the AWS SDK for .NET and AWS Toolkit for Visual Studio. You can also specify the name of a profile stored in the .ini-format credential file used with the AWS CLI and other AWS SDKs.
Required? False
Position? Named
Accept pipeline input? False
-NetworkCredential <PSCredential>
Used with SAML-based authentication when ProfileName references a SAML role profile. Contains the network credentials to be supplied during authentication with the configured identity provider's endpoint. This parameter is not required if the user's default network identity can or should be used during authentication.
Required? False
Position? Named
Accept pipeline input? False
-SecretKey <String>
The AWS secret key for the user account. This can be a temporary secret key if the corresponding session token is supplied to the -SessionToken parameter.
Required? False
Position? Named
Accept pipeline input? False
-SessionToken <String>
The session token if the access and secret keys are temporary session-based credentials.
Required? False
Position? Named
Accept pipeline input? False
-Region <String>
The system name of the AWS region in which the operation should be invoked. For example, us-east-1, eu-west-1 etc.
Required? False
Position? Named
Accept pipeline input? False
-EndpointUrl <String>

The endpoint to make the call against.

Note: This parameter is primarily for internal AWS use and is not required/should not be specified for normal usage. The cmdlets normally determine which endpoint to call based on the region specified to the -Region parameter or set as default in the shell (via Set-DefaultAWSRegion). Only specify this parameter if you must direct the call to a specific custom endpoint.

Required? False
Position? Named
Accept pipeline input? False

Inputs

You can pipe a String object to this cmdlet for the BucketName parameter.

Outputs

Examples

Example 1

PS C:\>Write-S3Object -BucketName test-files -Key "sample.txt" -File .\local-sample.txt
This command uploads the single file "local-sample.txt" to Amazon S3, creating an object with key "sample.txt" in bucket "test-files".

Example 2

PS C:\>Write-S3Object -BucketName test-files -File .\sample.txt
This command uploads the single file "sample.txt" to Amazon S3, creating an object with key "sample.txt" in bucket "test-files". If the -Key parameter is not supplied, the filename is used as the S3 object key.

Example 3

PS C:\>Write-S3Object -BucketName test-files -Key "prefix/to/sample.txt" -File .\local-sample.txt
This command uploads the single file "local-sample.txt" to Amazon S3, creating an object with key "prefix/to/sample.txt" in bucket "test-files".

Example 4

PS C:\>Write-S3Object -BucketName test-files -Folder .\Scripts -KeyPrefix SampleScripts\
This command uploads all files in the subdirectory "Scripts" to the bucket "test-files" and applies the common key prefix "SampleScripts" to each object. Each uploaded file will have a key of "SampleScripts/filename" where 'filename' varies.

Example 5

PS C:\>Write-S3Object -BucketName test-files -Folder .\Scripts -KeyPrefix SampleScripts\ -SearchPattern *.ps1
This command uploads all *.ps1 files in the local director "Scripts" to bucket "test-files" and applies the common key prefix "SampleScripts" to each object. Each uploaded file will have a key of "SampleScripts/filename.ps1" where 'filename' varies.

Example 6

PS C:\>Write-S3Object -BucketName test-files -Key "sample.txt" -Content "object contents"
This command creates a new S3 object containing the specified content string with key 'sample.txt'.

Example 7

PS C:\>Write-S3Object -BucketName test-files -File "sample.txt" -TagSet @{Key="key1";Value="value1"},@{Key="key2",Value="value2"}
This command uploads the specified file (the filename is used as the key) and applies the specified tags to the new object.

Example 8

PS C:\>Write-S3Object -BucketName test-files -Folder . -KeyPrefix "TaggedFiles" -Recurse -TagSet @{Key="key1";Value="value1"},@{Key="key2",Value="value2"}
This command recursively uploads the specified folder and applies the specified tags to all the new objects.

Supported Version

AWS Tools for PowerShell: 2.x.y.z