AWS Tools for Windows PowerShell
Command Reference

AWS services or capabilities described in AWS Documentation may vary by region/location. Click Getting Started with Amazon AWS to see specific differences applicable to the China (Beijing) Region.

Synopsis

Calls the Amazon Kinesis PutRecord API operation.

Syntax

FromBlob (Default)

Write-KINRecord
-StreamName <String>
-Data <Byte[]>
-ExplicitHashKey <String>
-PartitionKey <String>
-SequenceNumberForOrdering <String>
-StreamARN <String>
-Select <String>
-PassThru <SwitchParameter>
-Force <SwitchParameter>
-ClientConfig <AmazonKinesisConfig>

FromText

Write-KINRecord
-StreamName <String>
-Text <String>
-ExplicitHashKey <String>
-PartitionKey <String>
-SequenceNumberForOrdering <String>
-StreamARN <String>
-Select <String>
-PassThru <SwitchParameter>
-Force <SwitchParameter>
-ClientConfig <AmazonKinesisConfig>

FromFile

Write-KINRecord
-StreamName <String>
-FilePath <String>
-ExplicitHashKey <String>
-PartitionKey <String>
-SequenceNumberForOrdering <String>
-StreamARN <String>
-Select <String>
-PassThru <SwitchParameter>
-Force <SwitchParameter>
-ClientConfig <AmazonKinesisConfig>

Description

Writes a single data record into an Amazon Kinesis data stream. Call PutRecord to send data into the stream for real-time ingestion and subsequent processing, one record at a time. Each shard can support writes up to 1,000 records per second, up to a maximum data write total of 1 MiB per second. When invoking this API, you must use either the StreamARN or the StreamName parameter, or both. It is recommended that you use the StreamARN input parameter when you invoke this API. You must specify the name of the stream that captures, stores, and transports the data; a partition key; and the data blob itself. The data blob can be any type of data; for example, a segment from a log file, geographic/location data, website clickstream data, and so on. The partition key is used by Kinesis Data Streams to distribute data across shards. Kinesis Data Streams segregates the data records that belong to a stream into multiple shards, using the partition key associated with each data record to determine the shard to which a given data record belongs. Partition keys are Unicode strings, with a maximum length limit of 256 characters for each key. An MD5 hash function is used to map partition keys to 128-bit integer values and to map associated data records to shards using the hash key ranges of the shards. You can override hashing the partition key to determine the shard by explicitly specifying a hash value using the ExplicitHashKey parameter. For more information, see Adding Data to a Stream in the Amazon Kinesis Data Streams Developer Guide. PutRecord returns the shard ID of where the data record was placed and the sequence number that was assigned to the data record. Sequence numbers increase over time and are specific to a shard within a stream, not across all shards within a stream. To guarantee strictly increasing ordering, write serially to a shard and use the SequenceNumberForOrdering parameter. For more information, see Adding Data to a Stream in the Amazon Kinesis Data Streams Developer Guide. After you write a record to a stream, you cannot modify that record or its order within the stream. If a PutRecord request cannot be processed because of insufficient provisioned throughput on the shard involved in the request, PutRecord throws ProvisionedThroughputExceededException. By default, data records are accessible for 24 hours from the time that they are added to a stream. You can use IncreaseStreamRetentionPeriod or DecreaseStreamRetentionPeriod to modify this retention period.

Parameters

-ClientConfig <AmazonKinesisConfig>
Amazon.PowerShell.Cmdlets.KIN.AmazonKinesisClientCmdlet.ClientConfig
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-Data <Byte[]>
The data blob to put into the record, which is base64-encoded when the blob is serialized. When the data blob (the payload before base64-encoding) is added to the partition key size, the total size must not exceed the maximum record size (1 MiB).The cmdlet will automatically convert the supplied parameter of type string, string[], System.IO.FileInfo or System.IO.Stream to byte[] before supplying it to the service.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesBlob, Record_Data
-ExplicitHashKey <String>
The hash value used to explicitly determine the shard the data record is assigned to by overriding the partition key hash.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-FilePath <String>
This parameter is obsolete and will be removed in a future version. Use 'Record_Data' instead. The fully qualified name to a file containing the data to send, which is base64-encoded when the blob is serialized. When the data blob (the payload before base64-encoding) is added to the partition key size, the total size must not exceed the maximum record size (1 MB).
Required?True
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesRecord_FilePath
This parameter overrides confirmation prompts to force the cmdlet to continue its operation. This parameter should always be used with caution.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-PartitionKey <String>
Determines which shard in the stream the data record is assigned to. Partition keys are Unicode strings with a maximum length limit of 256 characters for each key. Amazon Kinesis Data Streams uses the partition key as input to a hash function that maps the partition key and associated data to a specific shard. Specifically, an MD5 hash function is used to map partition keys to 128-bit integer values and to map associated data records to shards. As a result of this hashing mechanism, all data records with the same partition key map to the same shard within the stream.
Required?True
Position?Named
Accept pipeline input?True (ByPropertyName)
-PassThru <SwitchParameter>
Changes the cmdlet behavior to return the value passed to the StreamName parameter. The -PassThru parameter is deprecated, use -Select '^StreamName' instead. This parameter will be removed in a future version.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-Select <String>
Use the -Select parameter to control the cmdlet output. The default value is '*'. Specifying -Select '*' will result in the cmdlet returning the whole service response (Amazon.Kinesis.Model.PutRecordResponse). Specifying the name of a property of type Amazon.Kinesis.Model.PutRecordResponse will result in that property being returned. Specifying -Select '^ParameterName' will result in the cmdlet returning the selected cmdlet parameter value.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-SequenceNumberForOrdering <String>
Guarantees strictly increasing sequence numbers, for puts from the same client and to the same partition key. Usage: set the SequenceNumberForOrdering of record n to the sequence number of record n-1 (as returned in the result when putting record n-1). If this parameter is not set, records are coarsely ordered based on arrival time.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-StreamARN <String>
The ARN of the stream.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-StreamName <String>
The name of the stream to put the data record into.
Required?False
Position?1
Accept pipeline input?True (ByValue, ByPropertyName)
-Text <String>
This parameter is obsolete and will be removed in a future version. Use 'Record_Data' instead. Text string containing the data to send, which is base64-encoded when the blob is serialized. When the data blob (the payload before base64-encoding) is added to the partition key size, the total size must not exceed the maximum record size (1 MB).
Required?True
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesRecord_Text

Common Credential and Region Parameters

-AccessKey <String>
The AWS access key for the user account. This can be a temporary access key if the corresponding session token is supplied to the -SessionToken parameter.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesAK
-Credential <AWSCredentials>
An AWSCredentials object instance containing access and secret key information, and optionally a token for session-based credentials.
Required?False
Position?Named
Accept pipeline input?True (ByValue, ByPropertyName)
-EndpointUrl <String>
The endpoint to make the call against.Note: This parameter is primarily for internal AWS use and is not required/should not be specified for normal usage. The cmdlets normally determine which endpoint to call based on the region specified to the -Region parameter or set as default in the shell (via Set-DefaultAWSRegion). Only specify this parameter if you must direct the call to a specific custom endpoint.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
-NetworkCredential <PSCredential>
Used with SAML-based authentication when ProfileName references a SAML role profile. Contains the network credentials to be supplied during authentication with the configured identity provider's endpoint. This parameter is not required if the user's default network identity can or should be used during authentication.
Required?False
Position?Named
Accept pipeline input?True (ByValue, ByPropertyName)
-ProfileLocation <String>
Used to specify the name and location of the ini-format credential file (shared with the AWS CLI and other AWS SDKs)If this optional parameter is omitted this cmdlet will search the encrypted credential file used by the AWS SDK for .NET and AWS Toolkit for Visual Studio first. If the profile is not found then the cmdlet will search in the ini-format credential file at the default location: (user's home directory)\.aws\credentials.If this parameter is specified then this cmdlet will only search the ini-format credential file at the location given.As the current folder can vary in a shell or during script execution it is advised that you use specify a fully qualified path instead of a relative path.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesAWSProfilesLocation, ProfilesLocation
-ProfileName <String>
The user-defined name of an AWS credentials or SAML-based role profile containing credential information. The profile is expected to be found in the secure credential file shared with the AWS SDK for .NET and AWS Toolkit for Visual Studio. You can also specify the name of a profile stored in the .ini-format credential file used with the AWS CLI and other AWS SDKs.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesStoredCredentials, AWSProfileName
-Region <Object>
The system name of an AWS region or an AWSRegion instance. This governs the endpoint that will be used when calling service operations. Note that the AWS resources referenced in a call are usually region-specific.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesRegionToCall
-SecretKey <String>
The AWS secret key for the user account. This can be a temporary secret key if the corresponding session token is supplied to the -SessionToken parameter.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesSK, SecretAccessKey
-SessionToken <String>
The session token if the access and secret keys are temporary session-based credentials.
Required?False
Position?Named
Accept pipeline input?True (ByPropertyName)
AliasesST

Outputs

This cmdlet returns an Amazon.Kinesis.Model.PutRecordResponse object containing multiple properties. The object can also be referenced from properties attached to the cmdlet entry in the $AWSHistory stack.

Examples

Example 1

Write-KINRecord -Text "test data from string" -StreamName "mystream" -PartitionKey "Key1"
Writes a record containing the string supplied to the -Text parameter.

Example 2

Write-KINRecord -FilePath "C:\TestData.txt" -StreamName "mystream" -PartitionKey "Key2"
Writes a record containing the data contained in the specified file. The file is treated as a sequence of bytes so if it contains text, it should be written with any necessary encoding before using it with this cmdlet.

Supported Version

AWS Tools for PowerShell: 2.x.y.z