AWS SDK Version 2 for .NET
API Reference

AWS services or capabilities described in AWS Documentation may vary by region/location. Click Getting Started with Amazon AWS to see specific differences applicable to the China (Beijing) Region.

.NET Framework 3.5
 
The BatchWriteItem operation puts or deletes multiple items in one or more tables. A single call to BatchWriteItem can write up to 16 MB of data, which can comprise as many as 25 put or delete requests. Individual items to be written can be as large as 400 KB.

BatchWriteItem cannot update items. To update items, use the UpdateItem API.

The individual PutItem and DeleteItem operations specified in BatchWriteItem are atomic; however BatchWriteItem as a whole is not. If any requested operations fail because the table's provisioned throughput is exceeded or an internal processing failure occurs, the failed operations are returned in the UnprocessedItems response parameter. You can investigate and optionally resend the requests. Typically, you would call BatchWriteItem in a loop. Each iteration would check for unprocessed items and submit a new BatchWriteItem request with those unprocessed items until all items have been processed.

Note that if none of the items can be processed due to insufficient provisioned throughput on all of the tables in the request, then BatchWriteItem will return a ProvisionedThroughputExceededException.

If DynamoDB returns any unprocessed items, you should retry the batch operation on those items. However, we strongly recommend that you use an exponential backoff algorithm. If you retry the batch operation immediately, the underlying read or write requests can still fail due to throttling on the individual tables. If you delay the batch operation using exponential backoff, the individual requests in the batch are much more likely to succeed.

For more information, see Batch Operations and Error Handling in the Amazon DynamoDB Developer Guide.

With BatchWriteItem, you can efficiently write or delete large amounts of data, such as from Amazon Elastic MapReduce (EMR), or copy data from another database into DynamoDB. In order to improve performance with these large-scale operations, BatchWriteItem does not behave in the same way as individual PutItem and DeleteItem calls would. For example, you cannot specify conditions on individual put and delete requests, and BatchWriteItem does not return deleted items in the response.

If you use a programming language that supports concurrency, you can use threads to write items in parallel. Your application must include the necessary logic to manage the threads. With languages that don't support threading, you must update or delete the specified items one at a time. In both situations, BatchWriteItem provides an alternative where the API performs the specified put and delete operations in parallel, giving you the power of the thread pool approach without having to introduce complexity into your application.

Parallel processing reduces latency, but each specified put and delete request consumes the same number of write capacity units whether it is processed in parallel or not. Delete operations on nonexistent items consume one write capacity unit.

If one or more of the following is true, DynamoDB rejects the entire batch write operation:

Namespace: Amazon.DynamoDBv2
Assembly: AWSSDK.dll
Version: (assembly version)

Syntax

C#
public virtual BatchWriteItemResponse BatchWriteItem(
         BatchWriteItemRequest request
)

Parameters

request
Type: Amazon.DynamoDBv2.Model.BatchWriteItemRequest

Container for the necessary parameters to execute the BatchWriteItem service method.

Return Value
Type: Amazon.DynamoDBv2.Model.BatchWriteItemResponse
The response from the BatchWriteItem service method, as returned by DynamoDB.

Exceptions

ExceptionCondition
InternalServerErrorException An error occurred on the server side.
ItemCollectionSizeLimitExceededException An item collection is too large. This exception is only returned for tables that have one or more local secondary indexes.
ProvisionedThroughputExceededException Your request rate is too high. The AWS SDKs for DynamoDB automatically retry requests that receive this exception. Your request is eventually successful, unless your retry queue is too large to finish. Reduce the frequency of requests and use exponential backoff. For more information, go to Error Retries and Exponential Backoff in the Amazon DynamoDB Developer Guide.
ResourceNotFoundException The operation tried to access a nonexistent table or index. The resource might not be specified correctly, or its status might not be ACTIVE.

Examples

The following examples show how to batch items into two tables.

This example will construct a batch-write collection for the first table in the request. The request will include two Put operations and one Delete operation.

BatchWrite sample - First table


// Create items to put into first table
Dictionary<string, AttributeValue> item1 = new Dictionary<string, AttributeValue>();
item1["Author"] = new AttributeValue { S = "Mark Twain" };
item1["Title"] = new AttributeValue { S = "A Connecticut Yankee in King Arthur's Court" };
item1["Pages"] = new AttributeValue { N = "575" };
Dictionary<string, AttributeValue> item2 = new Dictionary<string, AttributeValue>();
item2["Author"] = new AttributeValue { S = "Booker Taliaferro Washington" };
item2["Title"] = new AttributeValue { S = "My Larger Education" };
item2["Pages"] = new AttributeValue { N = "313" };
item2["Year"] = new AttributeValue { N = "1911" };

// Create key for item to delete from first table
//  Hash-key of the target item is string value "Mark Twain"
//  Range-key of the target item is string value "Tom Sawyer, Detective"
Dictionary<string, AttributeValue> keyToDelete1 = new Dictionary<string, AttributeValue>
{
    { "Author", new AttributeValue { S = "Mark Twain" } },
    { "Title", new AttributeValue { S = "Tom Sawyer, Detective" } }
};

// Construct write-request for first table
List<WriteRequest> sampleTableItems = new List<WriteRequest>();
sampleTableItems.Add(new WriteRequest
{
    PutRequest = new PutRequest { Item = item1 }
});
sampleTableItems.Add(new WriteRequest
{
    PutRequest = new PutRequest { Item = item2 }
});
sampleTableItems.Add(new WriteRequest
{
    DeleteRequest = new DeleteRequest { Key = keyToDelete1 }
});

                

This example will construct a batch-write collection for the second table in the request. The request will include one Delete operation.

BatchWrite sample - Second table


// Create key for item to delete from second table
//  Hash-key of the target item is string value "Francis Scott Key Fitzgerald"
Dictionary<string, AttributeValue> keyToDelete2 = new Dictionary<string, AttributeValue>
{
    { "Author", new AttributeValue { S = "Francis Scott Key Fitzgerald" } },
};

// Construct write-request for first table
List<WriteRequest> authorsTableItems = new List<WriteRequest>();
authorsTableItems.Add(new WriteRequest
{
    DeleteRequest = new DeleteRequest { Key = keyToDelete2 }
});

                

This example will construct the BatchWrite request from the two earlier-created collections, will issue the call and in case some items are not processed, will attempt to write the remaining items.

BatchWrite sample - Service calls


// Create a client
AmazonDynamoDBClient client = new AmazonDynamoDBClient();

// Construct table-keys mapping
Dictionary<string, List<WriteRequest>> requestItems = new Dictionary<string, List<WriteRequest>>();
requestItems["SampleTable"] = sampleTableItems;
requestItems["AuthorsTable"] = authorsTableItems;

BatchWriteItemRequest request = new BatchWriteItemRequest { RequestItems = requestItems };
BatchWriteItemResult result;
do
{
    // Issue request and retrieve items
    result = client.BatchWriteItem(request);

    // Some items may not have been processed!
    //  Set RequestItems to the result's UnprocessedItems and reissue request
    request.RequestItems = result.UnprocessedItems;

} while (result.UnprocessedItems.Count > 0);

                

Version Information

.NET Framework:
Supported in: 4.5, 4.0, 3.5