AWS services or capabilities described in AWS Documentation may vary by region/location. Click Getting Started with Amazon AWS to see specific differences applicable to the China (Beijing) Region.
Name | Description | |
---|---|---|
AwsVpcConfiguration |
This structure specifies the VPC subnets and security groups for the task, and whether
a public IP address is to be used. This structure is relevant only for ECS tasks that
use the |
|
BatchArrayProperties |
The array properties for the submitted job, such as the size of the array. The array size can be between 2 and 10,000. If you specify array properties for a job, it becomes an array job. This parameter is used only if the target is an Batch job. |
|
BatchContainerOverrides |
The overrides that are sent to a container. |
|
BatchEnvironmentVariable |
The environment variables to send to the container. You can add new environment variables,
which are added to the container at launch, or you can override the existing environment
variables from the Docker image or the task definition.
Environment variables cannot start with " |
|
BatchJobDependency |
An object that represents an Batch job dependency. |
|
BatchResourceRequirement |
The type and amount of a resource to assign to a container. The supported resources
include |
|
BatchRetryStrategy |
The retry strategy that's associated with a job. For more information, see Automated job retries in the Batch User Guide. |
|
CapacityProviderStrategyItem |
The details of a capacity provider strategy. To learn more, see CapacityProviderStrategyItem in the Amazon ECS API Reference. |
|
CloudwatchLogsLogDestination |
The Amazon CloudWatch Logs logging configuration settings for the pipe. |
|
CloudwatchLogsLogDestinationParameters |
The Amazon CloudWatch Logs logging configuration settings for the pipe. |
|
ConflictException |
An action you attempted resulted in an exception. |
|
CreatePipeRequest |
Container for the parameters to the CreatePipe operation. Create a pipe. Amazon EventBridge Pipes connect event sources to targets and reduces the need for specialized knowledge and integration code. |
|
CreatePipeResponse |
This is the response object from the CreatePipe operation. |
|
DeadLetterConfig |
A |
|
DeletePipeRequest |
Container for the parameters to the DeletePipe operation. Delete an existing pipe. For more information about pipes, see Amazon EventBridge Pipes in the Amazon EventBridge User Guide. |
|
DeletePipeResponse |
This is the response object from the DeletePipe operation. |
|
DescribePipeRequest |
Container for the parameters to the DescribePipe operation. Get the information about an existing pipe. For more information about pipes, see Amazon EventBridge Pipes in the Amazon EventBridge User Guide. |
|
DescribePipeResponse |
This is the response object from the DescribePipe operation. |
|
DimensionMapping |
Maps source data to a dimension in the target Timestream for LiveAnalytics table. For more information, see Amazon Timestream for LiveAnalytics concepts |
|
EcsContainerOverride |
The overrides that are sent to a container. An empty container override can be passed
in. An example of an empty container override is |
|
EcsEnvironmentFile |
A list of files containing the environment variables to pass to a container. You can
specify up to ten environment files. The file must have a
If there are environment variables specified using the This parameter is only supported for tasks hosted on Fargate using the following platform versions:
|
|
EcsEnvironmentVariable |
The environment variables to send to the container. You can add new environment variables, which are added to the container at launch, or you can override the existing environment variables from the Docker image or the task definition. You must also specify a container name. |
|
EcsEphemeralStorage |
The amount of ephemeral storage to allocate for the task. This parameter is used to
expand the total amount of ephemeral storage available, beyond the default amount,
for tasks hosted on Fargate. For more information, see Fargate
task storage in the Amazon ECS User Guide for Fargate.
This parameter is only supported for tasks hosted on Fargate using Linux platform
version |
|
EcsInferenceAcceleratorOverride |
Details on an Elastic Inference accelerator task override. This parameter is used to override the Elastic Inference accelerator specified in the task definition. For more information, see Working with Amazon Elastic Inference on Amazon ECS in the Amazon Elastic Container Service Developer Guide. |
|
EcsResourceRequirement |
The type and amount of a resource to assign to a container. The supported resource types are GPUs and Elastic Inference accelerators. For more information, see Working with GPUs on Amazon ECS or Working with Amazon Elastic Inference on Amazon ECS in the Amazon Elastic Container Service Developer Guide |
|
EcsTaskOverride |
The overrides that are associated with a task. |
|
Filter |
Filter events using an event pattern. For more information, see Events and Event Patterns in the Amazon EventBridge User Guide. |
|
FilterCriteria |
The collection of event patterns used to filter events.
To remove a filter, specify a For more information, see Events and Event Patterns in the Amazon EventBridge User Guide. |
|
FirehoseLogDestination |
The Amazon Data Firehose logging configuration settings for the pipe. |
|
FirehoseLogDestinationParameters |
The Amazon Data Firehose logging configuration settings for the pipe. |
|
InternalException |
This exception occurs due to unexpected causes. |
|
ListPipesRequest |
Container for the parameters to the ListPipes operation. Get the pipes associated with this account. For more information about pipes, see Amazon EventBridge Pipes in the Amazon EventBridge User Guide. |
|
ListPipesResponse |
This is the response object from the ListPipes operation. |
|
ListTagsForResourceRequest |
Container for the parameters to the ListTagsForResource operation. Displays the tags associated with a pipe. |
|
ListTagsForResourceResponse |
This is the response object from the ListTagsForResource operation. |
|
MQBrokerAccessCredentials |
The Secrets Manager secret that stores your broker credentials. |
|
MSKAccessCredentials |
The Secrets Manager secret that stores your stream credentials. |
|
MultiMeasureAttributeMapping |
A mapping of a source event data field to a measure in a Timestream for LiveAnalytics record. |
|
MultiMeasureMapping |
Maps multiple measures from the source event to the same Timestream for LiveAnalytics record. For more information, see Amazon Timestream for LiveAnalytics concepts |
|
NetworkConfiguration |
This structure specifies the network configuration for an Amazon ECS task. |
|
NotFoundException |
An entity that you specified does not exist. |
|
Pipe |
An object that represents a pipe. Amazon EventBridgePipes connect event sources to targets and reduces the need for specialized knowledge and integration code. |
|
PipeEnrichmentHttpParameters |
These are custom parameter to be used when the target is an API Gateway REST APIs or EventBridge ApiDestinations. In the latter case, these are merged with any InvocationParameters specified on the Connection, with any values from the Connection taking precedence. |
|
PipeEnrichmentParameters |
The parameters required to set up enrichment on your pipe. |
|
PipeLogConfiguration |
The logging configuration settings for the pipe. |
|
PipeLogConfigurationParameters |
Specifies the logging configuration settings for the pipe.
When you call
For example, suppose when you created the pipe you specified a Firehose stream log
destination. You then update the pipe to add an Amazon S3 log destination. In addition
to specifying the For more information on generating pipe log records, see Log EventBridge Pipes in the Amazon EventBridge User Guide. |
|
PipeSourceActiveMQBrokerParameters |
The parameters for using an Active MQ broker as a source. |
|
PipeSourceDynamoDBStreamParameters |
The parameters for using a DynamoDB stream as a source. |
|
PipeSourceKinesisStreamParameters |
The parameters for using a Kinesis stream as a source. |
|
PipeSourceManagedStreamingKafkaParameters |
The parameters for using an MSK stream as a source. |
|
PipeSourceParameters |
The parameters required to set up a source for your pipe. |
|
PipeSourceRabbitMQBrokerParameters |
The parameters for using a Rabbit MQ broker as a source. |
|
PipeSourceSelfManagedKafkaParameters |
The parameters for using a self-managed Apache Kafka stream as a source. A self managed cluster refers to any Apache Kafka cluster not hosted by Amazon Web Services. This includes both clusters you manage yourself, as well as those hosted by a third-party provider, such as Confluent Cloud, CloudKarafka, or Redpanda. For more information, see Apache Kafka streams as a source in the Amazon EventBridge User Guide. |
|
PipeSourceSqsQueueParameters |
The parameters for using a Amazon SQS stream as a source. |
|
PipesPaginatorFactory |
Paginators for the Pipes service |
|
PipeTargetBatchJobParameters |
The parameters for using an Batch job as a target. |
|
PipeTargetCloudWatchLogsParameters |
The parameters for using an CloudWatch Logs log stream as a target. |
|
PipeTargetEcsTaskParameters |
The parameters for using an Amazon ECS task as a target. |
|
PipeTargetEventBridgeEventBusParameters |
The parameters for using an EventBridge event bus as a target. |
|
PipeTargetHttpParameters |
These are custom parameter to be used when the target is an API Gateway REST APIs or EventBridge ApiDestinations. |
|
PipeTargetKinesisStreamParameters |
The parameters for using a Kinesis stream as a target. |
|
PipeTargetLambdaFunctionParameters |
The parameters for using a Lambda function as a target. |
|
PipeTargetParameters |
The parameters required to set up a target for your pipe. For more information about pipe target parameters, including how to use dynamic path parameters, see Target parameters in the Amazon EventBridge User Guide. |
|
PipeTargetRedshiftDataParameters |
These are custom parameters to be used when the target is a Amazon Redshift cluster to invoke the Amazon Redshift Data API BatchExecuteStatement. |
|
PipeTargetSageMakerPipelineParameters |
The parameters for using a SageMaker pipeline as a target. |
|
PipeTargetSqsQueueParameters |
The parameters for using a Amazon SQS stream as a target. |
|
PipeTargetStateMachineParameters |
The parameters for using a Step Functions state machine as a target. |
|
PipeTargetTimestreamParameters |
The parameters for using a Timestream for LiveAnalytics table as a target. |
|
PlacementConstraint |
An object representing a constraint on task placement. To learn more, see Task Placement Constraints in the Amazon Elastic Container Service Developer Guide. |
|
PlacementStrategy |
The task placement strategy for a task or service. To learn more, see Task Placement Strategies in the Amazon Elastic Container Service Service Developer Guide. |
|
S3LogDestination |
The Amazon S3 logging configuration settings for the pipe. |
|
S3LogDestinationParameters |
The Amazon S3 logging configuration settings for the pipe. |
|
SageMakerPipelineParameter |
Name/Value pair of a parameter to start execution of a SageMaker Model Building Pipeline. |
|
SelfManagedKafkaAccessConfigurationCredentials |
The Secrets Manager secret that stores your stream credentials. |
|
SelfManagedKafkaAccessConfigurationVpc |
This structure specifies the VPC subnets and security groups for the stream, and whether a public IP address is to be used. |
|
ServiceQuotaExceededException |
A quota has been exceeded. |
|
SingleMeasureMapping |
Maps a single source data field to a single record in the specified Timestream for LiveAnalytics table. For more information, see Amazon Timestream for LiveAnalytics concepts |
|
StartPipeRequest |
Container for the parameters to the StartPipe operation. Start an existing pipe. |
|
StartPipeResponse |
This is the response object from the StartPipe operation. |
|
StopPipeRequest |
Container for the parameters to the StopPipe operation. Stop an existing pipe. |
|
StopPipeResponse |
This is the response object from the StopPipe operation. |
|
Tag |
A key-value pair associated with an Amazon Web Services resource. In EventBridge, rules and event buses support tagging. |
|
TagResourceRequest |
Container for the parameters to the TagResource operation. Assigns one or more tags (key-value pairs) to the specified pipe. Tags can help you organize and categorize your resources. You can also use them to scope user permissions by granting a user permission to access or change only resources with certain tag values. Tags don't have any semantic meaning to Amazon Web Services and are interpreted strictly as strings of characters.
You can use the You can associate as many as 50 tags with a pipe. |
|
TagResourceResponse |
This is the response object from the TagResource operation. |
|
ThrottlingException |
An action was throttled. |
|
UntagResourceRequest |
Container for the parameters to the UntagResource operation. Removes one or more tags from the specified pipes. |
|
UntagResourceResponse |
This is the response object from the UntagResource operation. |
|
UpdatePipeRequest |
Container for the parameters to the UpdatePipe operation.
Update an existing pipe. When you call For more information about pipes, see Amazon EventBridge Pipes in the Amazon EventBridge User Guide. |
|
UpdatePipeResponse |
This is the response object from the UpdatePipe operation. |
|
UpdatePipeSourceActiveMQBrokerParameters |
The parameters for using an Active MQ broker as a source. |
|
UpdatePipeSourceDynamoDBStreamParameters |
The parameters for using a DynamoDB stream as a source. |
|
UpdatePipeSourceKinesisStreamParameters |
The parameters for using a Kinesis stream as a source. |
|
UpdatePipeSourceManagedStreamingKafkaParameters |
The parameters for using an MSK stream as a source. |
|
UpdatePipeSourceParameters |
The parameters required to set up a source for your pipe. |
|
UpdatePipeSourceRabbitMQBrokerParameters |
The parameters for using a Rabbit MQ broker as a source. |
|
UpdatePipeSourceSelfManagedKafkaParameters |
The parameters for using a self-managed Apache Kafka stream as a source. A self managed cluster refers to any Apache Kafka cluster not hosted by Amazon Web Services. This includes both clusters you manage yourself, as well as those hosted by a third-party provider, such as Confluent Cloud, CloudKarafka, or Redpanda. For more information, see Apache Kafka streams as a source in the Amazon EventBridge User Guide. |
|
UpdatePipeSourceSqsQueueParameters |
The parameters for using a Amazon SQS stream as a source. |
|
ValidationException |
Indicates that an error has occurred while performing a validate operation. |
|
ValidationExceptionField |
Indicates that an error has occurred while performing a validate operation. |
Name | Description | |
---|---|---|
IListPipesPaginator |
Paginator for the ListPipes operation |
|
IPipesPaginatorFactory |
Paginators for the Pipes service |