ManagedKafkaEventSource

class aws_cdk.aws_lambda_event_sources.ManagedKafkaEventSource(*, cluster_arn, topic, consumer_group_id=None, filters=None, on_failure=None, secret=None, starting_position, batch_size=None, enabled=None, max_batching_window=None)

Bases: StreamEventSource

Use a MSK cluster as a streaming source for AWS Lambda.

ExampleMetadata:

infused

Example:

from aws_cdk.aws_secretsmanager import Secret
from aws_cdk.aws_lambda_event_sources import ManagedKafkaEventSource

# my_function: lambda.Function


# Your MSK cluster arn
cluster_arn = "arn:aws:kafka:us-east-1:0123456789019:cluster/SalesCluster/abcd1234-abcd-cafe-abab-9876543210ab-4"

# The Kafka topic you want to subscribe to
topic = "some-cool-topic"

# The secret that allows access to your MSK cluster
# You still have to make sure that it is associated with your cluster as described in the documentation
secret = Secret(self, "Secret", secret_name="AmazonMSK_KafkaSecret")
my_function.add_event_source(ManagedKafkaEventSource(
    cluster_arn=cluster_arn,
    topic=topic,
    secret=secret,
    batch_size=100,  # default
    starting_position=lambda_.StartingPosition.TRIM_HORIZON
))
Parameters:
  • cluster_arn (str) – An MSK cluster construct.

  • topic (str) – The Kafka topic to subscribe to.

  • consumer_group_id (Optional[str]) – The identifier for the Kafka consumer group to join. The consumer group ID must be unique among all your Kafka event sources. After creating a Kafka event source mapping with the consumer group ID specified, you cannot update this value. The value must have a lenght between 1 and 200 and full the pattern ‘[a-zA-Z0-9-/:_+=.@-]’. Default: - none

  • filters (Optional[Sequence[Mapping[str, Any]]]) – Add filter criteria to Event Source. Default: - none

  • on_failure (Optional[IEventSourceDlq]) – Add an on Failure Destination for this Kafka event. SNS/SQS/S3 are supported Default: - discarded records are ignored

  • secret (Optional[ISecret]) – The secret with the Kafka credentials, see https://docs.aws.amazon.com/msk/latest/developerguide/msk-password.html for details This field is required if your Kafka brokers are accessed over the Internet. Default: none

  • starting_position (StartingPosition) – Where to begin consuming the stream.

  • batch_size (Union[int, float, None]) – The largest number of records that AWS Lambda will retrieve from your event source at the time of invoking your function. Your function receives an event with all the retrieved records. Valid Range: - Minimum value of 1 - Maximum value of: - 1000 for DynamoEventSource - 10000 for KinesisEventSource, ManagedKafkaEventSource and SelfManagedKafkaEventSource Default: 100

  • enabled (Optional[bool]) – If the stream event source mapping should be enabled. Default: true

  • max_batching_window (Optional[Duration]) – The maximum amount of time to gather records before invoking the function. Maximum of Duration.minutes(5). Default: - Duration.seconds(0) for Kinesis, DynamoDB, and SQS event sources, Duration.millis(500) for MSK, self-managed Kafka, and Amazon MQ.

Methods

bind(target)

Called by lambda.addEventSource to allow the event source to bind to this function.

Parameters:

target (IFunction) –

Return type:

None

Attributes

event_source_mapping_arn

The ARN for this EventSourceMapping.

event_source_mapping_id

The identifier for this EventSourceMapping.