CfnTransformer
- class aws_cdk.aws_logs.CfnTransformer(scope, id, *, log_group_identifier, transformer_config)
Bases:
CfnResource
Creates or updates a log transformer for a single log group.
You use log transformers to transform log events into a different format, making them easier for you to process and analyze. You can also transform logs from different sources into standardized formats that contains relevant, source-specific information.
After you have created a transformer, CloudWatch Logs performs the transformations at the time of log ingestion. You can then refer to the transformed versions of the logs during operations such as querying with CloudWatch Logs Insights or creating metric filters or subscription filers.
You can also use a transformer to copy metadata from metadata keys into the log events themselves. This metadata can include log group name, log stream name, account ID and Region.
A transformer for a log group is a series of processors, where each processor applies one type of transformation to the log events ingested into this log group. The processors work one after another, in the order that you list them, like a pipeline. For more information about the available processors to use in a transformer, see Processors that you can use .
Having log events in standardized format enables visibility across your applications for your log analysis, reporting, and alarming needs. CloudWatch Logs provides transformation for common log types with out-of-the-box transformation templates for major AWS log sources such as VPC flow logs, Lambda, and Amazon RDS. You can use pre-built transformation templates or create custom transformation policies.
You can create transformers only for the log groups in the Standard log class.
You can also set up a transformer at the account level. For more information, see PutAccountPolicy . If there is both a log-group level transformer created with
PutTransformer
and an account-level transformer that could apply to the same log group, the log group uses only the log-group level transformer. It ignores the account-level transformer.- See:
http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-logs-transformer.html
- CloudformationResource:
AWS::Logs::Transformer
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs cfn_transformer = logs.CfnTransformer(self, "MyCfnTransformer", log_group_identifier="logGroupIdentifier", transformer_config=[logs.CfnTransformer.ProcessorProperty( add_keys=logs.CfnTransformer.AddKeysProperty( entries=[logs.CfnTransformer.AddKeyEntryProperty( key="key", value="value", # the properties below are optional overwrite_if_exists=False )] ), copy_value=logs.CfnTransformer.CopyValueProperty( entries=[logs.CfnTransformer.CopyValueEntryProperty( source="source", target="target", # the properties below are optional overwrite_if_exists=False )] ), csv=logs.CfnTransformer.CsvProperty( columns=["columns"], delimiter="delimiter", quote_character="quoteCharacter", source="source" ), date_time_converter=logs.CfnTransformer.DateTimeConverterProperty( match_patterns=["matchPatterns"], source="source", target="target", # the properties below are optional locale="locale", source_timezone="sourceTimezone", target_format="targetFormat", target_timezone="targetTimezone" ), delete_keys=logs.CfnTransformer.DeleteKeysProperty( with_keys=["withKeys"] ), grok=logs.CfnTransformer.GrokProperty( match="match", # the properties below are optional source="source" ), list_to_map=logs.CfnTransformer.ListToMapProperty( key="key", source="source", # the properties below are optional flatten=False, flattened_element="flattenedElement", target="target", value_key="valueKey" ), lower_case_string=logs.CfnTransformer.LowerCaseStringProperty( with_keys=["withKeys"] ), move_keys=logs.CfnTransformer.MoveKeysProperty( entries=[logs.CfnTransformer.MoveKeyEntryProperty( source="source", target="target", # the properties below are optional overwrite_if_exists=False )] ), parse_cloudfront=logs.CfnTransformer.ParseCloudfrontProperty( source="source" ), parse_json=logs.CfnTransformer.ParseJSONProperty( destination="destination", source="source" ), parse_key_value=logs.CfnTransformer.ParseKeyValueProperty( destination="destination", field_delimiter="fieldDelimiter", key_prefix="keyPrefix", key_value_delimiter="keyValueDelimiter", non_match_value="nonMatchValue", overwrite_if_exists=False, source="source" ), parse_postgres=logs.CfnTransformer.ParsePostgresProperty( source="source" ), parse_route53=logs.CfnTransformer.ParseRoute53Property( source="source" ), parse_vpc=logs.CfnTransformer.ParseVPCProperty( source="source" ), parse_waf=logs.CfnTransformer.ParseWAFProperty( source="source" ), rename_keys=logs.CfnTransformer.RenameKeysProperty( entries=[logs.CfnTransformer.RenameKeyEntryProperty( key="key", rename_to="renameTo", # the properties below are optional overwrite_if_exists=False )] ), split_string=logs.CfnTransformer.SplitStringProperty( entries=[logs.CfnTransformer.SplitStringEntryProperty( delimiter="delimiter", source="source" )] ), substitute_string=logs.CfnTransformer.SubstituteStringProperty( entries=[logs.CfnTransformer.SubstituteStringEntryProperty( from="from", source="source", to="to" )] ), trim_string=logs.CfnTransformer.TrimStringProperty( with_keys=["withKeys"] ), type_converter=logs.CfnTransformer.TypeConverterProperty( entries=[logs.CfnTransformer.TypeConverterEntryProperty( key="key", type="type" )] ), upper_case_string=logs.CfnTransformer.UpperCaseStringProperty( with_keys=["withKeys"] ) )] )
- Parameters:
scope (
Construct
) – Scope in which this resource is defined.id (
str
) – Construct identifier for this resource (unique in its scope).log_group_identifier (
str
) – Specify either the name or ARN of the log group to create the transformer for.transformer_config (
Union
[IResolvable
,Sequence
[Union
[IResolvable
,ProcessorProperty
,Dict
[str
,Any
]]]]) – This structure is an array that contains the configuration of this log transformer. A log transformer is an array of processors, where each processor applies one type of transformation to the log events that are ingested.
Methods
- add_deletion_override(path)
Syntactic sugar for
addOverride(path, undefined)
.- Parameters:
path (
str
) – The path of the value to delete.- Return type:
None
- add_dependency(target)
Indicates that this resource depends on another resource and cannot be provisioned unless the other resource has been successfully provisioned.
This can be used for resources across stacks (or nested stack) boundaries and the dependency will automatically be transferred to the relevant scope.
- Parameters:
target (
CfnResource
) –- Return type:
None
- add_depends_on(target)
(deprecated) Indicates that this resource depends on another resource and cannot be provisioned unless the other resource has been successfully provisioned.
- Parameters:
target (
CfnResource
) –- Deprecated:
use addDependency
- Stability:
deprecated
- Return type:
None
- add_metadata(key, value)
Add a value to the CloudFormation Resource Metadata.
- Parameters:
key (
str
) –value (
Any
) –
- See:
- Return type:
None
https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/metadata-section-structure.html
Note that this is a different set of metadata from CDK node metadata; this metadata ends up in the stack template under the resource, whereas CDK node metadata ends up in the Cloud Assembly.
- add_override(path, value)
Adds an override to the synthesized CloudFormation resource.
To add a property override, either use
addPropertyOverride
or prefixpath
with “Properties.” (i.e.Properties.TopicName
).If the override is nested, separate each nested level using a dot (.) in the path parameter. If there is an array as part of the nesting, specify the index in the path.
To include a literal
.
in the property name, prefix with a\
. In most programming languages you will need to write this as"\\."
because the\
itself will need to be escaped.For example:
cfn_resource.add_override("Properties.GlobalSecondaryIndexes.0.Projection.NonKeyAttributes", ["myattribute"]) cfn_resource.add_override("Properties.GlobalSecondaryIndexes.1.ProjectionType", "INCLUDE")
would add the overrides Example:
"Properties": { "GlobalSecondaryIndexes": [ { "Projection": { "NonKeyAttributes": [ "myattribute" ] ... } ... }, { "ProjectionType": "INCLUDE" ... }, ] ... }
The
value
argument toaddOverride
will not be processed or translated in any way. Pass raw JSON values in here with the correct capitalization for CloudFormation. If you pass CDK classes or structs, they will be rendered with lowercased key names, and CloudFormation will reject the template.- Parameters:
path (
str
) –The path of the property, you can use dot notation to override values in complex types. Any intermediate keys will be created as needed.
value (
Any
) –The value. Could be primitive or complex.
- Return type:
None
- add_property_deletion_override(property_path)
Adds an override that deletes the value of a property from the resource definition.
- Parameters:
property_path (
str
) – The path to the property.- Return type:
None
- add_property_override(property_path, value)
Adds an override to a resource property.
Syntactic sugar for
addOverride("Properties.<...>", value)
.- Parameters:
property_path (
str
) – The path of the property.value (
Any
) – The value.
- Return type:
None
- apply_removal_policy(policy=None, *, apply_to_update_replace_policy=None, default=None)
Sets the deletion policy of the resource based on the removal policy specified.
The Removal Policy controls what happens to this resource when it stops being managed by CloudFormation, either because you’ve removed it from the CDK application or because you’ve made a change that requires the resource to be replaced.
The resource can be deleted (
RemovalPolicy.DESTROY
), or left in your AWS account for data recovery and cleanup later (RemovalPolicy.RETAIN
). In some cases, a snapshot can be taken of the resource prior to deletion (RemovalPolicy.SNAPSHOT
). A list of resources that support this policy can be found in the following link:- Parameters:
policy (
Optional
[RemovalPolicy
]) –apply_to_update_replace_policy (
Optional
[bool
]) – Apply the same deletion policy to the resource’s “UpdateReplacePolicy”. Default: truedefault (
Optional
[RemovalPolicy
]) – The default policy to apply in case the removal policy is not defined. Default: - Default value is resource specific. To determine the default value for a resource, please consult that specific resource’s documentation.
- See:
- Return type:
None
- get_att(attribute_name, type_hint=None)
Returns a token for an runtime attribute of this resource.
Ideally, use generated attribute accessors (e.g.
resource.arn
), but this can be used for future compatibility in case there is no generated attribute.- Parameters:
attribute_name (
str
) – The name of the attribute.type_hint (
Optional
[ResolutionTypeHint
]) –
- Return type:
- get_metadata(key)
Retrieve a value value from the CloudFormation Resource Metadata.
- Parameters:
key (
str
) –- See:
- Return type:
Any
https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/metadata-section-structure.html
Note that this is a different set of metadata from CDK node metadata; this metadata ends up in the stack template under the resource, whereas CDK node metadata ends up in the Cloud Assembly.
- inspect(inspector)
Examines the CloudFormation resource and discloses attributes.
- Parameters:
inspector (
TreeInspector
) – tree inspector to collect and process attributes.- Return type:
None
- obtain_dependencies()
Retrieves an array of resources this resource depends on.
This assembles dependencies on resources across stacks (including nested stacks) automatically.
- Return type:
List
[Union
[Stack
,CfnResource
]]
- obtain_resource_dependencies()
Get a shallow copy of dependencies between this resource and other resources in the same stack.
- Return type:
List
[CfnResource
]
- override_logical_id(new_logical_id)
Overrides the auto-generated logical ID with a specific ID.
- Parameters:
new_logical_id (
str
) – The new logical ID to use for this stack element.- Return type:
None
- remove_dependency(target)
Indicates that this resource no longer depends on another resource.
This can be used for resources across stacks (including nested stacks) and the dependency will automatically be removed from the relevant scope.
- Parameters:
target (
CfnResource
) –- Return type:
None
- replace_dependency(target, new_target)
Replaces one dependency with another.
- Parameters:
target (
CfnResource
) – The dependency to replace.new_target (
CfnResource
) – The new dependency to add.
- Return type:
None
- to_string()
Returns a string representation of this construct.
- Return type:
str
- Returns:
a string representation of this resource
Attributes
- CFN_RESOURCE_TYPE_NAME = 'AWS::Logs::Transformer'
- cfn_options
Options for this resource, such as condition, update policy etc.
- cfn_resource_type
AWS resource type.
- creation_stack
return:
the stack trace of the point where this Resource was created from, sourced from the +metadata+ entry typed +aws:cdk:logicalId+, and with the bottom-most node +internal+ entries filtered.
- log_group_identifier
Specify either the name or ARN of the log group to create the transformer for.
- logical_id
The logical ID for this CloudFormation stack element.
The logical ID of the element is calculated from the path of the resource node in the construct tree.
To override this value, use
overrideLogicalId(newLogicalId)
.- Returns:
the logical ID as a stringified token. This value will only get resolved during synthesis.
- node
The tree node.
- ref
Return a string that will be resolved to a CloudFormation
{ Ref }
for this element.If, by any chance, the intrinsic reference of a resource is not a string, you could coerce it to an IResolvable through
Lazy.any({ produce: resource.ref })
.
- stack
The stack in which this element is defined.
CfnElements must be defined within a stack scope (directly or indirectly).
- transformer_config
This structure is an array that contains the configuration of this log transformer.
Static Methods
- classmethod is_cfn_element(x)
Returns
true
if a construct is a stack element (i.e. part of the synthesized cloudformation template).Uses duck-typing instead of
instanceof
to allow stack elements from different versions of this library to be included in the same stack.- Parameters:
x (
Any
) –- Return type:
bool
- Returns:
The construct as a stack element or undefined if it is not a stack element.
- classmethod is_cfn_resource(x)
Check whether the given object is a CfnResource.
- Parameters:
x (
Any
) –- Return type:
bool
- classmethod is_construct(x)
Checks if
x
is a construct.Use this method instead of
instanceof
to properly detectConstruct
instances, even when the construct library is symlinked.Explanation: in JavaScript, multiple copies of the
constructs
library on disk are seen as independent, completely different libraries. As a consequence, the classConstruct
in each copy of theconstructs
library is seen as a different class, and an instance of one class will not test asinstanceof
the other class.npm install
will not create installations like this, but users may manually symlink construct libraries together or use a monorepo tool: in those cases, multiple copies of theconstructs
library can be accidentally installed, andinstanceof
will behave unpredictably. It is safest to avoid usinginstanceof
, and using this type-testing method instead.- Parameters:
x (
Any
) – Any object.- Return type:
bool
- Returns:
true if
x
is an object created from a class which extendsConstruct
.
AddKeyEntryProperty
- class CfnTransformer.AddKeyEntryProperty(*, key, value, overwrite_if_exists=None)
Bases:
object
This object defines one key that will be added with the addKeys processor.
- Parameters:
key (
str
) – The key of the new entry to be added to the log event.value (
str
) – The value of the new entry to be added to the log event.overwrite_if_exists (
Union
[bool
,IResolvable
,None
]) – Specifies whether to overwrite the value if the key already exists in the log event. If you omit this, the default isfalse
.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs add_key_entry_property = logs.CfnTransformer.AddKeyEntryProperty( key="key", value="value", # the properties below are optional overwrite_if_exists=False )
Attributes
- key
The key of the new entry to be added to the log event.
- overwrite_if_exists
Specifies whether to overwrite the value if the key already exists in the log event.
If you omit this, the default is
false
.
- value
The value of the new entry to be added to the log event.
AddKeysProperty
- class CfnTransformer.AddKeysProperty(*, entries)
Bases:
object
This processor adds new key-value pairs to the log event.
For more information about this processor including examples, see addKeys in the CloudWatch Logs User Guide .
- Parameters:
entries (
Union
[IResolvable
,Sequence
[Union
[IResolvable
,AddKeyEntryProperty
,Dict
[str
,Any
]]]]) – An array of objects, where each object contains the information about one key to add to the log event.- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs add_keys_property = logs.CfnTransformer.AddKeysProperty( entries=[logs.CfnTransformer.AddKeyEntryProperty( key="key", value="value", # the properties below are optional overwrite_if_exists=False )] )
Attributes
- entries
An array of objects, where each object contains the information about one key to add to the log event.
CopyValueEntryProperty
- class CfnTransformer.CopyValueEntryProperty(*, source, target, overwrite_if_exists=None)
Bases:
object
This object defines one value to be copied with the copyValue processor.
- Parameters:
source (
str
) – The key to copy.target (
str
) – The key of the field to copy the value to.overwrite_if_exists (
Union
[bool
,IResolvable
,None
]) – Specifies whether to overwrite the value if the destination key already exists. If you omit this, the default isfalse
.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs copy_value_entry_property = logs.CfnTransformer.CopyValueEntryProperty( source="source", target="target", # the properties below are optional overwrite_if_exists=False )
Attributes
- overwrite_if_exists
Specifies whether to overwrite the value if the destination key already exists.
If you omit this, the default is
false
.
- source
The key to copy.
- target
The key of the field to copy the value to.
CopyValueProperty
- class CfnTransformer.CopyValueProperty(*, entries)
Bases:
object
This processor copies values within a log event.
You can also use this processor to add metadata to log events by copying the values of the following metadata keys into the log events:
@logGroupName
,@logGroupStream
,@accountId
,@regionName
.For more information about this processor including examples, see copyValue in the CloudWatch Logs User Guide .
- Parameters:
entries (
Union
[IResolvable
,Sequence
[Union
[IResolvable
,CopyValueEntryProperty
,Dict
[str
,Any
]]]]) – An array ofCopyValueEntry
objects, where each object contains the information about one field value to copy.- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs copy_value_property = logs.CfnTransformer.CopyValueProperty( entries=[logs.CfnTransformer.CopyValueEntryProperty( source="source", target="target", # the properties below are optional overwrite_if_exists=False )] )
Attributes
- entries
An array of
CopyValueEntry
objects, where each object contains the information about one field value to copy.
CsvProperty
- class CfnTransformer.CsvProperty(*, columns=None, delimiter=None, quote_character=None, source=None)
Bases:
object
The
CSV
processor parses comma-separated values (CSV) from the log events into columns.For more information about this processor including examples, see csv in the CloudWatch Logs User Guide .
- Parameters:
columns (
Optional
[Sequence
[str
]]) – An array of names to use for the columns in the transformed log event. If you omit this, default column names ([column_1, column_2 ...]
) are used.delimiter (
Optional
[str
]) – The character used to separate each column in the original comma-separated value log event. If you omit this, the processor looks for the comma,
character as the delimiter.quote_character (
Optional
[str
]) – The character used used as a text qualifier for a single column of data. If you omit this, the double quotation mark"
character is used.source (
Optional
[str
]) – The path to the field in the log event that has the comma separated values to be parsed. If you omit this value, the whole log message is processed.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs csv_property = logs.CfnTransformer.CsvProperty( columns=["columns"], delimiter="delimiter", quote_character="quoteCharacter", source="source" )
Attributes
- columns
An array of names to use for the columns in the transformed log event.
If you omit this, default column names (
[column_1, column_2 ...]
) are used.
- delimiter
The character used to separate each column in the original comma-separated value log event.
If you omit this, the processor looks for the comma
,
character as the delimiter.
- quote_character
The character used used as a text qualifier for a single column of data.
If you omit this, the double quotation mark
"
character is used.
- source
The path to the field in the log event that has the comma separated values to be parsed.
If you omit this value, the whole log message is processed.
DateTimeConverterProperty
- class CfnTransformer.DateTimeConverterProperty(*, match_patterns, source, target, locale=None, source_timezone=None, target_format=None, target_timezone=None)
Bases:
object
This processor converts a datetime string into a format that you specify.
For more information about this processor including examples, see datetimeConverter in the CloudWatch Logs User Guide .
- Parameters:
match_patterns (
Sequence
[str
]) – A list of patterns to match against thesource
field.source (
str
) – The key to apply the date conversion to.target (
str
) – The JSON field to store the result in.locale (
Optional
[str
]) – The locale of the source field. If you omit this, the default oflocale.ROOT
is used.source_timezone (
Optional
[str
]) – The time zone of the source field. If you omit this, the default used is the UTC zone.target_format (
Optional
[str
]) – The datetime format to use for the converted data in the target field. If you omit this, the default ofyyyy-MM-dd'T'HH:mm:ss.SSS'Z
is used.target_timezone (
Optional
[str
]) – The time zone of the target field. If you omit this, the default used is the UTC zone.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs date_time_converter_property = logs.CfnTransformer.DateTimeConverterProperty( match_patterns=["matchPatterns"], source="source", target="target", # the properties below are optional locale="locale", source_timezone="sourceTimezone", target_format="targetFormat", target_timezone="targetTimezone" )
Attributes
- locale
The locale of the source field.
If you omit this, the default of
locale.ROOT
is used.
- match_patterns
A list of patterns to match against the
source
field.
- source
The key to apply the date conversion to.
- source_timezone
The time zone of the source field.
If you omit this, the default used is the UTC zone.
- target
The JSON field to store the result in.
- target_format
The datetime format to use for the converted data in the target field.
If you omit this, the default of
yyyy-MM-dd'T'HH:mm:ss.SSS'Z
is used.
- target_timezone
The time zone of the target field.
If you omit this, the default used is the UTC zone.
DeleteKeysProperty
- class CfnTransformer.DeleteKeysProperty(*, with_keys)
Bases:
object
This processor deletes entries from a log event. These entries are key-value pairs.
For more information about this processor including examples, see deleteKeys in the CloudWatch Logs User Guide .
- Parameters:
with_keys (
Sequence
[str
]) – The list of keys to delete.- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs delete_keys_property = logs.CfnTransformer.DeleteKeysProperty( with_keys=["withKeys"] )
Attributes
- with_keys
The list of keys to delete.
GrokProperty
- class CfnTransformer.GrokProperty(*, match, source=None)
Bases:
object
This processor uses pattern matching to parse and structure unstructured data.
This processor can also extract fields from log messages.
For more information about this processor including examples, see grok in the CloudWatch Logs User Guide .
- Parameters:
match (
str
) – The grok pattern to match against the log event. For a list of supported grok patterns, see Supported grok patterns .source (
Optional
[str
]) – The path to the field in the log event that you want to parse. If you omit this value, the whole log message is parsed.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs grok_property = logs.CfnTransformer.GrokProperty( match="match", # the properties below are optional source="source" )
Attributes
- match
The grok pattern to match against the log event.
For a list of supported grok patterns, see Supported grok patterns .
- source
The path to the field in the log event that you want to parse.
If you omit this value, the whole log message is parsed.
ListToMapProperty
- class CfnTransformer.ListToMapProperty(*, key, source, flatten=None, flattened_element=None, target=None, value_key=None)
Bases:
object
This processor takes a list of objects that contain key fields, and converts them into a map of target keys.
For more information about this processor including examples, see listToMap in the CloudWatch Logs User Guide .
- Parameters:
key (
str
) – The key of the field to be extracted as keys in the generated map.source (
str
) – The key in the log event that has a list of objects that will be converted to a map.flatten (
Union
[bool
,IResolvable
,None
]) – A Boolean value to indicate whether the list will be flattened into single items. Specifytrue
to flatten the list. The default isfalse
flattened_element (
Optional
[str
]) – If you setflatten
totrue
, useflattenedElement
to specify which element,first
orlast
, to keep. You must specify this parameter ifflatten
istrue
target (
Optional
[str
]) – The key of the field that will hold the generated map.value_key (
Optional
[str
]) – If this is specified, the values that you specify in this parameter will be extracted from thesource
objects and put into the values of the generated map. Otherwise, original objects in the source list will be put into the values of the generated map.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs list_to_map_property = logs.CfnTransformer.ListToMapProperty( key="key", source="source", # the properties below are optional flatten=False, flattened_element="flattenedElement", target="target", value_key="valueKey" )
Attributes
- flatten
A Boolean value to indicate whether the list will be flattened into single items.
Specify
true
to flatten the list. The default isfalse
- flattened_element
If you set
flatten
totrue
, useflattenedElement
to specify which element,first
orlast
, to keep.You must specify this parameter if
flatten
istrue
- key
The key of the field to be extracted as keys in the generated map.
- source
The key in the log event that has a list of objects that will be converted to a map.
- target
The key of the field that will hold the generated map.
- value_key
If this is specified, the values that you specify in this parameter will be extracted from the
source
objects and put into the values of the generated map.Otherwise, original objects in the source list will be put into the values of the generated map.
LowerCaseStringProperty
- class CfnTransformer.LowerCaseStringProperty(*, with_keys)
Bases:
object
This processor converts a string to lowercase.
For more information about this processor including examples, see lowerCaseString in the CloudWatch Logs User Guide .
- Parameters:
with_keys (
Sequence
[str
]) – The array caontaining the keys of the fields to convert to lowercase.- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs lower_case_string_property = logs.CfnTransformer.LowerCaseStringProperty( with_keys=["withKeys"] )
Attributes
- with_keys
The array caontaining the keys of the fields to convert to lowercase.
MoveKeyEntryProperty
- class CfnTransformer.MoveKeyEntryProperty(*, source, target, overwrite_if_exists=None)
Bases:
object
This object defines one key that will be moved with the moveKey processor.
- Parameters:
source (
str
) – The key to move.target (
str
) – The key to move to.overwrite_if_exists (
Union
[bool
,IResolvable
,None
]) – Specifies whether to overwrite the value if the destination key already exists. If you omit this, the default isfalse
.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs move_key_entry_property = logs.CfnTransformer.MoveKeyEntryProperty( source="source", target="target", # the properties below are optional overwrite_if_exists=False )
Attributes
- overwrite_if_exists
Specifies whether to overwrite the value if the destination key already exists.
If you omit this, the default is
false
.
- source
The key to move.
MoveKeysProperty
- class CfnTransformer.MoveKeysProperty(*, entries)
Bases:
object
This processor moves a key from one field to another. The original key is deleted.
For more information about this processor including examples, see moveKeys in the CloudWatch Logs User Guide .
- Parameters:
entries (
Union
[IResolvable
,Sequence
[Union
[IResolvable
,MoveKeyEntryProperty
,Dict
[str
,Any
]]]]) – An array of objects, where each object contains the information about one key to move.- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs move_keys_property = logs.CfnTransformer.MoveKeysProperty( entries=[logs.CfnTransformer.MoveKeyEntryProperty( source="source", target="target", # the properties below are optional overwrite_if_exists=False )] )
Attributes
- entries
An array of objects, where each object contains the information about one key to move.
ParseCloudfrontProperty
- class CfnTransformer.ParseCloudfrontProperty(*, source=None)
Bases:
object
This processor parses CloudFront vended logs, extract fields, and convert them into JSON format.
Encoded field values are decoded. Values that are integers and doubles are treated as such. For more information about this processor including examples, see parseCloudfront
For more information about CloudFront log format, see Configure and use standard logs (access logs) .
If you use this processor, it must be the first processor in your transformer.
- Parameters:
source (
Optional
[str
]) – Omit this parameter and the whole log message will be processed by this processor. No other value than@message
is allowed forsource
.- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs parse_cloudfront_property = logs.CfnTransformer.ParseCloudfrontProperty( source="source" )
Attributes
- source
Omit this parameter and the whole log message will be processed by this processor.
No other value than
@message
is allowed forsource
.
ParseJSONProperty
- class CfnTransformer.ParseJSONProperty(*, destination=None, source=None)
Bases:
object
This processor parses log events that are in JSON format.
It can extract JSON key-value pairs and place them under a destination that you specify.
Additionally, because you must have at least one parse-type processor in a transformer, you can use
ParseJSON
as that processor for JSON-format logs, so that you can also apply other processors, such as mutate processors, to these logs.For more information about this processor including examples, see parseJSON in the CloudWatch Logs User Guide .
- Parameters:
destination (
Optional
[str
]) – The location to put the parsed key value pair into. If you omit this parameter, it is placed under the root node.source (
Optional
[str
]) – Path to the field in the log event that will be parsed. Use dot notation to access child fields. For example,store.book
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs parse_jSONProperty = logs.CfnTransformer.ParseJSONProperty( destination="destination", source="source" )
Attributes
- destination
The location to put the parsed key value pair into.
If you omit this parameter, it is placed under the root node.
- source
Path to the field in the log event that will be parsed.
Use dot notation to access child fields. For example,
store.book
ParseKeyValueProperty
- class CfnTransformer.ParseKeyValueProperty(*, destination=None, field_delimiter=None, key_prefix=None, key_value_delimiter=None, non_match_value=None, overwrite_if_exists=None, source=None)
Bases:
object
This processor parses a specified field in the original log event into key-value pairs.
For more information about this processor including examples, see parseKeyValue in the CloudWatch Logs User Guide .
- Parameters:
destination (
Optional
[str
]) – The destination field to put the extracted key-value pairs into.field_delimiter (
Optional
[str
]) – The field delimiter string that is used between key-value pairs in the original log events. If you omit this, the ampersand&
character is used.key_prefix (
Optional
[str
]) – If you want to add a prefix to all transformed keys, specify it here.key_value_delimiter (
Optional
[str
]) – The delimiter string to use between the key and value in each pair in the transformed log event. If you omit this, the equal=
character is used.non_match_value (
Optional
[str
]) – A value to insert into the value field in the result, when a key-value pair is not successfully split.overwrite_if_exists (
Union
[bool
,IResolvable
,None
]) – Specifies whether to overwrite the value if the destination key already exists. If you omit this, the default isfalse
.source (
Optional
[str
]) – Path to the field in the log event that will be parsed. Use dot notation to access child fields. For example,store.book
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs parse_key_value_property = logs.CfnTransformer.ParseKeyValueProperty( destination="destination", field_delimiter="fieldDelimiter", key_prefix="keyPrefix", key_value_delimiter="keyValueDelimiter", non_match_value="nonMatchValue", overwrite_if_exists=False, source="source" )
Attributes
- destination
The destination field to put the extracted key-value pairs into.
- field_delimiter
The field delimiter string that is used between key-value pairs in the original log events.
If you omit this, the ampersand
&
character is used.
- key_prefix
If you want to add a prefix to all transformed keys, specify it here.
- key_value_delimiter
The delimiter string to use between the key and value in each pair in the transformed log event.
If you omit this, the equal
=
character is used.
- non_match_value
A value to insert into the value field in the result, when a key-value pair is not successfully split.
- overwrite_if_exists
Specifies whether to overwrite the value if the destination key already exists.
If you omit this, the default is
false
.
- source
Path to the field in the log event that will be parsed.
Use dot notation to access child fields. For example,
store.book
ParsePostgresProperty
- class CfnTransformer.ParsePostgresProperty(*, source=None)
Bases:
object
Use this processor to parse RDS for PostgreSQL vended logs, extract fields, and and convert them into a JSON format.
This processor always processes the entire log event message. For more information about this processor including examples, see parsePostGres .
For more information about RDS for PostgreSQL log format, see RDS for PostgreSQL database log filesTCP flag sequence . .. epigraph:
If you use this processor, it must be the first processor in your transformer.
- Parameters:
source (
Optional
[str
]) – Omit this parameter and the whole log message will be processed by this processor. No other value than@message
is allowed forsource
.- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs parse_postgres_property = logs.CfnTransformer.ParsePostgresProperty( source="source" )
Attributes
- source
Omit this parameter and the whole log message will be processed by this processor.
No other value than
@message
is allowed forsource
.
ParseRoute53Property
- class CfnTransformer.ParseRoute53Property(*, source=None)
Bases:
object
Use this processor to parse Route 53 vended logs, extract fields, and and convert them into a JSON format.
This processor always processes the entire log event message. For more information about this processor including examples, see parseRoute53 . .. epigraph:
If you use this processor, it must be the first processor in your transformer.
- Parameters:
source (
Optional
[str
]) – Omit this parameter and the whole log message will be processed by this processor. No other value than@message
is allowed forsource
.- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs parse_route53_property = logs.CfnTransformer.ParseRoute53Property( source="source" )
Attributes
- source
Omit this parameter and the whole log message will be processed by this processor.
No other value than
@message
is allowed forsource
.
ParseVPCProperty
- class CfnTransformer.ParseVPCProperty(*, source=None)
Bases:
object
Use this processor to parse Amazon VPC vended logs, extract fields, and and convert them into a JSON format.
This processor always processes the entire log event message.
This processor doesn’t support custom log formats, such as NAT gateway logs. For more information about custom log formats in Amazon VPC, see parseVPC For more information about this processor including examples, see parseVPC . .. epigraph:
If you use this processor, it must be the first processor in your transformer.
- Parameters:
source (
Optional
[str
]) – Omit this parameter and the whole log message will be processed by this processor. No other value than@message
is allowed forsource
.- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs parse_vPCProperty = logs.CfnTransformer.ParseVPCProperty( source="source" )
Attributes
- source
Omit this parameter and the whole log message will be processed by this processor.
No other value than
@message
is allowed forsource
.
ParseWAFProperty
- class CfnTransformer.ParseWAFProperty(*, source=None)
Bases:
object
Use this processor to parse AWS WAF vended logs, extract fields, and and convert them into a JSON format.
This processor always processes the entire log event message. For more information about this processor including examples, see parseWAF .
For more information about AWS WAF log format, see Log examples for web ACL traffic . .. epigraph:
If you use this processor, it must be the first processor in your transformer.
- Parameters:
source (
Optional
[str
]) – Omit this parameter and the whole log message will be processed by this processor. No other value than@message
is allowed forsource
.- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs parse_wAFProperty = logs.CfnTransformer.ParseWAFProperty( source="source" )
Attributes
- source
Omit this parameter and the whole log message will be processed by this processor.
No other value than
@message
is allowed forsource
.
ProcessorProperty
- class CfnTransformer.ProcessorProperty(*, add_keys=None, copy_value=None, csv=None, date_time_converter=None, delete_keys=None, grok=None, list_to_map=None, lower_case_string=None, move_keys=None, parse_cloudfront=None, parse_json=None, parse_key_value=None, parse_postgres=None, parse_route53=None, parse_vpc=None, parse_waf=None, rename_keys=None, split_string=None, substitute_string=None, trim_string=None, type_converter=None, upper_case_string=None)
Bases:
object
This structure contains the information about one processor in a log transformer.
- Parameters:
add_keys (
Union
[IResolvable
,AddKeysProperty
,Dict
[str
,Any
],None
]) –Use this parameter to include the addKeys processor in your transformer.
copy_value (
Union
[IResolvable
,CopyValueProperty
,Dict
[str
,Any
],None
]) –Use this parameter to include the copyValue processor in your transformer.
csv (
Union
[IResolvable
,CsvProperty
,Dict
[str
,Any
],None
]) –Use this parameter to include the CSV processor in your transformer.
date_time_converter (
Union
[IResolvable
,DateTimeConverterProperty
,Dict
[str
,Any
],None
]) –Use this parameter to include the datetimeConverter processor in your transformer.
delete_keys (
Union
[IResolvable
,DeleteKeysProperty
,Dict
[str
,Any
],None
]) –Use this parameter to include the deleteKeys processor in your transformer.
grok (
Union
[IResolvable
,GrokProperty
,Dict
[str
,Any
],None
]) –Use this parameter to include the grok processor in your transformer.
list_to_map (
Union
[IResolvable
,ListToMapProperty
,Dict
[str
,Any
],None
]) –Use this parameter to include the listToMap processor in your transformer.
lower_case_string (
Union
[IResolvable
,LowerCaseStringProperty
,Dict
[str
,Any
],None
]) –Use this parameter to include the lowerCaseString processor in your transformer.
move_keys (
Union
[IResolvable
,MoveKeysProperty
,Dict
[str
,Any
],None
]) –Use this parameter to include the moveKeys processor in your transformer.
parse_cloudfront (
Union
[IResolvable
,ParseCloudfrontProperty
,Dict
[str
,Any
],None
]) –Use this parameter to include the parseCloudfront processor in your transformer. If you use this processor, it must be the first processor in your transformer.
parse_json (
Union
[IResolvable
,ParseJSONProperty
,Dict
[str
,Any
],None
]) –Use this parameter to include the parseJSON processor in your transformer.
parse_key_value (
Union
[IResolvable
,ParseKeyValueProperty
,Dict
[str
,Any
],None
]) –Use this parameter to include the parseKeyValue processor in your transformer.
parse_postgres (
Union
[IResolvable
,ParsePostgresProperty
,Dict
[str
,Any
],None
]) –Use this parameter to include the parsePostGres processor in your transformer. If you use this processor, it must be the first processor in your transformer.
parse_route53 (
Union
[IResolvable
,ParseRoute53Property
,Dict
[str
,Any
],None
]) –Use this parameter to include the parseRoute53 processor in your transformer. If you use this processor, it must be the first processor in your transformer.
parse_vpc (
Union
[IResolvable
,ParseVPCProperty
,Dict
[str
,Any
],None
]) –Use this parameter to include the parseVPC processor in your transformer. If you use this processor, it must be the first processor in your transformer.
parse_waf (
Union
[IResolvable
,ParseWAFProperty
,Dict
[str
,Any
],None
]) –Use this parameter to include the parseWAF processor in your transformer. If you use this processor, it must be the first processor in your transformer.
rename_keys (
Union
[IResolvable
,RenameKeysProperty
,Dict
[str
,Any
],None
]) – Use this parameter to include the renameKeys processor in your transformer.split_string (
Union
[IResolvable
,SplitStringProperty
,Dict
[str
,Any
],None
]) – Use this parameter to include the splitString processor in your transformer.substitute_string (
Union
[IResolvable
,SubstituteStringProperty
,Dict
[str
,Any
],None
]) – Use this parameter to include the substituteString processor in your transformer.trim_string (
Union
[IResolvable
,TrimStringProperty
,Dict
[str
,Any
],None
]) – Use this parameter to include the trimString processor in your transformer.type_converter (
Union
[IResolvable
,TypeConverterProperty
,Dict
[str
,Any
],None
]) – Use this parameter to include the typeConverter processor in your transformer.upper_case_string (
Union
[IResolvable
,UpperCaseStringProperty
,Dict
[str
,Any
],None
]) – Use this parameter to include the upperCaseString processor in your transformer.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs processor_property = logs.CfnTransformer.ProcessorProperty( add_keys=logs.CfnTransformer.AddKeysProperty( entries=[logs.CfnTransformer.AddKeyEntryProperty( key="key", value="value", # the properties below are optional overwrite_if_exists=False )] ), copy_value=logs.CfnTransformer.CopyValueProperty( entries=[logs.CfnTransformer.CopyValueEntryProperty( source="source", target="target", # the properties below are optional overwrite_if_exists=False )] ), csv=logs.CfnTransformer.CsvProperty( columns=["columns"], delimiter="delimiter", quote_character="quoteCharacter", source="source" ), date_time_converter=logs.CfnTransformer.DateTimeConverterProperty( match_patterns=["matchPatterns"], source="source", target="target", # the properties below are optional locale="locale", source_timezone="sourceTimezone", target_format="targetFormat", target_timezone="targetTimezone" ), delete_keys=logs.CfnTransformer.DeleteKeysProperty( with_keys=["withKeys"] ), grok=logs.CfnTransformer.GrokProperty( match="match", # the properties below are optional source="source" ), list_to_map=logs.CfnTransformer.ListToMapProperty( key="key", source="source", # the properties below are optional flatten=False, flattened_element="flattenedElement", target="target", value_key="valueKey" ), lower_case_string=logs.CfnTransformer.LowerCaseStringProperty( with_keys=["withKeys"] ), move_keys=logs.CfnTransformer.MoveKeysProperty( entries=[logs.CfnTransformer.MoveKeyEntryProperty( source="source", target="target", # the properties below are optional overwrite_if_exists=False )] ), parse_cloudfront=logs.CfnTransformer.ParseCloudfrontProperty( source="source" ), parse_json=logs.CfnTransformer.ParseJSONProperty( destination="destination", source="source" ), parse_key_value=logs.CfnTransformer.ParseKeyValueProperty( destination="destination", field_delimiter="fieldDelimiter", key_prefix="keyPrefix", key_value_delimiter="keyValueDelimiter", non_match_value="nonMatchValue", overwrite_if_exists=False, source="source" ), parse_postgres=logs.CfnTransformer.ParsePostgresProperty( source="source" ), parse_route53=logs.CfnTransformer.ParseRoute53Property( source="source" ), parse_vpc=logs.CfnTransformer.ParseVPCProperty( source="source" ), parse_waf=logs.CfnTransformer.ParseWAFProperty( source="source" ), rename_keys=logs.CfnTransformer.RenameKeysProperty( entries=[logs.CfnTransformer.RenameKeyEntryProperty( key="key", rename_to="renameTo", # the properties below are optional overwrite_if_exists=False )] ), split_string=logs.CfnTransformer.SplitStringProperty( entries=[logs.CfnTransformer.SplitStringEntryProperty( delimiter="delimiter", source="source" )] ), substitute_string=logs.CfnTransformer.SubstituteStringProperty( entries=[logs.CfnTransformer.SubstituteStringEntryProperty( from="from", source="source", to="to" )] ), trim_string=logs.CfnTransformer.TrimStringProperty( with_keys=["withKeys"] ), type_converter=logs.CfnTransformer.TypeConverterProperty( entries=[logs.CfnTransformer.TypeConverterEntryProperty( key="key", type="type" )] ), upper_case_string=logs.CfnTransformer.UpperCaseStringProperty( with_keys=["withKeys"] ) )
Attributes
- add_keys
//docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CloudWatch-Logs-Transformation.html#CloudWatch-Logs-Transformation-addKeys>`_ processor in your transformer.
- See:
- Type:
Use this parameter to include the `addKeys <https
- copy_value
//docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CloudWatch-Logs-Transformation-Processors.html#CloudWatch-Logs-Transformation-copyValue>`_ processor in your transformer.
- See:
- Type:
Use this parameter to include the `copyValue <https
- csv
//docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CloudWatch-Logs-Transformation.html#CloudWatch-Logs-Transformation-CSV>`_ processor in your transformer.
- See:
- Type:
Use this parameter to include the `CSV <https
- date_time_converter
//docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CloudWatch-Logs-Transformation-Processors.html#CloudWatch-Logs-Transformation-datetimeConverter>`_ processor in your transformer.
- See:
- Type:
Use this parameter to include the `datetimeConverter <https
- delete_keys
//docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CloudWatch-Logs-Transformation.html#CloudWatch-Logs-Transformation-deleteKeys>`_ processor in your transformer.
- See:
- Type:
Use this parameter to include the `deleteKeys <https
- grok
//docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CloudWatch-Logs-Transformation-Processors.html#CloudWatch-Logs-Transformation-grok>`_ processor in your transformer.
- See:
- Type:
Use this parameter to include the `grok <https
- list_to_map
//docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CloudWatch-Logs-Transformation.html#CloudWatch-Logs-Transformation-listToMap>`_ processor in your transformer.
- See:
- Type:
Use this parameter to include the `listToMap <https
- lower_case_string
//docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CloudWatch-Logs-Transformation-Processors.html#CloudWatch-Logs-Transformation-lowerCaseString>`_ processor in your transformer.
- See:
- Type:
Use this parameter to include the `lowerCaseString <https
- move_keys
//docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CloudWatch-Logs-Transformation-Processors.html#CloudWatch-Logs-Transformation-moveKeys>`_ processor in your transformer.
- See:
- Type:
Use this parameter to include the `moveKeys <https
- parse_cloudfront
//docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CloudWatch-Logs-Transformation-Processors.html#CloudWatch-Logs-Transformation-parseCloudfront>`_ processor in your transformer.
If you use this processor, it must be the first processor in your transformer.
- See:
- Type:
Use this parameter to include the `parseCloudfront <https
- parse_json
//docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CloudWatch-Logs-Transformation-Processors.html#CloudWatch-Logs-Transformation-parseJSON>`_ processor in your transformer.
- See:
- Type:
Use this parameter to include the `parseJSON <https
- parse_key_value
//docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CloudWatch-Logs-Transformation-Processors.html#CloudWatch-Logs-Transformation-parseKeyValue>`_ processor in your transformer.
- See:
- Type:
Use this parameter to include the `parseKeyValue <https
- parse_postgres
//docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CloudWatch-Logs-Transformation.html#CloudWatch-Logs-Transformation-parsePostGres>`_ processor in your transformer.
If you use this processor, it must be the first processor in your transformer.
- See:
- Type:
Use this parameter to include the `parsePostGres <https
- parse_route53
//docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CloudWatch-Logs-Transformation-Processors.html#CloudWatch-Logs-Transformation-parseRoute53>`_ processor in your transformer.
If you use this processor, it must be the first processor in your transformer.
- See:
- Type:
Use this parameter to include the `parseRoute53 <https
- parse_vpc
//docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CloudWatch-Logs-Transformation-Processors.html#CloudWatch-Logs-Transformation-parseVPC>`_ processor in your transformer.
If you use this processor, it must be the first processor in your transformer.
- See:
- Type:
Use this parameter to include the `parseVPC <https
- parse_waf
//docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CloudWatch-Logs-Transformation.html#CloudWatch-Logs-Transformation-parseWAF>`_ processor in your transformer.
If you use this processor, it must be the first processor in your transformer.
- See:
- Type:
Use this parameter to include the `parseWAF <https
- rename_keys
//docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CloudWatch-Logs-Transformation.html#CloudWatch-Logs-Transformation-renameKeys>`_ processor in your transformer.
- See:
- Type:
Use this parameter to include the `renameKeys <https
- split_string
//docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CloudWatch-Logs-Transformation-Processors.html#CloudWatch-Logs-Transformation-splitString>`_ processor in your transformer.
- See:
- Type:
Use this parameter to include the `splitString <https
- substitute_string
//docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CloudWatch-Logs-Transformation-Processors.html#CloudWatch-Logs-Transformation-substituteString>`_ processor in your transformer.
- See:
- Type:
Use this parameter to include the `substituteString <https
- trim_string
//docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CloudWatch-Logs-Transformation-Processors.html#CloudWatch-Logs-Transformation-trimString>`_ processor in your transformer.
- See:
- Type:
Use this parameter to include the `trimString <https
- type_converter
//docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CloudWatch-Logs-Transformation-Processors.html#CloudWatch-Logs-Transformation-typeConverter>`_ processor in your transformer.
- See:
- Type:
Use this parameter to include the `typeConverter <https
RenameKeyEntryProperty
- class CfnTransformer.RenameKeyEntryProperty(*, key, rename_to, overwrite_if_exists=None)
Bases:
object
This object defines one key that will be renamed with the renameKey processor.
- Parameters:
key (
str
) – The key to rename.rename_to (
str
) – The string to use for the new key name.overwrite_if_exists (
Union
[bool
,IResolvable
,None
]) – Specifies whether to overwrite the existing value if the destination key already exists. The default isfalse
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs rename_key_entry_property = logs.CfnTransformer.RenameKeyEntryProperty( key="key", rename_to="renameTo", # the properties below are optional overwrite_if_exists=False )
Attributes
- key
The key to rename.
- overwrite_if_exists
Specifies whether to overwrite the existing value if the destination key already exists.
The default is
false
- rename_to
The string to use for the new key name.
RenameKeysProperty
- class CfnTransformer.RenameKeysProperty(*, entries)
Bases:
object
Use this processor to rename keys in a log event.
For more information about this processor including examples, see renameKeys in the CloudWatch Logs User Guide .
- Parameters:
entries (
Union
[IResolvable
,Sequence
[Union
[IResolvable
,RenameKeyEntryProperty
,Dict
[str
,Any
]]]]) – An array ofRenameKeyEntry
objects, where each object contains the information about a single key to rename.- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs rename_keys_property = logs.CfnTransformer.RenameKeysProperty( entries=[logs.CfnTransformer.RenameKeyEntryProperty( key="key", rename_to="renameTo", # the properties below are optional overwrite_if_exists=False )] )
Attributes
- entries
An array of
RenameKeyEntry
objects, where each object contains the information about a single key to rename.
SplitStringEntryProperty
- class CfnTransformer.SplitStringEntryProperty(*, delimiter, source)
Bases:
object
This object defines one log field that will be split with the splitString processor.
- Parameters:
delimiter (
str
) – The separator characters to split the string entry on.source (
str
) – The key of the field to split.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs split_string_entry_property = logs.CfnTransformer.SplitStringEntryProperty( delimiter="delimiter", source="source" )
Attributes
- delimiter
The separator characters to split the string entry on.
- source
The key of the field to split.
SplitStringProperty
- class CfnTransformer.SplitStringProperty(*, entries)
Bases:
object
Use this processor to split a field into an array of strings using a delimiting character.
For more information about this processor including examples, see splitString in the CloudWatch Logs User Guide .
- Parameters:
entries (
Union
[IResolvable
,Sequence
[Union
[IResolvable
,SplitStringEntryProperty
,Dict
[str
,Any
]]]]) – An array ofSplitStringEntry
objects, where each object contains the information about one field to split.- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs split_string_property = logs.CfnTransformer.SplitStringProperty( entries=[logs.CfnTransformer.SplitStringEntryProperty( delimiter="delimiter", source="source" )] )
Attributes
- entries
An array of
SplitStringEntry
objects, where each object contains the information about one field to split.
SubstituteStringEntryProperty
- class CfnTransformer.SubstituteStringEntryProperty(*, from_, source, to)
Bases:
object
This object defines one log field key that will be replaced using the substituteString processor.
- Parameters:
from – The regular expression string to be replaced. Special regex characters such as [ and ] must be escaped using when using double quotes and with when using single quotes. For more information, see Class Pattern on the Oracle web site.
source (
str
) – The key to modify.to (
str
) – The string to be substituted for each match offrom
.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs substitute_string_entry_property = logs.CfnTransformer.SubstituteStringEntryProperty( from="from", source="source", to="to" )
Attributes
- from_
The regular expression string to be replaced.
Special regex characters such as [ and ] must be escaped using when using double quotes and with when using single quotes. For more information, see Class Pattern on the Oracle web site.
- source
The key to modify.
- to
The string to be substituted for each match of
from
.
SubstituteStringProperty
- class CfnTransformer.SubstituteStringProperty(*, entries)
Bases:
object
This processor matches a key’s value against a regular expression and replaces all matches with a replacement string.
For more information about this processor including examples, see substituteString in the CloudWatch Logs User Guide .
- Parameters:
entries (
Union
[IResolvable
,Sequence
[Union
[IResolvable
,SubstituteStringEntryProperty
,Dict
[str
,Any
]]]]) – An array of objects, where each object contains the information about one key to match and replace.- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs substitute_string_property = logs.CfnTransformer.SubstituteStringProperty( entries=[logs.CfnTransformer.SubstituteStringEntryProperty( from="from", source="source", to="to" )] )
Attributes
- entries
An array of objects, where each object contains the information about one key to match and replace.
TrimStringProperty
- class CfnTransformer.TrimStringProperty(*, with_keys)
Bases:
object
Use this processor to remove leading and trailing whitespace.
For more information about this processor including examples, see trimString in the CloudWatch Logs User Guide .
- Parameters:
with_keys (
Sequence
[str
]) – The array containing the keys of the fields to trim.- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs trim_string_property = logs.CfnTransformer.TrimStringProperty( with_keys=["withKeys"] )
Attributes
- with_keys
The array containing the keys of the fields to trim.
TypeConverterEntryProperty
- class CfnTransformer.TypeConverterEntryProperty(*, key, type)
Bases:
object
This object defines one value type that will be converted using the typeConverter processor.
- Parameters:
key (
str
) – The key with the value that is to be converted to a different type.type (
str
) – The type to convert the field value to. Valid values areinteger
,double
,string
andboolean
.
- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs type_converter_entry_property = logs.CfnTransformer.TypeConverterEntryProperty( key="key", type="type" )
Attributes
- key
The key with the value that is to be converted to a different type.
- type
The type to convert the field value to.
Valid values are
integer
,double
,string
andboolean
.
TypeConverterProperty
- class CfnTransformer.TypeConverterProperty(*, entries)
Bases:
object
Use this processor to convert a value type associated with the specified key to the specified type.
It’s a casting processor that changes the types of the specified fields. Values can be converted into one of the following datatypes:
integer
,double
,string
andboolean
.For more information about this processor including examples, see trimString in the CloudWatch Logs User Guide .
- Parameters:
entries (
Union
[IResolvable
,Sequence
[Union
[IResolvable
,TypeConverterEntryProperty
,Dict
[str
,Any
]]]]) – An array ofTypeConverterEntry
objects, where each object contains the information about one field to change the type of.- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs type_converter_property = logs.CfnTransformer.TypeConverterProperty( entries=[logs.CfnTransformer.TypeConverterEntryProperty( key="key", type="type" )] )
Attributes
- entries
An array of
TypeConverterEntry
objects, where each object contains the information about one field to change the type of.
UpperCaseStringProperty
- class CfnTransformer.UpperCaseStringProperty(*, with_keys)
Bases:
object
This processor converts a string field to uppercase.
For more information about this processor including examples, see upperCaseString in the CloudWatch Logs User Guide .
- Parameters:
with_keys (
Sequence
[str
]) – The array of containing the keys of the field to convert to uppercase.- See:
- ExampleMetadata:
fixture=_generated
Example:
# The code below shows an example of how to instantiate this type. # The values are placeholders you should change. from aws_cdk import aws_logs as logs upper_case_string_property = logs.CfnTransformer.UpperCaseStringProperty( with_keys=["withKeys"] )
Attributes
- with_keys
The array of containing the keys of the field to convert to uppercase.