Skip to content

Logger

Logger utility

Usage Documentation

Logger

CLASS DESCRIPTION
Logger

Creates and setups a logger to format statements in JSON.

FUNCTION DESCRIPTION
log_uncaught_exception_hook

Callback function for sys.excepthook to use Logger to log uncaught exceptions

set_package_logger

Set an additional stream handler, formatter, and log level for aws_lambda_powertools package logger.

Logger

Logger(
    service: str | None = None,
    level: str | int | None = None,
    child: bool = False,
    sampling_rate: float | None = None,
    stream: IO[str] | None = None,
    logger_formatter: PowertoolsFormatter | None = None,
    logger_handler: Handler | None = None,
    log_uncaught_exceptions: bool = False,
    json_serializer: Callable[[dict], str] | None = None,
    json_deserializer: (
        Callable[[dict | str | bool | int | float], str]
        | None
    ) = None,
    json_default: Callable[[Any], Any] | None = None,
    datefmt: str | None = None,
    use_datetime_directive: bool = False,
    log_record_order: list[str] | None = None,
    utc: bool = False,
    use_rfc3339: bool = False,
    serialize_stacktrace: bool = True,
    **kwargs
)

Creates and setups a logger to format statements in JSON.

Includes service name and any additional key=value into logs It also accepts both service name or level explicitly via env vars

Environment variables

POWERTOOLS_SERVICE_NAME : str service name POWERTOOLS_LOG_LEVEL: str logging level (e.g. INFO, DEBUG) POWERTOOLS_LOGGER_SAMPLE_RATE: float sampling rate ranging from 0 to 1, 1 being 100% sampling

PARAMETER DESCRIPTION
service

service name to be appended in logs, by default "service_undefined"

TYPE: str DEFAULT: None

level

The level to set. Can be a string representing the level name: 'DEBUG', 'INFO', 'WARNING', 'ERROR', 'CRITICAL' or an integer representing the level value: 10 for 'DEBUG', 20 for 'INFO', 30 for 'WARNING', 40 for 'ERROR', 50 for 'CRITICAL'. by default "INFO"

TYPE: str, int optional DEFAULT: None

child

create a child Logger named ., False by default

TYPE: bool DEFAULT: False

sampling_rate

sample rate for debug calls within execution context defaults to 0.0

TYPE: float | None DEFAULT: None

stream

valid output for a logging stream, by default sys.stdout

TYPE: IO[str] | None DEFAULT: None

logger_formatter

custom logging formatter that implements PowertoolsFormatter

TYPE: PowertoolsFormatter | None DEFAULT: None

logger_handler

custom logging handler e.g. logging.FileHandler("file.log")

TYPE: Handler | None DEFAULT: None

log_uncaught_exceptions

logs uncaught exception using sys.excepthook

See: https://docs.python.org/3/library/sys.html#sys.excepthook

TYPE: bool DEFAULT: False

Parameters propagated to LambdaPowertoolsFormatter

json_serializer : Callable, optional function to serialize obj to a JSON formatted str, by default json.dumps json_deserializer : Callable, optional function to deserialize str, bytes, bytearraycontaining a JSON document to a Pythonobj, by default json.loads json_default : Callable, optional function to coerce unserializable values, by defaultstr()` Only used when no custom formatter is set utc : bool, optional set logging timestamp to UTC, by default False to continue to use local time as per stdlib log_record_order : list, optional set order of log keys when logging, by default ["level", "location", "message", "timestamp"]

Example

Setups structured logging in JSON for Lambda functions with explicit service name

1
2
3
4
5
>>> from aws_lambda_powertools import Logger
>>> logger = Logger(service="payment")
>>>
>>> def handler(event, context):
        logger.info("Hello")

Setups structured logging in JSON for Lambda functions using env vars

1
2
3
4
5
6
7
$ export POWERTOOLS_SERVICE_NAME="payment"
$ export POWERTOOLS_LOGGER_SAMPLE_RATE=0.01 # 1% debug sampling
>>> from aws_lambda_powertools import Logger
>>> logger = Logger()
>>>
>>> def handler(event, context):
        logger.info("Hello")

Append payment_id to previously setup logger

1
2
3
4
5
6
>>> from aws_lambda_powertools import Logger
>>> logger = Logger(service="payment")
>>>
>>> def handler(event, context):
        logger.append_keys(payment_id=event["payment_id"])
        logger.info("Hello")

Create child Logger using logging inheritance via child param

1
2
3
4
5
6
7
8
>>> # app.py
>>> import another_file
>>> from aws_lambda_powertools import Logger
>>> logger = Logger(service="payment")
>>>
>>> # another_file.py
>>> from aws_lambda_powertools import Logger
>>> logger = Logger(service="payment", child=True)

Logging in UTC timezone

1
2
3
4
5
>>> # app.py
>>> import logging
>>> from aws_lambda_powertools import Logger
>>>
>>> logger = Logger(service="payment", utc=True)

Brings message as the first key in log statements

1
2
3
4
5
>>> # app.py
>>> import logging
>>> from aws_lambda_powertools import Logger
>>>
>>> logger = Logger(service="payment", log_record_order=["message"])

Logging to a file instead of standard output for testing

1
2
3
4
5
>>> # app.py
>>> import logging
>>> from aws_lambda_powertools import Logger
>>>
>>> logger = Logger(service="payment", logger_handler=logging.FileHandler("log.json"))
RAISES DESCRIPTION
InvalidLoggerSamplingRateError

When sampling rate provided is not a float

METHOD DESCRIPTION
append_context_keys

Context manager to temporarily add logging keys.

clear_state

Removes all custom keys that were appended to the Logger.

get_correlation_id

Gets the correlation_id in the logging json

inject_lambda_context

Decorator to capture Lambda contextual info and inject into logger

set_correlation_id

Sets the correlation_id in the logging json

structure_logs

Sets logging formatting to JSON.

ATTRIBUTE DESCRIPTION
handlers

List of registered logging handlers

TYPE: list[Handler]

registered_formatter

Convenience property to access the first logger formatter

TYPE: BasePowertoolsFormatter

registered_handler

Convenience property to access the first logger handler

TYPE: Handler

Source code in aws_lambda_powertools/logging/logger.py
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
def __init__(
    self,
    service: str | None = None,
    level: str | int | None = None,
    child: bool = False,
    sampling_rate: float | None = None,
    stream: IO[str] | None = None,
    logger_formatter: PowertoolsFormatter | None = None,
    logger_handler: logging.Handler | None = None,
    log_uncaught_exceptions: bool = False,
    json_serializer: Callable[[dict], str] | None = None,
    json_deserializer: Callable[[dict | str | bool | int | float], str] | None = None,
    json_default: Callable[[Any], Any] | None = None,
    datefmt: str | None = None,
    use_datetime_directive: bool = False,
    log_record_order: list[str] | None = None,
    utc: bool = False,
    use_rfc3339: bool = False,
    serialize_stacktrace: bool = True,
    **kwargs,
) -> None:
    self.service = resolve_env_var_choice(
        choice=service,
        env=os.getenv(constants.SERVICE_NAME_ENV, "service_undefined"),
    )
    self.sampling_rate = resolve_env_var_choice(
        choice=sampling_rate,
        env=os.getenv(constants.LOGGER_LOG_SAMPLING_RATE),
    )
    self._default_log_keys: dict[str, Any] = {"service": self.service, "sampling_rate": self.sampling_rate}
    self.child = child
    self.logger_formatter = logger_formatter
    self._stream = stream or sys.stdout

    self.log_uncaught_exceptions = log_uncaught_exceptions

    self._is_deduplication_disabled = resolve_truthy_env_var_choice(
        env=os.getenv(constants.LOGGER_LOG_DEDUPLICATION_ENV, "false"),
    )
    self._logger = self._get_logger()
    self.logger_handler = logger_handler or self._get_handler()

    # NOTE: This is primarily to improve UX, so IDEs can autocomplete LambdaPowertoolsFormatter options
    # previously, we masked all of them as kwargs thus limiting feature discovery
    formatter_options = {
        "json_serializer": json_serializer,
        "json_deserializer": json_deserializer,
        "json_default": json_default,
        "datefmt": datefmt,
        "use_datetime_directive": use_datetime_directive,
        "log_record_order": log_record_order,
        "utc": utc,
        "use_rfc3339": use_rfc3339,
        "serialize_stacktrace": serialize_stacktrace,
    }

    self._init_logger(formatter_options=formatter_options, log_level=level, **kwargs)

    if self.log_uncaught_exceptions:
        logger.debug("Replacing exception hook")
        sys.excepthook = functools.partial(log_uncaught_exception_hook, logger=self)

handlers property

handlers: list[Handler]

List of registered logging handlers

Notes

Looking for the first configured handler? Use registered_handler property instead.

registered_formatter property

registered_formatter: BasePowertoolsFormatter

Convenience property to access the first logger formatter

registered_handler property

registered_handler: Handler

Convenience property to access the first logger handler

append_context_keys

append_context_keys(
    **additional_keys: Any,
) -> Generator[None, None, None]

Context manager to temporarily add logging keys.

PARAMETER DESCRIPTION
**additional_keys

Key-value pairs to include in the log context during the lifespan of the context manager.

TYPE: Any DEFAULT: {}

Example

Logging with contextual keys

1
2
3
4
logger = Logger(service="example_service")
with logger.append_context_keys(user_id="123", operation="process"):
    logger.info("Log with context")
logger.info("Log without context")
Source code in aws_lambda_powertools/logging/logger.py
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
@contextmanager
def append_context_keys(self, **additional_keys: Any) -> Generator[None, None, None]:
    """
    Context manager to temporarily add logging keys.

    Parameters
    -----------
    **additional_keys: Any
        Key-value pairs to include in the log context during the lifespan of the context manager.

    Example
    --------
    **Logging with contextual keys**

        logger = Logger(service="example_service")
        with logger.append_context_keys(user_id="123", operation="process"):
            logger.info("Log with context")
        logger.info("Log without context")
    """
    with self.registered_formatter.append_context_keys(**additional_keys):
        yield

clear_state

clear_state() -> None

Removes all custom keys that were appended to the Logger.

Source code in aws_lambda_powertools/logging/logger.py
635
636
637
638
639
640
641
def clear_state(self) -> None:
    """Removes all custom keys that were appended to the Logger."""
    # Clear all custom keys from the formatter
    self.registered_formatter.clear_state()

    # Reset to default keys
    self.structure_logs(**self._default_log_keys)

get_correlation_id

get_correlation_id() -> str | None

Gets the correlation_id in the logging json

RETURNS DESCRIPTION
(str, optional)

Value for the correlation id

Source code in aws_lambda_powertools/logging/logger.py
718
719
720
721
722
723
724
725
726
727
728
def get_correlation_id(self) -> str | None:
    """Gets the correlation_id in the logging json

    Returns
    -------
    str, optional
        Value for the correlation id
    """
    if isinstance(self.registered_formatter, LambdaPowertoolsFormatter):
        return self.registered_formatter.log_format.get("correlation_id")
    return None

inject_lambda_context

inject_lambda_context(
    lambda_handler: AnyCallableT,
    log_event: bool | None = None,
    correlation_id_path: str | None = None,
    clear_state: bool | None = False,
) -> AnyCallableT
inject_lambda_context(
    lambda_handler: None = None,
    log_event: bool | None = None,
    correlation_id_path: str | None = None,
    clear_state: bool | None = False,
) -> Callable[[AnyCallableT], AnyCallableT]
inject_lambda_context(
    lambda_handler: AnyCallableT | None = None,
    log_event: bool | None = None,
    correlation_id_path: str | None = None,
    clear_state: bool | None = False,
) -> Any

Decorator to capture Lambda contextual info and inject into logger

PARAMETER DESCRIPTION
clear_state

Instructs logger to remove any custom keys previously added

TYPE: bool DEFAULT: False

lambda_handler

Method to inject the lambda context

TYPE: Callable DEFAULT: None

log_event

Instructs logger to log Lambda Event, by default False

TYPE: bool DEFAULT: None

correlation_id_path

Optional JMESPath for the correlation_id

TYPE: str | None DEFAULT: None

Environment variables

POWERTOOLS_LOGGER_LOG_EVENT : str instruct logger to log Lambda Event (e.g. "true", "True", "TRUE")

Example

Captures Lambda contextual runtime info (e.g memory, arn, req_id)

1
2
3
4
5
6
7
from aws_lambda_powertools import Logger

logger = Logger(service="payment")

@logger.inject_lambda_context
def handler(event, context):
    logger.info("Hello")

Captures Lambda contextual runtime info and logs incoming request

1
2
3
4
5
6
7
from aws_lambda_powertools import Logger

logger = Logger(service="payment")

@logger.inject_lambda_context(log_event=True)
def handler(event, context):
    logger.info("Hello")
RETURNS DESCRIPTION
decorate

Decorated lambda handler

TYPE: Callable

Source code in aws_lambda_powertools/logging/logger.py
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
def inject_lambda_context(
    self,
    lambda_handler: AnyCallableT | None = None,
    log_event: bool | None = None,
    correlation_id_path: str | None = None,
    clear_state: bool | None = False,
) -> Any:
    """Decorator to capture Lambda contextual info and inject into logger

    Parameters
    ----------
    clear_state : bool, optional
        Instructs logger to remove any custom keys previously added
    lambda_handler : Callable
        Method to inject the lambda context
    log_event : bool, optional
        Instructs logger to log Lambda Event, by default False
    correlation_id_path: str, optional
        Optional JMESPath for the correlation_id

    Environment variables
    ---------------------
    POWERTOOLS_LOGGER_LOG_EVENT : str
        instruct logger to log Lambda Event (e.g. `"true", "True", "TRUE"`)

    Example
    -------
    **Captures Lambda contextual runtime info (e.g memory, arn, req_id)**

        from aws_lambda_powertools import Logger

        logger = Logger(service="payment")

        @logger.inject_lambda_context
        def handler(event, context):
            logger.info("Hello")

    **Captures Lambda contextual runtime info and logs incoming request**

        from aws_lambda_powertools import Logger

        logger = Logger(service="payment")

        @logger.inject_lambda_context(log_event=True)
        def handler(event, context):
            logger.info("Hello")

    Returns
    -------
    decorate : Callable
        Decorated lambda handler
    """

    # If handler is None we've been called with parameters
    # Return a partial function with args filled
    if lambda_handler is None:
        logger.debug("Decorator called with parameters")
        return functools.partial(
            self.inject_lambda_context,
            log_event=log_event,
            correlation_id_path=correlation_id_path,
            clear_state=clear_state,
        )

    log_event = resolve_truthy_env_var_choice(
        env=os.getenv(constants.LOGGER_LOG_EVENT_ENV, "false"),
        choice=log_event,
    )

    @functools.wraps(lambda_handler)
    def decorate(event, context, *args, **kwargs):
        lambda_context = build_lambda_context_model(context)
        cold_start = _is_cold_start()

        if clear_state:
            self.structure_logs(cold_start=cold_start, **lambda_context.__dict__)
        else:
            self.append_keys(cold_start=cold_start, **lambda_context.__dict__)

        if correlation_id_path:
            self.set_correlation_id(
                jmespath_utils.query(envelope=correlation_id_path, data=event),
            )

        if log_event:
            logger.debug("Event received")
            self.info(extract_event_from_common_models(event))

        return lambda_handler(event, context, *args, **kwargs)

    return decorate

set_correlation_id

set_correlation_id(value: str | None) -> None

Sets the correlation_id in the logging json

PARAMETER DESCRIPTION
value

Value for the correlation id. None will remove the correlation_id

TYPE: str

Source code in aws_lambda_powertools/logging/logger.py
708
709
710
711
712
713
714
715
716
def set_correlation_id(self, value: str | None) -> None:
    """Sets the correlation_id in the logging json

    Parameters
    ----------
    value : str, optional
        Value for the correlation id. None will remove the correlation_id
    """
    self.append_keys(correlation_id=value)

structure_logs

structure_logs(
    append: bool = False,
    formatter_options: dict | None = None,
    **keys
) -> None

Sets logging formatting to JSON.

Optionally, it can append keyword arguments to an existing logger, so it is available across future log statements.

Last keyword argument and value wins if duplicated.

PARAMETER DESCRIPTION
append

append keys provided to logger formatter, by default False

TYPE: bool DEFAULT: False

formatter_options

LambdaPowertoolsFormatter options to be propagated, by default {}

TYPE: dict DEFAULT: None

Source code in aws_lambda_powertools/logging/logger.py
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
def structure_logs(self, append: bool = False, formatter_options: dict | None = None, **keys) -> None:
    """Sets logging formatting to JSON.

    Optionally, it can append keyword arguments
    to an existing logger, so it is available across future log statements.

    Last keyword argument and value wins if duplicated.

    Parameters
    ----------
    append : bool, optional
        append keys provided to logger formatter, by default False
    formatter_options : dict, optional
        LambdaPowertoolsFormatter options to be propagated, by default {}
    """
    formatter_options = formatter_options or {}

    # There are 3 operational modes for this method
    ## 1. Register a Powertools for AWS Lambda (Python) Formatter for the first time
    ## 2. Append new keys to the current logger formatter; deprecated in favour of append_keys
    ## 3. Add new keys and discard existing to the registered formatter

    # Mode 1
    log_keys = {**self._default_log_keys, **keys}
    is_logger_preconfigured = getattr(self._logger, LOGGER_ATTRIBUTE_PRECONFIGURED, False)
    if not is_logger_preconfigured:
        formatter = self.logger_formatter or LambdaPowertoolsFormatter(**formatter_options, **log_keys)
        self.registered_handler.setFormatter(formatter)

        # when using a custom Powertools for AWS Lambda (Python) Formatter
        # standard and custom keys that are not Powertools for AWS Lambda (Python) Formatter parameters
        # should be appended and custom keys that might happen to be Powertools for AWS Lambda (Python)
        # Formatter parameters should be discarded this prevents adding them as custom keys, for example,
        # `json_default=<callable>` see https://github.com/aws-powertools/powertools-lambda-python/issues/1263
        custom_keys = {k: v for k, v in log_keys.items() if k not in RESERVED_FORMATTER_CUSTOM_KEYS}
        return self.registered_formatter.append_keys(**custom_keys)

    # Mode 2 (legacy)
    if append:
        # Maintenance: Add deprecation warning for major version
        return self.append_keys(**keys)

    # Mode 3
    self.registered_formatter.clear_state()
    self.registered_formatter.thread_safe_clear_keys()
    self.registered_formatter.append_keys(**log_keys)

log_uncaught_exception_hook

log_uncaught_exception_hook(
    exc_type, exc_value, exc_traceback, logger: Logger
) -> None

Callback function for sys.excepthook to use Logger to log uncaught exceptions

Source code in aws_lambda_powertools/logging/logger.py
904
905
906
def log_uncaught_exception_hook(exc_type, exc_value, exc_traceback, logger: Logger) -> None:
    """Callback function for sys.excepthook to use Logger to log uncaught exceptions"""
    logger.exception(exc_value, exc_info=(exc_type, exc_value, exc_traceback))  # pragma: no cover

set_package_logger

set_package_logger(
    level: str | int = logging.DEBUG,
    stream: IO[str] | None = None,
    formatter: Formatter | None = None,
) -> None

Set an additional stream handler, formatter, and log level for aws_lambda_powertools package logger.

Package log by default is suppressed (NullHandler), this should only used for debugging. This is separate from application Logger class utility

Example

Enables debug logging for Powertools for AWS Lambda (Python) package

1
2
>>> aws_lambda_powertools.logging.logger import set_package_logger
>>> set_package_logger()
PARAMETER DESCRIPTION
level

log level, DEBUG by default

TYPE: str | int DEFAULT: DEBUG

stream

log stream, stdout by default

TYPE: IO[str] | None DEFAULT: None

formatter

log formatter, "%(asctime)s %(name)s [%(levelname)s] %(message)s" by default

TYPE: Formatter | None DEFAULT: None

Source code in aws_lambda_powertools/logging/logger.py
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
def set_package_logger(
    level: str | int = logging.DEBUG,
    stream: IO[str] | None = None,
    formatter: logging.Formatter | None = None,
) -> None:
    """Set an additional stream handler, formatter, and log level for aws_lambda_powertools package logger.

    **Package log by default is suppressed (NullHandler), this should only used for debugging.
    This is separate from application Logger class utility**

    Example
    -------
    **Enables debug logging for Powertools for AWS Lambda (Python) package**

        >>> aws_lambda_powertools.logging.logger import set_package_logger
        >>> set_package_logger()

    Parameters
    ----------
    level: str, int
        log level, DEBUG by default
    stream: sys.stdout
        log stream, stdout by default
    formatter: logging.Formatter
        log formatter, "%(asctime)s %(name)s [%(levelname)s] %(message)s" by default
    """
    if formatter is None:
        formatter = logging.Formatter("%(asctime)s %(name)s [%(levelname)s] %(message)s")

    if stream is None:
        stream = sys.stdout

    logger = logging.getLogger("aws_lambda_powertools")
    logger.setLevel(level)
    handler = logging.StreamHandler(stream)
    handler.setFormatter(formatter)
    logger.addHandler(handler)