diff --git a/docs/core/logger.md b/docs/core/logger.md
index bc42225edcf..90353b060c4 100644
--- a/docs/core/logger.md
+++ b/docs/core/logger.md
@@ -16,50 +16,30 @@ Logger provides an opinionated logger with output structured as JSON.
Logger requires two settings:
-Setting | Description | Environment variable | Constructor parameter
-------------------------------------------------- | ------------------------------------------------- | ------------------------------------------------- | -------------------------------------------------
-**Logging level** | Sets how verbose Logger should be (INFO, by default) | `LOG_LEVEL` | `level`
-**Service** | Sets **service** key that will be present across all log statements | `POWERTOOLS_SERVICE_NAME` | `service`
-
-???+ example
- **AWS Serverless Application Model (SAM)**
-
-=== "template.yaml"
-
- ```yaml hl_lines="9 10"
- Resources:
- HelloWorldFunction:
- Type: AWS::Serverless::Function
- Properties:
- Runtime: python3.8
- Environment:
- Variables:
- LOG_LEVEL: INFO
- POWERTOOLS_SERVICE_NAME: example
- ```
-=== "app.py"
+| Setting | Description | Environment variable | Constructor parameter |
+| ----------------- | ------------------------------------------------------------------- | ------------------------- | --------------------- |
+| **Logging level** | Sets how verbose Logger should be (INFO, by default) | `LOG_LEVEL` | `level` |
+| **Service** | Sets **service** key that will be present across all log statements | `POWERTOOLS_SERVICE_NAME` | `service` |
- ```python hl_lines="2 4"
- from aws_lambda_powertools import Logger
- logger = Logger() # Sets service via env var
- # OR logger = Logger(service="example")
- ```
+```yaml hl_lines="12-13" title="AWS Serverless Application Model (SAM) example"
+--8<-- "examples/logger/sam/template.yaml"
+```
### Standard structured keys
Your Logger will include the following keys to your structured logging:
-Key | Example | Note
-------------------------------------------------- | ------------------------------------------------- | ---------------------------------------------------------------------------------
-**level**: `str` | `INFO` | Logging level
-**location**: `str` | `collect.handler:1` | Source code location where statement was executed
-**message**: `Any` | `Collecting payment` | Unserializable JSON values are casted as `str`
-**timestamp**: `str` | `2021-05-03 10:20:19,650+0200` | Timestamp with milliseconds, by default uses local timezone
-**service**: `str` | `payment` | Service name defined, by default `service_undefined`
-**xray_trace_id**: `str` | `1-5759e988-bd862e3fe1be46a994272793` | When [tracing is enabled](https://docs.aws.amazon.com/lambda/latest/dg/services-xray.html){target="_blank"}, it shows X-Ray Trace ID
-**sampling_rate**: `float` | `0.1` | When enabled, it shows sampling rate in percentage e.g. 10%
-**exception_name**: `str` | `ValueError` | When `logger.exception` is used and there is an exception
-**exception**: `str` | `Traceback (most recent call last)..` | When `logger.exception` is used and there is an exception
+| Key | Example | Note |
+| -------------------------- | ------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------ |
+| **level**: `str` | `INFO` | Logging level |
+| **location**: `str` | `collect.handler:1` | Source code location where statement was executed |
+| **message**: `Any` | `Collecting payment` | Unserializable JSON values are casted as `str` |
+| **timestamp**: `str` | `2021-05-03 10:20:19,650+0200` | Timestamp with milliseconds, by default uses local timezone |
+| **service**: `str` | `payment` | Service name defined, by default `service_undefined` |
+| **xray_trace_id**: `str` | `1-5759e988-bd862e3fe1be46a994272793` | When [tracing is enabled](https://docs.aws.amazon.com/lambda/latest/dg/services-xray.html){target="_blank"}, it shows X-Ray Trace ID |
+| **sampling_rate**: `float` | `0.1` | When enabled, it shows sampling rate in percentage e.g. 10% |
+| **exception_name**: `str` | `ValueError` | When `logger.exception` is used and there is an exception |
+| **exception**: `str` | `Traceback (most recent call last)..` | When `logger.exception` is used and there is an exception |
### Capturing Lambda context info
@@ -67,83 +47,38 @@ You can enrich your structured logs with key Lambda context information via `inj
=== "collect.py"
- ```python hl_lines="5"
- from aws_lambda_powertools import Logger
-
- logger = Logger(service="payment")
-
- @logger.inject_lambda_context
- def handler(event, context):
- logger.info("Collecting payment")
-
- # You can log entire objects too
- logger.info({
- "operation": "collect_payment",
- "charge_id": event['charge_id']
- })
- ...
+ ```python hl_lines="7"
+ --8<-- "examples/logger/src/inject_lambda_context.py"
```
=== "Example CloudWatch Logs excerpt"
- ```json hl_lines="7-11 16-19"
- {
- "level": "INFO",
- "location": "collect.handler:7",
- "message": "Collecting payment",
- "timestamp": "2021-05-03 11:47:12,494+0200",
- "service": "payment",
- "cold_start": true,
- "lambda_function_name": "test",
- "lambda_function_memory_size": 128,
- "lambda_function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test",
- "lambda_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72"
- },
- {
- "level": "INFO",
- "location": "collect.handler:10",
- "message": {
- "operation": "collect_payment",
- "charge_id": "ch_AZFlk2345C0"
- },
- "timestamp": "2021-05-03 11:47:12,494+0200",
- "service": "payment",
- "cold_start": true,
- "lambda_function_name": "test",
- "lambda_function_memory_size": 128,
- "lambda_function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test",
- "lambda_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72"
- }
+ ```json hl_lines="8-12 17-20"
+ --8<-- "examples/logger/src/inject_lambda_context_output.json"
```
When used, this will include the following keys:
-Key | Example
-------------------------------------------------- | ---------------------------------------------------------------------------------
-**cold_start**: `bool` | `false`
-**function_name** `str` | `example-powertools-HelloWorldFunction-1P1Z6B39FLU73`
-**function_memory_size**: `int` | `128`
-**function_arn**: `str` | `arn:aws:lambda:eu-west-1:012345678910:function:example-powertools-HelloWorldFunction-1P1Z6B39FLU73`
-**function_request_id**: `str` | `899856cb-83d1-40d7-8611-9e78f15f32f4`
+| Key | Example |
+| ------------------------------- | ---------------------------------------------------------------------------------------------------- |
+| **cold_start**: `bool` | `false` |
+| **function_name** `str` | `example-powertools-HelloWorldFunction-1P1Z6B39FLU73` |
+| **function_memory_size**: `int` | `128` |
+| **function_arn**: `str` | `arn:aws:lambda:eu-west-1:012345678910:function:example-powertools-HelloWorldFunction-1P1Z6B39FLU73` |
+| **function_request_id**: `str` | `899856cb-83d1-40d7-8611-9e78f15f32f4` |
-#### Logging incoming event
+### Logging incoming event
When debugging in non-production environments, you can instruct Logger to log the incoming event with `log_event` param or via `POWERTOOLS_LOGGER_LOG_EVENT` env var.
???+ warning
This is disabled by default to prevent sensitive info being logged
-```python hl_lines="5" title="Logging incoming event"
-from aws_lambda_powertools import Logger
-
-logger = Logger(service="payment")
-
-@logger.inject_lambda_context(log_event=True)
-def handler(event, context):
- ...
+```python hl_lines="7" title="Logging incoming event"
+--8<-- "examples/logger/src/log_incoming_event.py"
```
-#### Setting a Correlation ID
+### Setting a Correlation ID
You can set a Correlation ID using `correlation_id_path` param by passing a [JMESPath expression](https://jmespath.org/tutorial.html){target="_blank"}.
@@ -152,87 +87,63 @@ You can set a Correlation ID using `correlation_id_path` param by passing a [JME
=== "collect.py"
- ```python hl_lines="5"
- from aws_lambda_powertools import Logger
-
- logger = Logger(service="payment")
-
- @logger.inject_lambda_context(correlation_id_path="headers.my_request_id_header")
- def handler(event, context):
- logger.debug(f"Correlation ID => {logger.get_correlation_id()}")
- logger.info("Collecting payment")
+ ```python hl_lines="7"
+ --8<-- "examples/logger/src/set_correlation_id.py"
```
=== "Example Event"
```json hl_lines="3"
- {
- "headers": {
- "my_request_id_header": "correlation_id_value"
- }
- }
+ --8<-- "examples/logger/src/set_correlation_id_event.json"
```
=== "Example CloudWatch Logs excerpt"
```json hl_lines="12"
- {
- "level": "INFO",
- "location": "collect.handler:7",
- "message": "Collecting payment",
- "timestamp": "2021-05-03 11:47:12,494+0200",
- "service": "payment",
- "cold_start": true,
- "lambda_function_name": "test",
- "lambda_function_memory_size": 128,
- "lambda_function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test",
- "lambda_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72",
- "correlation_id": "correlation_id_value"
- }
+ --8<-- "examples/logger/src/set_correlation_id_output.json"
```
-We provide [built-in JMESPath expressions](#built-in-correlation-id-expressions) for known event sources, where either a request ID or X-Ray Trace ID are present.
+#### set_correlation_id method
+
+You can also use `set_correlation_id` method to inject it anywhere else in your code. Example below uses [Event Source Data Classes utility](../utilities/data_classes.md) to easily access events properties.
=== "collect.py"
- ```python hl_lines="2 6"
- from aws_lambda_powertools import Logger
- from aws_lambda_powertools.logging import correlation_paths
+ ```python hl_lines="11"
+ --8<-- "examples/logger/src/set_correlation_id_method.py"
+ ```
+=== "Example Event"
- logger = Logger(service="payment")
+ ```json hl_lines="3"
+ --8<-- "examples/logger/src/set_correlation_id_method_event.json"
+ ```
- @logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST)
- def handler(event, context):
- logger.debug(f"Correlation ID => {logger.get_correlation_id()}")
- logger.info("Collecting payment")
+=== "Example CloudWatch Logs excerpt"
+
+ ```json hl_lines="7"
+ --8<-- "examples/logger/src/set_correlation_id_method_output.json"
+ ```
+
+#### Known correlation IDs
+
+To ease routine tasks like extracting correlation ID from popular event sources, we provide [built-in JMESPath expressions](#built-in-correlation-id-expressions).
+
+=== "collect.py"
+
+ ```python hl_lines="2 8"
+ --8<-- "examples/logger/src/set_correlation_id_jmespath.py"
```
=== "Example Event"
```json hl_lines="3"
- {
- "requestContext": {
- "requestId": "correlation_id_value"
- }
- }
+ --8<-- "examples/logger/src/set_correlation_id_jmespath_event.json"
```
=== "Example CloudWatch Logs excerpt"
```json hl_lines="12"
- {
- "level": "INFO",
- "location": "collect.handler:8",
- "message": "Collecting payment",
- "timestamp": "2021-05-03 11:47:12,494+0200",
- "service": "payment",
- "cold_start": true,
- "lambda_function_name": "test",
- "lambda_function_memory_size": 128,
- "lambda_function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test",
- "lambda_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72",
- "correlation_id": "correlation_id_value"
- }
+ --8<-- "examples/logger/src/set_correlation_id_jmespath_output.json"
```
### Appending additional keys
@@ -254,30 +165,13 @@ You can append your own keys to your existing Logger via `append_keys(**addition
=== "collect.py"
- ```python hl_lines="9"
- from aws_lambda_powertools import Logger
-
- logger = Logger(service="payment")
-
- def handler(event, context):
- order_id = event.get("order_id")
-
- # this will ensure order_id key always has the latest value before logging
- logger.append_keys(order_id=order_id)
-
- logger.info("Collecting payment")
+ ```python hl_lines="12"
+ --8<-- "examples/logger/src/append_keys.py"
```
=== "Example CloudWatch Logs excerpt"
```json hl_lines="7"
- {
- "level": "INFO",
- "location": "collect.handler:11",
- "message": "Collecting payment",
- "timestamp": "2021-05-03 11:47:12,494+0200",
- "service": "payment",
- "order_id": "order_id_value"
- }
+ --8<-- "examples/logger/src/append_keys_output.json"
```
???+ tip "Tip: Logger will automatically reject any key with a None value"
@@ -296,103 +190,13 @@ It accepts any dictionary, and all keyword arguments will be added as part of th
=== "extra_parameter.py"
- ```python hl_lines="6"
- from aws_lambda_powertools import Logger
-
- logger = Logger(service="payment")
-
- fields = { "request_id": "1123" }
- logger.info("Collecting payment", extra=fields)
- ```
-=== "Example CloudWatch Logs excerpt"
-
- ```json hl_lines="7"
- {
- "level": "INFO",
- "location": "collect.handler:6",
- "message": "Collecting payment",
- "timestamp": "2021-05-03 11:47:12,494+0200",
- "service": "payment",
- "request_id": "1123"
- }
- ```
-
-#### set_correlation_id method
-
-You can set a correlation_id to your existing Logger via `set_correlation_id(value)` method by passing any string value.
-
-=== "collect.py"
-
- ```python hl_lines="6"
- from aws_lambda_powertools import Logger
-
- logger = Logger(service="payment")
-
- def handler(event, context):
- logger.set_correlation_id(event["requestContext"]["requestId"])
- logger.info("Collecting payment")
- ```
-
-=== "Example Event"
-
- ```json hl_lines="3"
- {
- "requestContext": {
- "requestId": "correlation_id_value"
- }
- }
- ```
-
-=== "Example CloudWatch Logs excerpt"
-
- ```json hl_lines="7"
- {
- "level": "INFO",
- "location": "collect.handler:7",
- "message": "Collecting payment",
- "timestamp": "2021-05-03 11:47:12,494+0200",
- "service": "payment",
- "correlation_id": "correlation_id_value"
- }
- ```
-
-Alternatively, you can combine [Data Classes utility](../utilities/data_classes.md) with Logger to use dot notation object:
-
-=== "collect.py"
-
- ```python hl_lines="2 7-8"
- from aws_lambda_powertools import Logger
- from aws_lambda_powertools.utilities.data_classes import APIGatewayProxyEvent
-
- logger = Logger(service="payment")
-
- def handler(event, context):
- event = APIGatewayProxyEvent(event)
- logger.set_correlation_id(event.request_context.request_id)
- logger.info("Collecting payment")
- ```
-=== "Example Event"
-
- ```json hl_lines="3"
- {
- "requestContext": {
- "requestId": "correlation_id_value"
- }
- }
+ ```python hl_lines="9"
+ --8<-- "examples/logger/src/append_keys_extra.py"
```
-
=== "Example CloudWatch Logs excerpt"
```json hl_lines="7"
- {
- "timestamp": "2020-05-24 18:17:33,774",
- "level": "INFO",
- "location": "collect.handler:9",
- "service": "payment",
- "sampling_rate": 0.0,
- "correlation_id": "correlation_id_value",
- "message": "Collecting payment"
- }
+ --8<-- "examples/logger/src/append_keys_extra_output.json"
```
### Removing additional keys
@@ -401,37 +205,14 @@ You can remove any additional key from Logger state using `remove_keys`.
=== "collect.py"
- ```python hl_lines="9"
- from aws_lambda_powertools import Logger
-
- logger = Logger(service="payment")
-
- def handler(event, context):
- logger.append_keys(sample_key="value")
- logger.info("Collecting payment")
-
- logger.remove_keys(["sample_key"])
- logger.info("Collecting payment without sample key")
+ ```python hl_lines="11"
+ --8<-- "examples/logger/src/remove_keys.py"
```
=== "Example CloudWatch Logs excerpt"
```json hl_lines="7"
- {
- "level": "INFO",
- "location": "collect.handler:7",
- "message": "Collecting payment",
- "timestamp": "2021-05-03 11:47:12,494+0200",
- "service": "payment",
- "sample_key": "value"
- },
- {
- "level": "INFO",
- "location": "collect.handler:10",
- "message": "Collecting payment without sample key",
- "timestamp": "2021-05-03 11:47:12,494+0200",
- "service": "payment"
- }
+ --8<-- "examples/logger/src/remove_keys_output.json"
```
#### Clearing all state
@@ -450,54 +231,20 @@ Logger is commonly initialized in the global scope. Due to [Lambda Execution Con
=== "collect.py"
- ```python hl_lines="5 8"
- from aws_lambda_powertools import Logger
-
- logger = Logger(service="payment")
-
- @logger.inject_lambda_context(clear_state=True)
- def handler(event, context):
- if event.get("special_key"):
- # Should only be available in the first request log
- # as the second request doesn't contain `special_key`
- logger.append_keys(debugging_key="value")
-
- logger.info("Collecting payment")
+ ```python hl_lines="7 10"
+ --8<-- "examples/logger/src/clear_state.py"
```
=== "#1 request"
```json hl_lines="7"
- {
- "level": "INFO",
- "location": "collect.handler:10",
- "message": "Collecting payment",
- "timestamp": "2021-05-03 11:47:12,494+0200",
- "service": "payment",
- "special_key": "debug_key",
- "cold_start": true,
- "lambda_function_name": "test",
- "lambda_function_memory_size": 128,
- "lambda_function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test",
- "lambda_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72"
- }
+ --8<-- "examples/logger/src/clear_state_event_one.json"
```
=== "#2 request"
```json hl_lines="7"
- {
- "level": "INFO",
- "location": "collect.handler:10",
- "message": "Collecting payment",
- "timestamp": "2021-05-03 11:47:12,494+0200",
- "service": "payment",
- "cold_start": false,
- "lambda_function_name": "test",
- "lambda_function_memory_size": 128,
- "lambda_function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test",
- "lambda_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72"
- }
+ --8<-- "examples/logger/src/clear_state_event_two.json"
```
### Logging exceptions
@@ -509,29 +256,14 @@ Use `logger.exception` method to log contextual information about exceptions. Lo
=== "collect.py"
- ```python hl_lines="8"
- from aws_lambda_powertools import Logger
-
- logger = Logger(service="payment")
-
- try:
- raise ValueError("something went wrong")
- except Exception:
- logger.exception("Received an exception")
+ ```python hl_lines="15"
+ --8<-- "examples/logger/src/logging_exceptions.py"
```
=== "Example CloudWatch Logs excerpt"
```json hl_lines="7-8"
- {
- "level": "ERROR",
- "location": "collect.handler:5",
- "message": "Received an exception",
- "timestamp": "2021-05-03 11:47:12,494+0200",
- "service": "payment",
- "exception_name": "ValueError",
- "exception": "Traceback (most recent call last):\n File \"\", line 2, in \nValueError: something went wrong"
- }
+ --8<-- "examples/logger/src/logging_exceptions_output.json"
```
## Advanced
@@ -543,52 +275,54 @@ You can use any of the following built-in JMESPath expressions as part of [injec
???+ note "Note: Any object key named with `-` must be escaped"
For example, **`request.headers."x-amzn-trace-id"`**.
-Name | Expression | Description
-------------------------------------------------- | ------------------------------------------------- | ---------------------------------------------------------------------------------
-**API_GATEWAY_REST** | `"requestContext.requestId"` | API Gateway REST API request ID
-**API_GATEWAY_HTTP** | `"requestContext.requestId"` | API Gateway HTTP API request ID
-**APPSYNC_RESOLVER** | `'request.headers."x-amzn-trace-id"'` | AppSync X-Ray Trace ID
-**APPLICATION_LOAD_BALANCER** | `'headers."x-amzn-trace-id"'` | ALB X-Ray Trace ID
-**EVENT_BRIDGE** | `"id"` | EventBridge Event ID
+| Name | Expression | Description |
+| ----------------------------- | ------------------------------------- | ------------------------------- |
+| **API_GATEWAY_REST** | `"requestContext.requestId"` | API Gateway REST API request ID |
+| **API_GATEWAY_HTTP** | `"requestContext.requestId"` | API Gateway HTTP API request ID |
+| **APPSYNC_RESOLVER** | `'request.headers."x-amzn-trace-id"'` | AppSync X-Ray Trace ID |
+| **APPLICATION_LOAD_BALANCER** | `'headers."x-amzn-trace-id"'` | ALB X-Ray Trace ID |
+| **EVENT_BRIDGE** | `"id"` | EventBridge Event ID |
### Reusing Logger across your code
-Logger supports inheritance via `child` parameter. This allows you to create multiple Loggers across your code base, and propagate changes such as new keys to all Loggers.
+Similar to [Tracer](./tracer.md#reusing-tracer-across-your-code), a new instance that uses the same `service` name - env var or explicit parameter - will reuse a previous Logger instance. Just like `logging.getLogger("logger_name")` would in the standard library if called with the same logger name.
+
+Notice in the CloudWatch Logs output how `payment_id` appeared as expected when logging in `collect.py`.
=== "collect.py"
- ```python hl_lines="1 7"
- import shared # Creates a child logger named "payment.shared"
- from aws_lambda_powertools import Logger
+ ```python hl_lines="1 9 11 12"
+ --8<-- "examples/logger/src/logger_reuse.py"
+ ```
- logger = Logger() # POWERTOOLS_SERVICE_NAME: "payment"
+=== "payment.py"
- def handler(event, context):
- shared.inject_payment_id(event)
- ...
+ ```python hl_lines="3 7"
+ --8<-- "examples/logger/src/logger_reuse_payment.py"
```
-=== "shared.py"
+=== "Example CloudWatch Logs excerpt"
- ```python hl_lines="6"
- from aws_lambda_powertools import Logger
+ ```json hl_lines="12"
+ --8<-- "examples/logger/src/logger_reuse_output.json"
+ ```
- logger = Logger(child=True) # POWERTOOLS_SERVICE_NAME: "payment"
+???+ note "Note: About Child Loggers"
+ Coming from standard library, you might be used to use `logging.getLogger(__name__)`. This will create a new instance of a Logger with a different name.
- def inject_payment_id(event):
- logger.structure_logs(append=True, payment_id=event.get("payment_id"))
- ```
+ In Powertools, you can have the same effect by using `child=True` parameter: `Logger(child=True)`. This creates a new Logger instance named after `service.`. All state changes will be propagated bi-directonally between Child and Parent.
-In this example, `Logger` will create a parent logger named `payment` and a child logger named `payment.shared`. Changes in either parent or child logger will be propagated bi-directionally.
+ For that reason, there could be side effects depending on the order the Child Logger is instantiated, because Child Loggers don't have a handler.
-???+ info "Info: Child loggers will be named after the following convention `{service}.{filename}`"
- If you forget to use `child` param but the `service` name is the same of the parent, we will return the existing parent `Logger` instead.
+ For example, if you instantiated a Child Logger and immediately used `logger.append_keys/remove_keys/set_correlation_id` to update logging state, this might fail if the Parent Logger wasn't instantiated.
+
+ In this scenario, you can either ensure any calls manipulating state are only called when a Parent Logger is instantiated (example above), or refrain from using `child=True` parameter altogether.
### Sampling debug logs
Use sampling when you want to dynamically change your log level to **DEBUG** based on a **percentage of your concurrent/cold start invocations**.
-You can use values ranging from `0.0` to `1` (100%) when setting `POWERTOOLS_LOGGER_SAMPLE_RATE` env var or `sample_rate` parameter in Logger.
+You can use values ranging from `0.0` to `1` (100%) when setting `POWERTOOLS_LOGGER_SAMPLE_RATE` env var, or `sample_rate` parameter in Logger.
???+ tip "Tip: When is this useful?"
Let's imagine a sudden spike increase in concurrency triggered a transient issue downstream. When looking into the logs you might not have enough information, and while you can adjust log levels it might not happen again.
@@ -602,46 +336,14 @@ Sampling decision happens at the Logger initialization. This means sampling may
=== "collect.py"
- ```python hl_lines="4 7"
- from aws_lambda_powertools import Logger
-
- # Sample 10% of debug logs e.g. 0.1
- logger = Logger(service="payment", sample_rate=0.1)
-
- def handler(event, context):
- logger.debug("Verifying whether order_id is present")
- logger.info("Collecting payment")
+ ```python hl_lines="6 10"
+ --8<-- "examples/logger/src/logger_reuse.py"
```
=== "Example CloudWatch Logs excerpt"
- ```json hl_lines="2 4 12 15 25"
- {
- "level": "DEBUG",
- "location": "collect.handler:7",
- "message": "Verifying whether order_id is present",
- "timestamp": "2021-05-03 11:47:12,494+0200",
- "service": "payment",
- "cold_start": true,
- "lambda_function_name": "test",
- "lambda_function_memory_size": 128,
- "lambda_function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test",
- "lambda_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72",
- "sampling_rate": 0.1
- },
- {
- "level": "INFO",
- "location": "collect.handler:7",
- "message": "Collecting payment",
- "timestamp": "2021-05-03 11:47:12,494+0200",
- "service": "payment",
- "cold_start": true,
- "lambda_function_name": "test",
- "lambda_function_memory_size": 128,
- "lambda_function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test",
- "lambda_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72",
- "sampling_rate": 0.1
- }
+ ```json hl_lines="3 5 13 16 25"
+ --8<-- "examples/logger/src/sampling_debug_logs_output.json"
```
### LambdaPowertoolsFormatter
@@ -650,23 +352,19 @@ Logger propagates a few formatting configurations to the built-in `LambdaPowerto
If you prefer configuring it separately, or you'd want to bring this JSON Formatter to another application, these are the supported settings:
-Parameter | Description | Default
-------------------------------------------------- | ------------------------------------------------- | -------------------------------------------------
-**`json_serializer`** | function to serialize `obj` to a JSON formatted `str` | `json.dumps`
-**`json_deserializer`** | function to deserialize `str`, `bytes`, `bytearray` containing a JSON document to a Python obj | `json.loads`
-**`json_default`** | function to coerce unserializable values, when no custom serializer/deserializer is set | `str`
-**`datefmt`** | string directives (strftime) to format log timestamp | `%Y-%m-%d %H:%M:%S,%F%z`, where `%F` is a custom ms directive
-**`use_datetime_directive`** | format the `datefmt` timestamps using `datetime`, not `time` (also supports the custom `%F` directive for milliseconds) | `False`
-**`utc`** | set logging timestamp to UTC | `False`
-**`log_record_order`** | set order of log keys when logging | `["level", "location", "message", "timestamp"]`
-**`kwargs`** | key-value to be included in log messages | `None`
-
-```python hl_lines="2 4-5" title="Pre-configuring Lambda Powertools Formatter"
-from aws_lambda_powertools import Logger
-from aws_lambda_powertools.logging.formatter import LambdaPowertoolsFormatter
-
-formatter = LambdaPowertoolsFormatter(utc=True, log_record_order=["message"])
-logger = Logger(service="example", logger_formatter=formatter)
+| Parameter | Description | Default |
+| ---------------------------- | ------------------------------------------------------------------------------------------------------------------------ | ------------------------------------------------------------- |
+| **`json_serializer`** | function to serialize `obj` to a JSON formatted `str` | `json.dumps` |
+| **`json_deserializer`** | function to deserialize `str`, `bytes`, `bytearray` containing a JSON document to a Python obj | `json.loads` |
+| **`json_default`** | function to coerce unserializable values, when no custom serializer/deserializer is set | `str` |
+| **`datefmt`** | string directives (strftime) to format log timestamp | `%Y-%m-%d %H:%M:%S,%F%z`, where `%F` is a custom ms directive |
+| **`use_datetime_directive`** | format the `datefmt` timestamps using `datetime`, not `time` (also supports the custom `%F` directive for milliseconds) | `False` |
+| **`utc`** | set logging timestamp to UTC | `False` |
+| **`log_record_order`** | set order of log keys when logging | `["level", "location", "message", "timestamp"]` |
+| **`kwargs`** | key-value to be included in log messages | `None` |
+
+```python hl_lines="2 7-8" title="Pre-configuring Lambda Powertools Formatter"
+--8<-- "examples/logger/src/powertools_formatter_setup.py"
```
### Migrating from other Loggers
@@ -681,6 +379,8 @@ For Logger, the `service` is the logging key customers can use to search log ope
#### Inheriting Loggers
+??? tip "Tip: Prefer [Logger Reuse feature](#reusing-logger-across-your-code) over inheritance unless strictly necessary, [see caveats.](#reusing-logger-across-your-code)"
+
> Python Logging hierarchy happens via the dot notation: `service`, `service.child`, `service.child_2`
For inheritance, Logger uses a `child=True` parameter along with `service` being the same value across Loggers.
@@ -692,38 +392,34 @@ For child Loggers, we introspect the name of your module where `Logger(child=Tru
=== "incorrect_logger_inheritance.py"
- ```python hl_lines="4 10"
- import my_module
- from aws_lambda_powertools import Logger
-
- logger = Logger(service="payment")
- ...
+ ```python hl_lines="1 9"
+ --8<-- "examples/logger/src/logging_inheritance_bad.py"
+ ```
- # my_module.py
- from aws_lambda_powertools import Logger
+=== "my_other_module.py"
- logger = Logger(child=True)
+ ```python hl_lines="1 9"
+ --8<-- "examples/logger/src/logging_inheritance_module.py"
```
-=== "correct_logger_inheritance.py"
+In this case, Logger will register a Logger named `payment`, and a Logger named `service_undefined`. The latter isn't inheriting from the parent, and will have no handler, resulting in no message being logged to standard output.
- ```python hl_lines="4 10"
- import my_module
- from aws_lambda_powertools import Logger
+???+ tip
+ This can be fixed by either ensuring both has the `service` value as `payment`, or simply use the environment variable `POWERTOOLS_SERVICE_NAME` to ensure service value will be the same across all Loggers when not explicitly set.
- logger = Logger(service="payment")
- ...
+Do this instead:
- # my_module.py
- from aws_lambda_powertools import Logger
+=== "correct_logger_inheritance.py"
- logger = Logger(service="payment", child=True)
+ ```python hl_lines="1 9"
+ --8<-- "examples/logger/src/logging_inheritance_good.py"
```
-In this case, Logger will register a Logger named `payment`, and a Logger named `service_undefined`. The latter isn't inheriting from the parent, and will have no handler, resulting in no message being logged to standard output.
+=== "my_other_module.py"
-???+ tip
- This can be fixed by either ensuring both has the `service` value as `payment`, or simply use the environment variable `POWERTOOLS_SERVICE_NAME` to ensure service value will be the same across all Loggers when not explicitly set.
+ ```python hl_lines="1 9"
+ --8<-- "examples/logger/src/logging_inheritance_module.py"
+ ```
#### Overriding Log records
@@ -737,124 +433,70 @@ You might want to continue to use the same date formatting style, or override `l
Logger allows you to either change the format or suppress the following keys altogether at the initialization: `location`, `timestamp`, `level`, `xray_trace_id`.
=== "lambda_handler.py"
- ```python hl_lines="7 10"
- from aws_lambda_powertools import Logger
-
- date_format = "%m/%d/%Y %I:%M:%S %p"
- location_format = "[%(funcName)s] %(module)s"
-
- # override location and timestamp format
- logger = Logger(service="payment", location=location_format, datefmt=date_format)
- # suppress the location key with a None value
- logger_two = Logger(service="payment", location=None)
-
- logger.info("Collecting payment")
+ ```python hl_lines="7 10"
+ --8<-- "examples/logger/src/overriding_log_records.py"
```
+
=== "Example CloudWatch Logs excerpt"
```json hl_lines="3 5"
- {
- "level": "INFO",
- "location": "[] lambda_handler",
- "message": "Collecting payment",
- "timestamp": "02/09/2021 09:25:17 AM",
- "service": "payment"
- }
+ --8<-- "examples/logger/src/overriding_log_records_output.json"
```
#### Reordering log keys position
You can change the order of [standard Logger keys](#standard-structured-keys) or any keys that will be appended later at runtime via the `log_record_order` parameter.
-=== "lambda_handler.py"
-
- ```python hl_lines="4 7"
- from aws_lambda_powertools import Logger
-
- # make message as the first key
- logger = Logger(service="payment", log_record_order=["message"])
-
- # make request_id that will be added later as the first key
- # Logger(service="payment", log_record_order=["request_id"])
+=== "app.py"
- # Default key sorting order when omit
- # Logger(service="payment", log_record_order=["level","location","message","timestamp"])
+ ```python hl_lines="5 8"
+ --8<-- "examples/logger/src/reordering_log_keys.py"
```
=== "Example CloudWatch Logs excerpt"
- ```json hl_lines="3 5"
- {
- "message": "hello world",
- "level": "INFO",
- "location": "[]:6",
- "timestamp": "2021-02-09 09:36:12,280",
- "service": "service_undefined",
- "sampling_rate": 0.0
- }
+ ```json hl_lines="3 10"
+ --8<-- "examples/logger/src/reordering_log_keys_output.json"
```
#### Setting timestamp to UTC
-By default, this Logger and standard logging library emits records using local time timestamp. You can override this behaviour via `utc` parameter:
-
-```python hl_lines="6" title="Setting UTC timestamp by default"
-from aws_lambda_powertools import Logger
+By default, this Logger and standard logging library emits records using local time timestamp. You can override this behavior via `utc` parameter:
-logger = Logger(service="payment")
-logger.info("Local time")
-
-logger_in_utc = Logger(service="payment", utc=True)
-logger_in_utc.info("GMT time zone")
-```
-
-#### Custom function for unserializable values
+=== "app.py"
-By default, Logger uses `str` to handle values non-serializable by JSON. You can override this behaviour via `json_default` parameter by passing a Callable:
+ ```python hl_lines="6"
+ --8<-- "examples/logger/src/setting_utc_timestamp.py"
+ ```
-=== "collect.py"
+=== "Example CloudWatch Logs excerpt"
- ```python hl_lines="3-4 9 12"
- from aws_lambda_powertools import Logger
+ ```json hl_lines="6 13"
+ --8<-- "examples/logger/src/setting_utc_timestamp_output.json"
+ ```
- def custom_json_default(value):
- return f""
+#### Custom function for unserializable values
- class Unserializable:
- pass
+By default, Logger uses `str` to handle values non-serializable by JSON. You can override this behavior via `json_default` parameter by passing a Callable:
- logger = Logger(service="payment", json_default=custom_json_default)
+=== "app.py"
- def handler(event, context):
- logger.info(Unserializable())
+ ```python hl_lines="6 17"
+ --8<-- "examples/logger/src/unserializable_values.py"
```
+
=== "Example CloudWatch Logs excerpt"
- ```json hl_lines="4"
- {
- "level": "INFO",
- "location": "collect.handler:8",
- "message": """",
- "timestamp": "2021-05-03 15:17:23,632+0200",
- "service": "payment"
- }
+ ```json hl_lines="4-6"
+ --8<-- "examples/logger/src/unserializable_values_output.json"
```
#### Bring your own handler
-By default, Logger uses StreamHandler and logs to standard output. You can override this behaviour via `logger_handler` parameter:
-
-```python hl_lines="3-4 9 12" title="Configure Logger to output to a file"
-import logging
-from pathlib import Path
-
-from aws_lambda_powertools import Logger
-
-log_file = Path("/tmp/log.json")
-log_file_handler = logging.FileHandler(filename=log_file)
-logger = Logger(service="payment", logger_handler=log_file_handler)
+By default, Logger uses StreamHandler and logs to standard output. You can override this behavior via `logger_handler` parameter:
-logger.info("Collecting payment")
+```python hl_lines="7-8 10" title="Configure Logger to output to a file"
+--8<-- "examples/logger/src/bring_your_own_handler.py"
```
#### Bring your own formatter
@@ -868,30 +510,13 @@ For these, you can override the `serialize` method from [LambdaPowertoolsFormatt
=== "custom_formatter.py"
- ```python hl_lines="6-7 12"
- from aws_lambda_powertools import Logger
- from aws_lambda_powertools.logging.formatter import LambdaPowertoolsFormatter
-
- from typing import Dict
-
- class CustomFormatter(LambdaPowertoolsFormatter):
- def serialize(self, log: Dict) -> str:
- """Serialize final structured log dict to JSON str"""
- log["event"] = log.pop("message") # rename message key to event
- return self.json_serializer(log) # use configured json serializer
-
- logger = Logger(service="example", logger_formatter=CustomFormatter())
- logger.info("hello")
+ ```python hl_lines="2 5-6 12"
+ --8<-- "examples/logger/src/bring_your_own_formatter.py"
```
=== "Example CloudWatch Logs excerpt"
- ```json hl_lines="5"
- {
- "level": "INFO",
- "location": ":16",
- "timestamp": "2021-12-30 13:41:53,413+0100",
- "event": "hello"
- }
+ ```json hl_lines="6"
+ --8<-- "examples/logger/src/bring_your_own_formatter_output.json"
```
The `log` argument is the final log record containing [our standard keys](#standard-structured-keys), optionally [Lambda context keys](#capturing-lambda-context-info), and any custom key you might have added via [append_keys](#append_keys-method) or the [extra parameter](#extra-parameter).
@@ -903,83 +528,24 @@ For exceptional cases where you want to completely replace our formatter logic,
=== "collect.py"
- ```python hl_lines="5 7 9-10 13 17 21 24 35"
- import logging
- from typing import Iterable, List, Optional
-
- from aws_lambda_powertools import Logger
- from aws_lambda_powertools.logging.formatter import BasePowertoolsFormatter
-
- class CustomFormatter(BasePowertoolsFormatter):
- def __init__(self, log_record_order: Optional[List[str]], *args, **kwargs):
- self.log_record_order = log_record_order or ["level", "location", "message", "timestamp"]
- self.log_format = dict.fromkeys(self.log_record_order)
- super().__init__(*args, **kwargs)
-
- def append_keys(self, **additional_keys):
- # also used by `inject_lambda_context` decorator
- self.log_format.update(additional_keys)
-
- def remove_keys(self, keys: Iterable[str]):
- for key in keys:
- self.log_format.pop(key, None)
-
- def clear_state(self):
- self.log_format = dict.fromkeys(self.log_record_order)
-
- def format(self, record: logging.LogRecord) -> str: # noqa: A003
- """Format logging record as structured JSON str"""
- return json.dumps(
- {
- "event": super().format(record),
- "timestamp": self.formatTime(record),
- "my_default_key": "test",
- **self.log_format,
- }
- )
-
- logger = Logger(service="payment", logger_formatter=CustomFormatter())
-
- @logger.inject_lambda_context
- def handler(event, context):
- logger.info("Collecting payment")
+ ```python hl_lines="6 9 11-12 15 19 23 26 38"
+ --8<-- "examples/logger/src/bring_your_own_formatter_from_scratch.py"
```
+
=== "Example CloudWatch Logs excerpt"
```json hl_lines="2-4"
- {
- "event": "Collecting payment",
- "timestamp": "2021-05-03 11:47:12,494",
- "my_default_key": "test",
- "cold_start": true,
- "lambda_function_name": "test",
- "lambda_function_memory_size": 128,
- "lambda_function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test",
- "lambda_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72"
- }
+ --8<-- "examples/logger/src/bring_your_own_formatter_from_scratch_output.json"
```
#### Bring your own JSON serializer
By default, Logger uses `json.dumps` and `json.loads` as serializer and deserializer respectively. There could be scenarios where you are making use of alternative JSON libraries like [orjson](https://github.com/ijl/orjson){target="_blank"}.
-As parameters don't always translate well between them, you can pass any callable that receives a `Dict` and return a `str`:
-
-```python hl_lines="1 5-6 9-10" title="Using Rust orjson library as serializer"
-import orjson
+As parameters don't always translate well between them, you can pass any callable that receives a `dict` and return a `str`:
-from aws_lambda_powertools import Logger
-
-custom_serializer = orjson.dumps
-custom_deserializer = orjson.loads
-
-logger = Logger(service="payment",
- json_serializer=custom_serializer,
- json_deserializer=custom_deserializer
-)
-
-# when using parameters, you can pass a partial
-# custom_serializer=functools.partial(orjson.dumps, option=orjson.OPT_SERIALIZE_NUMPY)
+```python hl_lines="1 3 7-8 13" title="Using Rust orjson library as serializer"
+--8<-- "examples/logger/src/bring_your_own_json_serializer.py"
```
## Testing your code
@@ -994,48 +560,13 @@ This is a Pytest sample that provides the minimum information necessary for Logg
Note that dataclasses are available in Python 3.7+ only.
```python
- from dataclasses import dataclass
-
- import pytest
-
- @pytest.fixture
- def lambda_context():
- @dataclass
- class LambdaContext:
- function_name: str = "test"
- memory_limit_in_mb: int = 128
- invoked_function_arn: str = "arn:aws:lambda:eu-west-1:809313241:function:test"
- aws_request_id: str = "52fdfc07-2182-154f-163f-5f0f9a621d72"
-
- return LambdaContext()
-
- def test_lambda_handler(lambda_context):
- test_event = {'test': 'event'}
- your_lambda_handler(test_event, lambda_context) # this will now have a Context object populated
+ --8<-- "examples/logger/src/fake_lambda_context_for_logger.py"
```
-=== "fake_lambda_context_for_logger_py36.py"
-
- ```python
- from collections import namedtuple
-
- import pytest
- @pytest.fixture
- def lambda_context():
- lambda_context = {
- "function_name": "test",
- "memory_limit_in_mb": 128,
- "invoked_function_arn": "arn:aws:lambda:eu-west-1:809313241:function:test",
- "aws_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72",
- }
+=== "fake_lambda_context_for_logger_module.py"
- return namedtuple("LambdaContext", lambda_context.keys())(*lambda_context.values())
-
- def test_lambda_handler(lambda_context):
- test_event = {'test': 'event'}
-
- # this will now have a Context object populated
- your_lambda_handler(test_event, lambda_context)
+ ```python
+ --8<-- "examples/logger/src/fake_lambda_context_for_logger_module.py"
```
???+ tip
@@ -1059,41 +590,18 @@ POWERTOOLS_LOG_DEDUPLICATION_DISABLED="1" pytest -o log_cli=1
You can enable the `botocore` and `boto3` logs by using the `set_stream_logger` method, this method will add a stream handler
for the given name and level to the logging module. By default, this logs all boto3 messages to stdout.
-```python hl_lines="6-7" title="Enabling AWS SDK logging"
-from typing import Dict, List
-from aws_lambda_powertools.utilities.typing import LambdaContext
-from aws_lambda_powertools import Logger
-
-import boto3
-boto3.set_stream_logger()
-boto3.set_stream_logger('botocore')
-
-logger = Logger()
-client = boto3.client('s3')
-
-
-def handler(event: Dict, context: LambdaContext) -> List:
- response = client.list_buckets()
-
- return response.get("Buckets", [])
+```python hl_lines="8-9" title="Enabling AWS SDK logging"
+---8<-- "examples/logger/src/enabling_boto_logging.py"
```
**How can I enable powertools logging for imported libraries?**
-You can copy the Logger setup to all or sub-sets of registered external loggers. Use the `copy_config_to_registered_logger` method to do this. By default all registered loggers will be modified. You can change this behaviour by providing `include` and `exclude` attributes. You can also provide optional `log_level` attribute external loggers will be configured with.
-
-```python hl_lines="10" title="Cloning Logger config to all other registered standard loggers"
-import logging
-
-from aws_lambda_powertools import Logger
-from aws_lambda_powertools.logging import utils
+You can copy the Logger setup to all or sub-sets of registered external loggers. Use the `copy_config_to_registered_logger` method to do this.
-logger = Logger()
+By default all registered loggers will be modified. You can change this behavior by providing `include` and `exclude` attributes. You can also provide optional `log_level` attribute external loggers will be configured with.
-external_logger = logging.logger()
-
-utils.copy_config_to_registered_loggers(source_logger=logger)
-external_logger.info("test message")
+```python hl_lines="10" title="Cloning Logger config to all other registered standard loggers"
+---8<-- "examples/logger/src/cloning_logger_config.py"
```
**What's the difference between `append_keys` and `extra`?**
@@ -1102,46 +610,16 @@ Keys added with `append_keys` will persist across multiple log messages while ke
Here's an example where we persist `payment_id` not `request_id`. Note that `payment_id` remains in both log messages while `booking_id` is only available in the first message.
-=== "lambda_handler.py"
-
- ```python hl_lines="6 10"
- from aws_lambda_powertools import Logger
-
- logger = Logger(service="payment")
-
- def handler(event, context):
- logger.append_keys(payment_id="123456789")
-
- try:
- booking_id = book_flight()
- logger.info("Flight booked successfully", extra={ "booking_id": booking_id})
- except BookingReservationError:
- ...
+=== "collect.py"
- logger.info("goodbye")
+ ```python hl_lines="16 23"
+ ---8<-- "examples/logger/src/append_keys_vs_extra.py"
```
+
=== "Example CloudWatch Logs excerpt"
- ```json hl_lines="8-9 18"
- {
- "level": "INFO",
- "location": ":10",
- "message": "Flight booked successfully",
- "timestamp": "2021-01-12 14:09:10,859",
- "service": "payment",
- "sampling_rate": 0.0,
- "payment_id": "123456789",
- "booking_id": "75edbad0-0857-4fc9-b547-6180e2f7959b"
- },
- {
- "level": "INFO",
- "location": ":14",
- "message": "goodbye",
- "timestamp": "2021-01-12 14:09:10,860",
- "service": "payment",
- "sampling_rate": 0.0,
- "payment_id": "123456789"
- }
+ ```json hl_lines="9-10 19"
+ ---8<-- "examples/logger/src/append_keys_vs_extra_output.json"
```
**How do I aggregate and search Powertools logs across accounts?**
diff --git a/docs/core/tracer.md b/docs/core/tracer.md
index 34eb1ed2b93..982e3aed942 100644
--- a/docs/core/tracer.md
+++ b/docs/core/tracer.md
@@ -21,7 +21,7 @@ Tracer is an opinionated thin wrapper for [AWS X-Ray Python SDK](https://github.
Before your use this utility, your AWS Lambda function [must have permissions](https://docs.aws.amazon.com/lambda/latest/dg/services-xray.html#services-xray-permissions) to send traces to AWS X-Ray.
```yaml hl_lines="9 12" title="AWS Serverless Application Model (SAM) example"
---8<-- "examples/tracer/template.yaml"
+--8<-- "examples/tracer/sam/template.yaml"
```
### Lambda handler
diff --git a/examples/tracer/template.yaml b/examples/logger/sam/template.yaml
similarity index 77%
rename from examples/tracer/template.yaml
rename to examples/logger/sam/template.yaml
index 504661d634d..3f702bfc041 100644
--- a/examples/tracer/template.yaml
+++ b/examples/logger/sam/template.yaml
@@ -9,15 +9,16 @@ Globals:
Tracing: Active
Environment:
Variables:
- POWERTOOLS_SERVICE_NAME: example
+ POWERTOOLS_SERVICE_NAME: payment
+ LOG_LEVEL: INFO
Layers:
# Find the latest Layer version in the official documentation
# https://awslabs.github.io/aws-lambda-powertools-python/latest/#lambda-layer
- !Sub arn:aws:lambda:${AWS::Region}:017000801446:layer:AWSLambdaPowertoolsPython:21
Resources:
- CaptureLambdaHandlerExample:
+ LoggerLambdaHandlerExample:
Type: AWS::Serverless::Function
Properties:
- CodeUri: src
- Handler: capture_lambda_handler.handler
+ CodeUri: ../src
+ Handler: inject_lambda_context.handler
diff --git a/examples/logger/src/append_keys.py b/examples/logger/src/append_keys.py
new file mode 100644
index 00000000000..0ef9cbe0f63
--- /dev/null
+++ b/examples/logger/src/append_keys.py
@@ -0,0 +1,15 @@
+from aws_lambda_powertools import Logger
+from aws_lambda_powertools.utilities.typing import LambdaContext
+
+logger = Logger()
+
+
+def handler(event: dict, context: LambdaContext) -> str:
+ order_id = event.get("order_id")
+
+ # this will ensure order_id key always has the latest value before logging
+ # alternative, you can use `clear_state=True` parameter in @inject_lambda_context
+ logger.append_keys(order_id=order_id)
+ logger.info("Collecting payment")
+
+ return "hello world"
diff --git a/examples/logger/src/append_keys_extra.py b/examples/logger/src/append_keys_extra.py
new file mode 100644
index 00000000000..0c66425f775
--- /dev/null
+++ b/examples/logger/src/append_keys_extra.py
@@ -0,0 +1,11 @@
+from aws_lambda_powertools import Logger
+from aws_lambda_powertools.utilities.typing import LambdaContext
+
+logger = Logger()
+
+
+def handler(event: dict, context: LambdaContext) -> str:
+ fields = {"request_id": "1123"}
+ logger.info("Collecting payment", extra=fields)
+
+ return "hello world"
diff --git a/examples/logger/src/append_keys_extra_output.json b/examples/logger/src/append_keys_extra_output.json
new file mode 100644
index 00000000000..b25abb226a1
--- /dev/null
+++ b/examples/logger/src/append_keys_extra_output.json
@@ -0,0 +1,8 @@
+{
+ "level": "INFO",
+ "location": "collect.handler:9",
+ "message": "Collecting payment",
+ "timestamp": "2021-05-03 11:47:12,494+0200",
+ "service": "payment",
+ "request_id": "1123"
+}
diff --git a/examples/logger/src/append_keys_output.json b/examples/logger/src/append_keys_output.json
new file mode 100644
index 00000000000..1e6d38bf785
--- /dev/null
+++ b/examples/logger/src/append_keys_output.json
@@ -0,0 +1,8 @@
+{
+ "level": "INFO",
+ "location": "collect.handler:11",
+ "message": "Collecting payment",
+ "timestamp": "2021-05-03 11:47:12,494+0200",
+ "service": "payment",
+ "order_id": "order_id_value"
+}
diff --git a/examples/logger/src/append_keys_vs_extra.py b/examples/logger/src/append_keys_vs_extra.py
new file mode 100644
index 00000000000..ab67ceb6932
--- /dev/null
+++ b/examples/logger/src/append_keys_vs_extra.py
@@ -0,0 +1,28 @@
+import os
+
+import requests
+
+from aws_lambda_powertools import Logger
+
+ENDPOINT = os.getenv("PAYMENT_API", "")
+logger = Logger(service="payment")
+
+
+class PaymentError(Exception):
+ ...
+
+
+def handler(event, context):
+ logger.append_keys(payment_id="123456789")
+ charge_id = event.get("charge_id", "")
+
+ try:
+ ret = requests.post(url=f"{ENDPOINT}/collect", data={"charge_id": charge_id})
+ ret.raise_for_status()
+
+ logger.info("Charge collected successfully", extra={"charge_id": charge_id})
+ return ret.json()
+ except requests.HTTPError as e:
+ raise PaymentError(f"Unable to collect payment for charge {charge_id}") from e
+
+ logger.info("goodbye")
diff --git a/examples/logger/src/append_keys_vs_extra_output.json b/examples/logger/src/append_keys_vs_extra_output.json
new file mode 100644
index 00000000000..444986d7714
--- /dev/null
+++ b/examples/logger/src/append_keys_vs_extra_output.json
@@ -0,0 +1,21 @@
+[
+ {
+ "level": "INFO",
+ "location": ":22",
+ "message": "Charge collected successfully",
+ "timestamp": "2021-01-12 14:09:10,859",
+ "service": "payment",
+ "sampling_rate": 0.0,
+ "payment_id": "123456789",
+ "charge_id": "75edbad0-0857-4fc9-b547-6180e2f7959b"
+ },
+ {
+ "level": "INFO",
+ "location": ":27",
+ "message": "goodbye",
+ "timestamp": "2021-01-12 14:09:10,860",
+ "service": "payment",
+ "sampling_rate": 0.0,
+ "payment_id": "123456789"
+ }
+]
diff --git a/examples/logger/src/bring_your_own_formatter.py b/examples/logger/src/bring_your_own_formatter.py
new file mode 100644
index 00000000000..1b85105f930
--- /dev/null
+++ b/examples/logger/src/bring_your_own_formatter.py
@@ -0,0 +1,13 @@
+from aws_lambda_powertools import Logger
+from aws_lambda_powertools.logging.formatter import LambdaPowertoolsFormatter
+
+
+class CustomFormatter(LambdaPowertoolsFormatter):
+ def serialize(self, log: dict) -> str:
+ """Serialize final structured log dict to JSON str"""
+ log["event"] = log.pop("message") # rename message key to event
+ return self.json_serializer(log) # use configured json serializer
+
+
+logger = Logger(service="payment", logger_formatter=CustomFormatter())
+logger.info("hello")
diff --git a/examples/logger/src/bring_your_own_formatter_from_scratch.py b/examples/logger/src/bring_your_own_formatter_from_scratch.py
new file mode 100644
index 00000000000..3088bf2a80f
--- /dev/null
+++ b/examples/logger/src/bring_your_own_formatter_from_scratch.py
@@ -0,0 +1,43 @@
+import json
+import logging
+from typing import Iterable, List, Optional
+
+from aws_lambda_powertools import Logger
+from aws_lambda_powertools.logging.formatter import BasePowertoolsFormatter
+
+
+class CustomFormatter(BasePowertoolsFormatter):
+ def __init__(self, log_record_order: Optional[List[str]], *args, **kwargs):
+ self.log_record_order = log_record_order or ["level", "location", "message", "timestamp"]
+ self.log_format = dict.fromkeys(self.log_record_order)
+ super().__init__(*args, **kwargs)
+
+ def append_keys(self, **additional_keys):
+ # also used by `inject_lambda_context` decorator
+ self.log_format.update(additional_keys)
+
+ def remove_keys(self, keys: Iterable[str]):
+ for key in keys:
+ self.log_format.pop(key, None)
+
+ def clear_state(self):
+ self.log_format = dict.fromkeys(self.log_record_order)
+
+ def format(self, record: logging.LogRecord) -> str: # noqa: A003
+ """Format logging record as structured JSON str"""
+ return json.dumps(
+ {
+ "event": super().format(record),
+ "timestamp": self.formatTime(record),
+ "my_default_key": "test",
+ **self.log_format,
+ }
+ )
+
+
+logger = Logger(service="payment", logger_formatter=CustomFormatter())
+
+
+@logger.inject_lambda_context
+def handler(event, context):
+ logger.info("Collecting payment")
diff --git a/examples/logger/src/bring_your_own_formatter_from_scratch_output.json b/examples/logger/src/bring_your_own_formatter_from_scratch_output.json
new file mode 100644
index 00000000000..147b4c1b443
--- /dev/null
+++ b/examples/logger/src/bring_your_own_formatter_from_scratch_output.json
@@ -0,0 +1,10 @@
+{
+ "event": "Collecting payment",
+ "timestamp": "2021-05-03 11:47:12,494",
+ "my_default_key": "test",
+ "cold_start": true,
+ "lambda_function_name": "test",
+ "lambda_function_memory_size": 128,
+ "lambda_function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test",
+ "lambda_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72"
+}
diff --git a/examples/logger/src/bring_your_own_formatter_output.json b/examples/logger/src/bring_your_own_formatter_output.json
new file mode 100644
index 00000000000..19869b7b885
--- /dev/null
+++ b/examples/logger/src/bring_your_own_formatter_output.json
@@ -0,0 +1,7 @@
+{
+ "level": "INFO",
+ "location": ":16",
+ "timestamp": "2021-12-30 13:41:53,413+0100",
+ "service": "payment",
+ "event": "hello"
+}
diff --git a/examples/logger/src/bring_your_own_handler.py b/examples/logger/src/bring_your_own_handler.py
new file mode 100644
index 00000000000..e70abca794f
--- /dev/null
+++ b/examples/logger/src/bring_your_own_handler.py
@@ -0,0 +1,11 @@
+import logging
+from pathlib import Path
+
+from aws_lambda_powertools import Logger
+
+log_file = Path("/tmp/log.json")
+log_file_handler = logging.FileHandler(filename=log_file)
+
+logger = Logger(service="payment", logger_handler=log_file_handler)
+
+logger.info("hello world")
diff --git a/examples/logger/src/bring_your_own_json_serializer.py b/examples/logger/src/bring_your_own_json_serializer.py
new file mode 100644
index 00000000000..204e131fb87
--- /dev/null
+++ b/examples/logger/src/bring_your_own_json_serializer.py
@@ -0,0 +1,17 @@
+import functools
+
+import orjson
+
+from aws_lambda_powertools import Logger
+
+custom_serializer = orjson.dumps
+custom_deserializer = orjson.loads
+
+logger = Logger(service="payment", json_serializer=custom_serializer, json_deserializer=custom_deserializer)
+
+# NOTE: when using parameters, you can pass a partial
+custom_serializer_with_parameters = functools.partial(orjson.dumps, option=orjson.OPT_SERIALIZE_NUMPY)
+
+logger_two = Logger(
+ service="payment", json_serializer=custom_serializer_with_parameters, json_deserializer=custom_deserializer
+)
diff --git a/examples/logger/src/clear_state.py b/examples/logger/src/clear_state.py
new file mode 100644
index 00000000000..ec842f034c1
--- /dev/null
+++ b/examples/logger/src/clear_state.py
@@ -0,0 +1,16 @@
+from aws_lambda_powertools import Logger
+from aws_lambda_powertools.utilities.typing import LambdaContext
+
+logger = Logger()
+
+
+@logger.inject_lambda_context(clear_state=True)
+def handler(event: dict, context: LambdaContext) -> str:
+ if event.get("special_key"):
+ # Should only be available in the first request log
+ # as the second request doesn't contain `special_key`
+ logger.append_keys(debugging_key="value")
+
+ logger.info("Collecting payment")
+
+ return "hello world"
diff --git a/examples/logger/src/clear_state_event_one.json b/examples/logger/src/clear_state_event_one.json
new file mode 100644
index 00000000000..0f051787013
--- /dev/null
+++ b/examples/logger/src/clear_state_event_one.json
@@ -0,0 +1,13 @@
+{
+ "level": "INFO",
+ "location": "collect.handler:10",
+ "message": "Collecting payment",
+ "timestamp": "2021-05-03 11:47:12,494+0200",
+ "service": "payment",
+ "special_key": "debug_key",
+ "cold_start": true,
+ "lambda_function_name": "test",
+ "lambda_function_memory_size": 128,
+ "lambda_function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test",
+ "lambda_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72"
+}
diff --git a/examples/logger/src/clear_state_event_two.json b/examples/logger/src/clear_state_event_two.json
new file mode 100644
index 00000000000..0f019adf3a5
--- /dev/null
+++ b/examples/logger/src/clear_state_event_two.json
@@ -0,0 +1,12 @@
+{
+ "level": "INFO",
+ "location": "collect.handler:10",
+ "message": "Collecting payment",
+ "timestamp": "2021-05-03 11:47:12,494+0200",
+ "service": "payment",
+ "cold_start": false,
+ "lambda_function_name": "test",
+ "lambda_function_memory_size": 128,
+ "lambda_function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test",
+ "lambda_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72"
+}
diff --git a/examples/logger/src/cloning_logger_config.py b/examples/logger/src/cloning_logger_config.py
new file mode 100644
index 00000000000..7472feee448
--- /dev/null
+++ b/examples/logger/src/cloning_logger_config.py
@@ -0,0 +1,11 @@
+import logging
+
+from aws_lambda_powertools import Logger
+from aws_lambda_powertools.logging import utils
+
+logger = Logger()
+
+external_logger = logging.logger()
+
+utils.copy_config_to_registered_loggers(source_logger=logger)
+external_logger.info("test message")
diff --git a/examples/logger/src/enabling_boto_logging.py b/examples/logger/src/enabling_boto_logging.py
new file mode 100644
index 00000000000..cce8dc6f8e7
--- /dev/null
+++ b/examples/logger/src/enabling_boto_logging.py
@@ -0,0 +1,18 @@
+from typing import Dict, List
+
+import boto3
+
+from aws_lambda_powertools import Logger
+from aws_lambda_powertools.utilities.typing import LambdaContext
+
+boto3.set_stream_logger()
+boto3.set_stream_logger("botocore")
+
+logger = Logger()
+client = boto3.client("s3")
+
+
+def handler(event: Dict, context: LambdaContext) -> List:
+ response = client.list_buckets()
+
+ return response.get("Buckets", [])
diff --git a/examples/logger/src/fake_lambda_context_for_logger.py b/examples/logger/src/fake_lambda_context_for_logger.py
new file mode 100644
index 00000000000..d3b3efc98f9
--- /dev/null
+++ b/examples/logger/src/fake_lambda_context_for_logger.py
@@ -0,0 +1,21 @@
+from dataclasses import dataclass
+
+import fake_lambda_context_for_logger_module # sample module for completeness
+import pytest
+
+
+@pytest.fixture
+def lambda_context():
+ @dataclass
+ class LambdaContext:
+ function_name: str = "test"
+ memory_limit_in_mb: int = 128
+ invoked_function_arn: str = "arn:aws:lambda:eu-west-1:809313241:function:test"
+ aws_request_id: str = "52fdfc07-2182-154f-163f-5f0f9a621d72"
+
+ return LambdaContext()
+
+
+def test_lambda_handler(lambda_context):
+ test_event = {"test": "event"}
+ fake_lambda_context_for_logger_module.handler(test_event, lambda_context)
diff --git a/examples/logger/src/fake_lambda_context_for_logger_module.py b/examples/logger/src/fake_lambda_context_for_logger_module.py
new file mode 100644
index 00000000000..fcb94f99db1
--- /dev/null
+++ b/examples/logger/src/fake_lambda_context_for_logger_module.py
@@ -0,0 +1,11 @@
+from aws_lambda_powertools import Logger
+from aws_lambda_powertools.utilities.typing import LambdaContext
+
+logger = Logger()
+
+
+@logger.inject_lambda_context
+def handler(event: dict, context: LambdaContext) -> str:
+ logger.info("Collecting payment")
+
+ return "hello world"
diff --git a/examples/logger/src/inject_lambda_context.py b/examples/logger/src/inject_lambda_context.py
new file mode 100644
index 00000000000..0bdf203565d
--- /dev/null
+++ b/examples/logger/src/inject_lambda_context.py
@@ -0,0 +1,13 @@
+from aws_lambda_powertools import Logger
+from aws_lambda_powertools.utilities.typing import LambdaContext
+
+logger = Logger()
+
+
+@logger.inject_lambda_context
+def handler(event: dict, context: LambdaContext) -> str:
+ logger.info("Collecting payment")
+
+ # You can log entire objects too
+ logger.info({"operation": "collect_payment", "charge_id": event["charge_id"]})
+ return "hello world"
diff --git a/examples/logger/src/inject_lambda_context_output.json b/examples/logger/src/inject_lambda_context_output.json
new file mode 100644
index 00000000000..edf2f7d6dc6
--- /dev/null
+++ b/examples/logger/src/inject_lambda_context_output.json
@@ -0,0 +1,29 @@
+[
+ {
+ "level": "INFO",
+ "location": "collect.handler:9",
+ "message": "Collecting payment",
+ "timestamp": "2021-05-03 11:47:12,494+0200",
+ "service": "payment",
+ "cold_start": true,
+ "lambda_function_name": "test",
+ "lambda_function_memory_size": 128,
+ "lambda_function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test",
+ "lambda_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72"
+ },
+ {
+ "level": "INFO",
+ "location": "collect.handler:12",
+ "message": {
+ "operation": "collect_payment",
+ "charge_id": "ch_AZFlk2345C0"
+ },
+ "timestamp": "2021-05-03 11:47:12,494+0200",
+ "service": "payment",
+ "cold_start": true,
+ "lambda_function_name": "test",
+ "lambda_function_memory_size": 128,
+ "lambda_function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test",
+ "lambda_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72"
+ }
+]
diff --git a/examples/logger/src/log_incoming_event.py b/examples/logger/src/log_incoming_event.py
new file mode 100644
index 00000000000..264a568c4ba
--- /dev/null
+++ b/examples/logger/src/log_incoming_event.py
@@ -0,0 +1,9 @@
+from aws_lambda_powertools import Logger
+from aws_lambda_powertools.utilities.typing import LambdaContext
+
+logger = Logger()
+
+
+@logger.inject_lambda_context(log_event=True)
+def handler(event: dict, context: LambdaContext) -> str:
+ return "hello world"
diff --git a/examples/logger/src/logger_reuse.py b/examples/logger/src/logger_reuse.py
new file mode 100644
index 00000000000..a232eadd979
--- /dev/null
+++ b/examples/logger/src/logger_reuse.py
@@ -0,0 +1,13 @@
+from logger_reuse_payment import inject_payment_id
+
+from aws_lambda_powertools import Logger
+from aws_lambda_powertools.utilities.typing import LambdaContext
+
+logger = Logger()
+
+
+@logger.inject_lambda_context
+def handler(event: dict, context: LambdaContext) -> str:
+ inject_payment_id(context=event)
+ logger.info("Collecting payment")
+ return "hello world"
diff --git a/examples/logger/src/logger_reuse_output.json b/examples/logger/src/logger_reuse_output.json
new file mode 100644
index 00000000000..15bc6e4fa88
--- /dev/null
+++ b/examples/logger/src/logger_reuse_output.json
@@ -0,0 +1,13 @@
+{
+ "level": "INFO",
+ "location": "collect.handler:12",
+ "message": "Collecting payment",
+ "timestamp": "2021-05-03 11:47:12,494+0200",
+ "service": "payment",
+ "cold_start": true,
+ "lambda_function_name": "test",
+ "lambda_function_memory_size": 128,
+ "lambda_function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test",
+ "lambda_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72",
+ "payment_id": "968adaae-a211-47af-bda3-eed3ca2c0ed0"
+}
diff --git a/examples/logger/src/logger_reuse_payment.py b/examples/logger/src/logger_reuse_payment.py
new file mode 100644
index 00000000000..00cad95d161
--- /dev/null
+++ b/examples/logger/src/logger_reuse_payment.py
@@ -0,0 +1,7 @@
+from aws_lambda_powertools import Logger
+
+logger = Logger()
+
+
+def inject_payment_id(context):
+ logger.append_keys(payment_id=context.get("payment_id"))
diff --git a/examples/logger/src/logging_exceptions.py b/examples/logger/src/logging_exceptions.py
new file mode 100644
index 00000000000..31df43cd663
--- /dev/null
+++ b/examples/logger/src/logging_exceptions.py
@@ -0,0 +1,18 @@
+import requests
+
+from aws_lambda_powertools import Logger
+from aws_lambda_powertools.utilities.typing import LambdaContext
+
+ENDPOINT = "http://httpbin.org/status/500"
+logger = Logger()
+
+
+def handler(event: dict, context: LambdaContext) -> str:
+ try:
+ ret = requests.get(ENDPOINT)
+ ret.raise_for_status()
+ except requests.HTTPError as e:
+ logger.exception("Received a HTTP 5xx error")
+ raise RuntimeError("Unable to fullfil request") from e
+
+ return "hello world"
diff --git a/examples/logger/src/logging_exceptions_output.json b/examples/logger/src/logging_exceptions_output.json
new file mode 100644
index 00000000000..8f3011e3a87
--- /dev/null
+++ b/examples/logger/src/logging_exceptions_output.json
@@ -0,0 +1,9 @@
+{
+ "level": "ERROR",
+ "location": "collect.handler:15",
+ "message": "Received a HTTP 5xx error",
+ "timestamp": "2021-05-03 11:47:12,494+0200",
+ "service": "payment",
+ "exception_name": "RuntimeError",
+ "exception": "Traceback (most recent call last):\n File \"\", line 2, in RuntimeError: Unable to fullfil request"
+}
diff --git a/examples/logger/src/logging_inheritance_bad.py b/examples/logger/src/logging_inheritance_bad.py
new file mode 100644
index 00000000000..18510720d9e
--- /dev/null
+++ b/examples/logger/src/logging_inheritance_bad.py
@@ -0,0 +1,16 @@
+from logging_inheritance_module import inject_payment_id
+
+from aws_lambda_powertools import Logger
+from aws_lambda_powertools.utilities.typing import LambdaContext
+
+# NOTE: explicit service name differs from Child
+# meaning we will have two Logger instances with different state
+# and an orphan child logger who won't be able to manipulate state
+logger = Logger(service="payment")
+
+
+@logger.inject_lambda_context
+def handler(event: dict, context: LambdaContext) -> str:
+ inject_payment_id(context=event)
+
+ return "hello world"
diff --git a/examples/logger/src/logging_inheritance_good.py b/examples/logger/src/logging_inheritance_good.py
new file mode 100644
index 00000000000..f7e29d09df7
--- /dev/null
+++ b/examples/logger/src/logging_inheritance_good.py
@@ -0,0 +1,16 @@
+from logging_inheritance_module import inject_payment_id
+
+from aws_lambda_powertools import Logger
+from aws_lambda_powertools.utilities.typing import LambdaContext
+
+# NOTE: explicit service name matches any new Logger
+# because we're using POWERTOOLS_SERVICE_NAME env var
+# but we could equally use the same string as service value, e.g. "payment"
+logger = Logger()
+
+
+@logger.inject_lambda_context
+def handler(event: dict, context: LambdaContext) -> str:
+ inject_payment_id(context=event)
+
+ return "hello world"
diff --git a/examples/logger/src/logging_inheritance_module.py b/examples/logger/src/logging_inheritance_module.py
new file mode 100644
index 00000000000..7891a972da6
--- /dev/null
+++ b/examples/logger/src/logging_inheritance_module.py
@@ -0,0 +1,7 @@
+from aws_lambda_powertools import Logger
+
+logger = Logger(child=True)
+
+
+def inject_payment_id(context):
+ logger.append_keys(payment_id=context.get("payment_id"))
diff --git a/examples/logger/src/overriding_log_records.py b/examples/logger/src/overriding_log_records.py
new file mode 100644
index 00000000000..f32da431158
--- /dev/null
+++ b/examples/logger/src/overriding_log_records.py
@@ -0,0 +1,12 @@
+from aws_lambda_powertools import Logger
+
+date_format = "%m/%d/%Y %I:%M:%S %p"
+location_format = "[%(funcName)s] %(module)s"
+
+# override location and timestamp format
+logger = Logger(service="payment", location=location_format, datefmt=date_format)
+
+# suppress the location key with a None value
+logger_two = Logger(service="payment", location=None)
+
+logger.info("Collecting payment")
diff --git a/examples/logger/src/overriding_log_records_output.json b/examples/logger/src/overriding_log_records_output.json
new file mode 100644
index 00000000000..ba2f1dfe8d5
--- /dev/null
+++ b/examples/logger/src/overriding_log_records_output.json
@@ -0,0 +1,7 @@
+{
+ "level": "INFO",
+ "location": "[] lambda_handler",
+ "message": "Collecting payment",
+ "timestamp": "02/09/2021 09:25:17 AM",
+ "service": "payment"
+}
diff --git a/examples/logger/src/powertools_formatter_setup.py b/examples/logger/src/powertools_formatter_setup.py
new file mode 100644
index 00000000000..b6f38a92bdd
--- /dev/null
+++ b/examples/logger/src/powertools_formatter_setup.py
@@ -0,0 +1,8 @@
+from aws_lambda_powertools import Logger
+from aws_lambda_powertools.logging.formatter import LambdaPowertoolsFormatter
+
+# NOTE: Check docs for all available options
+# https://awslabs.github.io/aws-lambda-powertools-python/latest/core/logger/#lambdapowertoolsformatter
+
+formatter = LambdaPowertoolsFormatter(utc=True, log_record_order=["message"])
+logger = Logger(service="example", logger_formatter=formatter)
diff --git a/examples/logger/src/remove_keys.py b/examples/logger/src/remove_keys.py
new file mode 100644
index 00000000000..763387d9399
--- /dev/null
+++ b/examples/logger/src/remove_keys.py
@@ -0,0 +1,14 @@
+from aws_lambda_powertools import Logger
+from aws_lambda_powertools.utilities.typing import LambdaContext
+
+logger = Logger()
+
+
+def handler(event: dict, context: LambdaContext) -> str:
+ logger.append_keys(sample_key="value")
+ logger.info("Collecting payment")
+
+ logger.remove_keys(["sample_key"])
+ logger.info("Collecting payment without sample key")
+
+ return "hello world"
diff --git a/examples/logger/src/remove_keys_output.json b/examples/logger/src/remove_keys_output.json
new file mode 100644
index 00000000000..4ec8740784e
--- /dev/null
+++ b/examples/logger/src/remove_keys_output.json
@@ -0,0 +1,17 @@
+[
+ {
+ "level": "INFO",
+ "location": "collect.handler:9",
+ "message": "Collecting payment",
+ "timestamp": "2021-05-03 11:47:12,494+0200",
+ "service": "payment",
+ "sample_key": "value"
+ },
+ {
+ "level": "INFO",
+ "location": "collect.handler:12",
+ "message": "Collecting payment without sample key",
+ "timestamp": "2021-05-03 11:47:12,494+0200",
+ "service": "payment"
+ }
+]
diff --git a/examples/logger/src/reordering_log_keys.py b/examples/logger/src/reordering_log_keys.py
new file mode 100644
index 00000000000..a3de53a6aed
--- /dev/null
+++ b/examples/logger/src/reordering_log_keys.py
@@ -0,0 +1,11 @@
+from aws_lambda_powertools import Logger
+
+# make message as the first key
+logger = Logger(service="payment", log_record_order=["message"])
+
+# make request_id that will be added later as the first key
+logger_two = Logger(service="order", log_record_order=["request_id"])
+logger_two.append_keys(request_id="123")
+
+logger.info("hello world")
+logger_two.info("hello world")
diff --git a/examples/logger/src/reordering_log_keys_output.json b/examples/logger/src/reordering_log_keys_output.json
new file mode 100644
index 00000000000..c89f7cb48bd
--- /dev/null
+++ b/examples/logger/src/reordering_log_keys_output.json
@@ -0,0 +1,17 @@
+[
+ {
+ "message": "hello world",
+ "level": "INFO",
+ "location": ":11",
+ "timestamp": "2022-06-24 11:25:40,143+0200",
+ "service": "payment"
+ },
+ {
+ "request_id": "123",
+ "level": "INFO",
+ "location": ":12",
+ "timestamp": "2022-06-24 11:25:40,144+0200",
+ "service": "order",
+ "message": "hello universe"
+ }
+]
diff --git a/examples/logger/src/sampling_debug_logs.py b/examples/logger/src/sampling_debug_logs.py
new file mode 100644
index 00000000000..3bbb1cdb920
--- /dev/null
+++ b/examples/logger/src/sampling_debug_logs.py
@@ -0,0 +1,13 @@
+from aws_lambda_powertools import Logger
+from aws_lambda_powertools.utilities.typing import LambdaContext
+
+# Sample 10% of debug logs e.g. 0.1
+# NOTE: this evaluation will only occur at cold start
+logger = Logger(service="payment", sample_rate=0.1)
+
+
+def handler(event: dict, context: LambdaContext):
+ logger.debug("Verifying whether order_id is present")
+ logger.info("Collecting payment")
+
+ return "hello world"
diff --git a/examples/logger/src/sampling_debug_logs_output.json b/examples/logger/src/sampling_debug_logs_output.json
new file mode 100644
index 00000000000..f216753aea1
--- /dev/null
+++ b/examples/logger/src/sampling_debug_logs_output.json
@@ -0,0 +1,28 @@
+[
+ {
+ "level": "DEBUG",
+ "location": "collect.handler:7",
+ "message": "Verifying whether order_id is present",
+ "timestamp": "2021-05-03 11:47:12,494+0200",
+ "service": "payment",
+ "cold_start": true,
+ "lambda_function_name": "test",
+ "lambda_function_memory_size": 128,
+ "lambda_function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test",
+ "lambda_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72",
+ "sampling_rate": 0.1
+ },
+ {
+ "level": "INFO",
+ "location": "collect.handler:7",
+ "message": "Collecting payment",
+ "timestamp": "2021-05-03 11:47:12,494+0200",
+ "service": "payment",
+ "cold_start": true,
+ "lambda_function_name": "test",
+ "lambda_function_memory_size": 128,
+ "lambda_function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test",
+ "lambda_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72",
+ "sampling_rate": 0.1
+ }
+]
diff --git a/examples/logger/src/set_correlation_id.py b/examples/logger/src/set_correlation_id.py
new file mode 100644
index 00000000000..3aa0bc5f2be
--- /dev/null
+++ b/examples/logger/src/set_correlation_id.py
@@ -0,0 +1,12 @@
+from aws_lambda_powertools import Logger
+from aws_lambda_powertools.utilities.typing import LambdaContext
+
+logger = Logger()
+
+
+@logger.inject_lambda_context(correlation_id_path="headers.my_request_id_header")
+def handler(event: dict, context: LambdaContext) -> str:
+ logger.debug(f"Correlation ID => {logger.get_correlation_id()}")
+ logger.info("Collecting payment")
+
+ return "hello world"
diff --git a/examples/logger/src/set_correlation_id_event.json b/examples/logger/src/set_correlation_id_event.json
new file mode 100644
index 00000000000..e74f572f070
--- /dev/null
+++ b/examples/logger/src/set_correlation_id_event.json
@@ -0,0 +1,5 @@
+{
+ "headers": {
+ "my_request_id_header": "correlation_id_value"
+ }
+}
diff --git a/examples/logger/src/set_correlation_id_jmespath.py b/examples/logger/src/set_correlation_id_jmespath.py
new file mode 100644
index 00000000000..049bc70a957
--- /dev/null
+++ b/examples/logger/src/set_correlation_id_jmespath.py
@@ -0,0 +1,13 @@
+from aws_lambda_powertools import Logger
+from aws_lambda_powertools.logging import correlation_paths
+from aws_lambda_powertools.utilities.typing import LambdaContext
+
+logger = Logger()
+
+
+@logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST)
+def handler(event: dict, context: LambdaContext) -> str:
+ logger.debug(f"Correlation ID => {logger.get_correlation_id()}")
+ logger.info("Collecting payment")
+
+ return "hello world"
diff --git a/examples/logger/src/set_correlation_id_jmespath_event.json b/examples/logger/src/set_correlation_id_jmespath_event.json
new file mode 100644
index 00000000000..dc27e741882
--- /dev/null
+++ b/examples/logger/src/set_correlation_id_jmespath_event.json
@@ -0,0 +1,5 @@
+{
+ "requestContext": {
+ "requestId": "correlation_id_value"
+ }
+}
diff --git a/examples/logger/src/set_correlation_id_jmespath_output.json b/examples/logger/src/set_correlation_id_jmespath_output.json
new file mode 100644
index 00000000000..168cc238301
--- /dev/null
+++ b/examples/logger/src/set_correlation_id_jmespath_output.json
@@ -0,0 +1,13 @@
+{
+ "level": "INFO",
+ "location": "collect.handler:11",
+ "message": "Collecting payment",
+ "timestamp": "2021-05-03 11:47:12,494+0200",
+ "service": "payment",
+ "cold_start": true,
+ "lambda_function_name": "test",
+ "lambda_function_memory_size": 128,
+ "lambda_function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test",
+ "lambda_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72",
+ "correlation_id": "correlation_id_value"
+}
diff --git a/examples/logger/src/set_correlation_id_method.py b/examples/logger/src/set_correlation_id_method.py
new file mode 100644
index 00000000000..74eaa338df6
--- /dev/null
+++ b/examples/logger/src/set_correlation_id_method.py
@@ -0,0 +1,14 @@
+from aws_lambda_powertools import Logger
+from aws_lambda_powertools.utilities.data_classes import APIGatewayProxyEvent
+from aws_lambda_powertools.utilities.typing import LambdaContext
+
+logger = Logger()
+
+
+def handler(event: dict, context: LambdaContext) -> str:
+ request = APIGatewayProxyEvent(event)
+
+ logger.set_correlation_id(request.request_context.request_id)
+ logger.info("Collecting payment")
+
+ return "hello world"
diff --git a/examples/logger/src/set_correlation_id_method_event.json b/examples/logger/src/set_correlation_id_method_event.json
new file mode 100644
index 00000000000..dc27e741882
--- /dev/null
+++ b/examples/logger/src/set_correlation_id_method_event.json
@@ -0,0 +1,5 @@
+{
+ "requestContext": {
+ "requestId": "correlation_id_value"
+ }
+}
diff --git a/examples/logger/src/set_correlation_id_method_output.json b/examples/logger/src/set_correlation_id_method_output.json
new file mode 100644
index 00000000000..f78d26740ae
--- /dev/null
+++ b/examples/logger/src/set_correlation_id_method_output.json
@@ -0,0 +1,8 @@
+{
+ "level": "INFO",
+ "location": "collect.handler:13",
+ "message": "Collecting payment",
+ "timestamp": "2021-05-03 11:47:12,494+0200",
+ "service": "payment",
+ "correlation_id": "correlation_id_value"
+}
diff --git a/examples/logger/src/set_correlation_id_output.json b/examples/logger/src/set_correlation_id_output.json
new file mode 100644
index 00000000000..23a5040ad91
--- /dev/null
+++ b/examples/logger/src/set_correlation_id_output.json
@@ -0,0 +1,13 @@
+{
+ "level": "INFO",
+ "location": "collect.handler:10",
+ "message": "Collecting payment",
+ "timestamp": "2021-05-03 11:47:12,494+0200",
+ "service": "payment",
+ "cold_start": true,
+ "lambda_function_name": "test",
+ "lambda_function_memory_size": 128,
+ "lambda_function_arn": "arn:aws:lambda:eu-west-1:12345678910:function:test",
+ "lambda_request_id": "52fdfc07-2182-154f-163f-5f0f9a621d72",
+ "correlation_id": "correlation_id_value"
+}
diff --git a/examples/logger/src/setting_utc_timestamp.py b/examples/logger/src/setting_utc_timestamp.py
new file mode 100644
index 00000000000..a454e216d75
--- /dev/null
+++ b/examples/logger/src/setting_utc_timestamp.py
@@ -0,0 +1,7 @@
+from aws_lambda_powertools import Logger
+
+logger = Logger(service="payment")
+logger.info("Local time")
+
+logger_in_utc = Logger(service="order", utc=True)
+logger_in_utc.info("GMT time zone")
diff --git a/examples/logger/src/setting_utc_timestamp_output.json b/examples/logger/src/setting_utc_timestamp_output.json
new file mode 100644
index 00000000000..80083fbf61b
--- /dev/null
+++ b/examples/logger/src/setting_utc_timestamp_output.json
@@ -0,0 +1,16 @@
+[
+ {
+ "level": "INFO",
+ "location": ":4",
+ "message": "Local time",
+ "timestamp": "2022-06-24 11:39:49,421+0200",
+ "service": "payment"
+ },
+ {
+ "level": "INFO",
+ "location": ":7",
+ "message": "GMT time zone",
+ "timestamp": "2022-06-24 09:39:49,421+0100",
+ "service": "order"
+ }
+]
diff --git a/examples/logger/src/unserializable_values.py b/examples/logger/src/unserializable_values.py
new file mode 100644
index 00000000000..9ed196827b2
--- /dev/null
+++ b/examples/logger/src/unserializable_values.py
@@ -0,0 +1,19 @@
+from datetime import date, datetime
+
+from aws_lambda_powertools import Logger
+
+
+def custom_json_default(value: object) -> str:
+ if isinstance(value, (datetime, date)):
+ return value.isoformat()
+
+ return f""
+
+
+class Unserializable:
+ pass
+
+
+logger = Logger(service="payment", json_default=custom_json_default)
+
+logger.info({"ingestion_time": datetime.utcnow(), "serialize_me": Unserializable()})
diff --git a/examples/logger/src/unserializable_values_output.json b/examples/logger/src/unserializable_values_output.json
new file mode 100644
index 00000000000..ed7770cab03
--- /dev/null
+++ b/examples/logger/src/unserializable_values_output.json
@@ -0,0 +1,10 @@
+{
+ "level": "INFO",
+ "location": ":19",
+ "message": {
+ "ingestion_time": "2022-06-24T10:12:09.526365",
+ "serialize_me": ""
+ },
+ "timestamp": "2022-06-24 12:12:09,526+0200",
+ "service": "payment"
+}