You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
**IMPORTANT NOTE: When upgrading, please ensure your forwarder Lambda function has [the latest Datadog Lambda Layer installed](https://github.com/DataDog/datadog-serverless-functions/tree/master/aws/logs_monitoring#3-add-the-datadog-lambda-layer).**
2
-
3
1
# Datadog Forwarder
4
2
5
-
AWS Lambda function to ship logs and metrics from ELB, S3, CloudTrail, VPC, CloudFront, and CloudWatch logs to Datadog
3
+
AWS Lambda function to ship ELB, S3, CloudTrail, VPC, CloudFront logs from S3 buckets, and Lambda metrics, traces and logs from CloudWatch logs to Datadog.
6
4
7
5
## Features
8
6
9
-
- Forward logs through HTTPS (defaulted to port 443)
10
-
- Use AWS Lambda to re-route triggered S3 events to Datadog
11
-
- Use AWS Lambda to re-route triggered Kinesis data stream events to Datadog, only the Cloudwatch logs are supported
12
-
- Cloudwatch, ELB, S3, CloudTrail, VPC and CloudFront logs can be forwarded
13
-
- SSL Security
14
-
- JSON events providing details about S3 documents forwarded
15
-
- Structured meta-information can be attached to the events
16
-
- Scrubbing / Redaction rules
17
-
- Filtering rules (`INCLUDE_AT_MATCH` and `EXCLUDE_AT_MATCH`)
18
-
- Multiline Log Support (S3 Only)
19
-
- Forward custom metrics from logs
20
-
- Submit `aws.lambda.enhanced.*` Lambda metrics parsed from the AWS REPORT log: duration, billed_duration, max_memory_used, estimated_cost
21
-
22
-
## Quick Start
23
-
24
-
The provided Python script must be deployed into your AWS Lambda service to collect your logs and send them to Datadog.
25
-
26
-
### 1. Create a new Lambda function
27
-
28
-
1.[Navigate to the Lambda console](https://console.aws.amazon.com/lambda/home) and create a new function.
29
-
2. Select `Author from scratch` and give the function a unique name: `datadog-log-monitoring-function`
30
-
3. For `Role`, select `Create new role from template(s)` and give the role a unique name: `datadog-log-monitoring-function-role`
31
-
4. Under Policy templates, select `s3 object read-only permissions`.
32
-
33
-
### 2. Provide the code
34
-
35
-
1. Copy paste the code of the Lambda function from the `lambda_function.py` file.
36
-
2. Set the runtime to `Python 2.7`, `Python 3.6`, or `Python 3.7`
37
-
3. Set the handler to `lambda_function.lambda_handler`
7
+
- Forward CloudWatch, ELB, S3, CloudTrail, VPC and CloudFront logs to Datadog
8
+
- Forward S3 events to Datadog
9
+
- Forward Kinesis data stream events to Datadog, only CloudWatch logs are supported
10
+
- Forward custom metrics from AWS Lambda functions via CloudWatch logs
11
+
- Generate and submit enhanced Lambda metrics (`aws.lambda.enhanced.*`) parsed from the AWS REPORT log: duration, billed_duration, max_memory_used, and estimated_cost
38
12
39
-
### 3. Add the Datadog Lambda Layer
40
-
The [Datadog Lambda Layer]((https://github.com/DataDog/datadog-lambda-layer-python))**MUST** be added to the log forwarder Lambda function. Use the Lambda layer ARN below, and replace `<AWS_REGION>` with the actual region (e.g., `us-east-1`), `<PYTHON_RUNTIME>` with the runtime of your forwarder (e.g., `Python27`), and `<VERSION>` with the latest version from the [CHANGELOG](https://github.com/DataDog/datadog-lambda-layer-python/blob/master/CHANGELOG.md).
Use the [AWS Serverless Repository](https://serverlessrepo.aws.amazon.com/applications/arn:aws:serverlessrepo:us-east-1:464622532012:applications~Datadog-Log-Forwarder) to deploy the Lambda in your AWS account.
The basic settings can be configured through the SAM app installation interface. Some of them are simply mapped to environment variables that can be updated after the installation from the Lambda configuration console.
51
20
21
+
### Application Name
52
22
53
-
### 4. Set your Parameters
23
+
DO **NOT** change unless for advanced use cases. If changed, you **MUST** provide the same application name when upgrade in the future.
54
24
55
-
At the top of the script you'll find a section called `PARAMETERS`, that's where you want to edit your code, available paramters are:
25
+
### Datadog API Key (REQUIRED)
56
26
57
-
#### DD_API_KEY
58
-
59
-
Set the Datadog API key for your Datadog platform, it can be found here:
27
+
The Datadog API key for your Datadog platform MUST be provided, it can be found here:
60
28
61
29
* Datadog US Site: https://app.datadoghq.com/account/settings#api
62
30
* Datadog EU Site: https://app.datadoghq.eu/account/settings#api
63
31
64
-
There are 3 possibilities to set your Datadog API key:
32
+
Provide the API key using one of the following SAM application settings:
33
+
34
+
*`DdApiKeySecretArn` (or environment variable `DD_API_KEY_SECRET_ARN`): Recommended. The Secret ARN to fetch the Datadog API key from Secrets Manager. `KMSKeyId` also needs to be provided if a customer managed CMK is used for encryption.
35
+
* Permission `secretsmanager:GetSecretValue` to access the provided Secret ARN will be added automatically.
36
+
* Permission `kms:Decrypt` to access the `KMSKeyId`, if provided, will be added automatically.
37
+
*`DdKmsApiKey` (or environment variable `DD_KMS_API_KEY`): Recommended. The Datadog API Key encrypted by KMS. `KMSKeyId` also needs to be provided for decryption.
38
+
* Permission `kms:Decrypt` to access the `KMSKeyId`, will be added automatically.
39
+
*`DdApiKey` (or environment variable `DD_API_KEY`): NOT recommended. The Datadog API Key in plain text.
65
40
66
-
1.**KMS Encrypted key (recommended)**: Use the `DD_KMS_API_KEY` environment variable to use a KMS encrypted key. Make sure that the Lambda execution role is listed in the KMS Key user in https://console.aws.amazon.com/iam/home#encryptionKeys.
67
-
2.**Environment Variable**: Use the `DD_API_KEY` environment variable for the Lambda function.
68
-
3.**Manual**: Replace `<YOUR_DATADOG_API_KEY>` in the code:
## The Datadog API key associated with your Datadog Account
73
-
## It can be found here:
74
-
##
75
-
## * Datadog US Site: https://app.datadoghq.com/account/settings#api
76
-
## * Datadog EU Site: https://app.datadoghq.eu/account/settings#api
77
-
#
78
-
DD_API_KEY="<YOUR_DATADOG_API_KEY>"
79
-
```
43
+
Add custom tags to logs forwarded by your function using the `DdTags` setting (or environment variable `DD_TAGS`). It must be a comma-delimited string with no trailing comma, e.g., `env:prod,stack:classic`.
80
44
81
-
#### Custom Tags
45
+
###Datadog Site
82
46
83
-
Add custom tags to all data forwarded by your function, either:
47
+
Define your Datadog Site to send data to using the `DdSite` setting (or environment variable `DD_SITE`). It must be either`datadoghq.com` for the Datadog US site or `datadoghq.eu` for the Datadog EU site.
84
48
85
-
* Use the `DD_TAGS` environment variable. Your tags must be a comma-separated list of strings with no trailing comma.
86
-
* Edit the lambda code directly:
49
+
### Function Name
87
50
88
-
```python
89
-
## @param DD_TAGS - list of comma separated strings - optional -default: none
90
-
## Pass custom tags as environment variable or through this variable.
91
-
## Ensure your tags are a comma separated list of strings with no trailing comma in the envvar!
92
-
#
93
-
DD_TAGS= os.environ.get("DD_TAGS", "")
94
-
```
51
+
You can optionally customize the Datadog Forwarder Lambda function name using the `FunctionName` setting. DO **NOT** change unless you need to install multiple forwarders in the same region for advanced use cases. If changed, you **MUST** provide the same function name when upgrade in the future.
95
52
96
-
#### Datadog Site
53
+
###Reserved Concurrency
97
54
98
-
Define your Datadog Site to send data to, `datadoghq.com`for Datadog US site or `datadoghq.eu` for Datadog EU site, either:
55
+
You can optionally customize the reserved concurrency for the Datadog Forwarder Lambda function using the `ReservedConcurrency` setting.
You can optionally customize the CloudWatch log retention for logs generated by the Datadog Forwarder Lambda function using the `LogRetentionInDays` setting.
110
60
111
-
#### Send logs through TCP or HTTP.
61
+
##Advanced Settings
112
62
113
-
By default, the forwarder sends logs using HTTPS through the port `443`. To send logs over a SSL encrypted TCP connection either:
63
+
The advanced settings can be configured after the installation as environment variables from the Lambda configuration console.
114
64
115
-
* Set the environment variable `DD_USE_TCP` to `true`.
By default, the forwarder sends logs using HTTPS through the port `443`. To send logs over a SSL encrypted TCP connection, set the environment variable `DD_USE_TCP` to `true`.
125
68
126
-
####Proxy
69
+
### Proxy
127
70
128
71
Ensure that you disable SSL between the lambda and your proxy by setting `DD_NO_SSL` to `true`
129
72
@@ -132,25 +75,11 @@ Two environment variables can be used to forward logs through a proxy:
132
75
*`DD_URL`: Define the proxy endpoint to forward the logs to.
133
76
*`DD_PORT`: Define the proxy port to forward the logs to.
134
77
135
-
#### DD_FETCH_LAMBDA_TAGS
78
+
###Fetch Lambda Tags (Beta)
136
79
137
80
If the `DD_FETCH_LAMBDA_TAGS` env variable is set to `true` then the log forwarder will fetch Lambda tags using [GetResources](https://docs.aws.amazon.com/resourcegroupstagging/latest/APIReference/API_GetResources.html) API calls and apply them to the `aws.lambda.enhanced.*` metrics parsed from the REPORT log. For this to work the log forwarder function needs to be given the `tag:GetResources` permission. The tags are cached in memory so that they'll only be fetched when the function cold starts or when the TTL (1 hour) expires. The log forwarder increments the `aws.lambda.enhanced.get_resources_api_calls` metric for each API call made.
138
81
139
-
### 5. Configure your function
140
-
141
-
To configure your function:
142
-
143
-
1. Set the memory to 1024 MB.
144
-
2. Also set the timeout limit. 120 seconds is recommended to deal with big files.
145
-
3. Hit the `Save` button.
146
-
147
-
### 6. Test it
148
-
149
-
Hit the `Test` button, and select `CloudWatch Logs` as the sample event. If the test "succeeded", you are all set! The test log doesn't show up in the platform.
150
-
151
-
**Note**: For S3 logs, there may be some latency between the time a first S3 log file is posted and the Lambda function wakes up.
152
-
153
-
### 7. (optional) Scrubbing / Redaction rules
82
+
### Scrubbing / Redaction rules
154
83
155
84
Multiple scrubbing options are available. `REDACT_IP` and `REDACT_EMAIL` match against hard-coded patterns, while `DD_SCRUBBING_RULE` allows users to supply a regular expression.
156
85
- To use `REDACT_IP`, add it as an environment variable and set the value to `true`.
@@ -163,7 +92,7 @@ Multiple scrubbing options are available. `REDACT_IP` and `REDACT_EMAIL` match
163
92
- Scrubbing rules are applied to the full JSON-formatted log, including any metadata that is automatically added by the Lambda function.
164
93
- Each instance of a pattern match is replaced until no more matches are found in each log.
165
94
166
-
### 8. (optional) Filtering rules
95
+
### Filtering rules
167
96
168
97
Use the `EXCLUDE_AT_MATCH` OR `INCLUDE_AT_MATCH` environment variables to filter logs based on a regular expression match:
169
98
@@ -172,17 +101,52 @@ Use the `EXCLUDE_AT_MATCH` OR `INCLUDE_AT_MATCH` environment variables to filter
172
101
- If a log matches both the inclusion and exclusion criteria, it is excluded.
173
102
- Filtering rules are applied to the full JSON-formatted log, including any metadata that is automatically added by the function.
174
103
175
-
### 9. (optional) Multiline Log support for s3
104
+
### Multiline Log support for s3
176
105
177
106
If there are multiline logs in s3, set `DD_MULTILINE_LOG_REGEX_PATTERN` environment variable to the specified regex pattern to detect for a new log line.
178
107
179
108
- Example: for multiline logs beginning with pattern `11/10/2014`: `DD_MULTILINE_LOG_REGEX_PATTERN="\d{2}\/\d{2}\/\d{4}"`
180
109
181
-
### 10. (optional) Disable log forwarding
110
+
### Disable log forwarding
182
111
183
112
The datadog forwarder **ALWAYS** forwards logs by default. If you do NOT use the Datadog log management product, you **MUST** set environment variable `DD_FORWARD_LOG` to `False`, to avoid sending logs to Datadog. The forwarder will then only forward other observability data, such as metrics.
184
113
185
-
### 11. (optional) Disable SSL validation
114
+
### Disable SSL validation
186
115
187
116
If you need to ignore SSL certificate validation when forwarding logs using HTTPS, you can set the environment variable `DD_SKIP_SSL_VALIDATION` to `True`.
188
117
This will still encrypt the traffic between the forwarder and the endpoint provided with `DD_URL` but will not check if the destination SSL certificate is valid.
118
+
119
+
## Test
120
+
121
+
Hit the `Test` button, and select `CloudWatch Logs` as the sample event. If the test "succeeded", you are all set! The test log doesn't show up in the platform.
122
+
123
+
## Triggers
124
+
125
+
Follow the steps below to set up triggers on the Datadog Forwarder.
126
+
127
+
1. Click "Publish new version" from the "Actions" menu. Put the Datadog Forwarder version (e.g., 3.0.1) in the description for reference.
128
+
1. Click "Create alias" from the "Actions" menu, name it `live` and point it to the version that was just published.
129
+
1. Now you can set up triggers against the Datadog Forwarder `live` alias (e.g., the ARN should be `arn:aws:lambda:us-east-1:xxxx:function:serverlessrepo-Datadog-Forwarder-xxxx:live`), instead of the unqualified version. This ensures a safe and gradual upgrade in the future. Learn more about [AWS Lambda alias](https://docs.aws.amazon.com/lambda/latest/dg/configuration-aliases.html).
130
+
1. You can either manually subscribe the Datadog Forwarder Lambda function to the desired CloudWatch Log groups or S3 buckets, or let Datadog [manage the triggers automatically for you](https://docs.datadoghq.com/integrations/amazon_web_services/?tab=allpermissions#automatically-setup-triggers).
131
+
132
+
## Upgrade
133
+
134
+
Follow the steps below to upgrade an existing installation of the Datadog Forwarder.
135
+
136
+
1. Make sure the triggers (S3 and CloudWatch Log events) are against the Datadog Forwarder lambda function alias, rather than the unqualified version. If not, follow the steps in the [Triggers section](#triggers) to create an alias and update the existing triggers to be against the alias. This ensures a [gradual upgrade](https://docs.aws.amazon.com/lambda/latest/dg/configuration-aliases.html) next.
137
+
1. Open a new tab and navigate to the [Datadog Forwarder SAM application](https://serverlessrepo.aws.amazon.com/applications/arn:aws:serverlessrepo:us-east-1:464622532012:applications~Datadog-Log-Forwarder) and "Deploy".
138
+
1. Configure the application settings.
139
+
1. The `Application name`**MUST** be the same as the existing Datadog Forwarder application, in order to *update* the CloudFormation stack behind the scene. Please ignore the SAM prefix `serverlessrepo-`, e.g., if your existing forwarder's application name is `serverlessrepo-DatadogForwarder`, enter `DatadogForwarder` for application name.
140
+
1. The `FunctionName`**MUST** be the same as the existing Datadog Forwarder Lambda function, which defaults to `DatadogForwarder`.
141
+
1. Configure the rest of the application settings based the settings (environment variables) of the existing Datadog Forwarder.
142
+
1. Click "Deploy", and the unqualified version (i.e. $LATEST) of `DatadogForwarder` will be updated in a few minutes. This shouldn't affect anything until you point the `live` alias to the new version of the Lambda.
143
+
1. Copy over any missing environment variables for advanced settings from the `live` forwarder to the new (unqualified) version.
144
+
1. If you have directly customized the source code of the forwarder perviously, you need to copy over the changes manually to the unqualified version.
145
+
1. Click "Publish new version" from the "Actions" menu. Put the Datadog Forwarder version (e.g., 3.0.1) in the description for reference.
146
+
1. Switch to the `live` alias, and set the newly published version as the additional version.
147
+
1. Adjust the weight gradually to shift traffic from the old version to the new.
148
+
1. If the new version works fine, point the `live` alias to the new version completely. Otherwise, point the `live` alias back to the old version, and report the issue to Datadog support.
149
+
150
+
## Notes
151
+
152
+
* For S3 logs, there may be some latency between the time a first S3 log file is posted and the Lambda function wakes up.
read -p "About to create a Github release aws-dd-forwarder-${VERSION} and upload the template.yaml to s3://${BUCKET}/templates/${VERSION}.yaml. Continue (y/n)?" CONT
21
+
if [ "$CONT"!="y" ];then
22
+
echo"Exiting"
23
+
exit 1
24
+
fi
25
+
26
+
# Create a github release
27
+
echo"Release aws-dd-forwarder-${VERSION} to github"
28
+
go get github.com/github/hub
29
+
zip -r aws-dd-forwarder-${VERSION}.zip .
30
+
hub release create -a aws-dd-forwarder-${VERSION}.zip -m "aws-dd-forwarder-${VERSION}" aws-dd-forwarder-${VERSION}
31
+
32
+
# Upload the template to the S3 bucket
33
+
echo"Uploading template.yaml to s3://${BUCKET}/templates/${VERSION}.yaml"
0 commit comments