Skip to content

Commit ebacd32

Browse files
authored
Update README.md for azure log forwarder to point to public docs (#967)
* Update README.md to point to public docs. We want our docs in one place. These ones have become outdated. * Update README.md
1 parent b057ba2 commit ebacd32

File tree

1 file changed

+1
-71
lines changed

1 file changed

+1
-71
lines changed

azure/activity_logs_monitoring/README.md

Lines changed: 1 addition & 71 deletions
Original file line numberDiff line numberDiff line change
@@ -2,74 +2,4 @@
22

33
The Datadog-Azure function is used to forward Azure logs to Datadog, including Activity and Diagnostic logs from EventHub.
44

5-
## Quick Start
6-
7-
The provided Node.js script must be deployed into your Azure Functions service. Follow the tutorial below to learn how to do so:
8-
9-
### 1. Create a new EventHub triggered function
10-
11-
- Expand your function application and click the `+` button next to `Functions`. If this is the first function in your function application, select `Custom function`. This displays the complete set of function templates.
12-
- In the search field type `Event Hub` and choose `Event Hub Trigger`.
13-
- Select the `Javascript` language in the right menu.
14-
- Enter a name for the function.
15-
- Add the wanted `Event Hub connection` or create a new one if you haven't have one already.
16-
- Select the `Event Hub consumer group` and the `Event Hub Name` you want to pull logs from.
17-
18-
### 2. Provide the code
19-
20-
- Copy paste the code of the [Datadog-Azure function](./index.js).
21-
- In the `Integrate` part:
22-
- `Event Hub Cardinality` must be set to `Many`.
23-
- Set the `Event Parameter Name` to `eventHubMessages`.
24-
25-
## 3. (optional) Send logs to EU or to a proxy
26-
27-
### Send logs to EU
28-
29-
Set the environment variable `DD_SITE` to `datadoghq.eu` and logs are automatically forwarded to your EU platform.
30-
31-
## Parameters
32-
33-
- **API KEY**:
34-
35-
There are 2 possibilities to set your [Datadog's API key](https://app.datadoghq.com/organization-settings/api-keys):
36-
37-
1. Replace `<DATADOG_API_KEY>` in the code with your API Key value.
38-
2. Set the value through the `DD_API_KEY` environment variable
39-
40-
- **Custom Tags**:
41-
42-
You have two options to add custom tags to your logs:
43-
44-
- Manually by editing the function code: Replace the `<TAG_KEY>:<TAG_VALUE>` placeholder for the `DD_TAGS` variable by a comma separated list of tags
45-
- Automatically with the `DD_TAGS` environment variable
46-
47-
Learn more about Datadog tagging in our main [Tagging documentation](https://docs.datadoghq.com/tagging/).
48-
49-
## Customization
50-
51-
- **Scrubbing PII**
52-
53-
To scrub PII from your logs, uncomment the SCRUBBER_RULE_CONFIG code. If you'd like to scrub more than just emails and IP addresses, add your own config to this map in the format
54-
```
55-
{
56-
NAME: {
57-
pattern: <regex_pattern>,
58-
replacement: <string to replace matching text with>}
59-
}
60-
```
61-
62-
- **Log Splitting**
63-
64-
To split array-type fields in your logs into individual logs, you can add sections to the DD_LOG_SPLITTING_CONFIG map in the code or by setting the DD_LOG_SPLITTING_CONFIG env variable (which must be a json string in the same format).
65-
This will create an attribute in your logs called "parsed_arrays", which contains the fields in the format of the original log with the split log value.
66-
67-
An example of an azure.datafactory use case is provided in the code and commented out. The format is as follows:
68-
```
69-
{
70-
source_type:
71-
paths: [list of [list of fields in the log payload to iterate through to find the one to split]],
72-
keep_original_log: bool, if you'd like to preserve the original log in addition to the split ones or not,
73-
preserve_fields: bool, whether or not to keep the original log fields in the new split logs
74-
}
75-
```
5+
## See our documentation: [Send Azure Logs to Datadog](https://docs.datadoghq.com/logs/guide/azure-logging-guide/).

0 commit comments

Comments
 (0)