You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/getting-started/automatic-import.asciidoc
+15-2Lines changed: 15 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -22,6 +22,7 @@ TIP: Click https://elastic.navattic.com/automatic-import[here] to access an inte
22
22
- A working <<llm-connector-guides, LLM connector>>. Recommended models: `Claude 3.5 Sonnet`; `GPT-4o`; `Gemini-1.5-pro-002`.
23
23
- An https://www.elastic.co/pricing[Enterprise] subscription.
24
24
- A sample of the data you want to import, in a structured or unstructured format (including JSON, NDJSON, and Syslog).
25
+
- To import data from a REST API, have its OpenAPI specification (OAS) file ready.
25
26
--
26
27
27
28
IMPORTANT: Using Automatic Import allows users to create new third-party data integrations through the use of third-party generative AI models (“GAI models”). Any third-party GAI models that you choose to use are owned and operated by their respective providers. Elastic does not own or control these third-party GAI models, nor does it influence their design, training, or data-handling practices. Using third-party GAI models with Elastic solutions, and using your data with third-party GAI models is at your discretion. Elastic bears no responsibility or liability for the content, operation, or use of these third-party GAI models, nor for any potential loss or damage arising from their use. Users are advised to exercise caution when using GAI models with personal, sensitive, or confidential information, as data submitted may be used to train the models or for other purposes. Elastic recommends familiarizing yourself with the development practices and terms of use of any third-party GAI models before use. You are responsible for ensuring that your use of Automatic Import complies with the terms and conditions of any third-party platform you connect with.
6. Define your integration's package name, which will prefix the imported event fields.
41
42
7. Define your **Data stream title**, **Data stream description**, and **Data stream name**. These fields appear on the integration's configuration page to help identify the data stream it writes to.
42
43
8. Select your {filebeat-ref}/configuration-filebeat-options.html[**Data collection method**]. This determines how your new integration will ingest the data (for example, from an S3 bucket, an HTTP endpoint, or a file stream).
44
+
+
45
+
.Importing CEL data
46
+
[NOTE]
47
+
====
48
+
If you select *API (CEL input)*, you'll have the additional option to upload the API's OAS file here. After you do, the LLM will use it to determine which API endpoints (GET only), query parameters, and data structures to use in the new custom integration. You will then select which API endpoints to consume and your authentication method before uploading your sample data.
49
+
====
50
+
+
43
51
9. Upload a sample of your data. Make sure to include all the types of events that you want the new integration to handle.
12. (Optional) After reviewing the proposed pipeline, you can fine-tune it by clicking **Edit pipeline**. Refer to the <<siem-field-reference,{elastic-sec} ECS reference>> to learn more about formatting field mappings. When you're satisfied with your changes, click **Save**.
59
67
+
68
+
[NOTE]
69
+
.How to edit a CEL program
70
+
====
71
+
If your new integration collects data from an API, you can update the CEL input configuration (program and API authentication information) from the new integration's integration policy.
72
+
====
73
+
+
60
74
image::images/auto-import-edit-pipeline.gif[A gif showing the user clicking the edit pipeline button and viewing the ingest pipeline flyout]
61
75
+
62
76
13. Click **Add to Elastic**. After the **Success** message appears, your new integration will be available on the Integrations page.
14. Click **Add to an agent** to deploy your new integration and start collecting data, or click **View integration** to view detailed information about your new integration.
67
-
68
-
NOTE: Once you've added an integration, you can't edit any details other than the ingest pipeline, which you can edit by going to **Stack Management → Ingest Pipelines**.
81
+
15. (Optional) Once you've added an integration, you can edit the ingest pipeline by going to **Project Settings → Stack Management → Ingest Pipelines**.
69
82
70
83
TIP: You can use the <<data-quality-dash, Data Quality dashboard>> to check the health of your data ingest pipelines and field mappings.
. Define your integration's package name, which will prefix the imported event fields.
51
52
. Define your **Data stream title**, **Data stream description**, and **Data stream name**. These fields appear on the integration's configuration page to help identify the data stream it writes to.
52
53
. Select your https://www.elastic.co/guide/en/beats/filebeat/current/configuration-filebeat-options.html[**Data collection method**]. This determines how your new integration will ingest the data (for example, from an S3 bucket, an HTTP endpoint, or a file stream).
54
+
+
55
+
.Importing CEL data
56
+
[NOTE]
57
+
====
58
+
If you select *API (CEL input)*, you'll have the additional option to upload the API's OAS file here. After you do, the LLM will use it to determine which API endpoints (GET only), query parameters, and data structures to use in the new custom integration. You will then select which API endpoints to consume and your authentication method before uploading your sample data.
59
+
====
60
+
+
53
61
. Upload a sample of your data. Make sure to include all the types of events that you want the new integration to handle.
image:images/auto-import-review-integration-page.png[The Automatic Import Review page showing proposed field mappings]
68
76
. (Optional) After reviewing the proposed pipeline, you can fine-tune it by clicking **Edit pipeline**. Refer to the https://www.elastic.co/guide/en/security/current/siem-field-reference.html[{elastic-sec} ECS reference] to learn more about formatting field mappings. When you're satisfied with your changes, click **Save**.
69
77
+
78
+
[NOTE]
79
+
.How to edit a CEL program
80
+
====
81
+
If your new integration collects data from an API, you can update the CEL input configuration (program and API authentication information) from the new integration's integration policy.
82
+
====
83
+
+
70
84
[role="screenshot"]
71
85
image:images/auto-import-edit-pipeline.gif[A gif showing the user clicking the edit pipeline button and viewing the ingest pipeline flyout]
72
86
. Click **Add to Elastic**. After the **Success** message appears, your new integration will be available on the Integrations page.
. Click **Add to an agent** to deploy your new integration and start collecting data, or click **View integration** to view detailed information about your new integration.
77
-
78
-
[NOTE]
79
-
====
80
-
Once you've added an integration, you can't edit any details other than the ingest pipeline, which you can edit by going to **Project Settings → Stack Management → Ingest Pipelines**.
81
-
====
90
+
. Click **Add to an agent** to deploy your new integration and start collecting data, or click **View integration** to view detailed information about your new integration.
91
+
. (Optional) Once you've added an integration, you can edit the ingest pipeline by going to **Project Settings → Stack Management → Ingest Pipelines**.
82
92
83
-
[TIP]
84
-
====
85
-
You can use the <<security-data-quality-dash,Data Quality dashboard>> to check the health of your data ingest pipelines and field mappings.
86
-
====
93
+
TIP: You can use the <<security-data-quality-dash,Data Quality dashboard>> to check the health of your data ingest pipelines and field mappings.
0 commit comments