Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
17 changes: 15 additions & 2 deletions docs/getting-started/automatic-import.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@ TIP: Click https://elastic.navattic.com/automatic-import[here] to access an inte
- A working <<llm-connector-guides, LLM connector>>. Recommended models: `Claude 3.5 Sonnet`; `GPT-4o`; `Gemini-1.5-pro-002`.
- An https://www.elastic.co/pricing[Enterprise] subscription.
- A sample of the data you want to import, in a structured or unstructured format (including JSON, NDJSON, and Syslog).
- To import data from a REST API, have its OpenAPI specification (OAS) file ready.
--

IMPORTANT: Using Automatic Import allows users to create new third-party data integrations through the use of third-party generative AI models (“GAI models”). Any third-party GAI models that you choose to use are owned and operated by their respective providers. Elastic does not own or control these third-party GAI models, nor does it influence their design, training, or data-handling practices. Using third-party GAI models with Elastic solutions, and using your data with third-party GAI models is at your discretion. Elastic bears no responsibility or liability for the content, operation, or use of these third-party GAI models, nor for any potential loss or damage arising from their use. Users are advised to exercise caution when using GAI models with personal, sensitive, or confidential information, as data submitted may be used to train the models or for other purposes. Elastic recommends familiarizing yourself with the development practices and terms of use of any third-party GAI models before use. You are responsible for ensuring that your use of Automatic Import complies with the terms and conditions of any third-party platform you connect with.
Expand All @@ -40,6 +41,13 @@ image::images/auto-import-create-new-integration-button.png[The Integrations pag
6. Define your integration's package name, which will prefix the imported event fields.
7. Define your **Data stream title**, **Data stream description**, and **Data stream name**. These fields appear on the integration's configuration page to help identify the data stream it writes to.
8. Select your {filebeat-ref}/configuration-filebeat-options.html[**Data collection method**]. This determines how your new integration will ingest the data (for example, from an S3 bucket, an HTTP endpoint, or a file stream).
+
.Importing CEL data
[NOTE]
====
If you select *API (CEL input)*, you'll have the additional option to upload the API's OAS file here. After you do, the LLM will use it to determine which API endpoints (GET only), query parameters, and data structures to use in the new custom integration. You will then select which API endpoints to consume and your authentication method before uploading your sample data.
====
+
9. Upload a sample of your data. Make sure to include all the types of events that you want the new integration to handle.
+
.Best practices for sample data
Expand All @@ -57,14 +65,19 @@ image::images/auto-import-review-integration-page.png[The Automatic Import Revie
+
12. (Optional) After reviewing the proposed pipeline, you can fine-tune it by clicking **Edit pipeline**. Refer to the <<siem-field-reference,{elastic-sec} ECS reference>> to learn more about formatting field mappings. When you're satisfied with your changes, click **Save**.
+
[NOTE]
.How to edit a CEL program
====
If your new integration collects data from an API, you can update the CEL input configuration (program and API authentication information) from the new integration's integration policy.
====
+
image::images/auto-import-edit-pipeline.gif[A gif showing the user clicking the edit pipeline button and viewing the ingest pipeline flyout]
+
13. Click **Add to Elastic**. After the **Success** message appears, your new integration will be available on the Integrations page.
+
image::images/auto-import-success-message.png[The automatic import success message]
+
14. Click **Add to an agent** to deploy your new integration and start collecting data, or click **View integration** to view detailed information about your new integration.

NOTE: Once you've added an integration, you can't edit any details other than the ingest pipeline, which you can edit by going to **Stack Management → Ingest Pipelines**.
15. (Optional) Once you've added an integration, you can edit the ingest pipeline by going to **Project Settings → Stack Management → Ingest Pipelines**.

TIP: You can use the <<data-quality-dash, Data Quality dashboard>> to check the health of your data ingest pipelines and field mappings.
27 changes: 17 additions & 10 deletions docs/serverless/ingest/auto-import.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@ Click https://elastic.navattic.com/automatic-import[here] to access an interacti
* A working <<security-llm-connector-guides, LLM connector>>. Recommended models: `Claude 3.5 Sonnet`; `GPT-4o`; `Gemini-1.5-pro-002`.
* A https://www.elastic.co/pricing/serverless-security[Security Analytics Complete] subscription.
* A sample of the data you want to import, in a structured or unstructured format (such as JSON, NDJSON, or Syslog).
* To import data from a REST API, have its OpenAPI specification (OAS) file ready.
====

[IMPORTANT]
Expand All @@ -50,6 +51,13 @@ image:images/auto-import-create-new-integration-button.png[The Integrations page
. Define your integration's package name, which will prefix the imported event fields.
. Define your **Data stream title**, **Data stream description**, and **Data stream name**. These fields appear on the integration's configuration page to help identify the data stream it writes to.
. Select your https://www.elastic.co/guide/en/beats/filebeat/current/configuration-filebeat-options.html[**Data collection method**]. This determines how your new integration will ingest the data (for example, from an S3 bucket, an HTTP endpoint, or a file stream).
+
.Importing CEL data
[NOTE]
====
If you select *API (CEL input)*, you'll have the additional option to upload the API's OAS file here. After you do, the LLM will use it to determine which API endpoints (GET only), query parameters, and data structures to use in the new custom integration. You will then select which API endpoints to consume and your authentication method before uploading your sample data.
====
+
. Upload a sample of your data. Make sure to include all the types of events that you want the new integration to handle.
+
.Best practices for sample data
Expand All @@ -67,20 +75,19 @@ image:images/auto-import-create-new-integration-button.png[The Integrations page
image:images/auto-import-review-integration-page.png[The Automatic Import Review page showing proposed field mappings]
. (Optional) After reviewing the proposed pipeline, you can fine-tune it by clicking **Edit pipeline**. Refer to the https://www.elastic.co/guide/en/security/current/siem-field-reference.html[{elastic-sec} ECS reference] to learn more about formatting field mappings. When you're satisfied with your changes, click **Save**.
+
[NOTE]
.How to edit a CEL program
====
If your new integration collects data from an API, you can update the CEL input configuration (program and API authentication information) from the new integration's integration policy.
====
+
[role="screenshot"]
image:images/auto-import-edit-pipeline.gif[A gif showing the user clicking the edit pipeline button and viewing the ingest pipeline flyout]
. Click **Add to Elastic**. After the **Success** message appears, your new integration will be available on the Integrations page.
+
[role="screenshot"]
image:images/auto-import-success-message.png[The Automatic Import success message]
. Click **Add to an agent** to deploy your new integration and start collecting data, or click **View integration** to view detailed information about your new integration.

[NOTE]
====
Once you've added an integration, you can't edit any details other than the ingest pipeline, which you can edit by going to **Project Settings → Stack Management → Ingest Pipelines**.
====
. Click **Add to an agent** to deploy your new integration and start collecting data, or click **View integration** to view detailed information about your new integration.
. (Optional) Once you've added an integration, you can edit the ingest pipeline by going to **Project Settings → Stack Management → Ingest Pipelines**.

[TIP]
====
You can use the <<security-data-quality-dash,Data Quality dashboard>> to check the health of your data ingest pipelines and field mappings.
====
TIP: You can use the <<security-data-quality-dash,Data Quality dashboard>> to check the health of your data ingest pipelines and field mappings.