# Azure Logs Ingestion API

Azure Logs Ingestion plugin lets you ingest your records using [Logs Ingestion API in Azure Monitor](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-ingestion-api-overview) to supported [Azure tables](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-ingestion-api-overview#supported-tables) or to [custom tables](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/create-custom-table#create-a-custom-table) that you create.

The Logs ingestion API requires the following components:

* A Data Collection Endpoint (DCE)
* A Data Collection Rule (DCR) and
* A Log Analytics Workspace

To visualize the basic logs ingestion operation, see the following image:

![Log ingestion overview](/files/Q4CskUET6ANJg1CdtExM)

To get more details about how to set up these components, refer to the following documentation:

* [Azure Logs Ingestion API](https://docs.microsoft.com/en-us/azure/log-analytics/)
* [Send data to Azure Monitor Logs with Logs ingestion API (setup DCE, DCR and Log Analytics)](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/tutorial-logs-ingestion-portal)

## Configuration parameters

| Key              | Description                                                                                                                                                                                              | Default      |
| ---------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------ |
| `tenant_id`      | The tenant ID of the Azure Active Directory (AAD) application.                                                                                                                                           | *none*       |
| `client_id`      | The client ID of the AAD application.                                                                                                                                                                    | *none*       |
| `client_secret`  | The client secret of the AAD application ([App Secret](https://docs.microsoft.com/en-us/azure/active-directory/develop/howto-create-service-principal-portal#option-2-create-a-new-application-secret)). | *none*       |
| `dce_url`        | Data Collection Endpoint(DCE) URL.                                                                                                                                                                       | *none*       |
| `dcr_id`         | Data Collection Rule (DCR) [immutable ID](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/tutorial-logs-ingestion-portal#collect-information-from-the-dcr).                                   | *none*       |
| `table_name`     | The name of the custom log table (include the `_CL` suffix as well if applicable)                                                                                                                        | *none*       |
| `time_key`       | Optional. Specify the key name where the timestamp will be stored.                                                                                                                                       | `@timestamp` |
| `time_generated` | Optional. If enabled, will generate a timestamp and append it to JSON. The key name is set by the `time_key` parameter.                                                                                  | `true`       |
| `compress`       | Optional. Enable HTTP payload gzip compression.                                                                                                                                                          | `true`       |
| `workers`        | The number of [workers](/manual/4.0/administration/multithreading.md#outputs) to perform flush operations for this output.                                                                               | `0`          |

## Get started

To send records into an Azure Log Analytics using Logs Ingestion API the following resources needs to be created:

* A Data Collection Endpoint (DCE) for ingestion
* A Data Collection Rule (DCR) for data transformation
* Either an [Azure tables](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-ingestion-api-overview#supported-tables) or [custom tables](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/create-custom-table#create-a-custom-table)
* An app registration with client secrets (for DCR access).

Follow [this guideline](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/tutorial-logs-ingestion-portal) to set up the DCE, DCR, app registration and a custom table.

### Configuration file

Use this configuration file to get started:

{% tabs %}
{% tab title="fluent-bit.yaml" %}

```yaml
pipeline:
  inputs:
    - name: tail
      path: /path/to/your/sample.log
      tag: sample
      key: RawData

    # Or use other plugins
    #- name: cpu
    #  tag: sample

  filters:
    - name: modify
      match: sample
      # Add a json key named "Application":"fb_log"
      add: Application fb_log

  outputs:
    # Enable this section to see your json-log format
    #- name: stdout
    #  match: '*'

    - name: azure_logs_ingestion
      match: sample
      client_id: XXXXXXXX-xxxx-yyyy-zzzz-xxxxyyyyzzzzxyzz
      client_secret: some.secret.xxxzzz
      tenant_id: XXXXXXXX-xxxx-yyyy-zzzz-xxxxyyyyzzzzxyzz
      dce_url: https://log-analytics-dce-XXXX.region-code.ingest.monitor.azure.com
      dcr_id: dcr-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
      table_name: ladcr_CL
      time_generated: true
      time_key: Time
      compress: true
```

{% endtab %}

{% tab title="fluent-bit.conf" %}

```
[INPUT]
  Name    tail
  Path    /path/to/your/sample.log
  Tag     sample
  Key     RawData

# Or use other plugins
#[INPUT]
#  Name    cpu
#  Tag     sample

[FILTER]
  Name modify
  Match sample
  # Add a json key named "Application":"fb_log"
  Add Application fb_log

# Enable this section to see your json-log format
#[OUTPUT]
#  Name stdout
#  Match *

[OUTPUT]
  Name            azure_logs_ingestion
  Match           sample
  client_id       XXXXXXXX-xxxx-yyyy-zzzz-xxxxyyyyzzzzxyzz
  client_secret   some.secret.xxxzzz
  tenant_id       XXXXXXXX-xxxx-yyyy-zzzz-xxxxyyyyzzzzxyzz
  dce_url         https://log-analytics-dce-XXXX.region-code.ingest.monitor.azure.com
  dcr_id          dcr-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
  table_name      ladcr_CL
  time_generated  true
  time_key        Time
  Compress        true
```

{% endtab %}
{% endtabs %}

Set up your DCR transformation based on the JSON output from the Fluent Bit pipeline (input, parser, filter, output).


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.fluentbit.io/manual/4.0/data-pipeline/outputs/azure_logs_ingestion.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
