Azure Logs Ingestion API

Send logs to Azure Log Analytics using Logs Ingestion API with DCE and DCR

Azure Logs Ingestion plugin allows you ingest your records using Logs Ingestion API in Azure Monitor to supported Azure tables or to custom tables that you create.

The Logs ingestion API requires the following components:

  • A Data Collection Endpoint (DCE)

  • A Data Collection Rule (DCR) and

  • A Log Analytics Workspace

Note: According to this document, all resources should be in the same region.

To get more details about how to setup these components, please refer to the following documentations:

Configuration Parameters

KeyDescriptionDefault

tenant_id

Required - The tenant ID of the AAD application.

client_id

Required - The client ID of the AAD application.

client_secret

Required - The client secret of the AAD application (App Secret).

dce_url

Required - Data Collection Endpoint(DCE) URL.

dcr_id

Required - Data Collection Rule (DCR) immutable ID (see this document to collect the immutable id)

table_name

Required - The name of the custom log table (include the _CL suffix as well if applicable)

time_key

Optional - Specify the key name where the timestamp will be stored.

@timestamp

time_generated

Optional - If enabled, will generate a timestamp and append it to JSON. The key name is set by the 'time_key' parameter.

true

compress

Optional - Enable HTTP payload gzip compression.

true

Getting Started

To send records into an Azure Log Analytics using Logs Ingestion API the following resources needs to be created:

  • A Data Collection Endpoint (DCE) for ingestion

  • A Data Collection Rule (DCR) for data transformation

  • An app registration with client secrets (for DCR access).

You can follow this guideline to setup the DCE, DCR, app registration and a custom table.

Configuration File

Use this configuration to quickly get started:

[INPUT]
    Name    tail
    Path    /path/to/your/sample.log
    Tag     sample
    Key     RawData 
# Or use other plugins Plugin
# [INPUT]
#     Name    cpu
#     Tag     sample

[FILTER]
    Name modify
    Match sample
    # Add a json key named "Application":"fb_log"
    Add Application fb_log

# Enable this section to see your json-log format
#[OUTPUT]
#    Name stdout
#    Match *
[OUTPUT]
    Name            azure_logs_ingestion
    Match           sample
    client_id       XXXXXXXX-xxxx-yyyy-zzzz-xxxxyyyyzzzzxyzz
    client_secret   some.secret.xxxzzz
    tenant_id       XXXXXXXX-xxxx-yyyy-zzzz-xxxxyyyyzzzzxyzz
    dce_url         https://log-analytics-dce-XXXX.region-code.ingest.monitor.azure.com
    dcr_id          dcr-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
    table_name      ladcr_CL
    time_generated  true
    time_key        Time
    Compress        true

Setup your DCR transformation accordingly based on the json output from fluent-bit's pipeline (input, parser, filter, output).

Last updated