Azure Logs Ingestion API
Send logs to Azure Log Analytics using Logs Ingestion API with DCE and DCR
Last updated
Was this helpful?
Send logs to Azure Log Analytics using Logs Ingestion API with DCE and DCR
Last updated
Was this helpful?
Azure Logs Ingestion plugin allows you ingest your records using to supported or to that you create.
The Logs ingestion API requires the following components:
A Data Collection Endpoint (DCE)
A Data Collection Rule (DCR) and
A Log Analytics Workspace
To get more details about how to setup these components, please refer to the following documentations:
tenant_id
Required - The tenant ID of the AAD application.
client_id
Required - The client ID of the AAD application.
client_secret
dce_url
Required - Data Collection Endpoint(DCE) URL.
dcr_id
table_name
Required - The name of the custom log table (include the _CL
suffix as well if applicable)
time_key
Optional - Specify the key name where the timestamp will be stored.
@timestamp
time_generated
Optional - If enabled, will generate a timestamp and append it to JSON. The key name is set by the 'time_key' parameter.
true
compress
Optional - Enable HTTP payload gzip compression.
true
To send records into an Azure Log Analytics using Logs Ingestion API the following resources needs to be created:
A Data Collection Endpoint (DCE) for ingestion
A Data Collection Rule (DCR) for data transformation
An app registration with client secrets (for DCR access).
Use this configuration to quickly get started:
Setup your DCR transformation accordingly based on the json output from fluent-bit's pipeline (input, parser, filter, output).
Note: According to , all resources should be in the same region.
To visualize basic Logs Ingestion operation, see the following image:
Required - The client secret of the AAD application ().
Required - Data Collection Rule (DCR) immutable ID (see to collect the immutable id)
Either an or
You can follow to setup the DCE, DCR, app registration and a custom table.