Conditional processing

Conditional processing lets you selectively apply processors to logs based on the value of fields within those logs. This feature lets you create processing pipelines that only process records that meet certain criteria, and ignore the rest.

Conditional processing is available in Fluent Bit version 4.0 and greater.

Configuration

You can turn a standard processor into a conditional processor by adding a condition block to the processor's YAML configuration settings.

These condition blocks use the following syntax:

pipeline:
  inputs:
    <...>

  processors:
    logs:
      - name: processor_name
        <...>
        condition:
          op: {and|or}
          rules:
          - field: {field_name1}
            op: {comparison_operator}
            value: {comparison_value1}
          - field: {field_name2}
            op: {comparison_operator}
            value: {comparison_value2}
        <...>

Each processor can only have a single condition block, but that condition can include multiple rules. These rules are stored as items in the condition.rules array.

Condition evaluation

The condition.op parameter specifies the condition's evaluation logic. It can have one of the following values:

  • and: A log entry meets this condition when all the rules in the condition.rules array are truthy.

  • or: A log entry meets this condition when one or more rules in the condition.rules array are truthy.

Rules

Each item in the condition.rules array must include values for the following parameters:

Parameter
Description

field

The field within your logs to evaluate. The value of this parameter must use the correct syntax to access the fields inside logs.

op

The comparison operator to evaluate whether the rule is true. This parameter (condition.rules.op) is distinct from the condition.op parameter and has different possible values.

value

The value of the specified log field to use in your comparison. Optionally, you can provide an array that contains multiple values.

Rules are evaluated against each log that passes through your data pipeline. For example, given a rule with these parameters:

This rule evaluates to true for a log that contains the string 'status':200, but evaluates to false for a log that contains the string 'status':403.

Field access

The conditions.rules.field parameter uses record accessor syntax to reference fields inside logs.

You can use $field syntax to access a top-level field, and $field['child']['subchild'] to access nested fields.

Comparison operators

The conditions.rules.op parameter has the following possible values:

  • eq: equal to

  • neq: not equal to

  • gt: greater than

  • lt: less than

  • gte: greater than or equal to

  • lte: less than or equal to

  • regex: matches a regular expression

  • not_regex: does not match a regular expression

  • in: is included in the specified array

  • not_in: is not included in the specified array

Examples

Basic condition

This example applies a condition that only processes logs that contain the string {"request": {"method": "POST":

Multiple conditions with and

This example applies a condition that only processes logs when all the specified rules are met:

Multiple conditions with or

This example applies a condition that only processes logs when one or more of the specified rules are met:

Array of values

This example uses an array for the value of condition.rules.value:

Multiple processors with conditions

This example uses multiple processors with conditional processing enabled for each:

This configuration adds an alert field to error logs from critical services, and adds a paging_required field to errors that contain specific critical patterns.

Last updated

Was this helpful?