Fluent Bit: Official Manual
SlackGitHubCommunity MeetingsSandbox and LabsWebinars
4.0
4.0
  • Fluent Bit v4.0 Documentation
  • About
    • What is Fluent Bit?
    • A Brief History of Fluent Bit
    • Fluentd & Fluent Bit
    • License
    • Sandbox and Lab Resources
  • Concepts
    • Key Concepts
    • Buffering
    • Data Pipeline
      • Input
      • Parser
      • Filter
      • Buffer
      • Router
      • Output
  • Installation
    • Getting Started with Fluent Bit
    • Upgrade Notes
    • Supported Platforms
    • Requirements
    • Sources
      • Download Source Code
      • Build and Install
      • Build with Static Configuration
    • Linux Packages
      • Amazon Linux
      • Redhat / CentOS
      • Debian
      • Ubuntu
      • Raspbian / Raspberry Pi
    • Docker
    • Containers on AWS
    • Amazon EC2
    • Kubernetes
    • macOS
    • Windows
    • Yocto / Embedded Linux
    • Buildroot / Embedded Linux
  • Administration
    • Configuring Fluent Bit
      • YAML Configuration
        • Service
        • Parsers
        • Multiline Parsers
        • Pipeline
        • Plugins
        • Upstream Servers
        • Environment Variables
        • Includes
      • Classic mode
        • Format and Schema
        • Configuration File
        • Variables
        • Commands
        • Upstream Servers
        • Record Accessor
      • Unit Sizes
      • Multiline Parsing
    • Transport Security
    • Buffering & Storage
    • Backpressure
    • Scheduling and Retries
    • Networking
    • Memory Management
    • Monitoring
    • Multithreading
    • HTTP Proxy
    • Hot Reload
    • Troubleshooting
    • Performance Tips
    • AWS credentials
  • Local Testing
    • Validating your Data and Structure
    • Running a Logging Pipeline Locally
  • Data Pipeline
    • Pipeline Monitoring
    • Inputs
      • Collectd
      • CPU Log Based Metrics
      • Disk I/O Log Based Metrics
      • Docker Events
      • Docker Log Based Metrics
      • Dummy
      • Elasticsearch
      • Exec
      • Exec Wasi
      • Ebpf
      • Fluent Bit Metrics
      • Forward
      • Head
      • Health
      • HTTP
      • Kafka
      • Kernel Logs
      • Kubernetes Events
      • Memory Metrics
      • MQTT
      • Network I/O Log Based Metrics
      • NGINX Exporter Metrics
      • Node Exporter Metrics
      • OpenTelemetry
      • Podman Metrics
      • Process Exporter Metrics
      • Process Log Based Metrics
      • Prometheus Remote Write
      • Prometheus Scrape Metrics
      • Random
      • Serial Interface
      • Splunk
      • Standard Input
      • StatsD
      • Syslog
      • Systemd
      • Tail
      • TCP
      • Thermal
      • UDP
      • Windows Event Log
      • Windows Event Log (winevtlog)
      • Windows Exporter Metrics
    • Parsers
      • Configuring Parser
      • JSON
      • Regular Expression
      • LTSV
      • Logfmt
      • Decoders
    • Processors
      • Content Modifier
      • Labels
      • Metrics Selector
      • OpenTelemetry Envelope
      • Sampling
      • SQL
      • Filters as processors
      • Conditional processing
    • Filters
      • AWS Metadata
      • CheckList
      • ECS Metadata
      • Expect
      • GeoIP2 Filter
      • Grep
      • Kubernetes
      • Log to Metrics
      • Lua
      • Parser
      • Record Modifier
      • Modify
      • Multiline
      • Nest
      • Nightfall
      • Rewrite Tag
      • Standard Output
      • Sysinfo
      • Throttle
      • Type Converter
      • Tensorflow
      • Wasm
    • Outputs
      • Amazon CloudWatch
      • Amazon Kinesis Data Firehose
      • Amazon Kinesis Data Streams
      • Amazon S3
      • Azure Blob
      • Azure Data Explorer
      • Azure Log Analytics
      • Azure Logs Ingestion API
      • Counter
      • Dash0
      • Datadog
      • Dynatrace
      • Elasticsearch
      • File
      • FlowCounter
      • Forward
      • GELF
      • Google Chronicle
      • Google Cloud BigQuery
      • HTTP
      • InfluxDB
      • Kafka
      • Kafka REST Proxy
      • LogDNA
      • Loki
      • Microsoft Fabric
      • NATS
      • New Relic
      • NULL
      • Observe
      • OpenObserve
      • OpenSearch
      • OpenTelemetry
      • Oracle Log Analytics
      • PostgreSQL
      • Prometheus Exporter
      • Prometheus Remote Write
      • SkyWalking
      • Slack
      • Splunk
      • Stackdriver
      • Standard Output
      • Syslog
      • TCP & TLS
      • Treasure Data
      • Vivo Exporter
      • WebSocket
  • Stream Processing
    • Introduction to Stream Processing
    • Overview
    • Changelog
    • Getting Started
      • Fluent Bit + SQL
      • Check Keys and NULL values
      • Hands On! 101
  • Fluent Bit for Developers
    • C Library API
    • Ingest Records Manually
    • Golang Output Plugins
    • WASM Filter Plugins
    • WASM Input Plugins
    • Developer guide for beginners on contributing to Fluent Bit
Powered by GitBook
On this page
  • Configuration
  • Condition evaluation
  • Rules
  • Examples
  • Basic condition
  • Multiple conditions with and
  • Multiple conditions with or
  • Array of values
  • Multiple processors with conditions

Was this helpful?

Export as PDF
  1. Data Pipeline
  2. Processors

Conditional processing

Last updated 1 month ago

Was this helpful?

Conditional processing lets you selectively apply to logs based on the value of fields that those logs contain. This feature lets you create processing pipelines that only process records that meet certain criteria, and ignore the rest.

Conditional processing is available in Fluent Bit version 4.0 and greater.

Configuration

You can turn a standard processor into a conditional processor by adding acondition block to the processor's YAML configuration settings.

Conditional processing is only available for , not .

These condition blocks use the following syntax:

pipeline:
  inputs:
  <...>
      processors:
        logs:
          - name: {processor_name}
            <...>
            condition:
	      op: {and|or}
              rules:
                - field: {field_name1}
                  op: {comparison_operator}
                  value: {comparison_value1}
                - field: {field_name2}
                  op: {comparison_operator}
                  value: {comparison_value2}
            <...>

Each processor can only have a single condition block, but that condition can include multiple rules. These rules are stored as items in the condition.rules array.

Condition evaluation

The condition.op parameter specifies the condition's evaluation logic. It has two possible values:

Rules

Each item in the condition.rules array must include values for the following parameters:

Parameter
Description

field

op

value

Rules are evaluated against each log that passes through your data pipeline. For example, given a rule with these parameters:

- field: "$status"
   op: eq
   value: 200

This rule evaluates to true for a log that contains the string 'status':200, but evaluates to false for a log that contains the string 'status':403.

Field access

You can use $field syntax to access a top-level field, and $field['child']['subchild'] to access nested fields.

Comparison operators

The conditions.rules.op parameter has the following possible values:

  • eq: equal to

  • neq: not equal to

  • gt: greater than

  • lt: less than

  • gte: greater than or equal to

  • lte: less than or equal to

  • regex: matches a regular expression

  • not_regex: does not match a regular expression

  • in: is included in the specified array

  • not_in: is not included in the specified array

Examples

Basic condition

This example applies a condition that only processes logs that contain the string {"request": {"method": "POST":

pipeline:
  inputs:
    - name: dummy
      dummy: '{"request": {"method": "GET", "path": "/api/v1/resource"}}'
      tag: request.log
      processors:
        logs:
          - name: content_modifier
            action: insert
            key: modified_if_post
            value: true
            condition:
              op: and
              rules:
                - field: "$request['method']"
                  op: eq
                  value: "POST"

Multiple conditions with and

This example applies a condition that only processes logs when all of the specified rules are met:

pipeline:
  inputs:
    - name: dummy
      dummy: '{"request": {"method": "POST", "path": "/api/v1/sensitive-data"}}'
      tag: request.log
      processors:
        logs:
          - name: content_modifier
            action: insert
            key: requires_audit
            value: true
            condition:
              op: and
              rules:
                - field: "$request['method']"
                  op: eq
                  value: "POST"
                - field: "$request['path']"
                  op: regex
                  value: "\/sensitive-.*"

Multiple conditions with or

This example applies a condition that only processes logs when one or more of the specified rules are met:

pipeline:
  inputs:
    - name: dummy
      dummy: '{"request": {"method": "GET", "path": "/api/v1/resource", "status_code": 200, "response_time": 150}}'
      tag: request.log
      processors:
        logs:
          - name: content_modifier
            action: insert
            key: requires_performance_check
            value: true
            condition:
              op: or
              rules:
                - field: "$request['response_time']"
                  op: gt
                  value: 100
                - field: "$request['status_code']"
                  op: gte
                  value: 400

Array of values

This example uses an array for the value of condition.rules.value:

pipeline:
  inputs:
    - name: dummy
      dummy: '{"request": {"method": "GET", "path": "/api/v1/resource"}}'
      tag: request.log
      processors:
        logs:
          - name: content_modifier
            action: insert
            key: high_priority_method
            value: true
            condition:
              op: and
              rules:
                - field: "$request['method']"
                  op: in
                  value: ["POST", "PUT", "DELETE"]

Multiple processors with conditions

This example uses multiple processors with conditional processing enabled for each:

pipeline:
  inputs:
    - name: dummy
      dummy: '{"log": "Error: Connection refused", "level": "error", "service": "api-gateway"}'
      tag: app.log
      processors:
        logs:
          - name: content_modifier
            action: insert
            key: alert
            value: true
            condition:
              op: and
              rules:
                - field: "$level"
                  op: eq
                  value: "error"
                - field: "$service"
                  op: in
                  value: ["api-gateway", "authentication", "database"]

          - name: content_modifier
            action: insert
            key: paging_required
            value: true
            condition:
              op: and
              rules:
                - field: "$log"
                  op: regex
                  value: "(?i)(connection refused|timeout|crash)"
                - field: "$level"
                  op: in
                  value: ["error", "fatal"]

This configuration adds an alert field to error logs from critical services, and adds a paging_required field to errors that contain specific critical patterns.

and: A log entry meets this condition when all of the rules in the condition.rules are .

or: A log entry meets this condition when one or more rules in the condition.rules array are .

The field within your logs to evaluate. The value of this parameter must use to access the fields inside logs.

The to evaluate whether the rule is true. This parameter (condition.rules.op) is distinct from the condition.op parameter and has different possible values.

The value of the specified log field to use in your comparison. Optionally, you can provide .

The conditions.rules.field parameter uses to reference fields inside logs.

processors
YAML configuration files
classic configuration files
truthy
truthy
record accessor syntax
the correct syntax
comparison operator
an array that contains multiple values