Configuration File
This page describes the yaml configuration file used by Fluent Bit
One of the ways to configure Fluent Bit is using a YAML configuration file that works at a global scope.
The YAML configuration file supports the following sections:
Env
Includes
Service
Pipeline
Inputs
Filters
Outputs
The YAML configuration file does not support the following sections yet:
Parsers
YAML configuration is used in the smoke tests for containers, so an always-correct up-to-date example is here: https://github.com/fluent/fluent-bit/blob/master/packaging/testing/smoke/container/fluent-bit.yaml.
Env
The env section allows the definition of configuration variables that will be used later in the configuration file.
Example:
Includes
The includes section allows the files to be merged into the YAML configuration to be identified as a list of filenames. If no path is provided, then the file is assumed to be in a folder relative to the file referencing it.
Example:
Service
The service section defines the global properties of the service. The Service keys available as of this version are described in the following table:
The following is an example of a service section:
For scheduler and retry details, please check there: scheduling and retries
Pipeline
A pipeline section will define a complete pipeline configuration, including inputs, filters and outputs subsections.
Each of the subsections for inputs, filters and outputs constitutes an array of maps that has the parameters for each. Most properties are either simple strings or numbers so can be define directly, ie:
This pipeline consists of two inputs; a tail plugin and an http server plugin. Each plugin has its own map in the array of inputs consisting of simple properties. To use more advanced properties that consist of multiple values the property itself can be defined using an array, ie: the record and allowlist_key properties for the record_modifier filter:
In the cases where each value in a list requires two values they must be separated by a space, such as in the record property for the record_modifier filter.
Input
An input section defines a source (related to an input plugin). Here we will describe the base configuration for each input section. Note that each input plugin may add it own configuration keys:
The Name is mandatory and it lets Fluent Bit know which input plugin should be loaded. The Tag is mandatory for all plugins except for the input forward plugin (as it provides dynamic tags).
Example input
The following is an example of an input section for the cpu plugin.
Filter
A filter section defines a filter (related to a filter plugin). Here we will describe the base configuration for each filter section. Note that each filter plugin may add its own configuration keys:
The Name is mandatory and it lets Fluent Bit know which filter plugin should be loaded. The Match or Match_Regex is mandatory for all plugins. If both are specified, Match_Regex takes precedence.
Example filter
The following is an example of a filter section for the grep plugin:
Output
The outputs section specify a destination that certain records should follow after a Tag match. Currently, Fluent Bit can route up to 256 OUTPUT plugins. The configuration supports the following keys:
Example output
The following is an example of an output section:
Example: collecting CPU metrics
The following configuration file example demonstrates how to collect CPU metrics and flush the results every five seconds to the standard output:
Processors
In recent versions of Fluent-Bit, the input and output plugins can run in separate threads. In Fluent-Bit 2.1.2, we have implemented a new interface called "processor" to extend the processing capabilities in input and output plugins directly without routing the data. This interface allows users to apply data transformations and filtering to incoming data records before they are processed further in the pipeline.
This functionality is only exposed in YAML configuration and not in classic configuration mode due to the restriction of nested levels of configuration.
Example: Using processors.
The following configuration file example demonstrates the use of processors to change the log record in the input plugin section by adding a new key "hostname" with the value "monox", and we use lua to append the tag to the log record. Also in the ouput plugin section we added a new key named "output" with the value "new data". All these without the need of routing the logs further in the pipeline.
Last updated