Kafka
Last updated
Was this helpful?
Last updated
Was this helpful?
The Kafka input plugin subscribes to one or more Kafka topics to collect messages from an service.
This plugin uses the official as a built-in dependency.
brokers
Single or multiple list of Kafka Brokers. For example: 192.168.1.3:9092
, 192.168.1.4:9092
.
none
topics
Single entry or list of comma-separated topics (,
) that Fluent Bit will subscribe to.
none
format
Serialization format of the messages. If set to json
, the payload will be parsed as JSON.
none
client_id
Client id passed to librdkafka.
none
group_id
Group id passed to librdkafka.
fluent-bit
poll_ms
Kafka brokers polling interval in milliseconds.
500
Buffer_Max_Size
Specify the maximum size of buffer per cycle to poll Kafka messages from subscribed topics. To increase throughput, specify larger size.
4M
rdkafka.{property}
none
threaded
false
To subscribe to or collect messages from Apache Kafka, run the plugin from the command line or through the configuration file:
The Kafka plugin can read parameters through the -p
argument (property):
In your main configuration file append the following Input
and Output
sections:
The Fluent Bit source repository contains a full example of using Fluent Bit to process Kafka records:
The previous example will connect to the broker listening on kafka-broker:9092
and subscribe to the fb-source
topic, polling for new messages every 100 milliseconds.
Since the payload will be in JSON format, the plugin is configured to parse the payload with format json
.
Every message received is then processed with kafka.lua
and sent back to the fb-sink
topic of the same broker.
The example can be executed locally with make start
in the examples/kafka_filter
directory (docker/compose
is used).
{property}
can be any
Indicates whether to run this input in its own .