Fluent Bit: Official Manual
SlackGitHubCommunity MeetingsSandboxWebinars
4.0
4.0
  • Fluent Bit Documentation
  • About
    • What is Fluent Bit?
    • A Brief History of Fluent Bit
    • Fluentd and Fluent Bit
    • License
    • Sandbox and Lab Resources
  • Concepts
    • Key Concepts
    • Buffering
    • Data Pipeline
      • Input
      • Parser
      • Filter
      • Buffer
      • Router
      • Output
  • Installation
    • Getting Started with Fluent Bit
    • Upgrade Notes
    • Supported Platforms
    • Requirements
    • Sources
      • Download Source Code
      • Build and Install
      • Build with Static Configuration
    • Linux Packages
      • Amazon Linux
      • Alma / Rocky Linux
      • Redhat / CentOS
      • Debian
      • Ubuntu
      • Raspbian / Raspberry Pi
    • Docker
    • Containers on AWS
    • Amazon EC2
    • Kubernetes
    • macOS
    • Windows
    • Yocto / Embedded Linux
    • Buildroot / Embedded Linux
  • Administration
    • Configuring Fluent Bit
      • YAML Configuration
        • Service
        • Parsers
        • Multiline Parsers
        • Pipeline
        • Plugins
        • Upstream Servers
        • Environment Variables
        • Includes
      • Classic mode
        • Format and Schema
        • Configuration File
        • Variables
        • Commands
        • Upstream Servers
        • Record Accessor
      • Unit Sizes
      • Multiline Parsing
    • Transport Security
    • Buffering and Storage
    • Backpressure
    • Scheduling and Retries
    • Networking
    • Memory Management
    • Monitoring
    • Multithreading
    • HTTP Proxy
    • Hot Reload
    • Troubleshooting
    • Performance Tips
    • AWS credentials
  • Local Testing
    • Validating your Data and Structure
    • Running a Logging Pipeline Locally
  • Data Pipeline
    • Pipeline Monitoring
    • Inputs
      • Collectd
      • CPU Log Based Metrics
      • Disk I/O Log Based Metrics
      • Docker Events
      • Docker Log Based Metrics
      • Dummy
      • Elasticsearch
      • Exec
      • Exec Wasi
      • Ebpf
      • Fluent Bit Metrics
      • Forward
      • Head
      • Health
      • HTTP
      • Kafka
      • Kernel Logs
      • Kubernetes Events
      • Memory Metrics
      • MQTT
      • Network I/O Log Based Metrics
      • NGINX Exporter Metrics
      • Node Exporter Metrics
      • OpenTelemetry
      • Podman Metrics
      • Process Exporter Metrics
      • Process Log Based Metrics
      • Prometheus Remote Write
      • Prometheus Scrape Metrics
      • Random
      • Serial Interface
      • Splunk
      • Standard Input
      • StatsD
      • Syslog
      • Systemd
      • Tail
      • TCP
      • Thermal
      • UDP
      • Windows Event Log
      • Windows Event Log (winevtlog)
      • Windows Exporter Metrics
    • Parsers
      • Configuring Parser
      • JSON
      • Regular Expression
      • LTSV
      • Logfmt
      • Decoders
    • Processors
      • Content Modifier
      • Labels
      • Metrics Selector
      • OpenTelemetry Envelope
      • Sampling
      • SQL
      • Filters as processors
      • Conditional processing
    • Filters
      • AWS Metadata
      • CheckList
      • ECS Metadata
      • Expect
      • GeoIP2 Filter
      • Grep
      • Kubernetes
      • Log to Metrics
      • Lua
      • Parser
      • Record Modifier
      • Modify
      • Multiline
      • Nest
      • Nightfall
      • Rewrite Tag
      • Standard Output
      • Sysinfo
      • Throttle
      • Type Converter
      • Tensorflow
      • Wasm
    • Outputs
      • Amazon CloudWatch
      • Amazon Kinesis Data Firehose
      • Amazon Kinesis Data Streams
      • Amazon S3
      • Azure Blob
      • Azure Data Explorer
      • Azure Log Analytics
      • Azure Logs Ingestion API
      • Counter
      • Dash0
      • Datadog
      • Dynatrace
      • Elasticsearch
      • File
      • FlowCounter
      • Forward
      • GELF
      • Google Chronicle
      • Google Cloud BigQuery
      • HTTP
      • InfluxDB
      • Kafka
      • Kafka REST Proxy
      • LogDNA
      • Loki
      • Microsoft Fabric
      • NATS
      • New Relic
      • NULL
      • Observe
      • OpenObserve
      • OpenSearch
      • OpenTelemetry
      • Oracle Log Analytics
      • PostgreSQL
      • Prometheus Exporter
      • Prometheus Remote Write
      • SkyWalking
      • Slack
      • Splunk
      • Stackdriver
      • Standard Output
      • Syslog
      • TCP and TLS
      • Treasure Data
      • Vivo Exporter
      • WebSocket
  • Stream Processing
    • Introduction to Stream Processing
    • Overview
    • Changelog
    • Getting Started
      • Fluent Bit + SQL
      • Check Keys and NULL values
      • Hands On 101
  • Fluent Bit for Developers
    • C Library API
    • Ingest Records Manually
    • Golang Output Plugins
    • WASM Filter Plugins
    • WASM Input Plugins
    • Developer guide for beginners on contributing to Fluent Bit
Powered by GitBook
On this page
  • Configuration parameters
  • TLS / SSL
  • gzipped content
  • Get started
  • Set a tag
  • Configuration file
  • Configuration file http.0 example
  • Curl request
  • Configuration file tag_key example
  • Configuration file example 3
  • Command line

Was this helpful?

Export as PDF
  1. Data Pipeline
  2. Inputs

HTTP

Last updated 5 days ago

Was this helpful?

The HTTP input plugin lets Fluent Bit open an HTTP port that you can then route data to in a dynamic way.

Configuration parameters

Key
Description
Default

listen

The address to listen on.

0.0.0.0

port

The port for Fluent Bit to listen on.

9880

tag_key

Specify the key name to overwrite a tag. If set, the tag will be overwritten by a value of the key.

none

buffer_max_size

Specify the maximum buffer size in KB to receive a JSON message.

4M

buffer_chunk_size

This sets the chunk size for incoming JSON messages. These chunks are then stored and managed in the space available by buffer_max_size.

512K

successful_response_code

Allows setting successful response code. Supported values: 200, 201, and 204

201

success_header

Add an HTTP header key/value pair on success. Multiple headers can be set. For example, X-Custom custom-answer

none

threaded

false

TLS / SSL

HTTP input plugin supports TLS/SSL. For more details about the properties available and general configuration, refer to .

gzipped content

The HTTP input plugin will accept and automatically handle gzipped content in version 2.2.1 or later if the header Content-Encoding: gzip is set on the received data.

Get started

This plugin supports dynamic tags which let you send data with different tags through the same input. See the following for an example:

Set a tag

The tag for the HTTP input plugin is set by adding the tag to the end of the request URL. This tag is then used to route the event through the system.

For example, in the following curl message the tag set is app.log**. ** because the end path is /app_log:

curl -d '{"key1":"value1","key2":"value2"}' -XPOST -H "content-type: application/json" http://localhost:8888/app.log

Configuration file

[INPUT]
    name http
    listen 0.0.0.0
    port 8888

[OUTPUT]
    name stdout
    match app.log
pipeline:
    inputs:
        - name: http
          listen: 0.0.0.0
          port: 8888
    outputs:
        - name: stdout
          match: app.log

Configuration file http.0 example

If you don't set the tag, http.0 is automatically used. If you have multiple HTTP inputs then they will follow a pattern of http.N where N is an integer representing the input.

curl -d '{"key1":"value1","key2":"value2"}' -XPOST -H "content-type: application/json" http://localhost:8888
[INPUT]
    name http
    listen 0.0.0.0
    port 8888

[OUTPUT]
    name  stdout
    match  http.0
pipeline:
    inputs:
        - name: http
          listen: 0.0.0.0
          port: 8888
    outputs:
        - name: stdout
          match: http.0

Set tag_key

The tag_key configuration option lets you specify the key name that will be used to overwrite a tag. The tag's value will be replaced with the value associated with the specified key. For example, setting tag_key to custom_tag and the log event contains a JSON field with the key custom_tag. Fluent Bit will use the value of that field as the new tag for routing the event through the system.

Curl request

curl -d '{"key1":"value1","key2":"value2"}' -XPOST -H "content-type: application/json" http://localhost:8888/app.log

Configuration file tag_key example

[INPUT]
    name http
    listen 0.0.0.0
    port 8888
    tag_key key1

[OUTPUT]
    name stdout
    match value1
pipeline:
    inputs:
        - name: http
          listen: 0.0.0.0
          port: 8888
          tag_key: key1
    outputs:
        - name: stdout
          match: value1

Set multiple custom HTTP headers on success

The success_header parameter lets you set multiple HTTP headers on success. The format is:

[INPUT]
    name http
    success_header X-Custom custom-answer
    success_header X-Another another-answer
    inputs:
        - name: http
          success_header: X-Custom custom-answer
          success_header: X-Another another-answer

Example curl message

curl -d @app.log -XPOST -H "content-type: application/json" http://localhost:8888/app.log

Configuration file example 3

[INPUT]
    name http
    listen 0.0.0.0
    port 8888

[OUTPUT]
    name stdout
    match *
pipeline:
    inputs:
        - name: http
          listen: 0.0.0.0
          port: 8888

    outputs:
        - name: stdout
          match: '*'

Command line

 fluent-bit -i http -p port=8888 -o stdout

Indicates whether to run this input in its own .

Transport Security
Link to video
thread