Content Modifier
The content modifier processor lets you manipulate the content, metadata, and attributes of logs and traces.
Similar to how filters work, this processor uses a unified mechanism to perform operations for data manipulation. The most significant difference is that processors perform better than filters, and when chaining them, there are no encoding/decoding performance penalties.
Contexts
The content modifier relies on context, meaning the place where the content modification will happen. Fluent Bit provides different contexts to manipulate the desired information.
The following contexts are available:
attributes
Logs
Modifies the attributes or metadata of a log record.
body
Logs
Modifies the content of a log record.
span_name
Traces
Modifies the name of a span.
span_kind
Traces
Modifies the kind of a span.
span_status
Traces
Modifies the status of a span.
span_attributes
Traces
Modifies the attributes of a span.
OpenTelemetry contexts
Additionally, Fluent Bit provides specific contexts for modifying data that follows the OpenTelemetry log schema. All of these contexts operate on shared data across a group of records.
The following contexts are available:
otel_resource_attributes
Logs
Modifies the attributes of the log resource.
otel_scope_name
Logs
Modifies the name of a log scope.
otel_scope_version
Logs
Modifies version of a log scope.
otel_scope_attributes
Logs
Modifies the attributes of a log scope.
Configuration parameters
The following configuration parameters are available:
context
Specifies the context where the modifications will happen.
key
Specifies the name of the key that will be used to apply the modification.
value
The role of this parameter changes based on the action type.
pattern
Defines a regular expression pattern. This property is only used by the extract
action.
converted_type
Defines the data type to perform the conversion. Possible values: string
, boolean
, int
and double
.
Actions
The actions specify the type of operation to run on top of a specific key or content from a log or a trace. The following actions are available:
insert
Inserts a new key with a value into the target context. The key
and value
parameters are required.
upsert
Given a specific key with a value, the upsert
operation will try to update the value of the key. If the key does not exist, a new key will be created. The key
and value
parameters are required.
delete
Deletes a key from the target context. The key
parameter is required.
rename
Changes the name of a key. The value
set in the configuration will represent the new name. The key
and value
parameters are required.
hash
Replaces the key value with a hash generated by the SHA-256 algorithm, the binary value generated is finally set as a hex string representation. The key
parameter is required.
extract
Extracts the value of a single key as a list of key/value pairs. This action needs the configuration of a regular expression in the pattern
property. The key
and pattern
parameters are required.
convert
Converts the data type of a key value. The key
and converted_type
parameters are required.
Insert example
The following example appends the key color
with the value blue
to the log stream.
pipeline:
inputs:
- name: dummy
dummy: '{"key1": "123.4"}'
processors:
logs:
- name: content_modifier
action: insert
key: "color"
value: "blue"
outputs:
- name : stdout
match: '*'
format: json_lines
Upsert example
Update the value of key1
and insert key2
:
pipeline:
inputs:
- name: dummy
dummy: '{"key1": "123.4"}'
processors:
logs:
- name: content_modifier
action: upsert
key: "key1"
value: "5678"
- name: content_modifier
action: upsert
key: "key2"
value: "example"
outputs:
- name : stdout
match: '*'
format: json_lines
Delete example
Delete key2
from the stream:
pipeline:
inputs:
- name: dummy
dummy: '{"key1": "123.4", "key2": "example"}'
processors:
logs:
- name: content_modifier
action: delete
key: "key2"
outputs:
- name : stdout
match: '*'
format: json_lines
Rename example
Change the name of key2
to test
:
pipeline:
inputs:
- name: dummy
dummy: '{"key1": "123.4", "key2": "example"}'
processors:
logs:
- name: content_modifier
action: rename
key: "key2"
value: "test"
outputs:
- name : stdout
match: '*'
format: json_lines
Hash example
Apply the SHA-256 algorithm for the value of the key password
:
pipeline:
inputs:
- name: dummy
dummy: '{"username": "bob", "password": "12345"}'
processors:
logs:
- name: content_modifier
action: hash
key: "password"
outputs:
- name : stdout
match: '*'
format: json_lines
Extract example
By using a domain address, perform a extraction of the components of it as a list of key value pairs:
pipeline:
inputs:
- name: dummy
dummy: '{"http.url": "https://fluentbit.io/docs?q=example"}'
processors:
logs:
- name: content_modifier
action: extract
key: "http.url"
pattern: ^(?<http_protocol>https?):\/\/(?<http_domain>[^\/\?]+)(?<http_path>\/[^?]*)?(?:\?(?<http_query_params>.*))?
outputs:
- name : stdout
match: '*'
format: json_lines
Convert example
Both keys in the example are strings. Convert the key1
to a double/float type and key2
to a boolean:
pipeline:
inputs:
- name: dummy
dummy: '{"key1": "123.4", "key2": "true"}'
processors:
logs:
- name: content_modifier
action: convert
key: key1
converted_type: int
- name: content_modifier
action: convert
key: key2
converted_type: boolean
outputs:
- name : stdout
match: '*'
format: json_lines
Last updated
Was this helpful?