The metric_selector processor allows you to select metrics to include or exclude (similar to the grep
filter for logs).
The native processor plugin supports the following configuration parameters:
Key | Description | Default |
---|---|---|
Here is a basic configuration example.
All processors are only valid with the YAML configuration format. Processor configuration should be located under the relevant input or output plugin configuration.
Metric_Name parameter will translate the strings which is quoted with backslashes /.../
as Regular expressions. Without them, users need to specify Operation_Type whether prefix matching or substring matching. The default operation is prefix matching. For example, /chunks/
will be translated as a regular expression.
Metric_Name
Keep metrics in which the metric of name matches with the actual name or the regular expression.
Context
Specify matching context. Currently, metric_name is only supported.
Metrics_Name
Action
Specify the action for specified metrics. INCLUDE and EXCLUDE are allowed.
Operation_Type
Specify the operation type of action for metrics payloads. PREFIX and SUBSTRING are allowed.
The sql processor provides a simple interface to select content from Logs by also supporting conditional expressions.
Our SQL processor does not depend on a database or indexing; it runs everything on the fly (this is good). We don't have the concept of tables but you run the query on the STREAM.
Note that this processor differs from the "stream processor interface" that runs after the filters; this one can only be used in the processor's section of the input plugins when using YAML configuration mode.
Key | Description |
---|---|
The following example generates a sample message with two keys called key
and http.url
. By using a simple SQL statement we will select only the key http.url
.
Similar to the example above, now we will extract the parts of http.url
and only select the domain from the value, for that we will use together content-modifier and sql processors together:
the expected output of this pipeline will be something like this:
query
Define the SQL statement to run on top of the Logs stream; it must end with ;
.
The content_modifier processor allows you to manipulate the metadata/attributes and content of Logs and Traces.
Similar to the functionality exposed by filters, this processor presents a unified mechanism to perform such operations for data manipulation. The most significant difference is that processors perform better than filters, and when chaining them, there are no encoding/decoding performance penalties.
Note that processors and this specific component can only be enabled using the new YAML configuration format. Classic mode configuration format doesn't support processors.
Key | Description |
---|---|
The actions specify the type of operation to run on top of a specific key or content from a Log or a Trace. The following actions are available:
Action | Description |
---|---|
The following example appends the key color
with the value blue
to the log stream.
Update the value of key1
and insert key2
:
Delete key2
from the stream:
Change the name of key2
to test
:
Apply the SHA-256 algorithm for the value of the key password
:
By using a domain address, perform a extraction of the components of it as a list of key value pairs:
Both keys in the example are strings. Convert the key1
to a double/float type and key2
to a boolean:
action
Define the operation to run on the target content. This field is mandatory; for more details about the actions available, check the table below.
context
Specify which component of the Telemetry type will be affected. When processing Logs the following contexts are available: attributes
or body
. When processing Traces the following contexts are available: span_name
, span_kind
, span_status
, span_attributes
.
key
Specify the name of the key that will be used to apply the modification.
value
Based on the action type, value
might required and represent different things. Check the detailed information for the specific actions.
pattern
Defines a regular expression pattern. This property is only used by the extract
action.
converted_type
Define the data type to perform the conversion, the available options are: string
, boolean
, int
and double
.
insert
Insert a new key with a value into the target context. The key
and value
parameters are required.
upsert
Given a specific key with a value, the upsert
operation will try to update the value of the key. If the key does not exist, the key will be created. The key
and value
parameters are required.
delete
Delete a key from the target context. The key
parameter is required.
rename
Change the name of a key. The value
set in the configuration will represent the new name. The key
and value
parameters are required.
hash
Replace the key value with a hash generated by the SHA-256 algorithm, the binary value generated is finally set as an hex string representation. The key
parameter is required.
extract
Allows to extact the value of a single key as a list of key/value pairs. This action needs the configuration of a regular expression in the pattern
property . The key
and pattern
parameters are required. For more details check the examples below.
convert
Convert the data type of a key value. The key
and converted_type
parameters are required.