1
0
mirror of https://github.com/openshift/openshift-docs.git synced 2026-02-05 12:46:18 +01:00
Files
openshift-docs/modules/cluster-logging-json-log-forwarding.adoc
2024-01-11 15:12:08 +00:00

38 lines
1.1 KiB
Plaintext

[id="cluster-logging-json-log-forwarding_{context}"]
= Parsing JSON logs
You can use a `ClusterLogForwarder` object to parse JSON logs into a structured object and forward them to a supported output.
To illustrate how this works, suppose that you have the following structured JSON log entry:
.Example structured JSON log entry
[source,yaml]
----
{"level":"info","name":"fred","home":"bedrock"}
----
To enable parsing JSON log, you add `parse: json` to a pipeline in the `ClusterLogForwarder` CR, as shown in the following example:
.Example snippet showing `parse: json`
[source,yaml]
----
pipelines:
- inputRefs: [ application ]
outputRefs: myFluentd
parse: json
----
When you enable parsing JSON logs by using `parse: json`, the CR copies the JSON-structured log entry in a `structured` field, as shown in the following example:
.Example `structured` output containing the structured JSON log entry
[source,yaml]
----
{"structured": { "level": "info", "name": "fred", "home": "bedrock" },
"more fields..."}
----
[IMPORTANT]
====
If the log entry does not contain valid structured JSON, the `structured` field is absent.
====