1
0
mirror of https://github.com/openshift/openshift-docs.git synced 2026-02-07 00:48:01 +01:00
Files
openshift-docs/modules/cluster-logging-log-forward-update.adoc
2020-10-13 18:20:44 -04:00

170 lines
4.6 KiB
Plaintext

// Module included in the following assemblies:
//
// * logging/cluster-logging-external.adoc
// Might not be needed. See https://issues.redhat.com/browse/LOG-654
[id="cluster-logging-log-forward-update_{context}"]
= Updating log forwarding custom resources
The {product-title} Log Forward API has been promoted from Technology Preview to Generally Available in {product-title} 4.6. The GA release contains some improvements and enhancements that require you to make a change to your Cluster Logging custom resource (CR) and to replace your `LogForwarding` custom resource (CR) with a `ClusterLogForwarder` CR.
.Sample `ClusterLogForwarder` instance in {product-title} 4.6
[source,yaml]
----
apiVersion: logging.openshift.io/v1
kind: ClusterLogForwarder
metadata:
name: instance
namespace: openshift-logging
....
spec:
outputs:
- url: http://remote.elasticsearch.com:9200
name: elasticsearch
type: elasticsearch
- url: tls://fluentdserver.example.com:24224
name: fluentd
type: fluentdForward
secret:
name: fluentdserver
pipelines:
- inputRefs:
- infrastructure
- application
name: mylogs
outputRefs:
- elasticsearch
- inputRefs:
- audit
name: auditlogs
outputRefs:
- fluentd
- default
...
----
.Sample `LogForwarding` CR in {product-title} 4.5
[source,yaml]
----
apiVersion: logging.openshift.io/v1alpha1
kind: LogForwarding
metadata:
name: instance
namespace: openshift-logging
spec:
disableDefaultForwarding: true
outputs:
- name: elasticsearch
type: elasticsearch
endpoint: remote.elasticsearch.com:9200
- name: fluentd
type: forward
endpoint: fluentdserver.example.com:24224
secret:
name: fluentdserver
pipelines:
- inputSource: logs.infra
name: infra-logs
outputRefs:
- elasticearch
- inputSource: logs.app
name: app-logs
outputRefs:
- elasticearch
- inputSource: logs.audit
name: audit-logs
outputRefs:
- fluentd
----
The following procedure shows each parameter you must change.
.Procedure
To update the `LogForwarding` CR in 4.5 to the `ClusterLogForwarding` CR for 4.6, make the following modifications:
. Edit the Cluster Logging custom resource (CR) to remove the `logforwardingtechpreview` annotation:
+
.Sample Cluster Logging CR
[source,yaml]
----
apiVersion: "logging.openshift.io/v1"
kind: "ClusterLogging"
metadata:
annotations:
clusterlogging.openshift.io/logforwardingtechpreview: enabled <1>
name: "instance"
namespace: "openshift-logging"
....
----
<1> Remove the `logforwardingtechpreview` annotation.
. Export the `LogForwarding` CR to create a YAML file for the `ClusterLogForwarder` instance:
+
[source,terminal]
----
$ oc get LogForwarding instance -n openshift-logging -o yaml| tee ClusterLogForwarder.yaml
----
. Edit the YAML file to make the following modifications:
+
.Sample `ClusterLogForwarder` instance in {product-title} 4.6
[source,yaml]
----
apiVersion: logging.openshift.io/v1 <1>
kind: ClusterLogForwarder <2>
metadata:
name: instance
namespace: openshift-logging
....
spec: <3>
outputs:
- url: http://remote.elasticsearch.com:9200 <4>
name: elasticsearch
type: elasticsearch
- url: tls://fluentdserver.example.com:24224
name: fluentd
type: fluentdForward <5>
secret:
name: fluentdserver
pipelines:
- inputRefs: <6>
- infrastructure
- application
name: mylogs
outputRefs:
- elasticsearch
- inputRefs:
- audit
name: auditlogs
outputRefs:
- fluentd
- default <7>
...
----
<1> Change the `apiVersion` from `"logging.openshift.io/v1alpha1"` to `"logging.openshift.io/v1"`.
<2> Change the object kind from `kind: "LogForwarding"` to `kind: "ClusterLogForwarder"`.
<3> Remove the `disableDefaultForwarding: true` parameter.
<4> Change the output parameter from `spec.outputs.endpoint` to `spec.outputs.url`. Add a prefix to the URL, such as `https://`, `tcp://`, and so forth, if a prefix is not present.
<5> For Fluentd outputs, change the `type` from `forward` to `fluentdForward`.
<6> Change the pipelines:
* Change `spec.pipelines.inputSource` to `spec.pipelines.inputRefs`
* Change `logs.infra` to `infrastructure`
* Change `logs.app` to `application`
* Change `logs.audit` to `audit`
<7> Optional: Add a `default` pipeline to send logs to the internal Elasticsearch instance. You are not required to configure a `default` output.
+
[NOTE]
====
If you want to forward logs to only the internal {product-title} Elasticsearch instance, do not configure the Log Forwarding API.
====
. Create the CR object:
+
[source,terminal]
----
$ oc create -f ClusterLogForwarder.yaml
----