1
0
mirror of https://github.com/openshift/openshift-docs.git synced 2026-02-05 12:46:18 +01:00

Merge pull request #56099 from openshift-cherrypick-robot/cherry-pick-55902-to-enterprise-4.13

[enterprise-4.13] RHDEVDOCS-4427 - Forwarding logs to Splunk
This commit is contained in:
Brian Burt
2023-02-17 15:25:58 -05:00
committed by GitHub
2 changed files with 65 additions and 1 deletions

View File

@@ -47,7 +47,7 @@ include::modules/cluster-logging-collector-log-forward-cloudwatch.adoc[leveloffs
[id="cluster-logging-collector-log-forward-sts-cloudwatch_{context}"]
=== Forwarding logs to Amazon CloudWatch from STS enabled clusters
For clusters with AWS Security Token Service (STS) enabled, you can create an AWS service account manually or create a credentials request by using the
For clusters with AWS Security Token Service (STS) enabled, you can create an AWS service account manually or create a credentials request by using the
ifdef::openshift-enterprise,openshift-origin[]
xref:../authentication/managing_cloud_provider_credentials/about-cloud-credential-operator.adoc[Cloud Credential Operator(CCO)]
endif::[]
@@ -180,6 +180,8 @@ include::modules/cluster-logging-troubleshooting-loki-entry-out-of-order-errors.
include::modules/cluster-logging-collector-log-forward-gcp.adoc[leveloffset=+1]
include::modules/logging-forward-splunk.adoc[leveloffset=+1]
include::modules/cluster-logging-collector-log-forward-project.adoc[leveloffset=+1]
include::modules/cluster-logging-collector-log-forward-logs-from-application-pods.adoc[leveloffset=+1]

View File

@@ -0,0 +1,62 @@
// Module included in the following assemblies:
// cluster-logging-external.adoc
//
:_content-type: PROCEDURE
[id="logging-forward-splunk_{context}"]
= Forwarding logs to Splunk
You can forward logs to the link:https://docs.splunk.com/Documentation/Splunk/9.0.0/Data/UsetheHTTPEventCollector[Splunk HTTP Event Collector (HEC)] in addition to, or instead of, the internal default {product-title} log store.
[NOTE]
====
Using this feature with Fluentd is not supported.
====
.Prerequisites
* Red Hat OpenShift Logging Operator 5.6 and higher
* ClusterLogging instance with vector specified as collector
* Base64 encoded Splunk HEC token
.Procedure
. Create a secret using your Base64 encoded Splunk HEC token.
+
[source,terminal]
----
$ oc -n openshift-logging create secret generic vector-splunk-secret --from-literal hecToken=<HEC_Token>
----
+
. Create or edit the `ClusterLogForwarder` Custom Resource (CR) using the template below:
+
[source,yaml]
----
apiVersion: "logging.openshift.io/v1"
kind: "ClusterLogForwarder"
metadata:
name: "instance" <1>
namespace: "openshift-logging" <2>
spec:
outputs:
- name: splunk-receiver <3>
secret:
name: vector-splunk-secret <4>
type: splunk <5>
url: <http://your.splunk.hec.url:8088> <6>
pipelines: <7>
- inputRefs:
- application
- infrastructure
name: <8>
outputRefs:
- splunk-receiver <9>
----
<1> The name of the ClusterLogForwarder CR must be `instance`.
<2> The namespace for the ClusterLogForwarder CR must be `openshift-logging`.
<3> Specify a name for the output.
<4> Specify the name of the secret that contains your HEC token.
<5> Specify the output type as `splunk`.
<6> Specify the URL (including port) of your Splunk HEC.
<7> Specify which log types to forward by using the pipeline: `application`, `infrastructure`, or `audit`.
<8> Optional: Specify a name for the pipeline.
<9> Specify the name of the output to use when forwarding logs with this pipeline.