mirror of
https://github.com/openshift/openshift-docs.git
synced 2026-02-07 00:48:01 +01:00
46 lines
1.2 KiB
Plaintext
46 lines
1.2 KiB
Plaintext
:_content-type: PROCEDURE
|
|
[id="logging-getting-started_{context}"]
|
|
= Getting started with logging
|
|
|
|
This overview of the logging deployment process is provided for ease of reference. It is not a substitute for full documentation. For new installations, Vector and LokiStack are recommended.
|
|
|
|
--
|
|
include::snippets/logging-under-construction-snip.adoc[]
|
|
--
|
|
|
|
--
|
|
include::snippets/logging-compatibility-snip.adoc[]
|
|
--
|
|
|
|
.Prerequisites
|
|
* Log store preference: Elasticsearch or LokiStack
|
|
* Collector implementation preference: Fluentd or Vector
|
|
* Credentials for your log forwarding outputs
|
|
|
|
.Procedure
|
|
|
|
--
|
|
include::snippets/logging-elastic-dep-snip.adoc[]
|
|
--
|
|
|
|
. Install the Operator for the log store you'd like to use.
|
|
** For Elasticsearch, install the *OpenShift Elasticsearch Operator*.
|
|
** For LokiStack, install the *Loki Operator*.
|
|
+
|
|
[NOTE]
|
|
====
|
|
If you have installed the Loki Operator, create a `LokiStack` custom resource (CR) instance as well.
|
|
====
|
|
|
|
. Install the *Red Hat OpenShift Logging* Operator.
|
|
|
|
. Create a `ClusterLogging` CR instance.
|
|
** Select your collector implementation.
|
|
+
|
|
--
|
|
include::snippets/logging-fluentd-dep-snip.adoc[]
|
|
--
|
|
. Create a `ClusterLogForwarder` CR instance.
|
|
|
|
. Create a secret for the selected output pipeline.
|